It’s like reading my mind.
It’s magic.
We’re innovating
at a scale like never before.
It takes what you prompt it with
and just runs with it.
Everything is linked.
Everything is easy to use.
We’re accelerating how fast we can deliver
solutions to our fans, to our business.
It takes the hands-on time
that the provider has to spend
with the computer
and puts that time back with the patient.
It allows for the patient and doctor
to be a patient and a doctor
like it used to be.
It’s fundamentally
about amplifying what every person is able to do.
every person is able to do.
This opens up the opportunity
for that farmer,
for that girl in the village
who’s trying to go to college
to find out what more
can they do with their life.
Good morning.
Good morning.
It’s fantastic to see you all.
Welcome to Build, all of those
who made it here.
You know, it’s
fantastic to be back together
physically and everyone joining online.
You know, these developer conferences are special
times, special places to be,
especially when platform shifts are in the air.
I distinctly remember my first PDC in 1991,
driving up 101 into Moscone Center,
and my life changed after that
developer conference.
So it’s exciting to be able
to come back to Build 2023 with that same sense
of anticipation of something big that is shifting
around us as developers.
To just sort of put this in perspective,
in fact, last summer I was reading
Mitchell Waldrop’s Dream Machine
while I was playing with DV3, as GPT-4
was called then, Da Vinci 3
and it just brought to -
it brought in perspective what this is all about.
I think that that concept of “Dream Machine”
perhaps best communicates
what we have really been doing
over the last 70 years, right?
All the ways,
starting with
what Vannevar Bush wrote in his most
seminal paper
As We May Think,
where he had all these concepts
like associative memory,
or what Licklider was the first one
who even conceptualized
the human computer symbiosis,
The Mother of All Demos
that came in ‘68 to the Xerox Alto,
and then, of course, the PDC that I attended,
which was the PC server one in ‘91,
‘93 is when we had the mosaic moment,
then there was iPhone and the cloud,
and all of these would be one continuous journey.
And then, in fact, the other thing I always loved
is Jobs’s
description of computers
as bicycles for the mind.
It’s sort of a beautiful metaphor.
I think it captures
the essence of what computing is.
But then last November,
we got an upgrade, right?
We went from the bicycle to the steam engine.
With the launch of ChatGPT,
it was like the mosaic moment for this generation
of the AI platform.
And now we look forward to, as developers,
what we can do going forward.
And so it’s an exciting time.
And in fact, every layer of the software stack
is going to be changed forever.
And no better place to start
than the actual developer stack, right?
We, as developers,
how do we build is fundamentally changing.
In fact, when I think about how we build,
I think about first Codespaces, right?
Being able to set up that environment in,
you know, in seconds was versus minutes.
Dev Box, you know,
instead of waiting for a day
for your managed
dev box to be set up,
you have it in less than an hour.
If you think about Copilot, what it does to you,
and Copilot X,
in terms of driving the overall flow
and productivity, which is, what, 54% or so up.
And then of course GitHub Actions
and Azure Deployment Environments,
really making that possible for you to stay.
In fact, one of the things that I keep on
bugging Scott Guthrie for years is,
“Hey, I want to stay in VS code,
I want to stay in command line
and let me do everything there.”
We are close, close to that dream.
And that to me is bringing back
both the joy of programming
and the flow of programming.
That ability to be able to stay on
task, it’s just so wonderful to see.
So how we build software is radically different,
but what we are going to build as developers
is really the story of this developer conference.
And what we build -
we’ve been, you know,
it’s not like I came in on January 1st and said,
“Let’s start doing press releases.”
But it does feel like that.
It does feel like every week
there is something new and,
you know, we’re infusing
this new AI stack across all layers of it, right?
So we started with tooling in GitHub -
or rather, Copilot in GitHub.
We did Copilot in Power Platform.
And when it comes to productivity,
Copilot in Microsoft 365,
Copilot in Viva.
With business process, it’s
the copilot in Dynamics 365.
And when it comes to industry sort of workflows,
what Nuance has done with DAX
or the Security Copilot.
Or the copilot,
of course, for the web in Bing and Edge,
features in LinkedIn, which are driven by AI,
and of course the AI infrastructure
with Azure OpenAI APIs
and everything else around it, right?
So every layer of the stack
is profoundly changing.
And we today are going to -
as part of this developer conference,
we’re going to have 50+ more announcements,
but I want to highlight five of them.
The first is we are bringing
search grounding in Bing to ChatGPT.
We are very excited about this.
Yeah, you can clap!
Look, ChatGPT is the most fast-growing
consumer app we’ve ever seen
and such grounding is a very key feature, right?
So that all the information is current
and grounded by what you have
from the crawl and the index.
And so it’s fantastic to see
that we are excited to be able -
it’s going to launch in ChatGPT+
immediately and quickly coming
even to the free tier.
And this is just the start of what we plan to do
with our partners in OpenAI
to bring the best of Bing to the ChatGPT experience.
Next, we are bringing
the copilot to the biggest canvas of all:
Windows.
You’re going to hear a lot from Panos
tomorrow about it, but I think that this is going
to make every user a power user of Windows.
Let’s roll the video.
It’s so cool.
So we’re going to talk a lot more about Windows,
you know, tomorrow when Panos is up here.
The other thing that we’re also very excited to
launch is the Copilot stack, right?
After all, we’ve built all these copilots
with one common architectural stack.
We want to make that available
so that everyone here
can build their own copilot
for their applications.
We will have everything from the AI
infrastructure to the foundation models to the
AI orchestration all the way up to your copilot
and its extensibility.
In fact, the other thing that we’re going to do
is have common
extensibility across
all of these surfaces, right?
Whether it is ChatGPT,
Bing Chat, Microsoft 365 Copilot,
or all of the Microsoft copilots,
and of course, your own copilots,
we can share the same extensibility model.
This is one of the most powerful things
for your developers -
for every developer to be able to write a plugin
and have it reach billions of users
across all of these surface areas.
So to be able to show you everything
in action
from both the plugin extensibility
to all of the copilots we announced,
let me right up on stage Yusuf Mehdi
to come show you all of this.
Yusuf, let me throw it over to you.
Hi, everybody.
We’re making fast progress on
delivering our vision of your copilot
for the web and for business.
And today, as Satya said,
we’re excited to announce
that we’re going to bring
ChatGPT and Bing together
as the default search experience
to give you higher quality answers
and more timely answers.
Let’s take a look.
Here I am in ChatGPT.
And as you can see now,
Bing is the default
and when I come in and select it,
I can now ask sort of real-time queries.
For example,
let’s ask what I should expect to hear
about Build and .NET.
And what you can see is the results
now are more up to date.
They include fresh content
and they include citations.
In fact, if you can see the links
on that page there,
you can click those
and those will take you straight
to a web page that’s sourced by Bing.
We’re also excited -
yeah, absolutely, you can clap!
We’re also excited to announce
that we’re going to bring
interoperability between ChatGPT and Bing
for plugins.
So you write them once
and they’re going to run everywhere.
So as you can see here in ChatGPT,
I’ve got Zillow and Instacart enabled,
but I want to show them to you
here in Bing Chat.
So we’ll flip over
and you can see again,
I’ve got the same plugins now
in both Bing Chat and in ChatGPT.
And what we’re going to show you now is
I’ll do a search here for
houses in Chicago.
And I can ask for a set of criteria,
I’ll learn a little bit about
the neighborhoods.
And now I can automatically call Zillow
by saying,
hey, give me three houses
in a certain price range
that meet my criteria.
And what you can see
is now I get these great options
and I’m also going to get
all of the other great things
you get with Bing, like
helpful city guides and maps and prompts.
I’m going to tell you now how
we’re going to further add value
to the plugins that you write.
They’re going to work
not just in chat and ChatGPT,
they’re going to work
across the entire Web,
courtesy of the Edge browser.
So here’s an example.
I’m on a web page
here checking out a recipe for a cake,
and now I can call Bing Chat
and ask it to tell me,
hey, give me the ingredients
from this web page.
And notice Bing can read
the context of the Web page,
understand those ingredients,
put them into chat,
and then I can say,
hey, give me a shopping list for this,
and it’ll automatically call
the Instacart plugin,
take those ingredients
right off the page,
and put them into an Instacart shopping.
And with one click,
I can get those now
delivered to my house.
This is incredible
productivity benefit for people.
Let’s show you how you’ll be
more productive at work.
Here I’m going to use
Microsoft 365 Copilot.
Now I’m in Microsoft Word
and I’m going to need some help
for drafting a legal contract.
I got a legal contract here
and I need some help with California law.
So I’m going to call three plugins
from Thomson Reuters
to edit this document.
First thing is, I’ll go into Copilot
and I’ll pull it up and I’ll say,
hey, help me understand
how to edit the limitation of liability
using the practical law plugin.
It’ll read the document,
find the paragraph, and make that change.
Next, I want to know
if this is enforceable
under California law,
so I’ll call the Westlaw plugin
that will do that analysis
and it’ll come back
and give you an analysis about it
from a legal perspective.
And finally,
since we’re making lots of changes,
I’d like to know
the summary of all of these changes.
And with Document Intelligence,
I get a simple table
that shows you all of those changes
in an easy-to-read format.
By joining the power of
Microsoft 365 Copilot in Word
with the support
of these real powerful plugins
like Thomson Reuters,
now you can draft a legal contract
in so much more powerful way.
Let me show you one more.
Here I am in Teams chat
and I’m engaging with
Microsoft 365 Copilot
to track website changes.
Copilot will just call the Atlassian plugin
to help.
Atlassian Jira specializes in project
and issue tracking.
So it’ll pull the Jira ticket
automatically with the plugin.
And now all I have to do is
assign an owner
using the Azure Active Directory
and that’s it.
It’s all done.
So with plugins
and Microsoft 365 Copilot,
you can intelligent reason across
all of your business apps
and the data stored
on the Microsoft Graph
to keep you in your flow.
Finally, as Satya shared,
we’re excited to announce the
Windows Copilot.
I think it’s going to change
how you use your PC forever.
Let me show it to you.
Here I am, in a coding project on my PC,
but I want to configure my PC
to help me be more creative
and more productive.
All I have to do now
is invoke the Windows Copilot.
I now just come down here to the taskbar.
I click on that
and now we’ll pop up the
Windows Copilot on the right.
This side pane here will be consistent
across every app that you use on your PC.
And just like with Bing Chat,
I can now ask it questions like,
how can I adjust my system
to get work done?
And not only will I get
a bunch of great suggestions,
but watch this.
I can now, with one click,
take action on those suggestions.
For example, I can put into focus mode.
I also know that as developers,
we like dark mode -
there’s a suggestion here for dark mode -
so with one click,
I’m now here on the dark side
and to really get going,
I want to get that coding playlist going.
So I’ll pull the plugin from Spotify
and say, give me a great coding playlist.
In this case, Chill Vibes will come up,
and now I’ll have it ready to go.
And finally,
there’s a suggestion here
that says, hey, to organize your PC,
let’s take advantage of snap.
So with one click,
it snaps all the windows
right in the place.
I need them so I can be super productive.
What do you think?
So as you can see,
we have an incredible array
of powerful AI-powered copilots.
We’ve got over 50 plugins
already available
for customers and thousands more coming.
I can’t wait to see
what you’re all going to build.
Thank you very much.
Thanks, Yusuf.
Thank you, Yusuf.
We have, as Yusuf said,
fantastic momentum already building.
And this is about really creating
that opportunity for developers
to reach all users across
all of these surface areas
and we are so excited
to see how you go about exploiting
that opportunity in the weeks and months to come.
Of course, when we talk about the
AI platform and the Copilot stack,
the next thing for us,
which is really exciting, is AI Studio.
This is the full lifecycle toolchain
for you to be able
to build your intelligent AI apps
and your copilots.
Everything from being able
to train your own models
to be able to then ground, whether it’s OpenAI
or any open source model, with data
that you bring.
Built-in vector Indexing in Azure Search,
built-in support
for RAG -
or retrieval augmented generation -
support, and built-in support for
prompt engineering
with prompt flow and orchestration,
and of course, built-in support
for perhaps the most important feature,
which is AI safety.
One of the things that we’ve been hard
at work is to build into the toolchain is AI safety.
We’ve been at work on AI safety
for the last five years.
We have principles,
which we have translated
into our core set of processes that we implement
across our engineering stack.
And then, of course, we have
all of the compliance and oversight.
But the real challenge
is not just to have these things
outside the engineering process,
but to build it into the everyday toolchain.
And that’s what we are doing with AI Studio.
And it starts with testing.
There is the Responsible AI dashboard
that helps you during the testing phase to ensure
that what you’re developing is safe.
We have grounding.
In fact,
the prompt flow is perhaps
one of the best features
for you to be able to ground your models.
You have provenance,
provenance for media provenance,
support for images and videos,
and watermarking for your neural voice
that’s going to be
available to all of you
as you build your applications.
And at deployment time,
that’s perhaps one of the most critical
things, is we have taken
all of the safety work
we did, for example, for the launch of Bing Chat
and really made it available
as just a set of features
for any developer to use, right?
You can take an OSS model
and use the AI Safety Service
to really make it, at the deployment time, safe.
And of course,
then you can even monitor the model
for model drift.
And that way then you can make sure
that it’s not just the one time,
but you are continuously looking to make sure
that you have safe deployment.
So we’re very, very excited about AI Studio
helping every developer out here
to be able to build AI applications,
but build them with safety first.
Let’s roll the video.
Introducing Azure AI Studio, a full life cycle tool
to build, customize, train, evaluate,
and deploy
the latest next-generation models responsibly.
With just a few clicks,
developers can ground AI models
with their structured
and unstructured data
to quickly and easily build customized, cutting edge,
conversational experiences for their customers.
Developers can take advantage
of a new model catalog that works
with the popular models organizations use,
including those from Azure OpenAI Service,
Hugging Face, and many other open source models.
With prompt flow,
developers can combine
relevant data from your organization
and create a detailed prompt to get better results.
Prompt flow works
with foundation,
internally developed or open source models,
and uses popular open source tools,
LangChain, and Semantic Kernel.
And because the AI systems
we build are designed to support our AI principles,
with Azure
AI Content Safety, we are making it easier for you
to test and evaluate your AI deployments for safety.
This is Azure AI Studio,
the trusted tools
you need to build the next generation
of AI applications.
Cool.
And of course,
all AI applications start with data.
And we are really thrilled
to be announcing Microsoft Fabric.
This is a product that we’ve been
working very, very hard on over multiple years,
and it’s finally coming together.
It’s perhaps the biggest launch of a data product
from Microsoft since the launch of SQL Server.
You know, it
really brings together compute and storage,
so it unifies compute and storage.
It unifies
all of the full analytics
stack product experiences.
It brings together governance,
so it unifies governance with analytics.
And, most interesting, it unifies
the business model,
across all the different types
of analytics workloads,
whether they’re SQL, machine learning,
whatever job you want,
you can use the same compute
infrastructure.
And this unification at the end of the day
is what I think will fuel
the next generation of AI applications.
Let’s roll the video.
Introducing Microsoft Fabric,
a unified data analytics platform,
one product,
one experience,
one architecture,
one business model.
Unified data is stored in OneLake,
a SaaS data lake for the entire organization.
Data is integrated and stored in an open format,
allowing one copy to be used to train
machine learning models,
visualize data,
and run SQL queries on the lake
and data warehouse.
A unified experience brings together all the tools
data professionals need -
pipelines for orchestrating data movement,
experiments for training machine learning models,
semantic models for defining key metrics,
and much more.
And for business users, Fabric brings together
data for collaborating
and doing ad hoc analysis in Microsoft 365.
Unified governance,
security,
and compliance is built in for all your data.
And with Copilot for Microsoft Fabric,
AI helps everyone be more productive,
whether it’s writing SQL statements,
building reports,
or setting up automations based on triggers.
All your data,
all your teams,
all in one place.
This is Microsoft Fabric.
You know what the AI
supercomputer did for the infrastructure layer?
Microsoft Fabric will do to the data
layer for this next generation
of AI applications.
Very, very exciting.
These are just the five of the 50,
so we have 45 more to discover
throughout the conference.
We’re, again,
really on a fast pace to build things
that help us
build this next generation of applications,
but build them with safety first.
Now,
one of the things
that I think we should ask ourselves
as developers is why do we build?
Why do we build technology?
You know, this relationship
between economic progress
and economic
growth and technology
has been there for a long time.
In fact, in this graph, when you see it,
it’s pretty
stunning that for most of human history,
we didn’t have much economic growth,
nor did we have much technology.
And then something happened
250 years ago, right?
Which was long in building,
by the way, from perhaps the Enlightenment
to the Scientific Revolution to the
Industrial Revolution was close to 400 years.
But then there was real progress.
You see that slope going upwards.
And then, of course, over the last
70 years, information
technology has played a role across
all of those sort of seminal moments
on the march towards that dream machine.
And of course, we now enter the age of AI
and we get to define
what that slope looks like
going forward for economic growth.
But it’s not even just economic growth
on its own, right?
We don’t build
just because we want economic growth.
We want economic growth
so that we can have human
development index growth.
We want the lifespans to go up.
We want education and prosperity
and standard of living to go up everywhere.
That’s why we build, that’s why we innovate.
That’s why technology exists.
It’s not for technology’s sake,
but it is for that broad impact.
And to me,
this all came together -
you know,
in January when I was visiting India,
I had a chance to see this demo
and it sort of had a very profound impact on me
because at some level
it motivated me to go into this next wave
with that much more rigorous ensure
that this time around, this technology
reaches everybody in the world.
There are two things that stood out for me,
right, that things that we build
can in fact make a difference
to 8 billion people,
not some small sort of group of people.
And to be able to do that
by diffusing
or diffusion that takes days and weeks,
not years and centuries,
because we want that equitable growth.
We want trust in technology.
We want to ensure that we protect
those fundamental rights that we care about
and that we do this
in such a way that we manage
our energy transition
given the finite resource we have in our planet.
That’s, at the end of the day,
what grounds us in our mission
to empower every person
and every organization
on the planet to achieve more.
And so I want to leave you with this video
of what you as developers are going to do
in the days and weeks
and months and years
to come
is going to have the most profound impact
of any technology
to 8 billion people all around the planet.
Thank you very much and have a fantastic Build.
I’m a farmer.
In our village, we have two acres of land where we grow crops.
I grow rabi and kharif crops, which include wheat,
jowar, millet, sesame, and mustard.
But here in Mewat, many villages don’t have access
to newspapers, so our access to information is limited.
There are many government schemes available to us,
but we often don’t know about them.
These are all communities living in media-dark areas,
which means that penetration of traditional media
like television and radio is very sparse.
There’s also literacy challenges,
so things like newspapers have a very low access in these areas.
In India, we have 22 constitutionally-recognized languages.
But apart from that, we have more than 100,
almost 120, languages.
Being able to provide justice
to each and every citizen in their own language,
it’s a challenge, but it’s also an opportunity.
Jugalbandi is a chat bot that we built to solve two problems.
One is this linguistic divide and the technological divide.
We collected millions of parallel sentences
for machine translation.
We collected thousands of hours of data
for speech recognition,
and then we built state-of-the-art models on top of it,
using Bhashini speech recognition models and Azure OpenAI.
So these capabilities allow us to understand language,
transcribe it, translate it,
and also synthesize it back in the form of speech.
Uncle, this is Jugalbandi.
We can write or ask about government schemes
that we want to know about
and it will give us information about it.
I am a farmer and I want to know about
the government scheme for farming.
My name is Jugalbandi.
I am an assistant that will help you to know about
your government scheme and apply for it.
The answer came both in text and audio,
with which documents and forms are required.
The Ministry of Electronics and IT in the country
has been working on this for the last five years now,
and when ChatGPT was launched,
that was really the catalyst for us
being able to bring these pieces together.
We’ve been able to categorize the diversity of India
as an innovation opportunity.
And if you’re able to do that successfully,
you’re able to do it for the world.