Testing captions.
ANNOUNCER: Please take your seats. The show will begin in five minutes. captioner standing [Live by].
[Music playing]. SPEAKER: Do you know where we’re
going? SPEAKER: We have this idea that
in the future, you can get help wherever you are, for whatever
you need. It’s almost like it’s in the air.
SPEAKER: Hey, what’s up? SPEAKER: How’s it going?
SPEAKER: Have you seen any of this stuff?
SPEAKER: Never been here. SPEAKER: Come on in.
SPEAKER: Thank you. SPEAKER: I’m not necessarily
designing this for myself. I’m designing it for people out
there who really could use an assistant in their home.
SPEAKER: There’s a lot of sensors and processors and
machine learning, things that are uniquely Google.
SPEAKER: When you combine the ultimate piece of technology and
something so human, that’s where magic happens.
SPEAKER: This vision to me, it’s really total because we can
create a new generation of products that are truly helpful.
SPEAKER: It helps you from the background in the foreground.
And your foreground is your life.
SPEAKER: Big picture. What’s the end game?
SPEAKER: It’s about making it easier every day.
SPEAKER: Making what easier? SPEAKER: Life. Making life easier.
SPEAKER: Yes. SPEAKER: Let’s take a look.
SPEAKER: Here we go. [Cheers and Applause]. RICK: Good morning! Good
morning! Thanks so much for joining us here in New York
City, and for those on the livestream, for joining us
around the world. Thanks so much. We’re going to spend the
next hour talking about the problems we’re working to solve
for our users, and the ways we’re delivering help for the
way people need it, when they need it. We’ll also take you
into our labs with writer and cultural commentator Baratunde
thurston, who hear from the folks at Google who personally
develop, design and bring these products to life. Now, if you
look across all of Google’s products, from Search to Maps, G
mail to photos, our mission is to bring a more helpful Google
for you. Creating tools that help you increase your
knowledge, success, health and happiness. Now, when we apply
that mission to hardware and services, it means creating
products like these. [Cheers and Applause]. New Pixel phones, wearables,
laptops and Nest devices for the home. Each one is
thoughtfully and responsibly designed to help you in your
everyday, without intruding on your life. Now, in
the mobile era, smartphones changed the world. It’s super
useful to have a powerful computer everywhere you are.
But it’s even more useful when computing is anywhere you need
it, always available to help.
Now, you heard me talk about this idea with Baratunde that
helpful computing can be all around you, ambient computing.
Your devices work together with services and AI so help is
anywhere you want it, and it’s fluid. The technology just
fades into the background when you don’t need it. So the devices aren’t the center
of the system. You are. That’s our vision for ambient
computing. The Google Assistant plays a
critical role here. It pulls everything together and gives
you a familiar, natural way to get the help you need. Our users tell
us they find the Google assistant to be smart, user
friendly and reliable, and that’s so important for ambient
technology. Interactions have to feel natural and intuitive.
Here’s an example. If you want to listen to music, the
experience should be the same, whether you’re in the kitchen,
you’re driving in your car or hanging out with friends. No
matter what you’re doing, you should just be able to say the
name of a song and the music just plays, without you having
to pull out a phone and tap on screens or push buttons. So think about
how this vision plays out in the home, where ambient technology
can make life so much easier. When you wake up in the morning,
your home knows what you need to start your day. You can get
your commute, find out when your first meeting starts, maybe play
some music on whatever speaker or screen is nearby. And when
you leave your house, your lights, thermostat, door locks,
security cameras, they all just know what to do. And your
devices go silent and turn off notifications at night when you
want to relax without technology interrupting or distracting you. So throughout your home,
technology works as a single system, instead of a bunch of
devices doing their own thing.
Now, we can bring this ambient computing vision to gaming as
well. With Stadia, our new generation cloud gaming
platform, we’re aiming to deliver the best games ever made
to almost any screen in your life. So I’m excited to share
an update with you all. Stadia will be available on November 19th.
[Cheers and Applause]. So you’ll be able to play games
wherever you want, on your TV, your laptop, even your Pixel,
which will be the first phone to support Stadia when it launches.
We’re also creating a few areas to create more human
interactions with technology, like Motion Sense and the new
Google Assistant for Pixel 4.
So instead of being glued to your phone, you can use quick
gestures and voice commands, and then get back to your day. That push for
quicker, more natural interactions, is leading us in
new hardware directions, too, extending the phone’s
capabilities in new ways. Let’s take a look.
SPEAKER: This is clearly a time machine. And you’re pretending
you’re using it to test earbuds. That’s a great cover story.
SPEAKER: Right, left, up, down. Hello!
SPEAKER: Hi. SPEAKER: Isabel?
SPEAKER: I know you and your team led the design for the
earbuds. SPEAKER: We really wanted it to
just be a simple, tiny little dot floating in your ear. What
is a simpler form than a circle? And how insanely tiny can we
make it. Because there’s, like, two computers in there.
SPEAKER: These are floating computers in your head.
SPEAKER: Yes. Yeah.
[Laughter] SPEAKER: Do you remember how you
felt when you first got the design brief for what these
earbuds were supposed to do? SPEAKER: I think it’s crazy.
SPEAKER: Certainly the assembly is a really challenging part of
this. All of these pieces have to go together with
submillimeter precision. I don’t think I would have
imagined being able to build things with this kind of
processing power this small. SPEAKER: A lot of sensors,
processors. A little bit like building a ship in a bottle.
What we’ve managed to do here is not just make great head phones,
but really putting in all of the other things that are uniquely
Google about this, the ability to process your voice and to
make a call clear even when you’re riding a bicycle down a
sidewalk. A lot of software. A lot of machine learning. It’s
the magic that powers the product. It turns a great set
of head phones into a Google set of head phones.
RICK: All right! [Applause]. That was a sneak peek at the
all-new Google Pixel Buds. So you can start to get an idea of
what ambient computing feels like. With Pixel Buds, help is
there when you want it, and the experience just comes to you.
Even when your phone is not in your hand. For instance, you
can get hands-free access to the assistant. So instead of
turning to your phone for quick tasks, you can just say, hey,
Google, and ask the assistant for whatever you need. Resume
your podcast, send a quick text, get directions, or even
understand another language with Google Translate. Pixel Buds even have
a long-range Bluetooth connection, which keeps you
connected even when your phone isn’t by your side. So you can
wear them in the yard when your phone might be charging inside,
or leave your phone in a locker if you’re working out in a gym.
Indoors, Pixel Buds will stay connected up to three rooms
away. And outside, they’ll work across an entire football field.
Of course, Pixel Buds won’t be truly helpful unless they’re
also great headphones. They have to have excellent sound
quality, they’ve gotta be comfortable to wear all the
time, and they need to last long enough to be useful. That’s a
lot to ask of a pair of headphones, especially because
they also need to be unobtrusive, too. So we did
some intricate origami with Pixel Buds to make sure
everything fit. Custom speakers, sensors, custom
battery. That’s usually what makes these wireless earbuds
stick so far out of your ears. But Pixel Buds gives you plenty
of battery life to get through your day. You’ll have five
hours of continuous listening time on a single charge, and up
to 24 hours when you’re using a wireless charging case. Now,
even with all those components and long battery life, you can
see Pixel Buds fits almost flush with the ear. They’re so small
and light, it’s easy to forget you’re wearing them. At the
same time, Pixel Buds deliver excellent sound quality. Now,
you typically have to choose between great sound and
awareness of the world around you, but Pixel Buds gives you
both with a unique hybrid design. The earbuds gently seal
the ear for rich bass and clear highs and the spatial vent
underneath reduces that plugged-ear feeling, and lets
through just the right amount of environmental sound. On the
software side, Pixel Buds respond to your surroundings,
with the new adaptive sound. The volume dynamically adjusts
as you move from the quiet of your home to a subway or a noisy
cafe, and you don’t have to constantly raise or lower the
volume. When you’re on a call, beam-forming mics focus on your
voice while a voice accelerometers detects speech
through your jaw bone, so a loud restaurant or a windy day won’t
get in the way of your conversation. Pixel
Buds will be available in the spring of next year, and we’ll
share more details in the coming months, including a few of the
helpful experiences that make good use of the on-device machine learning chips. So as
you can see, this ambient computing era is going to bring
all kinds of new interfaces, services and devices. But it’s
also introducing new challenges. When computing is always
available, designing for security and privacy becomes
more important than ever. You need to know that your data is
safe. Protecting your data and respecting your privacy are at
the core of everything we do.
We’ve designed strong protections against our hardware
family, like the tightened security chip in our phones and
laptops, tightened protects your most personal on-device
information, your OS data, pass words, even information in
third-party apps. And we know that privacy is personal, which
is why you have the controls so that you can choose the settings
you want that are right for you. We make it easy to access simple
on/off controls, including turning cameras and mics on your
Nest devices off. And you can now delete assistant data just by asking. Everything is
designed with your privacy in mind. And you’ll see examples
of that throughout today’s presentation. Now,
we’re also going to talk today about our work to create more
sustainable products and processes.
Developing sustainable solutions to mass production and
consumption is one of the biggest challenges we face today
as an industry. It impacts all of us, and it will for
generations to come. Now, we believe Google has both the
ability and the responsibility to create systemic change. As a
company, we’ve been focused on sustainability for a long time.
Google’s operations have been carbon neutral since 2007, and
for the past two years, we’ve matched all of Google’s energy
consumption with a hundred percent renewable energy. Now,
we’re continuing to expand access to clean energy to more
people, including our suppliers and the communities where our products are made. So today,
we’re announcing that Google is committing to invest another $150 million in renewable energy
projects in key manufacturing regions. Our investment —
[Applause]. Our investment, alongside
financial and manufacturing partners, aims to catalyze $1.5 billion of
capital. Now, this will generate approximately the same
amount of renewable energy as the electricity used to
manufacture Made By Google products. So when you choose to
buy hardware products from Google, you’re contributing to
bringing renewable energy to communities around the world.
Sustainable, secure and private, and, of course, helpful. That’s
the Google way to make hardware and services. Now, we’re
excited to share with you how we build these principles into our
products. And here’s Ivy Ross, who leads our design team, who’s
going to talk about some of our recent work in responsible manufacturing and design.
[Applause]. IVY: Thanks, Rick. I’m happy to
be back in New York to discuss our design philosophy at Google
and tell you about a few things that we’ve been working on. I
grew up not too far from here in the Bronx, and my dad was a
designer, industrial designer, too, and he worked for the
legendary industrial designer Raymond Loewy on automobiles and
a lot of other consumer goods. When I was little, he even made
me my own little roadster. I can remember spending hours in
his studio as a kid tinkering with different tools and
materials, and something I learned early on is that at its
core, design is about solving problems for people. Whether
you’re designing a building, an automobile, packaging or even a
phone, the goal is to create unique solutions to the world’s challenges. And
sustainability is one of the fundamental challenges of our
generation. You know, when you look at how
most things are made today, it just doesn’t make sense. In all
too many cases, devices are manufactured with dirt energy
from precious minerals and materials that are rapidly
depleted, and with technology that becomes obsolete in a short
time and then thrown away. Right now, we’re truly looking
at sustainability from every angle. For years, we’ve been
pushing what’s possible in design, manufacturing and new
materials. We’ve been able to include recycled plastics in
products like Chromecast and the new Stadia controller, and
today, I’m happy to share that all of our Nest products
launching in 2019 include recycled plastics. Instead of
these materials ending up in the ocean or in landfill, we’re
giving them a new life. We’ve designed and engineered the
fabric on our Nest mini speaker so it’s made from 100 percent
recycled plastic bottles. A single half-liter bottle
produces enough textiles to cover more than two Nest minis,
and we didn’t compromise on aesthetics or function. We
created beautiful recycled fabrics in colors that blend
into your home while hitting the same rigorous technical and acoustical requirements. We
continue to focus on products that empower people to reduce
their own environmental impact, as well. Our Nest team has been
at the forefront of these efforts since 2011, and as of
this month, Nest thermostats have helped consumers save more
than 41 billion kilowatts hours of energy, enough to power all
of Denver’s electricity needs for six years.
[Applause]. Rick just filled you in on our
new renewable energy investment, and as of last month, Google is
offsetting 100 percent of the carbon generated by our shipping
partners for all customer orders. We have so much more to
do, but by working with our suppliers, manufacturers on
these initiatives, our goal is to clear the way for the entire
industry and our planet to benefit. Another
sustainability goal is simply reducing the amount of hardware
you need to buy in the first place. What if you didn’t need
to upgrade a bulky new game console every few years? With
Stadia, we’re actually consolidating devices so the
only hardware you need is a controller and a screen to play
your games anywhere, any time. To give people a great gaming
experience, we designed the first cloud-based controller.
You know, great design isn’t just about how something looks,
it’s also about how it feels, and subtle design differences
can have a profound effect, and we wanted the controller to be
comfortable in the hands of all gamers. We found design
inspiration in some unlikely places.
SPEAKER: How’s it going? SPEAKER: Good, man.
SPEAKER: Nice to meet you. SPEAKER: Come on in.
SPEAKER: All right. SPEAKER: I go to these really
nice kitchens. They all have these simple knives. Like, none
of them look like the grocery store knives, with all the grips
and details. It’s really uncomfortable if you rotate your
hand around. The reason why most professional kitchens have
knives like this is you can use it in many ways, so that’s a
starting point for the controller. We literally took a
knife handle. We bent it. It’s like, oh, like we’re on to
something here. SPEAKER: Yeah. Okay.
SPEAKER: So from there, that one ancestor basically had hundreds
and hundreds of kids. I kept building on it patiently until
it became that. And it’s made for small and large hands. It’s
super usable for large segment of gamers that aren’t always
appreciated. SPEAKER: Jason was very, very
insistent that we have our non-visible screw design.
SPEAKER: Super important to give it that nice, clean look instead
of punching a bunch of holes on to the back.
SPEAKER: That was one of the biggest challenges for product
design. We think it’s really worth it. It makes it very
comfortable in the hand, and seamless.
SPEAKER: Oh, how! That was impressive! Good work!
IVY: We worked with thousands of people playing hours and hours
of games to test our controller against all of its limits. It
needs to feel right for as many people as possible. Putting people at
the center of our design is integral to our process and our
principles, whether it’s hardware or software, creating
truly helpful products for people starts with empathy. One
of our earliest projects we tackled within hardware team was
designing a new kind of laptop that could deliver performance
and versatility in a truly beautiful form. We wanted to
physically embody the speed and simplicity that people love
about Chrome OS. The result was the original Pixelbook. The
response was great. People really loved the award-winning
design, the keyboard, and the speed. So over the past
couple of years, we’ve been working really hard to bring
that kind of experience to even more people at a more affordable price. I actually
believe that you can be more creative when designing within
constraints. So once again, we started with our users’ needs,
especially portability and battery life. We wanted to
create a thin and light laptop that was really fast and also
have it last all day. And of course, we wanted it to look and
feel beautiful. We landed on Pixelbook Go. The design is so
distinguishive with an incredibly light magnesium that
lets us create a very smooth matte finish in great colors.
Pixelbook Go comes in just black and Not Pink, one of the iconic
colors we introduced on Pixel 3. And we createsd a new rippled,
wavy bottom that’s easy to grip. Pixelbook Go is lighter than
Pixelbook, but we still managed to add a battery that is 15 percent larger, making it
easier to keep working all day. We also spent a lot of time
making sure the keyboard is comfortable and quiet. We took
all of our learnings from the original Pixelbook and really
refined the design. We ended up with keys that feel great to
use, and are even quieter than the original.
And with Chrome OS, Pixelbook Go is always fast, secure and all
your devices stay in sync with each other. Everything about
Pixelbook Go is designed to address real user needs for an
affordable price. You can pre-order it now in Just Black,
with Not Pink coming soon. Next up, my colleague Rishi Chandra
will tell you about the work we’ve been doing to make life at
home a little easier. Thank you. [Applause].
RISHI: Hey, everyone. I’m excited to give an update on
Google Nest and our mission to create the helpful home. So
last month, we launched Nest Hub Max, which is a great example of
the power of ambient computing. See your photos come to life
with a screen that automatically adjusts to your lighting
conditions, pause your music and videos with a simple hand
gesture, and it automatically adjusts the information and
controls based on your proximity to the device. At Nest, we want
to put people first and build technology around their needs.
It’s the difference between just being smart and being truly
helpful. So while the rest of the industry is focused on
stand-alone devices, our focus is on building whole home
solutions that bring together technology to provide real help for real homes. And the most
important place to get this right is privacy. It’s your
home. The most personal, private space in your life. So
in May, we published a clear set of privacy commitments, which
helps you understand how our technology works. Today we want
to share how these commitments extend beyond Google to our
third-party ecosystem of partners. So we’re announcing
an update to our Works with Google Assistant program. We’re
working with partners to migrate their existing works with Nest
integrations that people know and love, but doing it built on
a foundation of privacy and security. For example, we’re
requiring partners to pass a security review before they can
even request access to your Nest devices. You should have
confidence in how Google and its partners are protecting your home data. And then you
can focus on, instead, the great benefits of the helpful home.
For example, let’s talk about home audio. It used to cost
thousands of dollars and a professional installation if you
wanted seamless audio throughout your home. Well, Google changed
all that. With a whole home audio solution that is simple,
affordable and sounds great. It started several years ago with
the launch of Chromecast, making it easy to use your phone or
your voice to play content on your favorite devices. And with
Google Home Mini, home audio got even more affordable, with a
great-sounding speaker with multi-room support. And with
Nest Hub Max, you now have a home media control center right
on your smart display. And it all works seamlessly together
with stream transfer. Where you can naturally move content
around your home. So, for example, I can start a play list
or watch a show on my Nest Hub Max in the kitchen, and when I’m
done cooking, just say, hey, Google, move this to the living
room TV. And it will pick up right where I left off. It’s really easy.
Now, for a lot of people, Google Home Mini was a perfect starter
kit for your audio system. And today, we’re introducing the
next generation, Nest Mini. It’s even more capable with the
same affordable price point and the same iconic design. So
let’s start with the design. Colors really helped Mini blend
naturally into your home, and you now have a new color option
called Sky. And as Ivy mentioned, all of our fabric is
made from 100 percent recycled plastic bottles. Now, we also
heard from you want a little more flexibility of where to
place Mini, so we added a simple wall mount. It really looks
great anywhere in your home. Now, the original Mini
was designed to pack in great sound in a really small form,
and with Nest Mini, you get even better quality sound, 2 X
stronger base and even more clear and natural sound. And
for those times when your home gets loud, like it does at mine,
we added a third mic to hear you better in noisy environments.
Nest Mini also got a really cool new super power. There’s a
dedicated machine learning chip with up to one TeraOPS of
compute. So for the first time, core experiences of the Google
Assistant can come from the data center and be moved instead to
run locally on your device. Simply put, things are going to
get a lot faster, as it learns your family’s most frequent commands.
Finally, Nest Mini also powers an amazing home communication
system, a home intercom, so you can talk room to room. A home
alert system, telling you who’s at the front door. A home
phone, allowing you to call anyone in the world for free
using Google Duo. I can even use my phone to call my Nest
devices. It works great for those times I’m leaving work and
I want to ask the family what they want for dinner. So that’s
the new Nest Mini, our next step in bringing seamless audio and
communication to more homes around the world. Okay.
Now, let’s talk about home awareness. One of our core
products is Nest Aware, which, combined with our Nest cams
provides intelligent alerts and camera history. Now, lots of
our users have multiple cameras, and we’ve heard from you that
our Nest Aware pricing can get a little expensive and
complicated. So today we’re announcing a new whole home
pricing model. For one monthly rate, you get Nest Aware support
across all your Nest devices in your home. So whether you have
two cameras or you have ten cameras, you pay the same
monthly rate. And you can choose between two different
pricing plans, depending on your needs. We even added more video
history. The new Nest Aware will be rolling out early next
year, and it will be easy to switch over your existing plan. Now, as part of the new Nest
Aware subscription, we’re also unlocking the power of speakers
and displays to be part of your home awareness system. So
devices like Nest Mini or Nest Hub can be your ears when you’re
on the road or on vacation. We use on-device AI, sound
detection AI, to pick up critical sounds like barking
dogs or smoke and carbon monoxide alarms, and we send an
alert to your phone. So now in one go, even those basic smoke
alarms become smart smoke alarms. And when you get an
alert, you have the option to hear the alert or listen live to
confirm the alarm. Now, if it is an emergency, the home app
can directly connect you to the 911 call center closest to you,
regardless of where in the world you are. So in those critical
moments, the last thing you want to do is scramble to find a
local emergency dispatcher. Now, these notifications will be
part of the new home app, which actually includes a new feature
called the home feed. It brings together all of the
notifications and snippets from your devices, organizes them and
highlights the important stuff. So you can quickly see priority
items or you get a general recap of the day. So that’s the
new Nest Aware, more affordable, with more features, and support
for more devices. Okay. Finally, let’s talk about home
connectivity. You can have the best home set-up in the world,
but it’s nothing without great wifi coverage. That’s why we
launched Google wifi three years ago, and since launch, it has
been the number one-selling mesh wifi system on the market and in
2019, it is the top-selling router of any kind. And it’s a
router that actually gets better over time, with automatic
updates, so add parental controls, improve performance,
and enable Google’s latest security features. Well, today, we’re also updating
the hardware with Nest wifi. Now, the Nest wifi system is
actually two devices. The router plugs into your modem and
creates a powerful home network. The point expands your coverage.
Now, working together, they create a single, strong wifi
connection throughout your entire home. And our
updated hardware and software delivers up to 2X the speed, and
up to 25 percent better coverage. So now the Nest wifi
system only needs one router and one point to cover around 85
percent of homes in the U.S. Now, we’re also solving a common
problem you find with routers today. Most of them get hidden
in a closet or cabinet because truthfully, they’re pretty ugly,
which reduces their signal intention by 50 percent. Nest
wifi is designed to be out in the open, where it performs at
its best, with a range of colors that will naturally blend into
your home. And, of course, it’s really
simple to use. With the Google Home app, you can set up your
Nest wifi network in minutes and once you’re set up, it’s easy to
share your wifi password, manage your network, set a schedule for
the kids or create a guest network. Nest wifi also provides a
foundation for your smart home connectivity. We are working
with a growing list of partners to enable seamless set-up in the
home app, and with support for BLE and Thread, we can talk to
smart home devices locally, so you don’t have to buy a separate
hub. Stay tuned for even more partner announcements over the next few months. Lastly, we
added a Google Assistant smart speaker to the Nest wifi point
so it does everything the Nest Mini does, plays your music with
great sound, provides answers to your questions, and lets you
control smart devices with just your voice. So now you can
broadcast a message to your kids that it’s time for dinner. And
if that doesn’t work, try saying, “Hey, Google, pause wifi
for kids’ devices.” I’m pretty sure that will work. So that’s
the new Nest wifi. Better coverage, smart home support,
and the Google Assistant. It will be available starting on November 4th. With new,
affordable home solutions for audio, awareness and
connectivity, everyone now can start building their own helpful
home. Thank you. [Applause]. SPEAKER: Right now in Mountain
View, it’s 68 and sunny. Today, it will be sunny with a
forecasted high of 72ᄚ and — SPEAKER: Hey, Google. Volume 10.
SPEAKER: Hey, John, can you come here for a second?
SPEAKER: So you use this room to test.
SPEAKER: Right. So we use this type of setting to really stress
the microphones. Right now I’m talking to it this way, but
sometimes, our devices are higher and sometimes they’re
lower. SPEAKER: And I might be down
here. SPEAKER: Yeah. Exactly. You
know, because it’s going to be in some areas, it’s sort of a
privilege to — we gotta get this right, you know. I come
from a family of immigrants, and they all have accents, and it’s
important for me to design products that, you know, my
parents can use, and that it works for everyone. At Google,
like, everyone sort of has their slogan of what their passion is,
and on mine, it’s actually “Be heard.” It obviously goes into
the stuff that I work on, but also, you know, speaking up when
things don’t feel right. What this represents is an entire
Google team’s voice of we got here because we worked together.
SPEAKER: You know, what’s kind of cool about that is when
multiple voices come together to express sound in a coherent and
beautiful way, we call that harmony.
SPEAKER: There you go. Harmony. With wifi. [Laughter] [Applause].
SABRINA: Hi. I’m Sabrina from the Pixel Team. Now, let’s talk
about how Google’s ambient computing vision comes to life
when you’re on the go. Pixel 4 introduces entirely new, helpful
experiences, with more natural interactions. It’s a completely
redesigned with a new look, a new color, and a beautiful new finish. And Pixel 4
includes camera features and sensors that you’re not going to
find on any other phone. Let’s start there.
Five years ago, our advanced technologies team began Project
Soli to investigate radar capabilities. Radar has been
around for a long time, and it’s still one of the best ways to
sense motion. It’s precise, low power, and it’s fast. There were lots
of exciting possibilities. But, here’s what our first sensor
looked like when we started working on Soli.
Radar sensors have always been way too big to fit in a phone,
so we shrunk it down into a tiny chip, but that still wasn’t
small enough, so we had to shrink it down even more. Pixel
4 is the first smartphone with a radar sensor. It powers the new
motion sense capabilities for more human interactions with
your phone. For instance, Pixel 4 has the fastest secure face
unlock on a smartphone, because the process starts before you’ve
even picked up your phone. Motion sense prepares the camera
when you reach for your Pixel 4, so you don’t need to tap the
screen. It’s so much faster and smoother. Motion sense can
power down your phone when you walk away and turn it back on
when you approach your phone.
It also lets you control your Pixel with simple gestures.
Swipe to skip a song, silence a call, wave hello to Pikachu.
And the Soli team is working on a wide range of helpful new
features from gaming to personal wellness. Here’s a quick look.
SPEAKER: Very famous saying that any knowledge becomes
indistinguishable from magic. That’s one of the things we
talked about with Soli is the magical sensor.
SPEAKER: I did it! I touched without touching!
SPEAKER: Radar has a lot of very interesting properties that
would be very useful for human computer interaction. You can
shrink it down. SPEAKER: All of this is now in
there. SPEAKER: That’s right. Yep. It
can sense materials. It’s extremely sensitive for motion.
SPEAKER: And so we build this new interaction paradigm based
on the understanding of body language, distances and
gestures. SPEAKER: How can we make the
language production technology closer to what we do naturally
in the real world. SPEAKER: Then, of course, we
really need to test to distinguish between intentional
and unintentional gestures. SPEAKER: Just because I wave on
top of a device doesn’t mean I want to skip a song.
SPEAKER: If I pick up coffee cup, this gesture is very
similar to a swipe, and this is very important connecting to
this, and it actually works. With Soli, we can try to
understand more about the implicit behavior that happen
around the device. SPEAKER: The phone knows earlier
what your intention is. SPEAKER: Exactly.
SPEAKER: Okay. SPEAKER: Let’s say the alarm
goes off. Really annoying. As you reach, we can lower the
volume. The phone is more polite, and then you can just do
a gesture to shut it off. These moments of understanding each
other happen all the time between us but they never happen
with technology. What we can do with radar is to actually
capture aspect of no verbal communication, and as a first
step in Pixel 4 with motion sense is to get as close as
possible to the intuitiveness of verbal interaction.
SPEAKER: Silence. SABRINA: Since the Soli sensor
can detect the environment around Pixel 4, privacy had to
be built in from the start. You can turn Motion Sense on or off
at any time, and when it’s on, all of the sensor’s data is
processed right on your Pixel. It’s never saved or shared with
other Google services. And motion sense isn’t the only way
we’re making your phone interactions faster and more
natural. The Google Assistant is now deeply integrated into
Pixel 4’s OS and across your apps. You can quickly open
apps, search across your phone, share what’s on your screen and
a lot more. The assistant can simplify multitasking, too, with
a clean, new interface. Check this out. Just give Pixel 4 a quick squeeze. Show me Maggie
Rogers on twitter. What are her concert dates? Share this with Vivian. Reply, let’s go see
her. Open ticketmaster.com. Search for Maggie Rogers events.
A key way we’re making the assistant this fast is with an
on-device version of our language models that run in our
data center. So they can run locally, right on your Pixel 4.
This means the new assistant uses a hybrid model. It can
respond to many day-to-day requests on device, like
starting a timer, while connecting for requests like, is
my flight on time? You also have new ways
to manage your data. Choose a time limit for how long you want
your activity data to be saved in your Google account, or, just
tell the assistant to delete everything you said to it today
or this week, and it will. You’re in control, and you can
ask — get more details by asking, hey, Google, how do you
keep my data safe? We’re taking the same care to protect your
on-device data, too. With Titan M and other security features,
last year’s Pixel 3 scored the highest for built-in security
for a smartphone, according to Gartner. We built Titan M into
Pixel 4 as well to protect your most sensitive on-device data,
like your pass words, your OS data, and now your face unlock
model. Your phone has some of your most personal, private
information, and we have a responsibility to keep it safe and secure. Now, how many
of you have tried a voice recorder app? I know I’ve tried
a few, thinking I’ll be able to get organized by recording notes
to myself, interesting lectures, important events. But then, I
end up with a bunch of untitled audio clips that I really don’t
know what to do with. So we created a new kind of audio
recorder that taps into our speech recognition and AI.
Let’s see it in action. We’ve had a Pixel 4
recording the show for the past few minutes. As you can see,
with one tap, I can get recorder transcribing my words in
real-time as I’m saying them. [Applause]. Now, to show this is live, it is
now 10:44. And it’s pretty accurate. This means you can
transcribe meetings, lectures, interviews or anything you want
to save. Eric back stage is going to save this recording,
and now, I can go into the search bar and find whatever I’m
looking for. I can search for sounds, words, phrases. Let’s
see all the times I’ve mentioned “Pixel” across my entire library
of recordings. The places where the word “Pixel” are said are
highlighted in yellow in the playback bar so you can dive
into the exact part of the recording you’re looking for.
It’s pretty cool. And you’ll notice this phone is actually in
airplane mode. All this recorder functionality happened on-device. [Cheers and Applause].
Now, I want to take a minute to talk about Pixel 4’s OLED
display. Displaymate has awarded Pixel 4 XL their highest
score, an A-plus rating, together with a best smartphone
display award. In five key areas like color accuracy and
image contrast, displaymate classified Pixel 4 XL’s display
as visually indistinguishable from perfect. Pixel 4 is also
our first smartphone with a 90 hertz refresh rate, and we’ve
added some smarts. The refresh rate adjusts on its own
depending on what you are doing. So you get a great visual
experience while still preserving battery life. Pixel
4 brings together so many helpful new technologies and
capabilities, and you’ll get the best Android experience with
Android 10, and you’re the first in line to get the latest OS
updates and features. We also want to make sure you get the
best experience out of the box, so Pixel 4 comes with three
months of Google One for new eligible members. You get lots
of premium features, Including Processions, for one-on-one
virtual help. So if you have a question about your settings or
want a few tips for the camera, we’re there for you. The new
Pixel comes in three colors: Just Black, Clearly White, and a
limited edition called Oh So Orange. It also comes in two
sizes, both with the same features and both available for
pre-order starting today. Shipping starts on October 24th. And we’re excited that people
will be able to find Pixel in even more places. We’re
expanding our carrier partnerships, so Pixel 4 is now
available through every major U.S. carrier.
[Applause]. Now, we didn’t forget about the
camera. For the past three years, Pixel set the standard
for smartphone cameras with incredible capabilities like HDR
+, Super Res Zoom, Top Shot, and, of course, Night Sight.
With Pixel 4, we’re raising that bar yet again, and it all starts
with this little square. Basically a miniaturized camera
rig right on the back of your phone. You can see the rear
wide and telephoto cameras, a hyperspectral sensor, a mic for
your videos and Instagram stories, and a flash that we
hope you’ll use mostly as a flashlight. But it’s there just
in case. But the hardware isn’t what makes our camera so much
better. The special sauce that makes our Pixel camera unique is
our computational photography, and who better to talk about it
than professor Marc Levoy from Google Research.
[Applause]. MARC LEVOY: Thanks, Sabrina.
It’s great to be here. There’s a saying among photographers
that what’s important to taking a great picture is, in order,
subject, lighting, lens, and the camera body. It’s their way of
saying that it doesn’t matter which SLR body you use unless
you get the first three elements right. Well, here’s a slightly
different take on this list. Subject, lighting, lens, software. So by software I mean
computational photography. So what does that mean? It means
doing less with hard-wired circuitry, and more with code.
I like to call it a software-defined camera. It
typically means capturing and combining multiple pictures to
make a single, better picture. One version of this is HDR +,
the technology we’ve used for taking photos on every Pixel phone. When you tap
the shutter button, we capture a burst of up to nine pictures.
These pictures are deliberately under-exposed to avoid blowing
out highlights. We align them, using software, and average
them, which reduces noise in the shadows. This lets us brighten
the shadows, giving you detail in both the highlights and the shadows. In fact,
there’s a simple formula. Noise goes down as the square root of
the number of images you average together.
So if you use nine images, you get one-third as much noise. This isn’t mad science. It’s
just simple physics. By the way, on the left is our RAW
output, if you enable that in the app. There’s something else
about this list. It says the lens is important. Without
quibbling about the order on the list, some subjects are farther
away than you’d like, so it does help telephoto shots to have a
telephoto lens. So Pixel 4 has a roughly 2x telephoto lens plus
our Super Res Zoom technology. In other words, a hybrid of
optical and digital zoom, which we use on both the main and
telephoto lenses so you get sharp imagery throughout the
zoom range. Here’s an example. You probably think this is a 1x
photo. It’s not. It’s a zoom, taken from way back here. By the way, Super Res Zoom is
real multi-frame super resolution, meaning that pinch
zooming, before you take the shot, gives you a By the way,
Super Res Zoom sharper photo than cropping afterwards. So
don’t crop like this. Compose the shot you want by pinch
zooming. Also, by the way, most popular
SLR lenses do magnify scenes, not shrink them, so while
wide-angle can be fun, we think telephoto is more important. So what new
computational photography features are we launching with
Pixel 4? Four of them. First, live HDR +.
Everyone here is familiar with HDR +’s signature look, and its
ability to capture extreme brights and darks in a way that
looks crisp and natural. But even phones with good HDR
solutions can’t compute them in real time, so the viewfinder
often looks different from the final image. In this example,
the view — the window is blown out on the viewfinder, which
might tempt you in to fiddling with the exposure. This year,
we’re using machine learning to approximate HDR + in the
viewfinder, so you get our signature look while you compose
your shot. We call this feature live HDR +. So the industry’s
most successful HDR solution is now real-time and WYSIWYG, what
you see is what you get. Now, if we have ab enTrinhically HDR + camera, we
should have controls for it, so Pixel 4 has dual controls.
Here’s an example. This is a nice HDR + shot, but maybe you
would like to try it as a silhouette, so you tap on the
screen and lower the brightness slider a bit.
That mainly changes the capture exposure. Then you lower the
shadows slider a lot. That mainly changes the tone mapping.
And voila, you get a different artistic vision. Try doing that
with any other cell phone. So separate sliders for brightness
and shadows while you compose your shot. It’s a different way
of thinking about controlling exposure in a camera. Second, white balancing in
photography is a hard problem. Mathematicians call it an
ill-posed problem. Is this snow blue, the way this SLR
originally captured it? Or is it white snow illuminated by a
blue sky? We know that snow is white, with enough training, so
can the camera. We’ve been using learning-based white
balancing in night sight since Pixel 3. In Pixel 4, we’re
using it in all photo modes. So you get truer colors, especially
in tricky lighting. Here’s a tough case. An ice
cave. It’s blue light, but not a blue person. And here’s what
it looks like with Pixel 4’s white balancing. Third, we’ve
continued to improve portrait mode, with our dual-pixel or
split-pixel technology, we’ve always been good at portraits
and at macro shots. This year, we’re computing depths, again
using machine learning, from both dual pixels and dual
cameras, which gives us accurate depth farther from the camera.
This extends portrait mode to large objects, and
stand-further-back portraits. We also have a luscious new
SLR-like BOKEH. That’s the shape of the Ben-Hur. Look at
the lights on either side of her head. We’re doing better on
hair and dog fur which are hard. And, of course, we still do
great selfie portraits. Fourth, and last, we have continued to
improve Night Sight, in many ways, and extended it to a use
case that has always been sort of a holy grail for me. You
could have taken this dusk shot using Pixel 3 last year. Using
Pixel 4, you can take this nighttime picture from the same viewpoint. [Applause].
In the year since we launched it, Night Sight has been called
everything from fake to sorcery. Well, it’s neither. Think back
to the mathematics that I explained at the beginning.
Astro photography is about taking longer exposures and more
of them. Up to 16 seconds times 15 exposures. That’s four
minutes. But it’s a single shutter press, and it’s fully
automatic. By the way, you can’t do this with a single long
exposure. In four minutes, the stars do move, and trees wave in
the wind. So you need robust alignment and merging of multiple pictures. And, for a
four-minute exposure, we do recommend a tripod, or you can
prop your phone on a rock. Is there machine
learning? Yes. We use it for white balancing, as I mentioned.
We also use semantic segmentation in all our photo
modes, and have for years, to brighten faces in HDR +, a
feature we call synthetic fill flash, to separate foregrounds
from backgrounds in portrait shots, and to darken and denoise
skies in Night Sight. Is there computational photography?
There’s lots of that, too. Digital sensors are prone to hot
pixels that are stuck at red, green or blue. The longer the
exposure, the more hot pixels. Our exposures are pretty long,
so we need some clever algorithms to remove those hot
pixels. By the way, that’s our astro photography field testing
team, and yes, they sat still for a long time for this shot.
So where does this game stop? What can’t we capture using
Pixel 4? Well, we can capture the moon, which, by the way,
required some fiddling with those dual exposure controls I
told you about, and we can capture a moonlit landscape.
This is not daytime. It’s the middle of the night, and the
landscape is illuminated only by the moon. See the stars? But what we can’t
do, including on Pixel 4 today, is capture both at once in the
same picture. The problem here is that the
moon is blown out, and the Marin headlands at the bottom are just
a silhouette. The dynamic range, the difference in
brightness, between a full moon and a moonlit landscape is 19
F-stops. That’s 19 doublings, about half a million times
brighter, way beyond the range of any consumer camera, even an
SLR. So is this scene forever impossible with a cell phone?
Remember what I said at the beginning about software-defined
camera? Pixel is committed to making its cameras better with
software updates, so stay tuned on this one. To sum up, four
new computational photography features. Live HDR + with dual
exposure controls, learning-based white balancing,
wider-range portrait mode with an SLR bokeh, and night sight
with astro photography. Oh, and remember, you can use Night
Sight for many things besides stars. Many things. So go out there
and be creative with Pixel 4. [Applause]. Now, it’s my honor to introduce
one of my favorite artists, who has spent her career creating
some of the most memorable photographs of the last 50
years. Twelve months ago, we gave her a Pixel, and she’s
taken it all over the country to build a new collection of
portraits. She also gives us suggestions and candid feedback,
which we’ve taken to heart in the tuning of the Pixel 4
camera. So please welcome my friend, Annie Leibovitz, along
with our own Lily Lin. [Applause].
LILY: Thank you, Marc. Hi, Annie. Thanks for joining us
today. ANNIE: Thank you, Marc. Thank
you, Marc. Thank you, [Applause].
LILY: Marc. This was an extraordinary
opportunity that Google gave me, and I’ve always been interested
in the camera phone, and, you know, what it could do and what
— you know, what its potential was, and Google came to me and
said, we’d like to support you in some sort of artistic
endeavor, and we thought of this project, and, I mean, it’s
obvious what’s interesting about a camera phone. I mean, it’s —
you can carry it in your pocket, for example. But go ahead, I’m
sorry. LILY: No. No. It’s great. I
know that you’ve been using the camera for over a year now to
shoot a collection of photographs, some of which we’re
seeing behind the scenes here. Can you tell us more about the
project? ANNIE: Well, we started really
with the Pixel 3, and, you know, I was very, you know,
suspicious, and, you know, very, you know, careful with it, and
it really became an exercise in light and composition and
content, and then when the Pixel 4 came along, I was kind of very
impressed about how I relaxed with it, and just glided with it
and used it and really just enjoyed taking pictures. I’m
really towards the end — I mean, we’re going to be doing
more work, but towards the end of the work that we were doing.
I felt like I was just beginning to sort of get it, and I just
let the camera do the work, quite honestly, and really
enjoyed myself. But the project, the people —
LILY: Some of which we have here today, Noor and chase and
Iddris. ANNIE: Such as Noor and Chase.
I just — it’s — I mean, the people, I mean the people made
the project. I mean, made — we really turned to people who care
and people who matter and people who are doing things that give
us hope, and across the board, and, you know, every single
person that we photographed is doing something that they care
about what they’re doing and they represent, you know, great
parts of us who are getting on with it, you know.
LILY: Change-makers I think is what I’ve heard you call them.
True change-makers around the country. So you’ve been
traveling, speaking of country, traveling across the country
shooting these amazing subjects. ANNIE: When Google first came to
me, you know, they sort of totally seduced me by saying
would you like to drive across the country, and then, you know,
that turned in to, of course, you know, going back to people.
I don’t know if you see what I did was I decided to take two
photographs to create a portrait, and because it’s
hard to say what you want to say about a person, especially these
extraordinary people, in one picture, and so I made it a
dyptic, and took two photographs, for example, with
Sarah Zorn from the Citadel. There’s a photograph of her, you
know, almost on graduation day in her uniform, but next to her
is a photograph of the boots she wore for four years, you know,
every single day. LILY: So I have to ask, I’m so
curious, because you have access to the world’s best camera
equipment. So how is it different with this project,
just having what you have in your pocket now? What is that
experience like? ANNIE: Well, I’ve been using
like everyone else, you know, camera phones for a while, and
the whole idea was can you use it to go out and do work as a
photographer. And I was dying for this opportunity — to be
given this opportunity by Google to sort of develop, you know, the
camera phone for a photographer, and had to use it — and, you
know, as I said before, it was a little bit of a rough start, and
then I just relaxed, and I really totally enjoyed myself,
one of the last shoots with Meg, the soccer player, it really
felt like we were just floating. I mean, she was just really a
beautiful — anyway, she was just beautiful
views, and I took these photographs. I wasn’t really
thinking about the camera, or thinking, you know, just really
composing, and the light was beautiful, and she had that red
shock of hair, and, you know — it was great.
LILY: That’s great. Well, so before we go, since I have you,
I have to ask, what pro tips you have for all of us here who want
to take beautiful images like this with the phone in your
pocket. ANNIE: Oh, it’s all inside you.
I mean, you just go out and you do it. It’s all there. I think
what’s great about, you know, the camera phone, I mean, my
children use this camera, and, I mean, we all are using this
camera, and it’s a brand new language, and, you know, if you
want to do something more specific, then — you know, then
you may fall into another category and you’re a
photographer, but it’s just really great that this is
available for everyone to use. LILY: Yeah, the democratization of the camera I
think is what you called it. So thank you. I know you, Noor and
Chase and Iddris are going to be sticking around, so you guys,
Annie Leibovitz. ANNIE: Thank you!
[Applause]. RICK: Thank you so much, Annie.
Amazing project. We’re all huge fans. That was awesome. Well,
as you’ve seen today, our vision for ambient computing is to
create a single consistent experience across your home,
your work, and on the go. It’s available anywhere you want it
whenever you need it. With the introduction of our new Pixel
phone, Pixel buds, Pixelbook Go, Nest Mini, Nest wifi, we’re
taking a big step towards this vision, with much more to come.
Now, we couldn’t get to all the product experiences today, so if
you’re here with us in New York, there will be a lot more product
details to see upstairs in person, and for those on the
live stream, please go to the Google store online and see a
lot more. Thanks so much for joining us today, and we’ll see
you again soon. Thank you. [Applause]. [Music playing].

Made by Google ’19
Tagged on:                         

Leave a Reply

Your email address will not be published. Required fields are marked *