Generic selectors
Exact matches only
Search in title
Search in content
Search in posts
Search in pages
Generic selectors
Exact matches only
Search in title
Search in content
Search in posts
Search in pages
Hit enter after type your search item

Google IDE 2019 live blog: keynote news in real time

Our live Live Live blog from Google IO 2019 is underway, as we report on the developer conference approaching the two hours. We have an overview of the news from the keynote below – directly from Mountain View, California.

So far, Google has announced Android Q beta 3 and new features, including Dark Theme, Nest Hub Max (and a price drop for the original Google Hub (now the name of Nest Hub)).

Today we picked up our badge and are waiting for minute-by-minute updates on Android Q, Pixel 3a and possible updates for Nest and the Google Home devices.

And of course we are the first to report on other surprises. Of course you can always watch the live stream video of Google IO, but for people at work (supposedly working), you have to stay locked up here for the latest live updates.

  • How to view Google IDE 2019

Google IO live blog: real-time updates

All times in Pacific Daylight Time

11:45: We call it again. Our Google IO live blog is ready and all announcements are below.

11:40: The last minutes of the Google IDE keynote are slowing down in terms of software and hardware announcements, but they go deeper into the real needs where people need the most help. Google is now talking about how its neural networks and physics for predicting floods.

If you are looking for Android Q announcements, Google Pixel 3 functions or the Nets Home Max, you can scroll below.

11:34: Our Google IDE 2019 keynote has chewed for an hour and a half and we have not really been delayed. Google talks about TensorFlow, Machine Learning, medicine and the Google AI healthcare team. This is usually accompanied by a moving video about improving the lives of patients with technology-based care and diagnoses.

11:27: The Pixel 3a is coming to more than just Verizon is the US. In addition to Verizon, the entire Pixel series comes to T-Mobile, Sprint and US Cellular. It is of course unlocked and compatible with Google Fi because it is unlocked in the Google Store.

Regarding Pixel 3a specifications, it has a Snapdragon 670 chipset (as opposed to the 855 chip), a Full HD screen, 4 GB RAM, 64 GB storage space and a polycarbonate case. Those are the big differences with the Pixel 3. The only constant is that great Pixel 3 camera.

11:26: Google talks about the battery life of Google Pixel 3a and it looks like it will last all day, like the Pixel 3, which suggests 30 hours on a single charge and 7 hours on just 15 minutes of fast charging. Of course you probably get more time from the Pixel 3a XL than the Pixel 3a.

11:25: The Google Maps AR feature, which was promoted last year on Google ID, is finally coming to consumers and, according to Google, is first available on Pixel phones. It is something that I want when I leave the New York subway and I try to figure out how it is in the city and in the center.

11:24: The night vision mode of the Google Pixel 3a camera is shown against the iPhone XS, and the differences are grim. The Pixel 3a photos look just as good as the Pixel 3 camera photos and contain free backups via Google Photos.

11:23: It has a 3.5 mm headphone jack, which makes more sense for a budget phone than for flagships. (But actually I would also like to see it on flagships).

11:22: The Google Pixel 3a and Pixel 3a XL are about to start. This will be almost as good as the Google Pixel 3 for almost half the price. They are made of plastic, but look very similar, with colors: just black, almost white and the new purple-like.

11:20: Here are the price details of the Nest Hub Max directly from the Google IDE keynote. It costs $ 229 and the original Hub (now called Nest Hub) is $ 129. They are in twelve new markets and support nine new languages.

11:19: Do you want to feel that you & apos; The Force & apos; With the Hub Max you can make a gesture to pause what is being played, which according to the demo seems at least much easier to do that by shouting through a loud speaker.

11:17: Face Match sounds great, with personalized recommendations, calendar reminders and even morning greetings based on who's in front of the 10-inch display. Google really uses the power of the camera to use this more advanced version of the Hub.

11:14: Google Home Hub is renamed to Nest Hub and now becomes a member of the Next Hub Max. It is a new product with a camera and a 10-inch display, to be the center of your house. You can use it as a Nest Cam and view things in your home through the Nest app. The wide angle lens gives you a good field of vision. For privacy there is a green camera recording light and an on / off switch for the rear camera.

11:13: Off to Google Home with more information about … you guessed it: Google Assistant. It is the software that pushes Google to every device. It is also about respecting your privacy in bold and bold letters. It means everyone!

11:12: No release date for Android Q (although we are pretty sure it will be early August) and no official name (your guess is just as good as ours).

11:10: Android Q beta 3 is available on 21 devices, including 12 OEMs. We are currently uploading a photo of all third-party company logos. This includes, among others, OnePlus and Nokia and definitely an improvement on last year's seven beta participants.

11:08: The just announced focus mode from Google. It is just like an advanced Do Not Disturb mode and comes to Android P and Q devices & apos; this fall & apos; so probably around August or September.

11:07: Digital well-being and privacy are what Google is talking about right now. You have more control over your location (if you order pizza, you can let the app know your location, but it won't follow you indefinitely.

11:02: Dark Theme is coming to Android Q. It is officially one of the more than 50 functions of Android Q. It will burn fewer pixels (on OLED screens). It can access from the notification screen (shadow has a whole new meaning) via Dark Theme or the Battery Saver mode. You first heard it here on our live blog of Google IDE 2019.

11:01: Here is another Live Caption demo, but this time the twist is that Google ' s speech recognition technology can be used offline (the demo was in airplane mode). Device learning on the device is also done to protect user privacy.

10:59: We knew about this Android Q function for our Google IO live blog, but Continuity is preparing for foldables. When you switch from the unfolded state to the unfolded state, the app you use seamlessly adjusts. Samsung has built this into its Android Pie phone, but it comes pre-packaged with Android Q.

10:57: Android Q is next and Google announces that there are more than 2.5 billion active Android devices – from 180 device makers around the world. And now foldable phones are coming to Android OEM's.

10:55 am: Live transcription, Live Caption, Live Relay and Project Euphonia are among the highly advanced functional limitations for which Google works in 2019.

10:52 am: Live Relay is a similar feature on the device that allows you to talk to someone while they are reading a text.

10:51 am: "It's a simple ' n simple feature, but it has a ' n big impact on me," said a person with hearing impairment in a Google video. The use cases for the Live Caption function are groundbreaking for the deaf. And for those who can hear it but are in a metro, it also has a use case, says Google.

10:49: Live Caption is new – with one click you can enable subtitles for a web video, a podcast or even a video that you record at home.

10:45 am: Federated learning is what Google uses for anonymous data to improve the global model. Here & apos; s an example: take Gboard, Google keyboard. When new words become popular – because people type BTS or YOLO – after thousands of people type that (or millions in the case of BTS), Gboard can use this data without tapping into individual user privacy. It sounds a lot like the differential privacy from Apple & apos; s; & apos; s & quot; approach.

10:43: The incognito mode comes to Google Maps (so in 2011 they will be in Chrome, YouTube and Maps) and with one tap access to your Google account, Chrome, search, YouTube and Maps.

10:40: Google talks about privacy and consumer control Incognito in Chrome is ten years old and Google Takeout is a valuable service for exporting data. Google says it knows that its work has not been completed. It makes all privacy settings easily accessible from your profile. You can view and manage your recent activity and even change your privacy settings. This is accompanied by automatic removal of follow-up checks (3 months and 18 months) announced last week. It will be rolled out in the coming weeks.

10:36: Breaking news: Now you can ask a Google Home speaker to turn off an alarm without first ' Hey Google ' having to say. Just call "Stop" and the annoying alarm will be switched off. So helpful. It's coming to English-speaking places today, according to Google, so search for it quickly.

10:35 am: Driving Mode is available from this summer on every Android phone with the Google Assistant. That means on more than a billion devices in more than 80 countries. Google's mission is to find the fastest and most personal way to get things done.

10:33: The Google Assistant will be coming to Waze in the coming weeks, and there will be a driving mode in the Assistant App with personalized suggestions and shortcuts (such as dinner menu ' s, top contacts or a podcast you want to resume in the car). Phone calls and music appear in an inconspicuous way so that you can get things done without leaving the navigation screen.

10:32: Google extends the Assistant's option with ' personal referrals & # 39 ;. It'll understand, "Hey, Google, what's the weather at Mama's house this weekend." You always have control over what it knows about you, your family and your personal information. Google suggests that this will be very handy on the road with Android Auto.

10.30 am ' in the morning: The next generation Google Assistant for newer Pixel phones later this year.

10:29: Google showed a more complex speech-to-text scenario where the assistant could send an email – and tell the difference between completing an action (opening an email and sending) and receiving a dictation. At this time, the Google Assistant cannot do that and must tap the screen. The future is without touch.

10:27: Now to a more practical (and fast) demo: send messages, look up a photo to send to a friend and answer that photo. It is all done without the need to touch, and includes multi-tasking.

10:26: So far the speed of Assistant seems to have improved a bit. But this is exactly how we thought it should work. It gets a lot of applause from people in the crowd (which Google Assistant probably won't find immediately).

10:24: Time for the next generation of Google Assistant. Fat sight: What if the assistant was so far away that tapping to operate your phone seemed almost slow. Google wants it to be 10x faster.

10:22: It is important that Duplex on the internet does not require input from companies. It works automatically, according to the Google CEO. This is the way the company works to build a more helpful assistant.

10:21: Google tackles difficult slow online reservations. "Book a national car rental for my trip" and Google will start filling in the details for you. It acts for you, although you always have control over the flow: brand, model, color, whether you want a car seat or not. It is much less input and selection and more adjustments where you need it.

10:20 am: Sundar is back on stage talking about last year's big surprise: Google Duplex. It is Google's AI voting assistant who calls restaurants for reservations. It now extends Duplex for tasks on the web.

10:20 am: The updates for the Google lens will be implemented later this month, so you should see them before the end of May.

10:17: Google Lens will get new translation functions on Google ID 2019. If you see a sign in a foreign language, it translates and you even read the words aloud. The coolest part, it will emphasize the strange words that it says, so you can go along (and maybe learn a little).

10:16: Wondering what a dish even looks like? Google Lens can open an image based on the words that appear in a menu. No word about how it gets this photo (whether it is based on actual photos of the restaurant or what the general dish looks like).

10:14: Another example of a Google lens: point the camera of your phone to a menu in a restaurant and point out the popular dishes in the restaurant and tell you what other people say about it on Google Maps (seems to be where it comes from) ). Lens can help you pay for the meal, even if you are going to help calculate the tip.

10:13: People have used a lens more than a billion times so far. Indexing the physical world, just like searching, indexes the billions of pages on the web.

10:11: There is a 3D shark in the Google IO 2019 phase. How? The AR shark with layered teeth (and the public behind it) came from the Google search and clicked on a 3D model. With the help of a telephone camera, it was placed in the real world, combining 3D objects and your world.

10:09: The camera comes to Google Search. Search for muscle flexion. You can see 3D models in search results and place them in your apartment.

10:08 am: Google wants to bring ' the right information in the right context & # 39 ;. This can be a code for ' We do our job to defeat fake news and hoaxes & # 39 ;.

10:05 am: Sundar says "Our goal is to build a more useful Google for everyone. And when we say useful, we mean to provide you with the tools to improve your knowledge, success, health and happiness. "Until now, it was a re-cap of what Google is doing now. We are still waiting for what it is doing now next one.

10:03: Google uses AR in its official Google IO 2019 app to help users navigate through this outdoor developer conference.

10:03: Sundar has just entered the stage. "I would like to welcome the language that all our users speak, but we want to keep this keynote for less than two hours."

10:00 am: Google starts with a video, with a retrospective of different technology, from the original mobile phone to the N64 controller and what Star Trek seems to be with Leonard Nimoy.

09:59: Last minute before departure, and my teammates consider blogging live about my live blogging. "He looks serious and stressed. Oh, wait a minute, he wrinkled his forehead." Yes, I & apos; m in game mode.

09:57: Predictions about the first thing to be announced on Google IO 2019: many figures about the success of the company and about how developers code for Android.

9:50 am: We are 10 minutes from the Google IDE keynote and here & apos; Your team is on the ground (from left to right): Matt Swider, David Lumb and Nick Pino. We are ready to win this live blog with real-time updates of everything announced.

09:29: I remember two human DJs at the last Google ID I had (left). With Google ' s AI DJ (right), I wonder if the human DJ ' s are still performing in this increasingly autonomous economy.

09:27: The AI ​​DJ from Google had to reset, because although the future is autonomous, it is not yet perfect. It gets up and turns around with what the children would call sick beats.

09:22: In the Google IO keynote phase, Google played an AI DJ music, accompanied by a human DJ (mainly to put the record on the turntable). It automatically adjusts the tempo.

9:00 am: We are in our seats for Google IO 2019 and only an hour away from the developers conference. We are ready to tackle the keynote.

8:30 am: Here & apos; s the new Google IO signaling for 2019.

May 7, 1:09 am: We have processed our Google IO 2019 schedule and updated the live blog one last time. Expect early morning updates.

Yesterday, 3 PM PT: I have my Google ID badge. David Lumb and Nick Pino also join me to give a live blog comment. Only a few hours before the keynote starts.

Credit image: TechRadar

10:30 am PT: Why is the estimate of Google Maps for ridesharing apps always the reason why this is not the case (especially for Lyft?). Maybe this is something the company can fix on Google IO (in addition to fast Android updates and messaging).

Credit image: TechRadar

10:00 am PT: We are 24 hours away from Google ID and successfully moved from New York City to Mountain View in California (with a flight to nearby San Fransisco).

You can really see the difference in weather here in the Bay Area:

Credit image: TechRadar

Refresh for more Google IDE 2019 because the event starts at 10.00 am CET.

This div height required for enabling the sticky sidebar
Ad Clicks : Ad Views : Ad Clicks : Ad Views : Ad Clicks : Ad Views : Ad Clicks : Ad Views : Ad Clicks : Ad Views : Ad Clicks : Ad Views :
​