Uncategorized

What to expect at the Apple event in ten years?

What to Expect at Apple’s WWDC 2024? The MacRumors Project Reports on the First M-Series Chips in a Mac

It was the first time an M-Series chip did not come first to a Mac when Apple launched its new iPad Pro with an M4 chip. I’m not going to debate what a computer is, but I’d wager we’ll see M4 chips in Macs soon. Apple’s Macbooks and Mac Pro are both offered with M3 chips, while the Mac Studio and Mac Pro are stuck with M2 chips.

As part of operating system updates for the Mac, iPhone, iPad and beyond, Apple is expected to debut new and enhanced versions of its built-in apps. The calculator app is rumored to receive a refresh, and also to get a password manager and redesigns coming to Apple’s messy settings and control center.

At long last, Apple may finally let users arrange apps to their liking. MacRumors reports that you’ll finally be able to leave blank spaces between apps in iOS 18. And Bloomberg has reported that Apple will integrate a theming system into the OS, letting you recolor icons to match each other.

Source: What to expect at Apple’s WWDC 2024

What I Learned About Apple and What It Means About a Personal Assistant: A Brief History of Apple and Other AI Projects over the past 10 Years

Apple had previously ignored Google’s (sometimes cringey) pleas to adopt RCS and let go of the age-old and privacy-lacking SMS fallback. It took some regulatory pressure to get Apple to rethink its stance. As for the green bubbles, they’ll stick around. But they’ll be green bubbles.

Apple is also said to have looked into third-party partnerships to bolster its AI prowess. The company is in talks with two companies but seems to have locked up a deal with OpenAI, which recently showed off a personal assistant-like conversation feature. Some kind of chatbot is heading to the Apple devices, according to a report.

After the I/O and Microsoft Build developer conferences, Apple is third in the rankings, and they have never had to announce a product in response. This time, however, things are different.

We should be skeptical about whatever claims Apple makes for Siri. More than a decade ago, Schiller stood onstage and proclaimed that Apple had built a better voice assistant, and it hadn’t. The same might be true now, as the hype for AI continues to move a lot faster than the actual technology. Humane, Rabbit, Google, and others are all working on similar ideas — “agent” is the buzzword of the summer in the AI world — and no one has demonstrated that it’s ready yet.

When Apple first launched Siri in 2011 alongside the iPhone 4S, the company made a series of very compelling ads showing how you might use this newfangled voice assistant thing. In two instances, Zooey Deschanel asked her phone for some life advice and John Malkovich asked for some. There is a New York City cab with Martin Scorsese and his schedule in it. They showed reminders, weather, alarms, and more. The point of the ads was that it was a useful companion that you could rely on, even if it was just for a moment. There is no need for apps or taps. Just ask.

There are really two reasons Siri never lived up to its potential in this way. The first is the simple one: the underlying technology wasn’t good enough. If you have used any of the methods listed below, you will know that every now and then it comes back to “Here’s some stuff I discovered on the internet.” This is where large language models are very exciting, since we have seen how datememe datememe datememe datememe datememe datememe is when datememe is used in large language models. Even though they aren’t flawless, they’re a huge improvement over what we’ve had before.

This could be the last time we’ll ever see the real Siri because Apple has cracked something here. Deschanel will have her tomato soup show up at her house in a commercial and the Headspace app will bring John Malkovich some inner peace. Maybe, finally, we’re going to get the Siri Apple always wanted to make.

AI might also give Apple a chance to end run the whole problem. Its researchers published a paper earlier this year detailing a system called Ferret-UI, which uses an AI model to understand small details of an onscreen image. The researchers detail how an overall app using Siri might work, and then show how Ferret can understand small regions and details. In practice, that might mean one system says, “This is the Ticketmaster app!” and the other says, “That right there is the buy button.”