While Apple’s ‘Shortcuts’ app will introduce AI at WWDC25, their updated SwiftUI will solve major pain points by adding a new text editor & more
According to Bloomberg, Apple is adding Apple Intelligence to its Shortcuts app, allowing users to create operations with the help of AI models. However, the new version of the app will not be available until at least next year and is not expected to debut at WWDC 2025 this month .
It is reported that in the new version of the shortcut command application, users do not need to perform tedious shortcut command configuration by themselves. They only need to make requests through natural language, and the AI model embedded in the application can generate various automated operations by itself , lowering the threshold for using the application.
On another front, the SwiftUI framework has always been seen as the direction of future development. It is a modern way to build user interfaces that can run across all Apple platforms. However, although SwiftUI excels in many aspects, developers still face some limitations in the process of using it, such as insufficient support for rich text input and web view embedding. However, according to the latest news, these pain points are about to be resolved.
According to Bloomberg's "Power On" reporter Mark Gurman, SwiftUI will have a much-anticipated improvement: a built-in rich text editor . Rich text has always been a clear shortcoming of SwiftUI. Although it can display rich text content well, it lacks corresponding support in terms of input. This forces developers to mix code with classic UIKit, use third-party libraries, or adopt some workaround solutions that destroy the native experience of SwiftUI.
Gurman noted: "For many developers, this will be a gratifying development. SwiftUI, Apple's series of frameworks and tools for creating application user interfaces, will finally be equipped with a built-in rich text editor." Although it seems to be a small change, it may bring many conveniences: for example, providing better input fields for notes, messaging apps, and documents without giving up SwiftUI's declarative development process.
Gurman further added that since Apple Intelligence was first introduced at WWDC2024, Apple's AI strategy has been slow and difficult. Although the 2025 conference was an opportunity for Apple to turn the tide, Mark Gurman claimed that this year's WWDC might disappoint people in terms of AI.
Still, there are reports that Apple is already testing larger models internally that are far more powerful than the current Apple smartphones, but executives are divided over when to release these technologies to the public .
It is reported that Apple is currently debugging multiple AI systems that are more complex than the existing Apple intelligence on devices. Specifically, the scale of models that Apple is testing includes 3 billion, 7 billion, 33 billion and 150 billion parameters.
For comparison, the basic language model of Apple Intelligence announced by Apple last year is about 3 billion parameters. This version of Apple Intelligence is said to be "deliberately kept small" so that it can run directly on the device without sending all requests to the cloud for processing. Future larger models are said to be based in the cloud, with a model of 150 billion parameters "already close to the level of the latest version of ChatGPT."
Lastly, while I never heard the rumor that Apple was considering to rename their upcoming smartphone, the iPhone 2025, Gurman confirms that’s not on the table.