
Apple has developed a large language model (LLM) capable of autonomously learning and creating effective user interface designs using SwiftUI, showcasing its innovative approach to AI-driven development. The model, trained on vast datasets, can generate and refine UI designs, streamlining the app development process for developers. This breakthrough highlights Apple’s ongoing commitment to integrating advanced AI technologies into its ecosystem to enhance software creation.
In a new study, a group of Apple researchers describe a very interesting approach they took to, basically, get an open-source model to teach itself how to build good user interface code in SwiftUI. Here’s how they did it.
In the paper UICoder: Finetuning Large Language Models to Generate User Interface Code through Automated Feedback, the researchers explain that while LLMs have gotten better at multiple writing tasks, including creative writing and coding, they still struggle to “reliably generate syntactically-correct, well-designed code for UIs.” They also have a good idea why:
Even in curated or manually authored finetuning datasets, examples of UI code are extremely rare, in some cases making up less than one percent of the overall examples in code datasets.
MacDailyNews Take: As always, Ferris Bueller said it best:
Life moves pretty fast. If you don’t stop and look around once in a while, you could miss it.
Thanks to AI, a lot of coders are going to have plenty of time to stop and look around.
Please help support MacDailyNews — and enjoy subscriber-only articles, comments, chat, and more — by subscribing to our Substack: macdailynews.substack.com. Thank you!
Support MacDailyNews at no extra cost to you by using this link to shop at Amazon.
The post Apple trains LLM to master SwiftUI interface design appeared first on MacDailyNews.