'AI development costs expected to reduce in 2024 with the emergence of locally-based machine learning solutions'
**Article Title: Shift Towards Local Machine Learning Development and Training Predicted for 2024**
In the ever-evolving world of technology, major tech giants such as Google, Apple, and Qualcomm are focusing on local, on-device machine learning inferencing, marking a significant shift in AI development. This trend was recently highlighted by industry figures like Julien Chaumond, the CTO of Hugging Face, and Ben Wood, the chief analyst and CMO at CCS Insight, in their year-end predictions posted on LinkedIn.
One of the key drivers for this shift is the cost of deployment for cloud-based AI. By moving AI processing to devices, companies can potentially reduce costs associated with AI development. This is particularly relevant for smaller organizations and startups that may not have the resources to maintain expensive cloud infrastructure.
The Apple MLX, a machine learning framework developed by Apple, is one example of this shift. It enables Apple device users to harness in-house silicon for AI inferencing. Similarly, Qualcomm has been working hard to support on-device AI through its latest Snapdragon platform generation, and Google has emerged as a leading champion of on-device AI capabilities, with the announcement of its Gemini Nano model for on-device tasks.
Samsung is rumored to be exploring on-device AI capabilities on its upcoming Galaxy S24 Ultra smartphone, which is expected to be powered by Qualcomm silicon. This could mean that the Galaxy S24 Ultra will have numerous on-device AI functions, further solidifying the trend towards local machine learning.
Another significant aspect of this shift is the potential to help developers circumvent high costs associated with AI training and development. By developing AI models tailored to specific regions, companies can reduce the need for expansive, costly infrastructure traditionally associated with large models.
This trend towards localized AI models is also expected to lead to more tailored solutions for specific regions. For instance, companies in Japan are focusing on AI models for manufacturing, while in Vietnam, there is a push for Vietnamese language models and AI-powered chip development.
In addition to cost efficiency, there are also privacy concerns surrounding the development and deployment of AI tools and services. Lingering privacy concerns could be a key factor in the adoption of a more localized approach to inferencing. With an increasing emphasis on data privacy, apps that clearly communicate privacy policies are gaining user trust, suggesting a shift towards transparent data usage practices.
The preference for models that can run on-premise, like PLaMo, could help mitigate privacy risks by keeping data local, reducing the reliance on cloud storage and minimizing exposure to potential data breaches.
In conclusion, the predicted trends for 2024 suggest a more efficient, cost-effective, and privacy-conscious approach to AI development, with a focus on tailored solutions for specific regions or tasks. As the technology continues to evolve, it is expected that these trends will shape the future of AI development and deployment.
[1] Source: Hugging Face CTO Julien Chaumond's year-end prediction post on LinkedIn. [2] Source: Ben Wood, chief analyst and CMO at CCS Insight, echoed Chaumond's comments on local machine learning inferencing. [3] Source: Various industry reports and expert opinions.
The shift towards local machine learning development and training, as predicted for 2024, is anticipated to impact data-and-cloud-computing significantly, as major tech companies like Google, Apple, and Qualcomm focus on on-device AI inferencing to reduce costs and address privacy concerns. This trend also promises to spur the creation of cybersecurity solutions tailored to specific regions (cybersecurity) as AI models become more localized.
Moreover, artificial-intelligence advancements, such as the Apple MLX and Qualcomm's Snapdragon platform, are contributing to the localized AI infrastructure, promising a shift towards transparent data usage practices and potential privacy protections (artificial-intelligence).