Tag: Enshittification

  • The Enshittification of AI: Understanding the Trend

    The Enshittification of AI: Understanding the Trend


    Introduction to Enshittification

    Enshittification, a term coined by Cory Doctorow, describes the inevitable decline in quality of two-sided online products and services over time. This phenomenon is characterized by three distinct stages: being good to users, exploiting user dependence to benefit business customers, and finally, squeezing both users and businesses to extract maximum profit, leading to a terrible service for everyone.

    Stage 1: Good to Users

    In the initial stage, platforms attract users with great features, locking them in. This is evident in the early days of social media platforms and dating apps, where the primary focus was on providing a seamless and enjoyable user experience.

    Stage 2: Good to Businesses

    As platforms grow in popularity, they start to exploit user dependence to benefit business customers. This is achieved through the introduction of ads, fees, and other revenue-generating strategies. While this stage may seem beneficial for businesses, it marks the beginning of the end for users.

    Stage 3: Good to Shareholders/Platform

    The final stage is where platforms prioritize their shareholders’ interests over users and businesses. This leads to a decline in service quality, as companies focus on extracting maximum profit. The consequences of enshittification can be seen in the examples of Google Search, Facebook, and other platforms that have prioritized profit over user experience.

    The Enshittification of AI

    As AI technology advances, it’s essential to consider whether it will follow the same path as other digital platforms. According to Cory Doctorow, the enshittification of AI is a predictable decline that sets in as digital platforms and services go from dazzling to dreadful. The signs of enshittification are already visible in AI-powered platforms, with the introduction of ads and price hikes.

    Practical Takeaways

    To avoid the pitfalls of enshittification, it’s crucial for companies to prioritize user experience and transparency. This can be achieved by implementing fair pricing models, providing clear guidelines on data usage, and ensuring that AI-powered services are designed with users’ best interests in mind.

  • Ollama’s Enshittification: The Rise of Llama.cpp


    Introduction to Ollama and Llama.cpp

    Ollama, a popular tool for running large language models (LLMs) locally, has been making headlines with its recent changes. The project, which was initially open-source, has started to shift its focus towards becoming a profitable business, backed by Y Combinator (YC). This has led to concerns among users and developers about the potential enshittification of Ollama. Meanwhile, llama.cpp, an open-source framework that runs LLMs locally, has been gaining popularity as a free and easier-to-use alternative.

    The Early Signs of Enshittification

    According to Rost Glukhov’s article on Medium, Ollama’s enshittification is already visible. The platform’s recent updates have introduced a sign-in requirement for Turbo, a feature that was previously available without any restrictions. Additionally, some key features in the Mac app now depend on Ollama’s servers, raising concerns about the platform’s commitment to being a local-first experience.

    Llama.cpp: The Open-Source Alternative

    Llama.cpp, on the other hand, remains a free and open-source project. As noted by XDA Developers, llama.cpp is the base foundation for several popular GUIs, including LM Studio. By switching to llama.cpp, developers can integrate the framework directly into their scripts or use it as a backend for apps like chatbots.

    Comparison of Ollama and Llama.cpp

    A comparison of Ollama and llama.cpp by Picovoice.ai highlights the key differences between the two platforms. While Ollama aims to further optimize the performance and efficiency of llama.cpp, the latter remains a more straightforward and open-source solution. Llama.cpp’s compatibility with the original llama.cpp project also allows users to easily switch between the two implementations or integrate llama.cpp into their existing projects.

    Conclusion and Future Implications

    The rise of llama.cpp as a free and open-source alternative to Ollama has significant implications for the future of LLMs. As Ollama continues to prioritize profitability over open-source principles, users and developers may increasingly turn to llama.cpp for their local LLM needs. This shift could lead to a more decentralized and community-driven approach to AI development, with llama.cpp at the forefront.

Oh hi there 👋
It’s nice to meet you.

Sign up to receive awesome content in your inbox, every Day.

We don’t spam! Read our privacy policy for more info.