human garbage

  • 5 Posts
  • 56 Comments
Joined 1 year ago
cake
Cake day: September 12th, 2023

help-circle











  • There’d probably arise a need of a default instance with only guest access for a test drive before they pick their own instance, with some pop ups pointing at the fact that the name [email protected] means he is a part of some meta-subreddit lemmy.ml, that doesn’t mean shit for he just helped [email protected] with a link to the source. Their likes are collected but never shown. When they’d want to stop lurking and finally press a login button, it shall instead invite them to see instances of people they liked before first, others next, with tips what lead some rank so high in their list. After the signup is confirmed, their likes may or may not be transported, but their temporal profile is deleted.

    I see the natural flow would be something akin to that: we start with a showcase of general content from different nearly-default instances and then get them recs about persons they did enjoy reading.





  • I perceive my advanced tools akin to a broom.

    I can mop floors alright, but I also don’t want to sit down with a cloth to do it.

    If I can’t do that myself, and it does that instead of me, that’s not just my tool, that’s my employee, and the one I now depend on.

    ‘AI’ companies sell us billions of hours of other people’s labor to replace our own need to interject our experience and ingrain themselves into our routine. Like the coming of ads, it’s already normalized. But this time, critical parts of our life has this black box dependancy and subscription.


  • Also, LLM doesn’t usually have memory or experience. It’s the first page of Google search every time you put in your tokens. A forever trainee that would never leave that stage in their career.

    Human’s abilities like pattern recognition, intuition, acummulation of proven knowledge in combination makes us become more and more effective at finding the right solution to anything.

    The LLM bubble can’t replace it and also actively hurts it as people get distanced from actual knowledge by the code door of LLM. They learn how to formulate their requests instead of learning how to do stuff they actually need. This outsourcing makes sense when you need a cookie recipe once a year, it doesn’t when you work in a bakery. What makes the doug behave each way? You don’t need to ask so you wouldn’t know.

    And the difference between asking like Lemmy and asking a chatbot is the ultimative convincing manner in which it tells you things, while forums, Q&A boards, blogs handled by people usually have some of these humane qualities behind replies and also an option for someone else to throw a bag of dicks at the suggestion of formating your system partition or turning stuff off and on.