i should be gripping rat

  • 67 Posts
  • 226 Comments
Joined 2 years ago
cake
Cake day: June 16th, 2023

help-circle
  • Where the hell are you shopping that you have to make small talk?

    Basically anywhere in the US besides Aldi. And Aldi works fine because the cashiers are trained to skip small talk, scan everything quickly, drop it directly into the cart, and then leave me to go bag my groceries in peace without having to rush.

    If people can’t even interact with each other on a surface level like that it’s no wonder we’re all so lonely and depressed.

    I’m happy for you extroverts to go use the cashier checkout lanes if you are that desperate for small talk. I can small talk, but I don’t like it, esp in scenarios like this where i’m focused on other tasks. The interaction is not fun, it does not liven up my day. It’s just draining.

    Now social interaction with my friends? Social interaction at a party? That shit fills me up, but i’m not going to the grocery store to get my social fix.


  • From the article:

    Instead, the future of hiring may require abandoning the résumé altogether in favor of methods that AI can’t easily replicate—live problem-solving sessions, portfolio reviews, or trial work periods, just to name a few ideas.

    Are those the best solutions? I don’t exactly know, the problem is bigger than any one person can solve. But any of those would probably be better solutions than what we’ve been doing the past 20 years.

    In my ideal world, people don’t have to go through any this bs to get a job. People don’t have to become their own salesperson just to get a job with a living wage. Maybe this is too communist for some people, but it would be nice if some government body just matched me with a job that matched my skillset and education, and then they guaranteed a living wage. If I work the job and I don’t like it, they let me pick one of my secondary matches. I don’t want to have to think about this shit, I’m not entrepreneurial and I don’t want to be entrepreneurial. In this scenario, I would think employers would also save a mint on recruiting costs.







  • Midjourney is a product that is being sold for money. Midjourney is making money off of providing users with unauthorized images of Disney and Universal characters. Midjourney is not making up original characters that happen to look like the licensed characters; they are just producing the characters themselves:

    For example, if a Midjourney subscriber prompts the AI tool to generate an image of Darth Vader, it immediately obliges, according to the plaintiffs, and the same occurs for images of Minions.

    Furthermore, we know that Midjourney obtained the ability to generate these images by training on Disney’s and Universal’s copyrighted properties. This is why Midjourney knows these characters by name.

    To your example, I think one big difference is that if you make a digital drawing of Mickey Mouse and then print it out, you are not going on to share that image with a global marketplace of other Epson users. Additionally, you also need an uncommon level of drawing skill to produce a drawing that is so convincing that people may confuse it for Disney’s own work. Midjourney has a social page where users share their creations, and those pages are littered with people’s low-effort generations of licensed characters:

    With Midjourney, any doofus can generate an image of Mickey Mouse flipping off Goofy, and it will look good enough that most people will think Disney made it. If the internet is littered with images like this, it reduces the value of Disney’s properties.













  • Big article, but a great read! Some key excerpts:

    This isn’t simply the norm of a digital world. It’s unique to AI, and a marked departure from Big Tech’s electricity appetite in the recent past. From 2005 to 2017, the amount of electricity going to data centers remained quite flat thanks to increases in efficiency, despite the construction of armies of new data centers to serve the rise of cloud-based online services, from Facebook to Netflix. In 2017, AI began to change everything. Data centers started getting built with energy-intensive hardware designed for AI, which led them to double their electricity consumption by 2023. The latest reports show that 4.4% of all the energy in the US now goes toward data centers. Given the direction AI is headed—more personalized, able to reason and solve complex problems on our behalf, and everywhere we look—it’s likely that our AI footprint today is the smallest it will ever be. According to new projections published by Lawrence Berkeley National Laboratory in December, by 2028 more than half of the electricity going to data centers will be used for AI. At that point, AI alone could consume as much electricity annually as 22% of all US households.

    Let’s say you’re running a marathon as a charity runner and organizing a fundraiser to support your cause. You ask an AI model 15 questions about the best way to fundraise. Then you make 10 attempts at an image for your flyer before you get one you are happy with, and three attempts at a five-second video to post on Instagram. You’d use about 2.9 kilowatt-hours of electricity—enough to ride over 100 miles on an e-bike (or around 10 miles in the average electric vehicle) or run the microwave for over three and a half hours.

    One can do some very rough math to estimate the energy impact. In February the AI research firm Epoch AI published an estimate of how much energy is used for a single ChatGPT query—an estimate that, as discussed, makes lots of assumptions that can’t be verified. Still, they calculated about 0.3 watt-hours, or 1,080 joules, per message. This falls in between our estimates for the smallest and largest Meta Llama models (and experts we consulted say that if anything, the real number is likely higher, not lower).

    One billion of these every day for a year would mean over 109 gigawatt-hours of electricity, enough to power 10,400 US homes for a year. If we add images and imagine that generating each one requires as much energy as it does with our high-quality image models, it’d mean an additional 35 gigawatt-hours, enough to power another 3,300 homes for a year. This is on top of the energy demands of OpenAI’s other products, like video generators, and that for all the other AI companies and startups.

    But here’s the problem: These estimates don’t capture the near future of how we’ll use AI. In that future, we won’t simply ping AI models with a question or two throughout the day, or have them generate a photo. Instead, leading labs are racing us toward a world where AI “agents” perform tasks for us without our supervising their every move. We will speak to models in voice mode, chat with companions for 2 hours a day, and point our phone cameras at our surroundings in video mode. We will give complex tasks to so-called “reasoning models” that work through tasks logically but have been found to require 43 times more energy for simple problems, or “deep research” models that spend hours creating reports for us. We will have AI models that are “personalized” by training on our data and preferences.

    By 2028, the researchers estimate, the power going to AI-specific purposes will rise to between 165 and 326 terawatt-hours per year. That’s more than all electricity currently used by US data centers for all purposes; it’s enough to power 22% of US households each year. That could generate the same emissions as driving over 300 billion miles—over 1,600 round trips to the sun from Earth.