25+ yr Java/JS dev
Linux novice - running Ubuntu (no windows/mac)

  • 0 Posts
  • 78 Comments
Joined 8 months ago
cake
Cake day: October 14th, 2024

help-circle
rss


  • Our purpose with this column isn’t to be alarmist

    [x] Doubt

    The amount of math that goes into training an AI and generating output exceeds human capacity to calculate. So does the Big Bang, but we have some pretty good ideas how that went.

    when given access to fictional emails during safety testing, threatened to blackmail an engineer over a supposed extramarital affair. This was part of responsible safety testing — but Anthropic can’t fully explain the irresponsible action.

    Because human writing, both fiction and non-fiction is full of this sort of thing, and all any LLM is doing is writing. Why wouldn’t it take a dark turn sometimes? It’s not like it has any inherent sense of ethics or morality.

    Anthropic CEO Dario Amodei, in an essay in April called “The Urgency of Interpretability,” warned: “People outside the field are often surprised and alarmed to learn that we do not understand how our own AI creations work. They are right to be concerned: this lack of understanding is essentially unprecedented in the history of technology.” Amodei called this a serious risk to humanity — yet his company keeps boasting of more powerful models nearing superhuman capabilities.

    Is this true? Don’t we have drugs that we don’t fully understand how they do what they do? I’m reading that we don’t fully understand all the mechanisms of aspirin.

    I get that this is a quote and not the author of the article, but this quote is just included without deeper analysis. Also, a car has superhuman capabilities; a fish has superhuman capabilities. LLMs are not superhuman in any way that matters. They are not even superhuman in ways different from computers of 40 years ago.

    But researchers at all these companies worry LLMs, because we don’t fully understand them, could outsmart their human creators and go rogue.

    This is 100% alarmism. AI might at some point outsmart humans, but it won’t be LLMs.


    None of this is to say there are absolutely no concerns about LLMs. Obviously there are. But there is no reason to suspect LLMs are going to end humanity unless some moron hooks one up to nuclear weapons.


  • You probably could train an AI to play chess and win, but it wouldn’t be an LLM.

    In fact, let’s go see…

    • Stockfish: Open-source and regularly ranks at the top of computer chess tournaments. It uses advanced alpha-beta search and a neural network evaluation (NNUE).

    • Leela Chess Zero (Lc0): Inspired by DeepMind’s AlphaZero, it uses deep reinforcement learning and plays via a neural network with Monte Carlo tree search.

    • AlphaZero: Developed by DeepMind, it reached superhuman levels using reinforcement learning and defeated Stockfish in high-profile matches (though not under perfectly fair conditions).

    Hmm. neural networks and reinforcement learning. So non-LLM AI.

    you can play chess against something based on chatgpt, and if you’re any good at chess you can win

    You don’t even have to be good. You can just flat out lie to ChatGPT because fiction and fact are intertwined in language.

    “You can’t put me in check because your queen can only move 1d6 squares in a single turn.”



  • Most tedious part I’ve seen so far is there is so fucking much to upgrade. At least 20 buildings and probably a fair bit more with construction, and they all go C, B, A, S and every upgrade takes time and can only be done one at a time, so getting your town built up is pretty tedious. Time will tell if the rewards are worth it. At least you can get your rewards from all settlements (max 4 I think) just stopping at one. So once you’re built up maybe it’s fine?


  • I signed up with Matrix and it was not seamless but maybe a private server would be great and they could go from there (but that feels like a long term commitment to supporting those users). I haven’t really played much with it. Tried getting the folks in my discord server to give it a try but they haven’t and they are tech folks. I would say it’s not ready for normies, but I really wish it was.

    I still have it installed on my phone, but I don’t really have anywhere interesting to go. Same with Signal TBH—it’s installed but no one I know uses it. Still waiting on my invite from the Secretary of Defense.




  • I think it’s fair to discuss the energy. I’m not sure where the math comes from that 100 words takes .14kWh. My video card uses 120W pegged and can generate 100 words in let’s say a nice round 2 minutes. So that works out to 4W or .004kWh. But of course they are running much more advanced and hungry models, and this is probably generating the text and then generating the voice, and I don’t know what that adds. I do know that an AI tool I use added a voice tool and it added nothing to cost, so it was small enough for them to eat, but also the voices are eh and there are much better voice models out there.

    So that’s fine, I can pretty well define the lower bounds of what a line of text could cost, energy-wise. But this strategy doesn’t get us closer to an actual number. What might be helpful… is understanding it from EA’s perspective. They are doing this to increase their bottom line through driving customer engagement and excitement, because I haven’t heard anything about this costing the customer anything.

    So whatever the cost is of all the AI they are using, has to be small enough for them to simply absorb in the name of increased player engagement leading to more purchases. The number I just found is $1.2 billion in profit annually. Fuck, that’s a lot of money. What do you think they might spend on this? Do you think it would be as high as 2%? I’ll be honest, I really don’t know. So lets say they are going to spend $24million on generative AI and let’s just assume for a second that all goes to power.

    I just checked and the average for 1KWh nationally is $0.1644 but let’s cut that in half assuming they cut some good deals? (I’m trying to be completely fair in these numbers so disagree if you like. I’m writing this before doing all the math so I don’t even know where this is going.) That looks like about 291 million KWh (or… that’s just 291 GWh, right?)

    I read global energy usage is estimated at 25,500 TWh, and check my math that works out to about 1/87,000th of the world’s annual electricity consumption. Kinda a lot for a single game, but it’s pretty popular.

    But the ask is how that compares to video cards and… let’s be honest this is going to be a very slippery, fudge-y number. I was quoted 1.5 million daily players (and I see other sources report up to 30 million which is really wide, but lets go with the lower number). So the question is, how long do they play on average, and how much power do their video cards use? I see estimates of 6-10 hours per week and 8-10 hours per week. Let’s make it really easy and assume 7 hours per week or 1 hour per day.

    I have a pretty low end video card, but it’s probably still comparable to or better than some of the devices connecting to fortnight. I don’t have a better number to use, so I’m going to use 120W. There should be a lot of players higher than that, but also probably a lot of switches and whatnot that are probably lower power. Feel free to disagree.

    So 1.5m players x 1 hour per day = 120MWh x 365 = 43.8GWh.

    By these numbers the AI uses about 6x the power of the GPUs. So there is that. But also I think I have been extremely generous with these numbers everywhere except maybe the video card wattage which I really don’t have any idea how to estimate. Would EA spend 2% expecting to recoup that in revenue? What if it’s 1%? What if it’s .5%? At .5% they are getting pretty close.

    Or if the number of daily players is 15 million instead of 1.5, that alone is enough to tip the scale the other way.

    And device power is honestly a wild-ass guess. You could tell me the average is 40W or 250W and I’d have no real basis to argue.

    If you have any numbers or suggestions to make any of this more accurate, I’m all ears. The current range of numbers would lean toward me being wrong, but my confidence in any of this is low enough that I consider the matter unresolved. I also didn’t dive into how much of AI cost is power vs. infrastructure. If only half the cost of AI is power (and it’s probably lower than that) it changes things significantly.

    I’m going to stick with my assertion, but my confidence is lower than it was.



  • Okay. So, your position is that 6 year olds are going to join Fortnite to spam the funny-man-speak button and because of that AI energy usage will be higher? Okay. Maybe. I’d argue the novelty of AI wears thin really quickly once you interact with it a lot, but I’ll grant you some folks might remain excited by AI beyond reason.

    So now they are logging into Fortnite and rather than playing the actual game they are just going to talk to characters? It doesn’t make a lot of sense to me. But once we throw out the other commenter’s numbers and suppose it’s not 7 generations to equal 30 minutes of play, maybe it’s 20. Maybe it’s 40. Maybe it’s 100. I honestly don’t know. But we’re definitely in the realm where I think betting the video card uses more energy than the AI for a given player (and all video cards use more energy than AI for all given players) is a perfectly reasonable position to take.

    I bet that is the case. I don’t know it. I can’t prove it right or wrong without actual numbers. But based on my ability to generate images and text locally on a shit video card, I am sticking with my bet.



  • What I said was I’ll bet one person uses more power running the game than the AI uses to respond to them. Just that.

    Then you started inventing scenarios and moving goalposts to comparing one single video card to an entire data center. I guess because you didn’t want to let my statement go unchallenged, but you had nothing solid to back you up. You’re the one that posted 6500 joules, which you supported, and I appreciate that, but after that it’s all just supposition and guesses.

    You’re right that it’s almost certainly higher than that. But I can generate text and images on my home PC. Not at the quality and speed of OpenAI or whatever they have on the back-end, but it can be done on my 1660. So my suggestion that running a 3D game consumes more power than generating a few lines seems pretty reasonable.

    But I know someone who works for a company that has an A100 used for serving AI. I’ll ask and see if he has more information or even a better-educated guess than I do, and if I find out I’m wrong, I won’t suggest otherwise in the future.


  • We know that most of the closed source models are way more complicated, so let’s say they take 3 times the cost to generate a response.

    This is completely arbitrary and supposition. Is it 3x “regular” response? I have no idea. How do you even arrive at that guess? Is a more complex prompt exponential more expensive? Linearly? Logarithmically? And how complex are we talking when system prompts themselves can be 10k tokens?

    Generating an AI voice to speak the lines increases that energy cost exponentially. MIT found that generating a grainy, five-second video at 8 frames per second on an open source model took about 109,000 joules

    Why did you go from voice gen to video gen? I mean I don’t know whether video gen takes more joules or not but there’s no actual connection here. You just decided that a line of audio gen is equivalent to 40 frames of video. What if they generate the text and then use conventional voice synthesizers? And what does that have to do with video gen?

    If these estimates are close

    Who even knows, mate? You’ve been completely fucking arbitrary and, shocker, your analysis supports your supposition, kinda. How many Vader lines are you going to get in 30 minutes? When it’s brand new probably a lot, but after the luster wears off?

    I’m not even telling you you’re wrong, just that your methodology here is complete fucking bullshit.

    It could be as low as 6500 joules (based on your link) which changes the calculus to 60 lines per half hour. Is it that low? Probably not, but that is every bit as valid as your math and I’m even using your numbers without double checking.

    At the end of the day maybe I lose the bet. Fair. I’ve been wondering for a bit how they actually stack up, and I’m willing to be shown. But I suspect using it for piddly shit day to day is a drop in the bucket compared to all the mass corporate spam. But I’m aware it’s nothing but a hypothesis and I’m willing to be proven wrong. But not based on this.



  • I worked with a guy who ran PopOS and loved it. He said the UI was really good. I’ve seen it get some love in social places. Figured I’d give it a shot some time.

    I’m pretty happy with Mint. It’s comfortable and the conventions feel more familiar than even my work MacBook—like I don’t even know what the desktop is for except my screenshots show up there for some reason. I don’t think corporate would let me run Linux, but if they would I’d be happy with Mint or Ubuntu. They probably don’t want to support a million flavors of Linux desktop.