• 0 Posts
  • 53 Comments
Joined 1 year ago
cake
Cake day: July 15th, 2023

help-circle















  • I’m surprised at the amount of disagreement your comment is getting.

    I don’t want to downplay Jackson’s displacement of American Indians, but there was one real FUBAR thing Jackson did that no one remembers:

    He paid off the national debt.

    Completely and entirely. The federal government existed debt free for some months (I forget exactly how long, I want to say it was a year or so before borrowing exceeded income).

    On the face of it, this probably sounds like a good thing, but it hard crashed the economy. Obv wasn’t alive at the time, but it’s my understanding that it was the worst economic disaster until the Great Depression (and The Great Depression was only worse because the country and world were far more connected than the world of Jackson’s day).

    That said, I hear inauguration party was a real rager.


  • Liquor Bottle by Herbal T. Has a nice faux-upbeat rhythm with jazzy kinda beats, but lyrics.are dark. Definitely helps me keep a sane face on the dark days:

    And that’s why / I keep a

    A liquor bottle in the freezer ♪

    In case I gotta take it out ♫

    Mix me a drink

    To help me

    Forget all the things

    In my life that I worry about ♪ ♫


  • Right.

    I don’t mean to say that the mechanism by which human brains learn and the mechanism by which AI is trained are 1:1 directly comparable.

    I do mean to say that the process looks pretty similar.

    My knee jerk reaction is to analogize it as comparing a fish swimming to a bird flying. Sure there are some important distinctions (e.g. bird’s need to generate lift while fish can rely on buoyancy) but in general, the two do look pretty similar (i.e. they both take a fluid medium and push it to generate thrust).

    And so with that, it feels fair to say that learning, that the storage and retrieval of memories/experiences, and that the way that that stored information shapes our sub-concious (and probably conscious too) reactions to the world around us seems largely comparable to the processes that underlie the training of “AI” and LLMs.


  • Thats not what I intended to communicate.

    I feel the Turing machine portion is not particularly relevant to the larger point. Not to belabor the point, but to be as clear as I can be: I don’t think nor intend to communicate that humans operate in the same way as a computer; I don’t mean to say that we have a CPU that handles instructions in a (more or less) one at a time fashion with specific arguments that determine flow of data as a computer would do with Assembly Instructions. I agree that anyone arguing human brains work like that are missing a lot in both neuroscience and computer science.

    The part I mean to focus on is the models of how AIs learn, specifically in neutral networks. There might be some merit in likening a cell to a transistor/switch/logic gate for some analogies, but for the purposes of talking about AI, I think comparing a brain cell to a node in a neutral network is most useful.

    The individual nodes in neutral network will have minimal impact on converting input to output, yet each one does influence the processing of one to the other. Iand with the way we train AI, how each node tweaks the result will depend solely on the past I put that has been given to it.

    In the same way, when met with a situation, our brains will process information in a comparable way: that is, any given input will be processed by a practically uncountable amount of neurons, each influencing our reactions (emotional, physical, chemical, etc) in miniscule ways based on how our past experiences have “treated” those individual neurons.

    In that way, I would argue that the processes by which AI are trained and operated are comparable to that of the human mind, though they do seem to lack complexity.

    Ninjaedit: I should proofread my post before submitting it.