Log in

No account? Create an account

Previous Entry | Next Entry

Memory oddness

So we have these things called artificial neural networks, that learn in a supposedly neurologically-inspired manner. But AIUI, they typically take many repetitions to really learn something. Many many. And sometimes human learning seems like that, like rote memorization.

But other times, we learn with single-instance burn in. And not always because of some great emotional association, or repeated reflection. I had two instances of that yesterday.

First, I was up in Lowell for a job interview, and took the train. Now, I did that a few years ago, when a bunch of us went together to museums there. So as I emerged from the station this time, and looked across the street, suddenly I remembered exploring the park last time, before I'd gone to meet the others (who biked up). It's not like it's a particularly exciting or distinctive park, and I doubt I've thought of it since... but the impressions were there to be recalled.

As were the memories of being daunted and confued by crossing the nigh-freeway streets to get downtown, but that actually was mildly traumatic.

Second, I've been reading someone's Where I Read thread of the Robotech novels, based on the Robotech animated series, much of which I saw as a child. Last night the reader described a late scene where Minmei is seated outside somewhere, and her douchebag cousin-lover-manager Kyle is chugging a liquor bottle, before he finishes it and smashes it in mid air with a dropkick.

And I remember all that! Not well enough to guess who was on which side of the screen, but all of that suddenly seemed vividly familiar, in a way that other described scenes recently haven't. And unlike other remembered scenes -- the firing of the main gun, Roy dying, Max and Miriya fighting/courting, the SDF-1 punching a Zentraedi ship in Operation Daedalus -- I don't think I've thought about or reflected on this scene in the intervening nigh thirty years; it doesn't seem that iconic (though true, raging alcoholics are rare on Saturday morning cartoons, along with many other things distinctive about Robotech.) It's just some scene... that suddenly feels very fresh, after all this time.

I can't prove it's not some false memory constructed in response to the text. But I see no reason it has to be.

See the comment count unavailable DW comments at http://mindstalk.dreamwidth.org/460395.html#comments


( 2 comments — Leave a comment )
Dec. 1st, 2016 01:02 am (UTC)
good luck on your job search... hope you find a good place that appreicates you..
Dec. 3rd, 2016 04:40 am (UTC)
long time no type, friend!

Remember that ANN's are an outgrowth of perceptrons and originally intended to serve as recognizers and classifiers. The gradient descent learning method used with backward propagation does indeed take arbitrarily many repetitions to train to a given permissible level of error for a given learning set ,


the neocortex's massive parallelism allows more diverse exploration of the solution space and neural columns may follow their peers who have found useful shortcuts


gradient descent is already kind of passe, my UG advisor has published on other kinds of learning algorithms with lighter, more efficient math by treating the layers of input weights as matrices under linear algebra


deep learning suggests that subnets pre-trained on relevant primitives can be rapidly connected and the assembly overall rapidly trained to recognize and select for patterns of the primitives associated with the overall goal. If you are an NLP or computer vision person, this is an adaption of the Bag of Words method .


it may be that simple recording of perceptions without need of classification can be done in some entirely different way.
( 2 comments — Leave a comment )


Damien Sullivan

Latest Month

January 2019


Powered by LiveJournal.com
Designed by Lilia Ahner