Association method: 5 secrets of memory for visual images


This article should be Wikified.

Please format it according to the article formatting rules.

Machine learning and data mining
Tasks
  • Classification problem
  • Unsupervised learning
  • Partially supervised learning
  • Regression analysis
  • AutoML
  • Association rules
  • Feature extraction
  • Feature learning
  • Ranking training
  • Grammar conclusion
  • Online training
Tutored training
  • k-nearest neighbors method
  • Naive Bayes classifier
  • Decision tree
  • Support Vector Machine
  • Linear regression
  • Logistic regression
  • Perceptron
  • Model ensembles (Bagging, Boosting, Random forest)
  • Relevance vector machine
Cluster analysis
  • k-means method
  • Fuzzy clustering method
  • Hierarchical clustering
  • EM algorithm
  • BIRCH
  • CURE
  • DBSCAN
  • OPTICS
  • Mean-shift
Dimensionality reduction
  • Factor analysis
  • Principal component method
  • CCA
  • ICA
  • LDA
  • Non-negative matrix decomposition
  • t-SNE
Structural forecasting
  • Graph Probabilistic Model (Bayesian Network, Hidden Markov Model, CRF)
Anomaly detection
  • k-nearest neighbors method
  • Local outlier factor
Graph Probabilistic Models
  • Bayesian network
  • Markov network
  • Hidden Markov Model
Artificial neural networks
  • Restricted Boltzmann machine
  • Self-organizing map
  • Activation function (Sigmoid, Softmax, Radial basis function)
  • Backpropagation method
  • Deep learning
  • Multilayer Perceptron
  • Recurrent Neural Network (Long Short-Term Memory, Controlled Recurrent Unit)
  • Convolutional Neural Network (U-Net)
  • Autoencoder
Reinforcement learning
  • Markov process
  • Bellman equation
  • Greedy algorithm
  • Q-learning
  • SARSA
  • Temporal difference (TD)
Theory
  • Vapnik-Chervonenkis theory
  • The Bias and Variance Dilemma
  • Computational learning theory
  • Empirical risk minimization
  • Occam learning
  • PAC learning
  • Statistical learning theory
Journals and conferences
  • NIPS
  • ICML
  • JMLR
  • ArXiv:cs.LG
Glossary
  • Machine Learning Glossary
  • List of datasets for machine-learning research
  • Outline of machine learning
  • Portal "Machine Learning"

Human memory is associative, meaning that a memory can generate a larger area associated with it. One object reminds us of another, and this other reminds us of a third. If we allow our thoughts, they will move from object to object along a chain of mental associations. For example, a few bars of music can evoke a whole range of sensory memories, including sights, sounds and smells. In contrast, ordinary computer memory is locally addressable, presenting an address and retrieving information at that address.

Associative Memory and Artificial Intelligence[ | ]

An artificial neural network with feedback forms associative memory. Just like human memory for a given piece of desired information, all information is retrieved from “memory.”

Autoassociative memory

- called a memory that can complete or use an image, but cannot associate the resulting image with another image. This fact is the result of a single-level structure of associative memory, in which the vector appears at the output of the same neurons that receive the input vector. Such networks are unstable. For a stable network, successive iterations result in smaller and smaller changes in the output until eventually the output becomes constant. For many networks, the process never ends. Unstable networks have interesting properties and have been studied as examples of chaotic systems. In a certain sense, this can be achieved without feedback, for example, with a perceptron for cases when stability is more important than the study of chaotic systems.

Heteroassociative memory

- called a memory in which, when a stimulus arrives at one set of neurons, a feedback response appears at another set of neurons.

The first model of auto-associative memory was developed by Hopfield - the Hopfield Neural Network. To achieve stability, the weighting coefficients had to be chosen so as to form energy minima at the required vertices of a unit hypercube.

Subsequently, Kosko developed Hopfield's ideas and developed a model of heteroassociative memory - bidirectional associative memory (BAM).

But exactly the same result can be achieved using a wide class of recurrent neural networks, a classic example of which is the Elman network, while the problem of stability disappears, and such strict conditions are not imposed on the weighting coefficients, due to which the network has greater capacity. In addition, recurrent neural networks can describe a finite state machine without losing all the advantages of artificial neural networks

Industry Standards for Content Addressable Memory

The definition of the primary interface for NSEs and other Network Search Elements (NSEs) was specified in the Interoperability Agreement called the Look-Aside Interface ( LA-1
and
LA-1B
), which was developed by the Network Processing Forum, which was later merged with the Optical Internetworking Forum (OIF).
Numerous devices have been manufactured by Integrated Device Technology, Cypress Semiconductor, IBM, Netlogic Micro Systems and others under these LA agreements. On December 11, 2007, OIF issued a Serial Lookaside Agreement ( SLA
).

Associative Memory and Programming[ | ]

A number of works examined the possibilities of the concept of associative memory as applied to programming languages ​​and hardware implementation of the processor. And as a working definition we used the following:

Associative memory is usually understood as a certain set or collection of elements that have the ability to store information. Access to these elements is carried out simultaneously and in parallel in accordance with the content of the data stored in them, and not by specifying the address or location of the element.

But this understanding of associative memory essentially reflects only the fact of the presence of relationships between data and has nothing to do with the mechanism of information storage itself. Therefore, the term “content-addressable memory” (CAM) is used to refer to such an information storage mechanism.

As soon as the emphasis was placed on the design of “content-addressable memory,” it became possible to simplify the requirements for the very understanding of associativity, and to develop devices that have associativity only in a certain sense. So, for example, the first

What has been simplified is the assumption that parallelism in search operations is not inherently a fundamental functional characteristic.

Second simplification

is associated with the denial of the need for distributed memory, since associativity in the sense of memory, with addressing by content, can be formally achieved without the need to distribute information between memory elements. In contrast, it is possible to store a unit of information holistically in a specific cell, having only information about the direct connections of this cell with others - thus, we come to an understanding of semantic networks. These principles are also used in indexing and searching in modern databases. Of course, in this sense, this simplification contradicts the ideas of connectivism (which are based on artificial neural networks), and smoothly flows into the ideas of symbolism.

The main thing that is lost with this simplification is one of the amazing properties of biological memory. It is known that various types of damage to brain tissue lead to impairments in the functional characteristics of memory. Nevertheless, it turned out to be extremely difficult to isolate in the work of individual neural structures phenomena associated with the localization of memory functions. The explanation for this is based on the assumption that memory traces in the brain are represented in the form of spatially distributed structures formed as a result of some transformation of primary perceptions.

But nevertheless, although with such a simplification a number of biologically plausible properties were lost, which is important when modeling the brain, but in a technical sense it became clear how to implement content-addressable memory. Thanks to this, ideas about hashing appeared, which were then implemented both in programming languages ​​and in the hardware implementation of some processors.

Third simplification

is associated with the accuracy of matching the required information. Retrieval of data based on its content always involves some form of comparison of the externally specified key to be searched with some or all of the information stored in memory cells. The purpose of comparison should not always be the appearance of information that coincides with the key one. For example, when searching for values ​​that are located within a given interval. In this case, we have the classic way of using SQL when selecting from a database. But a search option is possible, in which it is necessary to find among the totality of data those that best (in the sense of some given measure) correspond to key information.

In this formulation, the associative sampling problem is very close to the pattern recognition problem. But what is decisive is the methods that are used - if the meaning of associativity is not subject to the simplifications described here, then we are dealing with pattern recognition using artificial neural networks, otherwise we are dealing with optimizing the operation of databases (as well as hardware processor caches) , or methods of associative data representation (for example, semantic networks). From here it should be clear that the associative representation of data and some techniques for working with memory addressed by content are insufficient

to understand this as associative memory in the full sense of the word.

Fourth simplification

may be related to the so-called
temporary association
, which from a programming point of view relates to automata theory. These problems are associated with the development of methods for storing and retrieving time-ordered sequences from memory. At the same time, they can branch, forming secondary alternative sequences, and the transition to one of them is determined by the content of some background or contextual information. These sequences may also contain closed loops.

Thus, from the point of view of programming or symbolism, in relation to associative memory there are all the same problems and tasks as in artificial intelligence. The difference is that in programming, simplifications can be made and methods can be constructed that only partially satisfy the understanding of associative memory. While connectionism attempts to solve the problem of associative memory, using methods that do not contain simplifications in the senses described here, have some stochasticity and unpredictability in the sense of the method, but ultimately give a meaningful result in the fields of pattern recognition or adaptive control.

Psychoanalytic theory

Representatives of the psychoanalytic theory of memory, the founder of which is Z.

Freud, special attention in considering the preservation and memorization of information is paid to the unconscious level of the psyche. The psychoanalytic theory of memory shows the significant role played by early emotional experiences that can influence the rest of life.

Representatives of this theory pay special attention to the displacement of negative information from consciousness and its manifestation through humor, dreams, slips of the tongue and other manifestations of the unconscious.

Associative memory

Thanks to psychoanalysis, many interesting psychological mechanisms of subconscious forgetting related to the functioning of motivation have been discovered and described.

Memory training

Memory needs to be trained, like any human skill. Our brain holds tens, hundreds of times more information than a computer, but to unlock this potential we need to train hard.

Thus, you can train your memory in different ways, and before you start training, you need to answer several questions:

Associative memory

How much do you need to memorize?

There is a difference between memorizing 1-3 people and memorizing a group of 30 people. Memorizing a small passage of text or an entire book.

For how long will I need this information?

You need to remember a lot of information at once and for a short period of time. Then fast memory works, short-term, and it stores very little information. It also happens when you need to remember information for a month, a year, or even for the rest of your life, and then long-term memory turns on, which is capable of remembering for a long time without maintaining information in active - fast memory.

Diagnosis of memory disorders

The main memory impairments should be diagnosed by a doctor so as not to miss a serious concomitant disease (tumors, dementia, diabetes). Standard diagnostics includes a comprehensive examination:

  • blood tests (general, biochemistry, hormones);
  • magnetic resonance imaging (MRI);
  • computed tomography (CT);
  • positron emission tomography (PET).

Psychodiagnosis of memory disorders is based on the methods of A.R. Luria:

  1. Learning 10 words. Diagnostics of mechanical memory. A psychologist or psychiatrist slowly names 10 words in order and asks the patient to repeat in any order. The procedure is repeated 5 times, and when repeated, the doctor notes how many of the 10 words were correctly named. Normally, after the 3rd repetition, all words are remembered. After an hour, the patient is asked to repeat 10 words (normally 8–10 words should be reproduced).
  2. Associative series “words + pictures”. Impaired logical memory. The therapist names the words and asks the patient to choose a picture for each word, for example: cow - milk, tree - forest. An hour later, the patient is presented with pictures and asked to name the words corresponding to the image. The number of words and complexity-primitiveness in compiling an associative series are assessed.

Associative memory

The essence

Memory is the basis of mental phenomena.
Without it, a person is forced to perceive each repeating experience anew, and this makes knowledge of the world impossible. Memory is an actively researched area, and there is no single theory that would explain the essence of memorization. The process of memorization is associated with the work of thinking and the nervous system: the changes that occur in it ensure the imprinting and reproduction of images. Each theory considers memorization based on the features of science: biological, chemical and physical processes, psychoanalysis, semantic associations. Not all theories are recognized in scientific psychology, but some are considered fundamental.

Exercises for memory development

Many of the exercises that are presented on the site can be done not only online, but also in everyday life. You can practice remembering routes and people when moving around the city during the day. Memorizing employees in the office, who sits where and what they do, at the time of entering or leaving the office. You can memorize your to-do list for the day, week, or even month. I also highly recommend remembering the events of the whole day, start in the morning: how you woke up, what your mood was, how you managed to get up, what happened at home, what happened on the street, at work, and so on. At first, few events may be remembered, but over time there will be more and more of them.

Pay attention also to a very effective technique for remembering dreams. Not every person remembers what he dreamed in the morning, yesterday, and even 5 minutes after sleep

You can keep a dream diary. Over time, not only will your dreams be remembered better, but their quality will also become better and better. So in one night you will remember not just one, but 3, 4 or more dreams and you will remember everything better than before the start of the training. Often, such attention to dreams not only transforms them from cloudy memories into beautiful and detailed worlds, but also gives awareness. Thus, a lucid dream may develop, especially if in the evening you selectively or completely reread the dreams that you wrote down.

Internet course “Super memory in 30 days”

As soon as you sign up for this course, you will begin a powerful 30-day training in the development of super-memory and brain pumping.

Within 30 days after subscribing, you will receive interesting exercises and educational games in your email that you can apply in your life.

We will learn to remember everything that may be required in work or personal life: learn to remember texts, sequences of words, numbers, images, events that happened during the day, week, month, and even road maps.

Implementation on semiconductors

Because AM is designed to search all memory in one operation, it is much faster than RAM searches in virtually all search applications. However, there is also a disadvantage in the higher cost of AP. Unlike a RAM chip, which has simple stores, each individual memory bit in a fully parallel AM must have its own associated comparison circuit to detect a match between the stored bit and the input bit. In addition, the comparison outputs from each cell in the data word must be combined to produce the complete data word comparison result. The additional circuitry increases the physical size of the AP chip, which increases production costs. The additional circuitry also increases power dissipation since all comparison circuits are active at every clock cycle. As a consequence, AM is used only in specialized applications where search speed cannot be achieved using other, less expensive methods.

Patterns derived by G. Ebbinghaus

At the end of the last century, G. Ebbinghaus derived and systematized a number of memory principles. He managed to do this thanks to the associative theory of memory in psychology. Briefly, we can say that he worked on establishing patterns of memorization, for the study of which nonsense syllables and other information that was poorly organized in terms of meaning were used.

Associative memory

He found out that a person immediately and for a long time remembers even the simplest events in life if they made a particularly strong impression on him. If these moments are less interesting to a person, he may not remember them, even if they happen several dozen times. With sufficient concentration of attention, a person can easily reproduce from memory all the main points of an event that happened once in his life.

When memorizing a long series, you need to remember that its beginning and end are easiest to reproduce. When memorizing a series that is too long (when the number of elements in it exceeds the capacity of short-term memory), the number of correctly reproduced elements of this series is reduced if we compare this indicator with a similar indicator in the case when the number of members of the series is equal to the capacity of short-term memory.

The story of how Mikhail developed his memory

Since my school days I have had a friend, Mikhail, whose memory has always amazed me. He could easily remember a poem or a mathematical formula. Years later, having entered the university, his abilities were not lost, but only increased.

I wondered how he was able to achieve such results. The secret turned out to be simple - he uses various exercises to develop his brain abilities. It turned out that daily training, even the shortest, can lead to excellent results.

Since then, I have also become interested in developing my own capabilities, which has resulted in some good success. This article will help everyone who wants to develop their associative memory.

what is associative memory

Rating
( 2 ratings, average 4 out of 5 )
Did you like the article? Share with friends: