telecomkh: BrainChip: Key differences between transfer learning and incremental learning

via telecomkh

BrainChip offers insight into two widely accepted forms of deep learning

The massive computing resources required to train neural networks for AI/ML tasks has driven interest in two forms of learning presumed to be more efficient: transfer learning and incremental learning. Experts at BrainChip Holdings Ltd., a leading provider of ultra-low power high performance artificial intelligence technology, offered the following insight and considerations for their use in edge AI/IoT environments.

In transfer learning, applicable knowledge established in a previously trained AI model is “imported” and used as the basis of a new model. After taking this shortcut of using a pretrained model, such as an open-source image or NLP dataset, new objects can be added to customize the result for the particular scenario… READ THE FULL ARTICLE