Intellegens White Paper.
Traditional machine learning is well-known for needing a large amount of training data. However, real-life data derived from experiments is not only expensive but also time-consuming to collect so there is never enough data available. How can we respond?
In this white paper we review four strategies for applying deep learning to maximise the value of ‘small data’:
- Using property-property correlations
- Uncertainty estimates
- Design of experiments
- Data acquisition and validation
We discuss how the Alchemite™ machine learning software enables these strategies and can be applied to systems with apparently impossibly small amounts of data. We review examples of how each approach is tried and tested in the real world of materials and drug discovery and development.
This topic was also the focus of an Intellegens webinar with Dr Gareth Conduit – a recording is available.