- [Sacred](https://github.com/IDSIA/sacred) is a fantastic open-source tool to use to pipeline the test process. As explained [here](https://towardsdatascience.com/managing-machine-learning-projects-226a37fc4bfa), it can really help to log all of the runs that you do with your model.
- Use [CatalyzeX](https://chrome.google.com/webstore/detail/aiml-papers-with-code-eve/aikkeehnlfpamidigaffhfmgbkdeheil?hl=en) for code with ML papers.
- Python [wrapper](https://github.com/nottheswimmer/dalle) for Dall-E API
- PyTorch package to train and audit ML models for[ Individual Fairness](https://github.com/IBM/inFairness)
- [Truss](https://www.baseten.co/) serves any model without boilerplate code
- [WEKA](obsidian://open?vault=Coding%20Tips&file=Computers%2FPython%2FProjects%2FMachine%20Learning%2FWEKA) is a good resource data mining processes and machine learning testing
- Collection of [wolfram](https://resources.wolframcloud.com/NeuralNetRepository/?source=nav) neural nets
- For a list of a bunch of projects go to [ProjectPro](https://www.projectpro.io/project/project-demo?source=start&uri=www.projectpro.io/project-use-case/forecast-customer-churn-by-building-a-neural-network-in-r)
- [Deep Note](https://deepnote.com/workspace/windtelligent-e87f4ef4-a5f5-4f9b-8def-624a9e35da51/project/Welcome-2ef6e214-0da3-4ac5-9287-5e0d8ca5839f/%2Fnotebook.ipynb) is being used along with[ hugging face](https://huggingface.co/) to document an indepth analaysis on ML python tools
- [BLOOM](https://huggingface.co/bigscience/bloom) 176 billion parameter LLM model created by researchers & FOSS
- here are some [examples](https://github.com/Sentdex/BLOOM_Examples)
---
#### Further reading and tutorials:
[Animated](https://nnfs.io/neural_network_animations) tutorials of Neural Networks
Using [fast.ai](https://www.fast.ai/posts/2020-02-13-fastai-A-Layered-API-for-Deep-Learning.html)