Notepad/enter/Coding Tips (Classical)/Terminal Tips/Languages/Python/Projects/Machine Learning/ML Management.md

29 lines
2.0 KiB
Markdown
Raw Normal View History

2023-07-05 18:29:11 +00:00
# ML Management
This will be important to keep a log of all of the tests that you'll want to run as you begin the EDA process.
**Sacred**
- [Sacred](https://github.com/IDSIA/sacred) is a fantastic open-source tool to use to pipeline the test process. As explained [here](https://towardsdatascience.com/managing-machine-learning-projects-226a37fc4bfa), it can really help to log all of the runs that you do with your model.
- Use [CatalyzeX](https://chrome.google.com/webstore/detail/aiml-papers-with-code-eve/aikkeehnlfpamidigaffhfmgbkdeheil?hl=en) for code with ML papers.
- Python [wrapper](https://github.com/nottheswimmer/dalle) for Dall-E API
- PyTorch package to train and audit ML models for[ Individual Fairness](https://github.com/IBM/inFairness)
- [Truss](https://www.baseten.co/) serves any model without boilerplate code
- [WEKA](obsidian://open?vault=Coding%20Tips&file=Computers%2FPython%2FProjects%2FMachine%20Learning%2FWEKA) is a good resource data mining processes and machine learning testing
- Collection of [wolfram](https://resources.wolframcloud.com/NeuralNetRepository/?source=nav) neural nets
---
**Deep Note & more ML repos**
- [Deep Note](https://deepnote.com/workspace/windtelligent-e87f4ef4-a5f5-4f9b-8def-624a9e35da51/project/Welcome-2ef6e214-0da3-4ac5-9287-5e0d8ca5839f/%2Fnotebook.ipynb) is being used along with[ hugging face](https://huggingface.co/) to document an indepth analaysis on ML python tools
- [BLOOM](https://huggingface.co/bigscience/bloom) 176 billion parameter LLM model created by researchers & FOSS
- here are some [examples](https://github.com/Sentdex/BLOOM_Examples)
---
#### Further reading and tutorials:
[Animated](https://nnfs.io/neural_network_animations) tutorials of Neural Networks
Using [fast.ai](https://www.fast.ai/posts/2020-02-13-fastai-A-Layered-API-for-Deep-Learning.html)
- As a reference, text generation has been happening since 2005 with [SCIGen](https://pdos.csail.mit.edu/archive/scigen/#talks) for instance.