# ML Management This will be important to keep a log of all of the tests that you'll want to run as you begin the EDA process. --- Quick Links to Chat Apps: - ChatGPT - [Perplexity AI](https://www.perplexity.ai/) - V0 - Vercel's Chat app - for how to use - [watch this video](https://www.youtube.com/watch?v=zA-eCGFBXjM) - [Gemini](https://gemini.google.com/?hl=en) - Google's Chat app - [Bingchat](https://www.bing.com/chat) - Microsoft's OpenAi app - [Dall-e 2](https://labs.openai.com/) - OpenAI's image generator - [Stable diffusion ](https://stabledifffusion.com/generate) ![[Pasted image 20240905174215.png]] For the privacy-conscious - - [KubeAi](https://www.kubeai.org/) - private OpenAI on Kubernetes --- **Sacred** - [Sacred](https://github.com/IDSIA/sacred) is a fantastic open-source tool to use to pipeline the test process. As explained [here](https://towardsdatascience.com/managing-machine-learning-projects-226a37fc4bfa), it can really help to log all of the runs that you do with your model. - Use [CatalyzeX](https://chrome.google.com/webstore/detail/aiml-papers-with-code-eve/aikkeehnlfpamidigaffhfmgbkdeheil?hl=en) for code with ML papers. - Python [wrapper](https://github.com/nottheswimmer/dalle) for Dall-E API - PyTorch package to train and audit ML models for[ Individual Fairness](https://github.com/IBM/inFairness) - [Truss](https://www.baseten.co/) serves any model without boilerplate code - [WEKA](obsidian://open?vault=Coding%20Tips&file=Computers%2FPython%2FProjects%2FMachine%20Learning%2FWEKA) is a good resource data mining processes and machine learning testing - Collection of [wolfram](https://resources.wolframcloud.com/NeuralNetRepository/?source=nav) neural nets - [Mini-omni](https://github.com/gpt-omni/mini-omni) - one that can hear and output while hearing - [Kotamon](https://huggingface.co/spaces/cin-model/kotaemon-demo) - For a list of a bunch of projects go to [ProjectPro](https://www.projectpro.io/project/project-demo?source=start&uri=www.projectpro.io/project-use-case/forecast-customer-churn-by-building-a-neural-network-in-r) --- **Deep Note & more ML repos** - [Deep Note](https://deepnote.com/workspace/windtelligent-e87f4ef4-a5f5-4f9b-8def-624a9e35da51/project/Welcome-2ef6e214-0da3-4ac5-9287-5e0d8ca5839f/%2Fnotebook.ipynb) is being used along with[ hugging face](https://huggingface.co/) to document an indepth analaysis on ML python tools - [BLOOM](https://huggingface.co/bigscience/bloom) 176 billion parameter LLM model created by researchers & FOSS - here are some [examples](https://github.com/Sentdex/BLOOM_Examples) --- #### Further reading and tutorials: [Animated](https://nnfs.io/neural_network_animations) tutorials of Neural Networks Using [fast.ai](https://www.fast.ai/posts/2020-02-13-fastai-A-Layered-API-for-Deep-Learning.html) - [NLP Resources for beginners](https://github.com/JUSTSUJAY/nlp-zero-to-hero) - As a reference, text generation has been happening since 2005 with [SCIGen](https://pdos.csail.mit.edu/archive/scigen/#talks) for instance. ## Interesting Papers & Experiments --- - [Classifying a massive dataset of pdfs](https://snats.xyz/pages/articles/classifying_a_bunch_of_pdfs.html)