Compare commits

..

No commits in common. "4e5db77998c85a5db8d6dd6f61c2ccba91507ba4" and "422a0f2f94fc8ff1b923e4ec6534f4a285f1ef24" have entirely different histories.

5 changed files with 20 additions and 24 deletions

View File

@ -1,13 +1,13 @@
{
"recentFiles": [
{
"basename": "Robots.txt Files",
"path": "Coding Tips (Classical)/Terminal Tips/GUIs/Internet/Websites/Robots.txt Files.md"
},
{
"basename": "Webscraping",
"path": "Coding Tips (Classical)/Terminal Tips/GUIs/Tools/Webscraping.md"
},
{
"basename": "Robots.txt Files",
"path": "Robots.txt Files.md"
},
{
"basename": "Potentiometers & Analog SerialReader",
"path": "Machine Tips (Quantum)/Physics/Hardware/Potentiometers & Analog SerialReader.md"

View File

@ -25,7 +25,7 @@
"state": {
"type": "markdown",
"state": {
"file": "Coding Tips (Classical)/Terminal Tips/GUIs/Internet/Websites/Robots.txt Files.md",
"file": "Coding Tips (Classical)/Terminal Tips/GUIs/Tools/Webscraping.md",
"mode": "source",
"source": false
}
@ -107,7 +107,7 @@
"state": {
"type": "backlink",
"state": {
"file": "Coding Tips (Classical)/Terminal Tips/GUIs/Internet/Websites/Robots.txt Files.md",
"file": "Coding Tips (Classical)/Terminal Tips/GUIs/Tools/Webscraping.md",
"collapseAll": false,
"extraContext": false,
"sortOrder": "alphabetical",
@ -124,7 +124,7 @@
"state": {
"type": "outgoing-link",
"state": {
"file": "Coding Tips (Classical)/Terminal Tips/GUIs/Internet/Websites/Robots.txt Files.md",
"file": "Coding Tips (Classical)/Terminal Tips/GUIs/Tools/Webscraping.md",
"linksCollapsed": false,
"unlinkedCollapsed": true
}
@ -147,7 +147,7 @@
"state": {
"type": "outline",
"state": {
"file": "Coding Tips (Classical)/Terminal Tips/GUIs/Internet/Websites/Robots.txt Files.md"
"file": "Coding Tips (Classical)/Terminal Tips/GUIs/Tools/Webscraping.md"
}
}
}
@ -174,10 +174,9 @@
"obsidian-excalidraw-plugin:Create new drawing": false
}
},
"active": "0a0de85a51848b9d",
"active": "dbad7b010371d947",
"lastOpenFiles": [
"Coding Tips (Classical)/Terminal Tips/GUIs/Tools/Webscraping.md",
"Coding Tips (Classical)/Terminal Tips/GUIs/Internet/Websites/Robots.txt Files.md",
"Robots.txt Files.md",
"Excalidraw/Drawing 2023-10-16 12.13.42.excalidraw.md",
"Machine Tips (Quantum)/Physics/Hardware/Potentiometers & Analog SerialReader.md",
"Excalidraw",
@ -207,6 +206,7 @@
"Untitled.canvas",
"Coding Tips (Classical)/Project Vault/Current Occupations/Manhattan Youth",
"Coding Tips (Classical)/Project Vault/Current Occupations/Website Projects/My Domain Names.md",
"Coding Tips (Classical)/Project Vault/Current Occupations/Potential and Future/Career Tips.md",
"Coding Tips (Classical)/Project Vault/About Obsidian/imgFiles/Pasted image 20231011091043.png",
"Coding Tips (Classical)/Project Vault/About Obsidian/Slides & Tools/export/Slides/plugin/chalkboard/_style.css",
"Coding Tips (Classical)/Project Vault/About Obsidian/Slides & Tools/export/Slides/plugin/chalkboard/img/blackboard.png",

View File

@ -1,9 +0,0 @@
Robots.txt is an increasingly important file found on websites that determine whether you permit a website crawler to index your page for search engine optimization. As web-scraping is entirely legal in the US, this is the wild west of scraping and thus I want to keep mu brain and information safe from scraping.
Fun Fact: Google [open-sourced](https://opensource.googleblog.com/2019/07/googles-robotstxt-parser-is-now-open.html) their [robots.txt parser](https://github.com/google/robotstxt) in 2019 f you want to see an example of reverse engineering the robots.txt file for search indexing.
*Resources*:
- [Robots.txt file examples](https://blog.hubspot.com/marketing/robots-txt-file)
- Robots.txt [generator tool](https://www.internetmarketingninjas.com/tools/robots-txt-generator/)

View File

@ -1,10 +1,9 @@
# Web-scraping
# Webscraping
Web-scraping is a common task in the CS world that makes it easy and efficient to extract large amounts of data. It is part of a larger topic of data mining which allows for the human understandable analysis of all the data that is out there.
Webscraping is a common task in the CS world that makes it easy and efficient to extract large amounts of data. It is part of a larger topic of data mining which allows for the human understandable analysis of all the data that is out there.
You will often use requests and `beautifulsoup` libraries.
To prevent web-scraping on your own sites, refer to the [robots.txt](obsidian://open?vault=enter&file=Robots.txt%20Files) information.
You will often use requests and beautifulsoup libraries. To prevent webscraping on your own sites, refer to the rob
---

View File

@ -0,0 +1,6 @@
Robots.txt is an increasingly important file found on websites that determine whether you permit a website crawler to index your page for search engine optimization. As webscraping is entirely legal in the US, this is the wild west of scraping and thus I want to keep mu brain and information safe from scraping.
*Resources*:
- [Robots.txt file examples](https://blog.hubspot.com/marketing/robots-txt-file)