Get Even More Visitors To Your Blog, Upgrade To A Business Listing >>

Google Hacking Explanation & Effective Protection Measures

What is google hacking?

Hackers use a Search engine e.g. B. Google to find websites that are vulnerable to standard attacks or should remain hidden from the public.

Google hacking does not describe hacking by the Google Search Engine itself, but how hackers can get help from the Google search engine with a hack.

The penetration testers can use Google to collect information about the object in stages 2 and 3:

  1. information collection
  2. Reconnaissance: Google can reveal a lot of information about the object in preparation for the hack.
  3. Discovering and scanning: Google can help to find “banned” and “secret” sites.
  4. Vulnerability Assessment
  5. Vulnerability Exploitation
  6. analysis and report
  7. implementation of the test results

How does a search engine work?

A search engine is a web application that suggests other websites for search terms that match the search term. The search engine searches the Internet with crawlers.

The search engine saves the text (HTML, PDF, TXT) and content (images, videos) of a website and ranks the content for possible suitable search terms.

What is a crawler?

A crawler is a computer program that automatically visits websites on the Internet, saves them and extracts the links from the website. The links lead to new websites with new content. The crawler creates a big data database from the websites it finds.

The crawler avoids indexing the same pages again .

Ranking according to Larry Page “Page rank” is an algorithm that determines the order of the web pages for a search term . Websites with “good” content have many links from other domains that point to the domain. The more backlinks the website has, the higher this website will appear in the ranking.

Tutorial Vulnerability Search / Google Hacking

How can hackers use search engines for their purposes?
Some website operators are careless and leave websites on their server without effective indexing protection.

Google uses compliant crawlers . Rule- compliant crawlers only visit websites that website operators have not “excluded”.

The big downside is that Google “by default” assumes that every webpage should be indexed. Only if this is explicitly forbidden does the crawler stick to it.

Google Dorks – a few examples

#1 You can use Google e.g. B. PHP parameters are found:

https://domain.com/web.php?parameter=oeffentlich
https://domain.com/web.php?parameter=geheim
https://domain.com/web.php?parameter=jetztloeschen

#2 With a SQL injection , the hacker can extract or delete entire table:

https://domain.com/web.php?parameter=asdfa;SELECT….

#3 Search for

mysqldump filetype:sql

and you get some nice SQL commands that reveal a lot (everything) about the and you get some nice SQL commands that reveal a lot (everything) about the database.

#4 Log files can be particularly interesting for hackers because they reveal a lot of information about the target:

allintext:username filetype:log

#4-1 Find public (Apache) folders that you can access via browser with the following statement:

inurl:/proc/self/cwd

#4-2 Display various FTP servers (file servers):

intitle:"index of" inurl:ftp

#5 Log files from Putty server (SSH) may contain passwords. You can use the server as a DDoS bot or to mine Bitcoins .

filetype:log username putty

#5-1 Or jump into someone else’s zoom call (zoom bombing):

inurl:zoom.us/j and intext:scheduled for

#5-2 Spy on a secret API . Ideally, the Swagger documentation should not be available

intitle:"Swagger UI - " + "Show/Hide"

#6 Download an entire database available on the internet.

site:nomedosite.com intitle:index.of "database.db"

#7 Check out a Google Drive and watch the videos

site:drive.google.com /preview intext:movie inurl:flv | wmv | mp4 -pdf -edit -view

#8 Inspect PHP errors:

intext: PHP error

Google tools

Google provides several operators:

  1. site:lists all websites that Google has indexed to a domain.
  2. filetype:pdfonly searches for PDF documents.
  3. cache:search in google cache. Google stores the page for a while even after the website has been deleted by the administrator. You can continue to visit the website.
  4. intitle:search for a search term that must appear in the title of the website
  5. inurl:search for filenames in URLs

Why doesn’t Google ban these features?

Politicians quickly come up with the idea of ​​banning these tools…

…then the government would have to ban kitchen knives as well. Anyone can buy a sharp knife with a blade length of 26 cm in the supermarket to cook with or to stab people with.

On the one hand, you can use a search engine to research your thesis . On the other hand, a hacker can misuse the same tool to get information about a victim website that hackers can use to hack into the systems.

They help companies and inform them about bugs, data leaks or privacy violations. Google indexes all websites that are not excluded because the search engine wants to deliver the best possible results for a search term.

The software assumes that the website operators always appear the websites as quickly as possible in the Google rankings.

Protection against indexing / Google hacking

The best protection is not to get indexed. These 6 tips will help you:

  • Robots.txt – Legacy and standard
    The robots.txt is located in the root directory of the website (next to the index.html / index.php). It is intended to help the “benign” bots. The bots read the robots.txt to know which web pages to index and which ones to skip.
  • Meta Tag – Modern and simple
    The robots.txt no longer corresponds to the latest standard. Many bots honor these, but the no-index meta tag tells a crawler not to include this in its collection.
  • Passwords – Classic and Logical
    Any form of authentication prevents crawlers from coming to certain websites. Legitimate users can log in via
    • Username Password,
    • O-Auth
    • Baerar Token
    • SAML
  • Encapsulated Systems – Trump
    Servers behind a firewall , which can only be reached via VPN , prevent a bot from accidentally becoming aware of the website. Strictly separate public content and internal content through different (sub)domains and content management systems.
  • No links – your secret
    Crawlers can only visit websites they know. To do this, they collect the links available on a collection of startup websites. Prevent your website / subdomain from being kept secret.
  • Spidertraps – Last resort
    Besides the good bots, there are also the bad bots. These are the noises of the web. They snoop around on every website they can somehow find (link, manual entry). Website operators employ the malicious bots with spidertraps.

A pre-processor can create an infinite number of Lorem Ipsum web pages. The bots get into the spider trap via a hidden link and index an infinite amount of nonsense content.

The post Google Hacking Explanation & Effective Protection Measures appeared first on KaliTut.



This post first appeared on Best Wifi Adapter For Pentesting, please read the originial post: here

Share the post

Google Hacking Explanation & Effective Protection Measures

×

Subscribe to Best Wifi Adapter For Pentesting

Get updates delivered right to your inbox!

Thank you for your subscription

×