MYSTERY ZILLION တွင် English သို့မဟုတ် Unicode ဖြင့်သာ အသုံးပြုခွင့်ရှိသည်။ ဇော်ဂျီ ၊ ဧရာ စသည်တို့ကို အသုံးပြုခွင့် မရှိ။ Unicode fonts များမှာ Mon3,Yunghkio, Myanamr3 စသည်များ အသုံးပြုနိုင်သည်။ Unicode Guide ကို ဒီမှာ Download ချပါ။ Zawgyi to Unicode Converter
Don't share ebook or software if nobody request. You can find free book websites on here. We are welcome for discussion or asking question instead.

How to Find Disallowed Parts of Sites???

edited November 2008 in Other
How to find secret sites and articels

On the internet there are a lot of site owners that hide some of the site's pages or even

the entire site from the search engines. You can now find those sites with robots.txt.

Robots.txt is a text file present in the root directory of a site which is used to control

which pages are indexed by a robot. If you use the 'disallow' word you can block parts of

your sites to be found by search engines.

1. Go to Googleand search after the keyword :


"robots.txt" "disallow:" filetype:txt

2. You will find the robots.txt file from sites that uses disallow command in it.

3. Let's open for example the first site: WhiteHouse. We can see that a lot of pages were

made invisible.

4. To open 'forbidden' pages just copy the text from what disallow command you want, without

the "text" at the end.

5. Now replace in the browser /robots.txt with your copied text and press Enter. The page

will open.

This is the hidden page from WhiteHouse.

Of course you can find more interesting pages, this was just an example.


Sign In or Register to comment.