Category Archives: Linux

How to check RTMP source stream is live or not?

For more details:


Tags: , , , , ,

How to Setup Elastic Load Balancing on AWS


GIT – Cheat Sheet


Tags: , , , , , , , , , , ,

How to clean malware from website?

Malware, short for malicious software, is a software designed to secretly access a computer system without the owner’s informed consent. The expression is a general term used by computer professionals to mean a variety of forms of hostile, intrusive, or annoying software or program code.

You have seen above warning many times when you want to browse website using web browsers. This is the warning from search engine bots like Google for website is affected from malwares or viruses. If you still want to access website, it can affect your system or system resources.

Most of times websites are hacked or unauthorized accessed from hackers or cross-site scripting (XSS) or cross-site request forgeries (CSRF).

There may be lot of “holes” in website security that invite hackers to play their game.

The possible HOLES may be:
1. File/Folder permissions
2. Poor authentication for application
3. Cross-Site Scripting
4. Cross-Site Request Forgeries
5. Anti-Virus Software
6. File formats
7. Network “Firewalls/Filters”
8. Shell access & Logs

Please check some link to make web application secure and safe ūüėČ

You can review online Virus & Threat Scanner for cleaning malwares & viruses. These softwares are designed to run on your web server and scan your public web files for malicious code.

Google Safe Browsing Tool

Norton Safe Web

You can search for more tools like…
Security Pro | SiteMonitor | IP trap | htaccess | AntiXSS | Check Permissions | KISS FileSafe

If you are running PHP website under Apache & MySQL, make sure file and folder should not be access public. You have to check PHP function’s security for more secure access.

PHP Functions may be used in hacking:
1. file_get_contents()
2. base64_decode()
3. eval()
4. exec()
5. preg_match()
6. gzuncompress()
7. urldecode()
8. error_reporting()
9. shell_exec()
10. setcookie()
11. chmod()
12. is_writable()
13. move_uploaded_file() and copy()

The above functions can be used by hackers to write malicious code to your files. The malicious code executed using eval() that will execute every run of website. So, disable eval(), file_put_contents(), file_get_contents(), exec() etc. You can check safe_mode in php.ini for disabling shell access ūüėČ

Most of the time websites are hacked using file_get_contents(), eval(base64_decode()), urldecode(), include() or iframes.

You can search infected file on web server “/var/www/” using below command:

# grep -iR ‘eval(base64_decode(‘ /web-root
# grep -iR ‘ # grep -iR ‘urldecode(‘ /web-root
# grep -iR ‘file_get_contents(‘ /web-root
# grep -iR ‘exec(‘ /web-root

As soon as infection found, you have to backup all application running on web server, now you have to remove infected files manually or using scanner.
Now all up to you how you can manage your web server more securely…

I’ve found that luck is quite predictable. If you want more luck, take more chances. Be more active. Show up more often. ūüėÄ


Tags: , , , , , , , , , , , , , , ,

Recursive Replace in Files Folders

Hello Friends,
Some time we want to change branding of web based softwares. Open source web based software have GNU License. We can modify the code and launch versions.
We can use PERL (Perl is a highly capable, feature-rich programming language with over 22 years of development.)
Here are some basic steps to Recursive Replace Rename Files, Variables, Folder Name etc.
Let you have a project in folder "/root/svnlabs"
To replace search string in all filename and foldernames recursively

# chmod a+x ./

# perl -m g 'search_text' 'replace_text'

# chmod a+x ./

# perl -m g 'Search' 'Replace'

# grep -iRl 'Search' /root/svnlabs  (Now search in your project)

If you would hit the mark, you must aim a little above it ūüėČ
1 Comment

Posted by on December 11, 2010 in CentOS, Linux, php, Tips, Tricks, Web Services


Tags: , , , , , , ,

Block badAgents on site

PHP is very powerful language to block bad agents. Below is the code to avoid webspider using PHP in_array().

=¬†array(‘Acunetix¬†Web¬†Vulnerability¬†Scanner’, ‘Bot\¬†’, ‘ChinaClaw’, ‘Custo’, ‘DISCo’, ‘Download\¬†Demon’, ‘eCatch’, ‘EirGrabber’, ‘EmailSiphon’, ‘EmailWolf’, ‘Express\¬†WebPictures’, ‘ExtractorPro’, ‘EyeNetIE’, ‘FlashGet’, ‘GetRight’, ‘GetWeb!’, ‘Go!Zilla’, ‘Go-Ahead-Got-It’, ‘GrabNet’, ‘Grafula’, ‘HMView’, ‘HTTrack’, ‘Image\¬†Stripper’, ‘Image\¬†Sucker’, ‘Indy\¬†Library’, ‘InterGET’, ‘Internet\¬†Ninja’, ‘JetCar’, ‘JOC\¬†Web\¬†Spider’, ‘larbin’, ‘LeechFTP’, ‘Mass\¬†Downloader’, ‘MIDown\¬†tool’, ‘Mister\¬†PiX’, ‘Navroad’, ‘NearSite’, ‘NetAnts’, ‘NetSpider’, ‘Net\¬†Vampire’, ‘NetZIP’, ‘Octopus’, ‘Offline\¬†Explorer’, ‘Offline\¬†Navigator’, ‘PageGrabber’, ‘Papa\¬†Foto’, ‘pavuk’, ‘pcBrowser’, ‘RealDownload’, ‘ReGet’, ‘SiteSnagger’, ‘SmartDownload’, ‘SuperBot’, ‘SuperHTTP’, ‘Surfbot’, ‘tAkeOut’, ‘Teleport\¬†Pro’, ‘VoidEYE’, ‘Web\¬†Image\¬†Collector’, ‘Web\¬†Sucker’, ‘WebAuto’, ‘WebCopier’, ‘WebFetch’, ‘WebGo\¬†IS’, ‘WebLeacher’, ‘WebReaper’, ‘WebSauger’, ‘Website\¬†eXtractor’, ‘Website\¬†Quester’, ‘WebStripper’, ‘WebWhacker’, ‘WebZIP’, ‘Wget’, ‘Widow’, ‘WWWOFFLE’, ‘Xaldon\¬†WebSpider’, ‘Zeus’);

Posted by on December 4, 2010 in CURL, Linux, Open Source, php, Tips


Tags: , , , , , ,