Categories
Apache Linux SMS WhoIS Windows

10 Free Scripts to Create Your Own Url Shortening Service

Since the popularity of Twitter started to rise and the 140 characters mania began, there has been an outburst of Url shortening services which allow you to sqeeze a long url in the given limited space. There are many url shortening services out there which are both free and easy to use. There is no doubt, that till services like Twitter are there, these url shortening services are no where to go.

Moreover, mechanisms like these let us turn ugly long urls into shorter ones which are easy to manage and are rememberable also. On the other hand, there are several scripts available which let you shorten urls when installed on a server that you manage. So you own the service and it can be used for either personal, public or commercial use. Many popular websites like friendfeed, gigaom, digg etc already use such methods to easily distribute their content in digital as well as print format.

If you are looking to create a similar service for yourself, check out these scripts that we have compiled together.

Kissa.Be

kissa-logoKissa.be is a relatively new url shortening script that is based on php and mysql. When you install it on your server and get it running, you can not only shorten urls but also create text notes, upload images and create shortened email links that redirect to email addresses when opened. You also get an API to work with so developers can create 3rd party applications for it.  A ‘-’ after any shortened url gets you its stats and number of page views. You can check out a demo here. Download Kissa.be here

Shorty

Shorty needs PHP 4+, MySQL 3.23+, and Apache 1.2+ to run on a server. With Shorty, you’ll get a full blown admin panel from where you can control all your shortened links. When shortening links, you can either create a random url or provide custom keywords for the url. Within the admin panel, you can orgranize, edit short urls and also get a rss feed for them. You can test a demo of Shorty here and Download Shorty here.

Linx

Linx is a tiny (5kb) script based on  php. You’ll also need a mysql database to run it. It is created by @harry_jerry and he uses it on his site to create brandable shortened urls. You can see how it works on the Linx site and Download Linx here.

Phurl

Phurl runs on Php and is very flexible when it comes to extendibility. It comes with a very easy installation process and a simple administration panel. Unlike many on the list, it can handle urls without the http:// prefix, so it makes shortening even more quick. It is captcha enabled so users will have to pass a test before creating urls, but this is optional. You can checkout a demo here. Download Phurl.

PhpUrl

phpurl-logoAs the name implies, PhpUrl is built on Php and needs a mysql database to run. PhpUrl includes a simplistic admin backend that lets you monitor hits and IPs. Custom keywords may be affixed to the url. You can check out PhpUrl in live on this demo. Download PhpUrl.

Url Management

Url Management script is a lot more than just url shortening. You can create a site with features like domain whois, ip pinging, video download options and a lot more of webmaster tools. Easy ad placements are present and the process of installation is not difficult. Have a look at them demo hereDownload Url Management.

BrokenScript

logoBrokenScript as again, is based on php and mysql. Unlike other scripts on the list, the shortened url will not be prefixed with any random characters, but a user defined custom keyword (seed). the script’s size is not big, only 45kb. You can check out a site using this script here and Download Broken Script here.

Tiny Tiny Url

Tiny Tiny Url is a script built on ruby by Leah Culver. If you are familiar with Ruby, you can customize it as you wish. Tiny Tiny Url is available for download from Github, Download here. And also read the release post here.

Url Shrink

Url Shrink is a simple, easy to use and deploy php script for shortening links. It comes with several design templates and you can create one of your own too. One shortcoming of Url shrink is that the user has to enter his email address before shortening links. Bu for people who are ok with this, Url Shrink should be all good for them. Url Shrink Demo. Download Url Shrink.

URL Shortening Script by Kiviniar

Kiviniar.com has brought up a url shortening script which the developer is giving away for free. It’s built on php and needs a mysql database to run. The installation is very easy to follow and you can easily edit the site’s design with css. Download it here.

And That’s All! As previously stated, there are many url shortening scripts but a majority of them are paid ones. Looking for scripts which are free is a difficult task. So if you liked this post, please do give it a digg or stumble, we really appreciate it!

Categories
Apache cacti Cacti Examples graphing Linux monitor Passwords rrdtool Ubuntu Windows

How to install Cacti on Debian or Ubuntu

Cacti is a web based PHP/MySql graphing solution using the RRDtool engine. Classically, it can graph network bandwidthes with SNMP. But in fact, a lot of different graphs can be done with snmp, shell or perl scripts.

Cacti’s strength lies in the fact that it can be installed and used incredibly easily. You don’t need to be a guru or spend tons of hours on the tool to configure it. Even a beginner can use it very quickly. On the very active Cacti forum, you can share “Cacti templates” with other users which can can save you a lot of time. You can very easily add plugins to the Cacti too enabling the possiblility to integrate other free tools like ntop or php weathermap. In our opinion, this is by far the best RRDtool frontend.

For details about how to use Cacti, see the very good Cacti Manual.
RRDtool is a program developed by the Swiss Tobi Oeticker who was already the creator of the famous MRTG. RRDtool is developed using the “C” programming language and it stores the collected data on “.rrd” files.

The number of records in a “.rrd” file never increases, meaning that old records are frequently removed. This implies that one obtains precise figures for recently logged data, whereas figures based on very old data are mean value approximations. By default, you can have daily, weekly, monthy and yearly graphs.
Some of the advantages of RRDtool over MRTG are the following:

  • it is much quicker
  • it can use negative values
  • it can use more than one data source in a graph
  • the generated graphes are very customizable
  • it can be used by a wide variety of front-ends such as Cacti
  • the RRDtool records stored in .rrd files keep the same size and do not increase.

The following programs are needed to run cacti:

  • apache2 for the web server
  • mysql-server for the database
  • php5 for the server-based script
  • php5-common
  • php5-cgi
  • php5-cli
  • php5-mysql
  • snmp – snmp tools used to collect data to the remote hosts
  • rrdtool – a perl script to format collected data to rrdtool files
  • php5-gd – the graphical library used by a Cacti plugin named php weathermap

INSTALL PROGRAMS
Use apt-get to install the programs
#apt-get install apache2
#apt-get install mysql-server
#apt-get install php5
#apt-get install php5-common
#apt-get install php5-cgi
#apt-get install php5-cli
#apt-get install php5-mysql
#apt-get install snmp
#apt-get install rrdtool

INSTALL CACTI WITH APT-GET (recommanded)

#apt-get install cacti

You will have to configure the mysql settings through a little wizard.

At the end of the tutorial, a mysql database and user named cacti will be automatically created.

Now Cacti is ready to be used via: http://localhost/cacti The default login and password are admin.
Cacti will check if all the required tools are correctly installed.

Initial Cacti Configuration
Select “New Install”

Verify the required tools are correcty seen by cacti

Note that the poller.php script which send the requests to the remote hosts is lauched by the apache2 user, it means www-data.

To reconfigure cacti, use the following command:

#dpkg-reconfigure cacti

If you want to activate the poller manually run:

#/usr/share/cacti/site/php5 poller.php

Sometimes you need to activate it the first time, then it should run automatically every 5 minutes by default.

rrdtool install on debian

See also Multi-CPU Utilization Graphing in Cacti.

Categories
Apache Linux SSL Windows

Require SSL mod_rewrite apache

Require SSL using mod_rewrite under Apache in linux

RewriteEngine On
RewriteCond %{SERVER_PORT} 80
RewriteRule ^(.*)$ https://host.domain.tld/$1 [R,L]

Categories
Apache FTP Linux PHP WGET Windows

Securing Linux & PHP

MOD_REWRITE OVERVIEW
http://www.sitepoint.com/article/guide-url-rewriting
http://www.jeffdarlington.com/tag/mod_rewrite/

LINUX SECURE CONFIG
http://aymanh.com/tips-to-secure-linux-workstation

PHP SECURE CONFIG
http://aymanh.com/checklist-for-securing-php-configuration

MOD_REWRITE SCRIPTS FOR APACHE
SIMPLEST SET OF RULES
==================================================================

#Turn on mod_rewrite
RewriteEngine On
RewriteOptions inherit
RewriteLog “/var/log/httpd/rewrite_log”

# Prevent harmful binary execution through injection
RewriteCond %{REQUEST_URI}?%{QUERY_STRING} (.*)chmod(.*) [OR]
RewriteCond %{REQUEST_URI}?%{QUERY_STRING} (.*)chown(.*) [OR]
RewriteCond %{REQUEST_URI}?%{QUERY_STRING} (.*)wget(.*) [OR]
RewriteCond %{REQUEST_URI}?%{QUERY_STRING} (.*)cmd(.*) [OR]
RewriteCond %{REQUEST_URI}?%{QUERY_STRING} (.*)cd%20(.*) [OR]
RewriteCond %{REQUEST_URI}?%{QUERY_STRING} (.*)scp(.*) [OR]
RewriteCond %{REQUEST_URI}?%{QUERY_STRING} (.*)curl(.*) [OR]

# Disable TRACE & TRACK methods
RewriteCond %{REQUEST_METHOD} TRACE [OR]
RewriteCond %{REQUEST_METHOD} TRACK [OR]

# Redirect objectional persons to the bit bucket
RewriteRule ^.* – [F,L]

#Turn on mod_rewrite
RewriteEngine On
RewriteOptions inherit
RewriteLog w3g_rewrite_log

#Disable command line hacks via XSS scripting w/ vulnerable PHP options & includes
RewriteCond %{REQUEST_URI}?%{QUERY_STRING} (.*)chmod(.*) [OR]
RewriteCond %{REQUEST_URI}?%{QUERY_STRING} (.*)chown(.*) [OR]
RewriteCond %{REQUEST_URI}?%{QUERY_STRING} (.*)wget(.*) [OR]
RewriteCond %{REQUEST_URI}?%{QUERY_STRING} (.*)cmd(.*) [OR]
RewriteCond %{REQUEST_URI}?%{QUERY_STRING} (.*)cd%20(.*) [OR]
RewriteCond %{REQUEST_URI}?%{QUERY_STRING} (.*)scp(.*) [OR]
RewriteCond %{REQUEST_URI}?%{QUERY_STRING} (.*)curl(.*) [OR]

#Disable TRACE & TRACK methods
RewriteCond %{REQUEST_METHOD} TRACE [OR]
RewriteCond %{REQUEST_METHOD} TRACK [OR]

#Other hack prevention, mostly windows-based
RewriteCond %{REQUEST_URI}?%{QUERY_STRING} (.*)/winnt/system32/(.*) [NC,OR]
RewriteCond %{REQUEST_URI}?%{QUERY_STRING} (.*)/winnt/system/(.*) [NC,OR]
RewriteCond %{REQUEST_URI}?%{QUERY_STRING} (.*)/windows/system32/(.*) [NC,OR]
RewriteCond %{REQUEST_URI}?%{QUERY_STRING} (.*)/windows/system/(.*) [NC,OR]
RewriteCond %{REQUEST_URI}?%{QUERY_STRING} (.*)/cmd.exe[$|?(.*)] [NC,OR]
RewriteCond %{REQUEST_URI}?%{QUERY_STRING} (.*)/scripts/root.exe[$|?(.*)] [NC,OR]
RewriteCond %{REQUEST_URI}?%{QUERY_STRING} (.*)/msadc/root.exe[$|?(.*)] [NC,OR]
RewriteCond %{REQUEST_URI}?%{QUERY_STRING} (.*)\..(.*) [NC,OR]
RewriteCond %{REQUEST_URI}?%{QUERY_STRING} (.*)/admin.dll[$|?(.*)] [NC,OR]
RewriteCond %{REQUEST_URI}?%{QUERY_STRING} (.*)/msadcs.dll[$|?(.*)] [NC,OR]
RewriteCond %{REQUEST_URI}?%{QUERY_STRING} (.*)/ext.dll[$|?(.*)] [NC,OR]
RewriteCond %{REQUEST_URI} (.*)/.(.*) [NC,OR]
RewriteCond %{REQUEST_URI}?%{QUERY_STRING} (.*)/php.exe[$|?(.*)] [NC,OR]
RewriteCond %{REQUEST_URI}?%{QUERY_STRING} (.*)|(.*) [OR]
RewriteCond %{REQUEST_URI} (.{255,}) [OR]
RewriteCond %{QUERY_STRING} (.{127,}) [OR]
RewriteCond %{REQUEST_URI}?%{QUERY_STRING} [x00-x1f]+ [OR]
RewriteCond %{REQUEST_URI}?%{QUERY_STRING} [x7f|xff]+

#Rewrite offending persons to forbidden page
RewriteRule (.*) [F]

# Stop bad bots/spiders
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [OR]
RewriteCond %{HTTP_USER_AGENT} ^Bot mailto:craftbot@yahoo.com [OR]
RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [OR]
RewriteCond %{HTTP_USER_AGENT} ^Custo [OR]
RewriteCond %{HTTP_USER_AGENT} ^DISCo [OR]
RewriteCond %{HTTP_USER_AGENT} ^Download Demon [OR]
RewriteCond %{HTTP_USER_AGENT} ^eCatch [OR]
RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [OR]
RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [OR]
RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [OR]
RewriteCond %{HTTP_USER_AGENT} ^Express WebPictures [OR]
RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [OR]
RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [OR]
RewriteCond %{HTTP_USER_AGENT} ^FlashGet [OR]
RewriteCond %{HTTP_USER_AGENT} ^GetRight [OR]
RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [OR]
RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [OR]
RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [OR]
RewriteCond %{HTTP_USER_AGENT} ^GrabNet [OR]
RewriteCond %{HTTP_USER_AGENT} ^Grafula [OR]
RewriteCond %{HTTP_USER_AGENT} ^HMView [OR]
RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Image Stripper [OR]
RewriteCond %{HTTP_USER_AGENT} ^Image Sucker [OR]
RewriteCond %{HTTP_USER_AGENT} Indy Library [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^InterGET [OR]
RewriteCond %{HTTP_USER_AGENT} ^Internet Ninja [OR]
RewriteCond %{HTTP_USER_AGENT} ^JetCar [OR]
RewriteCond %{HTTP_USER_AGENT} ^JOC Web Spider [OR]
RewriteCond %{HTTP_USER_AGENT} ^larbin [OR]
RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [OR]
RewriteCond %{HTTP_USER_AGENT} ^Mass Downloader [OR]
RewriteCond %{HTTP_USER_AGENT} ^MIDown tool [OR]
RewriteCond %{HTTP_USER_AGENT} ^Mister PiX [OR]
RewriteCond %{HTTP_USER_AGENT} ^Navroad [OR]
RewriteCond %{HTTP_USER_AGENT} ^NearSite [OR]
RewriteCond %{HTTP_USER_AGENT} ^NetAnts [OR]
RewriteCond %{HTTP_USER_AGENT} ^NetSpider [OR]
RewriteCond %{HTTP_USER_AGENT} ^Net Vampire [OR]
RewriteCond %{HTTP_USER_AGENT} ^NetZIP [OR]
RewriteCond %{HTTP_USER_AGENT} ^Octopus [OR]
RewriteCond %{HTTP_USER_AGENT} ^Offline Explorer [OR]
RewriteCond %{HTTP_USER_AGENT} ^Offline Navigator [OR]
RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [OR]
RewriteCond %{HTTP_USER_AGENT} ^Papa Foto [OR]
RewriteCond %{HTTP_USER_AGENT} ^pavuk [OR]
RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [OR]
RewriteCond %{HTTP_USER_AGENT} ^RealDownload [OR]
RewriteCond %{HTTP_USER_AGENT} ^ReGet [OR]
RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [OR]
RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [OR]
RewriteCond %{HTTP_USER_AGENT} ^SuperBot [OR]
RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [OR]
RewriteCond %{HTTP_USER_AGENT} ^Surfbot [OR]
RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [OR]
RewriteCond %{HTTP_USER_AGENT} ^Teleport Pro [OR]
RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [OR]
RewriteCond %{HTTP_USER_AGENT} ^Web Image Collector [OR]
RewriteCond %{HTTP_USER_AGENT} ^Web Sucker [OR]
RewriteCond %{HTTP_USER_AGENT} ^WebAuto [OR]
RewriteCond %{HTTP_USER_AGENT} ^WebCopier [OR]
RewriteCond %{HTTP_USER_AGENT} ^WebFetch [OR]
RewriteCond %{HTTP_USER_AGENT} ^WebGo IS [OR]
RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [OR]
RewriteCond %{HTTP_USER_AGENT} ^WebReaper [OR]
RewriteCond %{HTTP_USER_AGENT} ^WebSauger [OR]
RewriteCond %{HTTP_USER_AGENT} ^Website eXtractor [OR]
RewriteCond %{HTTP_USER_AGENT} ^Website Quester [OR]
RewriteCond %{HTTP_USER_AGENT} ^WebStripper [OR]
RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [OR]
RewriteCond %{HTTP_USER_AGENT} ^WebZIP [OR]
RewriteCond %{HTTP_USER_AGENT} ^Wget [OR]
RewriteCond %{HTTP_USER_AGENT} ^Widow [OR]
RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [OR]
RewriteCond %{HTTP_USER_AGENT} ^Xaldon WebSpider [OR]
RewriteCond %{HTTP_USER_AGENT} ^Zeus
RewriteRule ^.* – [F,L]

Categories
Apache Linux PCI Security SSL Windows

PCI Audit Remediation for TRACE and TRACK issues on apache

PCI Audits often reveal TRACE & TRACK as issues that must be handled before the website can be considered PCI compliant.

If you are running apache 2.x, the following directives will disable TRACE & TRACK functionality.

This change needs to be made in /etc/httpd/conf/httpd.conf:
ServerTokens OS
TraceEnable OFF

The Mod_rewrite directives below need to be added to all paragraphs in both of the following locations:
/etc/httpd/conf/httpd.conf
/etc/httpd/conf.d/ssl.conf

nsert this code right before for each Virtual Host

RewriteEngine on
RewriteCond %{REQUEST_METHOD} ^(TRACE|TRACK)
RewriteRule .* – [F]

Categories
Apache Linux Windows

Linux shell script backup system configuration httpd mysql apache files

# THE FOLLOWING DIRECTORIES MUST EXIST
# /var/backup
# /var/backup/tmp
# /var/backup/conf
# /var/backup/tmp/conf

# GATHER SYSTEM INFORMATION
cp /etc/php.ini /var/backup/tmp/conf/php.ini
cp /etc/my.cnf /var/backup/tmp/conf/my.cnf
cp /etc/hosts /var/backup/tmp/conf/hosts
rpm -qa > /var/backup/tmp/conf/rpms

# GATHER HTTPD INFORMATION
tar -cvf /var/backup/tmp/conf/etc-http-conf.tar /etc/httpd/conf/

# TAR & COMPRESS ALL INFO
tar -cvf /var/backup/conf/confbak.tar /var/backup/tmp/conf/
gzip -f /var/backup/conf/confbak.tar

# COPY TO DAILY CRONTAB (without # sign) TO RUN EVERY DAY
# cp /root/bin/confbak.sh /etc/cron.daily/

# CLEANUP
rm -rf /var/backup/tmp/conf/*

Categories
Apache Linux Windows

Linux Shell Script backup web apache httpd content

# THE FOLLOWING DIRECTORIES MUST EXIST
# /var/backup
# /var/backup/www

# GATHER WEB FILES
tar -cvf /var/backup/www/html.tar /var/www/html/

# COMPRESS ALL INFO
gzip -f /var/backup/www/html.tar

# TO AUTOMATE, COPY TO WEEKLY CRONTAB (without # sign)
# cp /root/bin/wwwbak.sh /etc/cron.weekly/

Categories
Apache Grep Linux Security

Set Apache Password Protected Directories With .htaccess File

There are many ways you can password protect directories under Apache web server. This is important to keep your file privates from both unauthorized users and search engines (when you do not want to get your data indexed). Here you will see the basics of password protecting a directory on your server. You can use any one of the following method:
  1. Putting authentication directives in a section, in your main server configuration httpd.conf file, is the preferred way to implement this kind of authentication.
  2. If you do not have access to Apache httpd.conf file (for example shared hosting) then with the help of file called .htaccess you can create password protect directories. .htaccess file provide a way to make configuration changes on a per-directory basis.

In order to create apache password protected directories you need:

  • Password file
  • And Directory name which you would like to password protect (/var/www/docs)

Step # 1: Make sure Apache is configured to use .htaccess file

You need to have AllowOverride AuthConfig directive in httpd.conf file in order for these directives to have any effect. Look for DocumentRoot Directory entry. In this example, our DocumentRoot directory is set to /var/www. Therefore, my entry in httpd.conf looks like as follows:


Options Indexes Includes FollowSymLinks MultiViews
AllowOverride AuthConfig
Order allow,deny
Allow from all

Save the file and restart Apache
If you are using Red Hat /Fedora Linux:

# service httpd restart

If you are using Debian Linux:

# /etc/init.d/apache-perl restart

Step # 2: Create a password file with htpasswd

htpasswd command is used to create and update the flat-files (text file) used to store usernames and password for basic authentication of Apache users. General syntax:
htpasswd -c password-file username
Where,

  • -c : Create the password-file. If password-file already exists, it is rewritten and truncated.
  • username : The username to create or update in password-file. If username does not exist in this file, an entry is added. If it does exist, the password is changed.

Create directory outside apache document root, so that only Apache can access password file. The password-file should be placed somewhere not accessible from the web. This is so that people cannot download the password file:

# mkdir -p /home/secure/

Add new user called sysadmin

# htpasswd -c /home/secure/apasswords sysadmin

Make sure /home/secure/apasswords file is readable by Apache web server. If Apache cannot read your password file, it will not authenticate you. You need to setup a correct permission using chown command. Usually apache use www-data user. Use the following command to find out Apache username. If you are using Debian Linux use pache2.conf, type the following command:
# grep -e '^User' /etc/apache2/apache2.conf

Output:

www-data

Now allow apache user www-data to read our password file:
# chown www-data:www-data /home/secure/apasswords
# chmod 0660 /home/secure/apasswords

If you are using RedHat and Fedora core, type the following commands :
# grep -e '^User' /etc/httpd/conf/httpd.conf

Output:

apache

Now allow apache user apache to read our password file:
# chown apache:apache /home/secure/apasswords
# chmod 0660 /home/secure/apasswords

Now our user sysadmin is added but you need to configure the Apache web server to request a password and tell the server which users are allowed access. Let us assume you have directory called /var/www/docs and you would like to protect it with a password.

Create a directory /var/www/docs if it does not exist:
# mkdir -p /var/www/docs

Create .htaccess file using text editor:
# cd /var/www/docs
# vi .htaccess

Add following text:

AuthType Basic
AuthName "Restricted Access"
AuthUserFile /home/secure/apasswords
Require user sysadmin

Save file and exit to shell prompt.

Step # 3: Test your configuration

Start your browser type url http://yourdomain.com/docs/ or http://localhost/docs/ or http://ip-address/docs

When prompted for username and password please supply username sysadmin and password. You can add following lines to any file entry in httpd.conf file:

AuthType Basic
AuthName "Restricted Access"
AuthUserFile /home/secure/apasswords
Require user sysadmin

To change or setup new user use htpasswd command again.

Troubleshooting

If password is not accepted or if you want to troubleshoot authentication related problems, open and see apache access.log/error.log files:

Fedora Core/CentOS/RHEL Linux log file location:
# tail -f /var/log/httpd/access_log
# tail -f /var/log/httpd/error_log

Debian Linux Apache 2 log file location:
# tailf -f /var/log/apache2/access.log
# tailf -f /var/log/apache2/error.log

See also: