Squid as Transparent Proxy on CentOs 6.4

In this tutorial I am going to configure squid acting as transparent proxy what does it means? It means we have no part of configurations on the client end, just to setup squid in transparent proxy mode so it will sits between client and internet. And redirect their port 80 requests to 3128 port which is default squid port. Here the simple steps which you need to perform on squid server.

Lab Environment: 

  • CetnOs 6.4 (as squid transparent proxy server), Hostname = pxy.lintut.com
  • eth0 : (Connected to Internet) IP = 192.168.1.211/24, Gateway = 192.168.1.1 and DNS = 8.8.8.8
  • eth1 : (Connected to LAN) IP = 10.0.0.1/24, and DNS = 172.0.0.1
  • Xp Pro SP3 (Client PC for testing). Hostname = xp1.lintut.com IP = 10.0.0.11/8, Gateway = 10.0.0.1(squid Server’s IP) and DNS = 10.0.0.3

Step-1 Installing squid packages.

yum install squid -y

Step-2 Edit squid configuration file ‘/etc/squid/squid.conf’.

Read more

Share

Perfect Squid with Transparent proxy and SSL log

This will be a transparent SQUID proxy for your home or corporate network  ,  it  will transparently  intercept  all  traffic http and https  ,  for  https  you  will  need  to  push  to  clients  the CA certificate of the SQUID server, it  has  been tested to be working without problems with lastest Internet Explorer, Mozilla Firefox and Chrome browsers.

STEP 1 – Installing Base system , upgrading it and disabling IPtables , SeLINUX

We  start  by  downloading  the  CentOS  6.5  iso from CentOS website (x86 or x64) : CentOS 6.5 ISO’s , install Base system. Partitioning , software or hardware raid is up to the user. In this example hostname is : proxy.home.lan and ip address is : 192.168.201.250 .

Read more

Share

Convert squid timestamps

When you work with a squid access log file you sometimes want to know when a site or resource was accessed. Squid does not store the date and time information for that in a human readable format.

It is stored as <unix timestamp>.<centisecond> so you can use a command like that to post-process to make it more readable for you:

cat access.log | perl -p -e 's/^([0-9]*)/"[".localtime($1)."]"/e'
Share

Stunnel on Debian/Ubuntu with Squid

What’s Stunnel

The Stunnel program is designed to work as an SSL encryption wrapper between remote client and local (inetd-startable) or remote server. It can be used to add SSL functionality to commonly used inetd daemons like POP2, POP3, and IMAP servers without any changes in the program’s code.

What Stunnel basically does is that it turns any insecure TCP port into a secure encrypted port using OpenSSL package for cryptography. It’s somehow like a small secure VPN that runs on specific ports.

Step 1: Create an Ubuntu Droplet

So far I have tested it on Ubuntu 12.04 x32/x64, Ubuntu 12.10 x32/x64, Ubuntu 13.04 x32/x64.

Step 2: Update and Upgrade Ubuntu

Using these commands update your Ubuntu’s package list and also upgrade the existing packages to the latest version:

apt-get update
apt-get upgrade

Read more

Share

How To Control Access To Unwanted Websites Using URL Blacklist With SafeSquid Proxy Server

SafeSquid – Content Filtering Internet Proxy, has many content filtering features that can be used to decide who is allowed what, when and how much on the net. In this tutorial I will describe how to control access to unwanted categories of websites, by using URL Blacklist database with SafeSquid Proxy Server.

Note: Also see the following articles :
'Deploying A Content Filtering Proxy Server To Distribute Controlled Internet Access With SafeSquid'
Set Up Gateway Level Virus Security With ClamAV And SafeSquid Proxy
How To Set Up Internet Access Control And Internet Filtering With SafeSquid Proxy Server

SafeSquid allows the administrators to use plain text urlblacklist very easily and with a desired level of sophistication. The sites http://www.shallalist.de/ and  http://www.urlblacklist.com maintain a well categorized list of various web-sites and pages like porn, adult, webmail, jobsearch, entertainment, etc. This is an excellent resource for an administrator seeking to granularly enforce a corporate policy that allows or disallows only certain kinds of web-sites to be accessible by specific users, groups or networks.

Note: cProfiles offers the flexibility of many more actions than URL Blacklist, instead of just allowing / blocking categories. For example, you can add a profile to a specific category, and then use that profile in any of SafeSquid's filtering sections, for actions on the category like blocking cookies, ads and banners, ActiveX, Java Scripts, throttling bandwidth (QoS), or simply analyzing what category is most visited, without blocking access.
For Details, see http://www.safesquid.com/html/portal.php?page=132

While Shalla Secure Services offer free downloads and updates for home users, Urlblacklist requires you to subscribe to receive updates. You can download the URL Blacklist by Shalla from HERE, and the trial database by urlblacklist.com from HERE.
Please note that you will be able to download this trial database only once. You need to subscribe to urlblacklist.com to be able to receive regular updates 

Copy the downloaded trial database to /usr/local/bin directory on the SafeSquid Server, and untar the files

cd /usr/local/src
tar -zxvf bigblacklist.tar.gz

This will create a directory 'blacklist'. Create a directory 'urlbl' in /opt/safesquid and copy the contents of blacklist in this directory.

mkdir /opt/safesquid/urlbl
cd blacklist
cp -rf . /opt/safesquid/urlbl

Next, restart SafeSquid

/etc/init.d/safesquid restart

In SafeSquid GUI Interface, click on URL blacklist in the Top Menu It should display a list of all the categories copied to the urlbl directory. Here, you can query the database to find out if a website is listed under any category. For example, to find out what category hackerstuff.com belongs to, type hackerstuff.com in the Domain field and click on Submit below. You should get a screen similar to this –

SafeSquid Interface – URL Blacklist Database Query

Note: This section only allows you to query the database. Selecting or unselecting a category does not enable or disable it. 

Read more

Share

URL Filtering usng cProfile

cProfiles: Real-Time Website Profiler

 
cProfiles provides the SafeSquid users, the much needed mechanism of classifying web-sites, in one or more categories. Usage is very simple, yet will allow security managers a lot of room, to handle challenges rather inventively.
Over 3 million web-sites have been classified in a variety of categories like news, webmail, adult, porn, arts, etc. Policy makers can create rules to determine if a web-site belongs to one or more categories, and "ADD PROFILE" of their choice, say – "NOT_BUSINESS", and then use this Profile, in any of the other SafeSquid's sections like URL Filter, or MiMe Filter, or Cookie Filter etc. to allow or deny the transaction, as per enterprise policy.
 
Categories:
  • Ads
  • Adult
  • Adult Education
  • Arts
  • Chat
  • Drugs
  • Education
  • Fileshare
  • Finance
  • Gambling
  • Games
  • Government
  • Hacking
  • Hate
  • Highrisk
  • Housekeeping
  • Instant Messaging
  • Jobs
  • Leisure
  • Mail
  • Multimedia
  • News
  • Porn
  • Proxy
  • Search Engines
  • Shopping
  • Social
  • Sports
  • System Utilities
  • Travel
  • Business

How cProfiles works

Policy makers can configure cProfiles to "add a profile" to a request for any website that is listed under one or more categories. Whenever a user requests for any website, the cProfiles module verifies if the website is listed under the specified categories. It first checks its cache for an entry. If the entry is found in the cache, cProfiles adds the profile instantly to the request. If the entry is not found in the cache, the cProfiles module sends a query to SafeSquid's Content Categorization Service (CCS). cProfiles uses DNS technology to query the CCS. This naturally updates all the enroute caching Nameservers. So if you even restart SafeSquid, the resolutions will be quickly retrieved from the nearest DNS provider.
Unlike legacy technologies that forces users to store huge databases, cProfiles caches only 'really visited' websites and therefore, utilizes very little system resources. Since the categorization happens in real-time, users do not have to regularly download updates to keep their database up to date.
The CCS has been initially seeded with a little over 3 Million web-sites. CCS has been built with a unique self-learning technology, that allows it to build a list of web-sites that must be categorized for the benefit of it's users, and CCS then automatically creates the "suggested classifications" for these web-sites, in real-time. These results are then validated by human editors, on an hourly basis, allowing the data to be instantly useable by the real-users.
Learn how to use cProfiles with SafeSquid, seecProfiles Documentation

Read more

Share

URL Filter DB and URL Filter DB Guard

Together, URLfilterDB and ufdbGuard offer a unique set of features, all aimed at protecting your private network and reducing network bandwidth usage. These features include:

  • URL filtering
  • Advertisements blocking
  • HTTPS proxy tunnel protection
  • Blocking adult images produced by search engines
  • Controlling HTTPS usage

 

URL filtering
 
There are three methods available to block unwanted web content:

 

  • Content scanning: this method blocks access to web pages based on the occurrence of “bad” words in the content.
  • Artificial intelligence: a variant on content scanning, intended to render more accurate results.
  • Blacklists: this method blocks access to web pages based on their being listed in a website category to be blocked.

URLfilterDB uses the last method as the fastest and most accurate method for URL filtering available. Navigate to our product comparison page to find out why and how using a blacklist filtering method makes URLfilterDB superior to competitor’s products.

Read more

Share