Use Putty as SOCKS proxy

I’ll need 2 programs to achieve this, assuming you already have access to a Linux box and you can connect using putty client. Need S/w packages are-

  1. Putty and
  2. SeaMonkey Browser

Follow the following steps-

  • Run Putty
  • Goto SSH > Tunnels and configure as required
    • Turn Tick ON- Local port accept connects from other hosts
    • Turn Tick ON- Remote ports do the same (SSH-2 only)
    • Type “8443” (I’ll using port 8443 to create tunnel) into “Source Port” and click “Add”. Upon hitting “Add” button the mid text box is shown as “D8443”
  • Key point here is to use “DYNAMIC” for destination section
  • Now connect to the SSH Server

Read more

Share

How to enable Proxy Settings for Yum Command on RHEL / CentOS Servers

This can be easily achieved with yum config file “/etc/yum.conf“. Under main section define the proxy settings like below:

………………
proxy=http://<Proxy-Server-IP-Address>:<Proxy_Port>
proxy_username=<Proxy-User-Name>
proxy_password=<Proxy-Password> 
………………

Save and exit the file and start using the yum command. Sample yum Config file with proxy settings is shown below :

Share

Wget, Apt and Git behind proxy

For Wget

Add/Comment out below line(s) in file ~/.wgetrc or /etc/wgetrc:

http_proxy = http://[Proxy_Server]:[port]
https_proxy = http://[Proxy_Server]:[port]
ftp_proxy = http://[Proxy_Server]:[port]

For Apt

Create a new configuration file named proxy.conf.sudo touch /etc/apt/apt.conf.d/proxy.conf
Open the proxy.conf file in a text editor.sudo vi /etc/apt/apt.conf.d/proxy.conf

Paste it as following:

Acquire {
HTTP::proxy "http://127.0.0.1:8080";
HTTPS::proxy "http://127.0.0.1:8080";
}

For Git:

git config --global http.proxy http://proxyuser:proxypwd@proxy.server.com:8080

change proxyuser to your proxy user
change proxypwd to your proxy password
change proxy.server.com to the URL of your proxy server
change 8080 to the proxy port configured on your proxy server
Note that this works for both http and https repos.

If you decide at any time to reset this proxy and work without proxy:

Command to use:

git config --global --unset http.proxy

Finally, to check the currently set proxy:

git config --global --get http.proxy
Share

Perfect Squid with Transparent proxy and SSL log

This will be a transparent SQUID proxy for your home or corporate network  ,  it  will transparently  intercept  all  traffic http and https  ,  for  https  you  will  need  to  push  to  clients  the CA certificate of the SQUID server, it  has  been tested to be working without problems with lastest Internet Explorer, Mozilla Firefox and Chrome browsers.

STEP 1 – Installing Base system , upgrading it and disabling IPtables , SeLINUX

We  start  by  downloading  the  CentOS  6.5  iso from CentOS website (x86 or x64) : CentOS 6.5 ISO’s , install Base system. Partitioning , software or hardware raid is up to the user. In this example hostname is : proxy.home.lan and ip address is : 192.168.201.250 .

Read more

Share

Install Memcache

By default PHP loads and saves sessions to disk. Disk storage has a few problems:

1. Slow IO: Reading from disk is one of the most expensive operations an application can perform, aside from reading across a network.
2. Scale: If we add a second server, neither machine will be aware of sessions on the other.

Enter Memcached
I hinted at Memcached before as a content cache that can improve application performance by preventing trips to the database. Memcached is also perfect for storing session data, and has been supported in PHP for quite some time.

Why use memcached rather than file-based sessions? Memcache stores all of its data using key-value pairs in RAM – it does not ever hit the hard drive, which makes it F-A-S-T. In multi-server setups, PHP can grab a persistent connection to the memcache server and share all sessions between multiple nodes.

Installation
Before beginning, you’ll need to have the Memcached server running. I won’t get into the details for building and installing the program as it is different in each environment. On Ubuntu it’s as easy as aptitude install memcached. Most package managers have a memcached installation available.

Read more

Share

How To Control Access To Unwanted Websites Using URL Blacklist With SafeSquid Proxy Server

SafeSquid – Content Filtering Internet Proxy, has many content filtering features that can be used to decide who is allowed what, when and how much on the net. In this tutorial I will describe how to control access to unwanted categories of websites, by using URL Blacklist database with SafeSquid Proxy Server.

Note: Also see the following articles :
'Deploying A Content Filtering Proxy Server To Distribute Controlled Internet Access With SafeSquid'
Set Up Gateway Level Virus Security With ClamAV And SafeSquid Proxy
How To Set Up Internet Access Control And Internet Filtering With SafeSquid Proxy Server

SafeSquid allows the administrators to use plain text urlblacklist very easily and with a desired level of sophistication. The sites http://www.shallalist.de/ and  http://www.urlblacklist.com maintain a well categorized list of various web-sites and pages like porn, adult, webmail, jobsearch, entertainment, etc. This is an excellent resource for an administrator seeking to granularly enforce a corporate policy that allows or disallows only certain kinds of web-sites to be accessible by specific users, groups or networks.

Note: cProfiles offers the flexibility of many more actions than URL Blacklist, instead of just allowing / blocking categories. For example, you can add a profile to a specific category, and then use that profile in any of SafeSquid's filtering sections, for actions on the category like blocking cookies, ads and banners, ActiveX, Java Scripts, throttling bandwidth (QoS), or simply analyzing what category is most visited, without blocking access.
For Details, see http://www.safesquid.com/html/portal.php?page=132

While Shalla Secure Services offer free downloads and updates for home users, Urlblacklist requires you to subscribe to receive updates. You can download the URL Blacklist by Shalla from HERE, and the trial database by urlblacklist.com from HERE.
Please note that you will be able to download this trial database only once. You need to subscribe to urlblacklist.com to be able to receive regular updates 

Copy the downloaded trial database to /usr/local/bin directory on the SafeSquid Server, and untar the files

cd /usr/local/src
tar -zxvf bigblacklist.tar.gz

This will create a directory 'blacklist'. Create a directory 'urlbl' in /opt/safesquid and copy the contents of blacklist in this directory.

mkdir /opt/safesquid/urlbl
cd blacklist
cp -rf . /opt/safesquid/urlbl

Next, restart SafeSquid

/etc/init.d/safesquid restart

In SafeSquid GUI Interface, click on URL blacklist in the Top Menu It should display a list of all the categories copied to the urlbl directory. Here, you can query the database to find out if a website is listed under any category. For example, to find out what category hackerstuff.com belongs to, type hackerstuff.com in the Domain field and click on Submit below. You should get a screen similar to this –

SafeSquid Interface – URL Blacklist Database Query

Note: This section only allows you to query the database. Selecting or unselecting a category does not enable or disable it. 

Read more

Share

URL Filtering usng cProfile

cProfiles: Real-Time Website Profiler

 
cProfiles provides the SafeSquid users, the much needed mechanism of classifying web-sites, in one or more categories. Usage is very simple, yet will allow security managers a lot of room, to handle challenges rather inventively.
Over 3 million web-sites have been classified in a variety of categories like news, webmail, adult, porn, arts, etc. Policy makers can create rules to determine if a web-site belongs to one or more categories, and "ADD PROFILE" of their choice, say – "NOT_BUSINESS", and then use this Profile, in any of the other SafeSquid's sections like URL Filter, or MiMe Filter, or Cookie Filter etc. to allow or deny the transaction, as per enterprise policy.
 
Categories:
  • Ads
  • Adult
  • Adult Education
  • Arts
  • Chat
  • Drugs
  • Education
  • Fileshare
  • Finance
  • Gambling
  • Games
  • Government
  • Hacking
  • Hate
  • Highrisk
  • Housekeeping
  • Instant Messaging
  • Jobs
  • Leisure
  • Mail
  • Multimedia
  • News
  • Porn
  • Proxy
  • Search Engines
  • Shopping
  • Social
  • Sports
  • System Utilities
  • Travel
  • Business

How cProfiles works

Policy makers can configure cProfiles to "add a profile" to a request for any website that is listed under one or more categories. Whenever a user requests for any website, the cProfiles module verifies if the website is listed under the specified categories. It first checks its cache for an entry. If the entry is found in the cache, cProfiles adds the profile instantly to the request. If the entry is not found in the cache, the cProfiles module sends a query to SafeSquid's Content Categorization Service (CCS). cProfiles uses DNS technology to query the CCS. This naturally updates all the enroute caching Nameservers. So if you even restart SafeSquid, the resolutions will be quickly retrieved from the nearest DNS provider.
Unlike legacy technologies that forces users to store huge databases, cProfiles caches only 'really visited' websites and therefore, utilizes very little system resources. Since the categorization happens in real-time, users do not have to regularly download updates to keep their database up to date.
The CCS has been initially seeded with a little over 3 Million web-sites. CCS has been built with a unique self-learning technology, that allows it to build a list of web-sites that must be categorized for the benefit of it's users, and CCS then automatically creates the "suggested classifications" for these web-sites, in real-time. These results are then validated by human editors, on an hourly basis, allowing the data to be instantly useable by the real-users.
Learn how to use cProfiles with SafeSquid, seecProfiles Documentation

Read more

Share