I’ll need 2 programs to achieve this, assuming you already have access to a Linux box and you can connect using putty client. Need S/w packages are-
- Putty and
- SeaMonkey Browser
Follow the following steps-
Brain's Workbench
I’ll need 2 programs to achieve this, assuming you already have access to a Linux box and you can connect using putty client. Need S/w packages are-
Follow the following steps-
This can be easily achieved with yum config file “/etc/yum.conf“. Under main section define the proxy settings like below:
……………… proxy=http://<Proxy-Server-IP-Address>:<Proxy_Port> proxy_username=<Proxy-User-Name> proxy_password=<Proxy-Password> ………………
Save and exit the file and start using the yum command. Sample yum Config file with proxy settings is shown below :
For Wget
Add/Comment out below line(s) in file ~/.wgetrc or /etc/wgetrc:
http_proxy = http://[Proxy_Server]:[port] https_proxy = http://[Proxy_Server]:[port] ftp_proxy = http://[Proxy_Server]:[port]
For Apt
Create a new configuration file named proxy.conf.sudo touch /etc/apt/apt.conf.d/proxy.conf
Open the proxy.conf file in a text editor.sudo vi /etc/apt/apt.conf.d/proxy.conf
Paste it as following:
Acquire { HTTP::proxy "http://127.0.0.1:8080"; HTTPS::proxy "http://127.0.0.1:8080"; }
For Git:
git config --global http.proxy http://proxyuser:proxypwd@proxy.server.com:8080
change proxyuser to your proxy user
change proxypwd to your proxy password
change proxy.server.com to the URL of your proxy server
change 8080 to the proxy port configured on your proxy server
Note that this works for both http and https repos.
If you decide at any time to reset this proxy and work without proxy:
Command to use:
git config --global --unset http.proxy
Finally, to check the currently set proxy:
git config --global --get http.proxy
This will be a transparent SQUID proxy for your home or corporate network , it will transparently intercept all traffic http and https , for https you will need to push to clients the CA certificate of the SQUID server, it has been tested to be working without problems with lastest Internet Explorer, Mozilla Firefox and Chrome browsers.
We start by downloading the CentOS 6.5 iso from CentOS website (x86 or x64) : CentOS 6.5 ISO’s , install Base system. Partitioning , software or hardware raid is up to the user. In this example hostname is : proxy.home.lan and ip address is : 192.168.201.250 .
By default PHP loads and saves sessions to disk. Disk storage has a few problems:
1. Slow IO: Reading from disk is one of the most expensive operations an application can perform, aside from reading across a network.
2. Scale: If we add a second server, neither machine will be aware of sessions on the other.
Enter Memcached
I hinted at Memcached before as a content cache that can improve application performance by preventing trips to the database. Memcached is also perfect for storing session data, and has been supported in PHP for quite some time.
Why use memcached rather than file-based sessions? Memcache stores all of its data using key-value pairs in RAM – it does not ever hit the hard drive, which makes it F-A-S-T. In multi-server setups, PHP can grab a persistent connection to the memcache server and share all sessions between multiple nodes.
Installation
Before beginning, you’ll need to have the Memcached server running. I won’t get into the details for building and installing the program as it is different in each environment. On Ubuntu it’s as easy as aptitude install memcached. Most package managers have a memcached installation available.
SafeSquid – Content Filtering Internet Proxy, has many content filtering features that can be used to decide who is allowed what, when and how much on the net. In this tutorial I will describe how to control access to unwanted categories of websites, by using URL Blacklist database with SafeSquid Proxy Server.
Note: Also see the following articles :
'Deploying A Content Filtering Proxy Server To Distribute Controlled Internet Access With SafeSquid'
Set Up Gateway Level Virus Security With ClamAV And SafeSquid Proxy
How To Set Up Internet Access Control And Internet Filtering With SafeSquid Proxy Server
SafeSquid allows the administrators to use plain text urlblacklist very easily and with a desired level of sophistication. The sites http://www.shallalist.de/ and http://www.urlblacklist.com maintain a well categorized list of various web-sites and pages like porn, adult, webmail, jobsearch, entertainment, etc. This is an excellent resource for an administrator seeking to granularly enforce a corporate policy that allows or disallows only certain kinds of web-sites to be accessible by specific users, groups or networks.
Note: cProfiles offers the flexibility of many more actions than URL Blacklist, instead of just allowing / blocking categories. For example, you can add a profile to a specific category, and then use that profile in any of SafeSquid's filtering sections, for actions on the category like blocking cookies, ads and banners, ActiveX, Java Scripts, throttling bandwidth (QoS), or simply analyzing what category is most visited, without blocking access.
For Details, see http://www.safesquid.com/html/portal.php?page=132
While Shalla Secure Services offer free downloads and updates for home users, Urlblacklist requires you to subscribe to receive updates. You can download the URL Blacklist by Shalla from HERE, and the trial database by urlblacklist.com from HERE.
Please note that you will be able to download this trial database only once. You need to subscribe to urlblacklist.com to be able to receive regular updates
Copy the downloaded trial database to /usr/local/bin directory on the SafeSquid Server, and untar the files
cd /usr/local/src
tar -zxvf bigblacklist.tar.gz
This will create a directory 'blacklist'. Create a directory 'urlbl' in /opt/safesquid and copy the contents of blacklist in this directory.
mkdir /opt/safesquid/urlbl
cd blacklist
cp -rf . /opt/safesquid/urlbl
Next, restart SafeSquid
/etc/init.d/safesquid restart
In SafeSquid GUI Interface, click on URL blacklist in the Top Menu It should display a list of all the categories copied to the urlbl directory. Here, you can query the database to find out if a website is listed under any category. For example, to find out what category hackerstuff.com belongs to, type hackerstuff.com in the Domain field and click on Submit below. You should get a screen similar to this –
Note: This section only allows you to query the database. Selecting or unselecting a category does not enable or disable it.
cProfiles: Real-Time Website Profiler
|
|
|
How cProfiles works