Tuesday, September 28, 2010

32-bit Mode Worker Processes

One of the performance benefits of the x64 platform is that it increases virtual address space, making more memory available. We recommend that you configure IIS to use a 32-bit worker processes on 64-bit Windows. Not only its compatibility better than the native 64-bit, performance and memory consumption are also better.

Set the Web server to enable this globally so that you do not need to set it for each application pool. Unlike previous versions of IIS, you can now configure specific worker processes to run in 32-bit or 64-bit mode independently of each other on the same server.

To configure 32-bit mode for the server

  • Open a command prompt and run the following:

%windir%\system32\inetsrv\appcmd set config -section:applicationPools -applicationPoolDefaults.enable32BitAppOnWin64:true

Note: This setting applies only to 64-bit servers.

To configure 32-bit mode for the server using the IIS PowerShell Provider

  • Open a PowerShell prompt and run the following:
set-itemproperty iis:\ -name applicationPoolDefaults.enable32BitAppOnWin64 -value true

IIS7 - Running 32-bit and 64-bit ASP.NET versions at the same time on different worker processes

In IIS6, this was a pain area. On a 64-bit Windows 2003 Server, you cannot run worker processes in both 32-bit mode and as well as 64 bit mode. Either only one of them was possible.


This was possible by the below Metabase Key which would be applied to the W3SVC/AppPools node.

W3SVC/AppPools/enable32BitAppOnWin64 : true | false

Read more how to do this in IIS 6.0 here. But, in IIS7, you can run 32-bit and 64-bit worker processes simultaneously. Let’s see how to do it.

You have the same enable32BitAppOnWin64 property for the applicationPools in the applicationHost.config. Now, you can set this for individual application pools. Below is my sample configuration:

<applicationPools>

<add name="MyAppPool32bit" autoStart="true" enable32BitAppOnWin64="true" />

<add name="MyAppPool64bit" autoStart="true" enable32BitAppOnWin64="false" />

<applicationPoolDefaults>

<processModel identityType="NetworkService" />

applicationPoolDefaults>

applicationPools>

Below is how you do it from IIS7 manager:

  1. Right click on the Application Pool and select “Advanced Settings…” or select the same from the Actions pane after selecting the Application pool
  2. Change the “Enable 32-bit Applications” to True (if you want the application pool to spawn in a 32-bit mode)
  3. Click OK

Below is how you do from the AppCmd:

appcmd apppool set /apppool.name:MyAppPool32bit /enable32BitAppOnWin64:true

appcmd apppool set /apppool.name:MyAppPool32bit /enable32BitAppOnWin64:false

NOTE : By default, it is false.

Most of you may already know how to see if the process is really spun in a 32-bit mode in 64-bit OS. Yes, simple way is to open the Task Manager and go to Processes tab – you would see the below:

Now, you may ask how does the correct version of the DLLs picked up automatically. Open your applicationHost.config and search for aspnet_filter. You would see the below:

The preCondition="bitness32" or "bitness64" decides which ISAPI Filter to pick up for corresponding modes. Same case with any DLL used, for example ISAPI Filter, Modules, etc.

Monday, September 27, 2010

How To Change or Rename Your Username in Ubuntu 10.04 Lucid Lynx

This brief tutorial will show you how to quickly change your user ID in Ubuntu Lucid. Before going any further, you must understand one rule in Ubuntu when it comes to changing usernames. You cannot change your username while you’re logged into the system. You must login with a different account to change it.

Getting started:

After logging in with a different account, go to Applications –> Accessories –> Terminal.

Then type the commands below to change the username and the user’s real name.

sudo usermod -c "Real Name" -l New_Username Old_Username

usernaem_lucid_upd_1

Next, verify if the name change was a success.

usernaem_lucid_upd_2

Remember that the account you login with must also have the right to execute the sudo command. That means that user must be an administrator or a member of the administrator group.

Apache Web Server Security

The increase in cyber attacks on high profile online business websites implies that web security still needs to be addressed. Exploits of web server vulnerabilities typically have a more disastrous and visible impact. While with web application vulnerabilities a malicious user could gain access to a database and other form of stored data, with web server software vulnerabilities, a malicious user could even gain access to the operating system, potentially compromising the whole network.

A brief introduction to Apache web server

The Apache Foundation released Apache Version 2 in 2002, following on the success of Apache Web Server version 1. Version 2 was almost a total rewrite, which mostly focused on further modularization and development of the Apache core. Apache Version 2 has several improvements, which include Unix threading, IP version 6 support and most importantly of all, better support for non-Unix platforms, such as Microsoft Windows. These improvements helped Apache web server to become the first web server used to host over 100 million websites and web applications on the internet, thus being the most widely used web server.

Securing Apache web server

The following suggestions will go a long way in improving the security of an Apache web server installation. Despite applying the ‘less is better’ rule to harden a web server by disabling a number of modules, it will not be enough. It is still recommended to apply all security patches in case a disabled module is enabled in the future.

Limit server functionality

One must first be aware of which function or functions the web server will be used for. Will it serve HTML pages only, or also execute a number of scripts? Enabling support for PHP, ASP.NET and other similar web technologies will only increase the attack surface which a malicious user could penetrate. This might happen because of vulnerabilities in a specific web server module. Therefore one should only support web technologies which are going to be used.

Limit access to operating system and its files

The operating system on which the Apache web server will be running must also be hardened. For more information on how to secure a web server operating system, click here. It is important that the Apache web server process runs under a unique user ID, which must not be used by any other system processes. Apache processes must also have limited access to the operating system files (chrooting). Chrooting is the process of creating a new root directory structure where all Apache daemon files are moved to. As a result of the chrooting process, the Apache web server daemon will only have access to the new directory structure. No shell programs e.g. /bin/sh, /bin/csh/ should be present in the Apache’s chrooted environment. This way the web server benefits immunity from a large number of existing exploits.

Disable unnecessary web server modules

Apache is shipped with a number of pre-enabled modules to be more user friendly. This is a bit of a break from its Unix past of only providing the essential. In this case, the ‘less is better’ rule is to be applied with more diligence and the choice of which module to enable is probably the most important step. An administrator would avoid potential break-ins simply by disabling unnecessary modules when new vulnerabilities are found in them. Check about every pre-enabled module to confirm if it is needed, and if not, disable it.

Tighten the Apache configuration

The default Apache configuration contains a large number of directives that are not used in a typical scenario. One can safely switch off or disable the directives listed below if not being used:

  • directory indexes
  • unnecessary default ‘Alias’ and 'ScriptAlias’
  • Handlers (only leave handlers which you will be using. Remove all others)
  • Directory Options such as ‘FollowSymLinks’ (if no symbolic links are used in the web directories)

Friendly web server error messages should also be configured, to make sure that the least amount of information is disclosed about the Apache web server installation and configuration. If possible, the web server banner should also be obfuscated (security by obscurity).

Disable Server Sides Includes

SSIs bring along a number of potential security risks with them. Most important of all, SSI-enabled web documents will severely increase the load on the server. In high traffic websites or in a shared web hosting environment, such load can become very significant. Furthermore, server sides includes pose the same number of risks associated with CGI (Common Gateway Interface) scripts in general. SSI-enabled files, can execute any CGI script or program under the permissions of the user which the Apache web server runs on, thus posing a huge security risk. A number of web technologies that avoid using SSI and transfer the processing required to the client / browser side are available today. A web master should take advantage of such web technologies.

Monitor log files

Proper web application access and web server operation logging should be enabled. Such logs should not be placed in the web server root directory. As with any other internet service, it is important to check all of Apache’s log files regularly. Analysing past events from web server log files, will give the administrator a good idea of what attack trends are being followed. This will also help the administrator identify where the web server needs tightening up.

Install all Apache web server updates

From time to time, Apache Foundation release a number patches. It is always a good practice to keep the Apache web server installation up to date, despite of whether the patch is a security patch, or a software update. Subscribe to a mailing list such as ‘Apache Server Announcements’ to receive notifications of when updates and patches are available. Such mailing lists are hosted by the Apache Foundation.

Stay informed and frequently check the web server configuration

A number of books, online security guidelines articles and white papers are available to help an administrators secure an Apache web server installation and implement the above security suggestions, which are however universal. It is suggested to read such articles and follow the guidelines. After applying such changes, it is also important to test the web server’s and web application’s functionality, to confirm that such changes did not hinder any of the web application’s functionality. The use of third party security tools, such as Acunetix WVS can help confirm that the applied procedures did improve your web server security. Acunetix WVS will also launch a number of security checks against an Apache web server, and will also check for weak configurations. Take a product tour of Acunetix Web Vulnerability Scanner or download it today!

IIS Web Server Security

With the sharp increase of hacking attacks over the last couple of years, and the introduction of a number of regulatory compliance guidelines to follow, web application security has become a key concern for many online businesses, and also a common expense in a company’s budget. Although many businesses are focusing on securing their web applications, unfortunately they are not looking at the whole picture. A vital part of securing a business’s whole web infrastructure also includes having a secure web server configuration. Securing a web server’s configuration is as important as securing the web application itself.

A brief introduction to Microsoft IIS

Microsoft Internet Information Services, better known as IIS, is Microsoft’s set of internet based services for servers, which runs on Microsoft Windows operating systems. The internet services provided with IIS are a SMTP (simple mail transport protocol) Server, FTP (file transfer protocol) Server, NNTP (network news transfer protocol) server and WWW (World Wide Web) server. Until version 5.1 (~ year 2000), IIS was hit with a number of vulnerabilities, which also lead to a number of infamous worms on the internet, such as the Code Red Worm. Most of such incidents happened because both the design of the application itself and the permissions it used to run were flawed. An out of the box installation of IIS version 5.1 published on the internet in 2000, could get hacked in a matter of minutes. From version 6 onwards, when internet hacking was causing a huge financial problem to major online businesses, Microsoft re-designed IIS. The way IIS 6 and more recent versions work and use resources, are more secure, and help in reducing the attack surface. Microsoft also removed the SMTP server and the NNTP server in more recent versions of IIS. Still, an out of the box IIS installation, with a default configuration may still lead to a number of problems.

Securing your Microsoft IIS web server

1. To start off with, all web related documents, such as web application files and other files which are typically shared over the internet, should be stored in a different drive from the operating system drive. Because of vulnerability, a malicious user can gain access to the web root directory, and then escalate his permissions and gain access to the whole drive where the web root is. If the malicious user gains access to the whole drive, at least he cannot tamper your operating system installation, thus making it easier to trace his activity.

2. When creating a new web root directory, where all the files to be shared on the web will be stored, grant the appropriate (least possible) NTFS permissions to the anonymous user being used from the IIS web server to access the web content. Ideally, deny write access to any file or directory in the web root directory to the anonymous user. If need be, create a new user to be used as an anonymous user and grant the appropriate permissions, and disable the built-in IIS anonymous user. Always avoid giving write NTFS permissions to the anonymous user.

3. If a database server, such as Microsoft SQL Server is to be used as a backend database, install it on a separate server. If the budget permits, other network services should be installed on separate servers. If one of such network services is compromised, it makes it more difficult for a malicious user to gain access over the other servers, thus compromising the whole web farm infrastructure. Also, if possible, avoid mapping virtual directories between two different servers, or over a network.

4. If IIS FTP service is needed, do not install the FTP service shipped with latest windows editions. Apart from not being a very practical FTP server, its configuration has to be accessed from a IIS 6 management console. Microsoft released a new FTP server which integrates better with IIS 7 and its configuration, and is more practical. It can be downloaded from the following URL (IIS Official website); http://learn.iis.net/page.aspx/310/what-is-new-for-microsoft-and-ftp-75/. One should also follow good FTP server configuration practices, such as to always isolate the ftp user in his home directory, use secure FTP (over SSL), and also allow the least possible privileges to the users. Even though FTP user permissions can be controlled from the IIS 7 MMC, make sure that such permissions are also enforced via NTFS permissions. If port scanning is enabled in Acunetix WVS, it will also check your FTP server configuration, and will launch a number of security checks against the FTP server.

5. Monitor servers, web applications and network services activity frequently. By analysing the log files, you can determine if a web server or network service is being attacked. Studying such activity, you can also learn what type of attacks your server is undergoing, thus helping you understand the attacker and will also help you adapt to such attacks and increase the level of security of the whole web farm infrastructure.

6. Like every other software vendor, Microsoft periodically releases a number of software updates and security patches. These patches should be applied at the earliest possible to reduce the risk of someone finding the security hole before it is patched. Once vulnerabilities are made public, within a couple of hours, malicious users would already have automated scripts and scanners that crawl the internet and identify vulnerable server. Once a new security patch or software update is applied, you should also thoroughly test the web applications’ functionalities, to confirm they were not affected by such security patch or software update.

7. Microsoft provides a number of tools to help you secure your web server. It is imperative to invest time in learning how to use such tools, since in the long run they can save you from a more time consuming activity, such as restoring a server after it was hacked, or tracing a malicious user’s activity. Such tools are; IIS Security “What if” tool, which helps you troubleshoot security issues with IIS, IIS Security Planning Tool which helps you deploy IIS with security that is appropriate for the server’s role, and IIS Lockdown tool, which provides built-in secure IIS configuration templates. There are many other tools provided from Microsoft for free, which can be downloaded from their website. Also, Microsoft frequently publishes documents and security guidelines on how to secure your web server, which one should follow. After using such tools, or applying changes which you’ve learnt about from Microsoft’s documentation, test the web application’s functionality, to confirm that such changes did not affect or block any of the web application’s functionality. Ideally you should also use other third party security tools, such as Acunetix WVS to confirm that such tools are properly securing your web server.

Tuesday, September 7, 2010

Drupal Installation in Ubuntu thru command

Note: The two methods listed below are not compatible. The Ubuntu package (from the repositories) installs Drupal6 in different directories from the manual method. Choose one method or the other, but do not attempt to use both. You cannot switch back and forth between the two methods.

Install Drupal package from the Ubuntu repositories

You can choose whether to install Drupal5 or Drupal6 from the repositories. As always, repository version are not the most current versions, but they can be updated easily.

sudo apt-get install drupal6

or

sudo apt-get install drupal5

Manual installation of Drupal

The following instructions are for version 6.15 of the Drupal6 branch, which was the most current version at the time of writing. (A similar method can be used for the most current version of Drupal5). This installation does not install Drupal in directories that are compatible with the respository packages.

wget http://ftp.drupal.org/files/projects/drupal-6.15.tar.gz
tar xvzf drupal-6.15.tar.gz sudo mkdir /var/www/drupal sudo mv drupal-6.15/*
drupal-6.15/.htaccess /var/www/drupal

It is required to create a files subdirectory in your drupal default site installation directory. It will be used for files such as custom logos, user avatars, and other media associated with your new site.

sudo mkdir /var/www/drupal/sites/default/files
sudo chown www-data:www-data /var/www/drupal/sites/default/files

It is also required to create the initial configuration file for the default site.

sudo cp /var/www/drupal/sites/default/default.settings.php /var/www/drupal/sites/default/settings.php
sudo chown www-data:www-data /var/www/drupal/sites/default/settings.php

Manually configure MySQL database

You need to create a MySQL drupal database and then load the database schema into it. You can do this with PhpMyAdmin or via the command line:

mysqladmin -u root -p create drupal

Where drupal is the name you picked for the mysql database that Drupal will use. You can call it anything you want.

mysql -u root -p  mysql> GRANT SELECT, INSERT, UPDATE, DELETE, CREATE, DROP, INDEX,
ALTER, CREATE TEMPORARY TABLES, LOCK TABLES ON drupal.* TO 'drupaluser'@'localhost'
IDENTIFIED BY 'drupalpass';

You do not want to have Drupal use the mysql root user to access the database. The above command creates a mysql user (other than the mysql root user) with some priviledges to use the drupal database. You should pick something different than drupaluser and drupalpass. If the command was successful, activate the new permissions:

mysql> FLUSH PRIVILEGES;

Quit the mysql prompt:

mysql> \q

Miscellaneous adjustments for manual installation

(Mike_B_sixosix 02.05.09) Comment: When I used the manual installation method I had to go back and edit the settings.php page with the drupal db username, password, and db name in order for the database portion of the install.php page to recognize that I had made the changes. After manually updating the file and saving, I refreshed the page and it automatically recognized the changes and moved to the next step.

(Mike_B_sixosix 02.05.09) Comment: I also had to

sudo chown www-data:www-data /var/www/drupal/sites/default/files

directory in order for the install.php page to make file changes.

Manually configure postgreSQL for Drupal

Edit the settings.php file so that drupal can know what user, password and database to use.

$ sudo nano /var/www/drupal/sites/default/settings.php     
* Database URL format:
$db_url = 'pgsql://username:password@localhost/databasename';
where username = drupaluser password = drupaluserpass databasename = drupaldb

Save the file (Ctrl + x, y)

Adjust PHP memory limit

You should increase the default PHP memory limit value (the amount of memory dedicated to running scripts), since the default 8 Mb is not sufficient. Use 96 Mb (or even 160M) if you intend to use graphics (although for simple uses 32 Mb may be sufficient).

In newer versions of Drupal6, you can simply edit the settings.php file and add the line:

ini_set('memory_limit', '96M');

In older versions of Drupal5, or if using PHP for many different uses, it is best to increase the amount of PHP memory using this method:

Edit the /etc/php5/apache2/php.ini file and increase the memory_limit value to 96M (or another value of your choice).

If using PostgreSQL, you can also edit the /etc/php5/apache2/php.ini file to accommodate PostgreSQL options by adding the line:

extensions=pgsql.so;

Adjust PHP Error Reporting

An adjustment to the error reporting is required for the install.php page to come up properly. Edit your settings.php file (e.g., /etc/drupal/6/sites/default/settings.php) and add the following ini_set line after the existing ini_set lines:

ini_set('error_reporting',          4096);

This is based upon the recommendations from the drupal.org forms: http://drupal.org/node/514334#comment-1912924

Adjust PHP extensions (Drupal 7 only)

Drupal 7 will only support MySQL 5.0.15 or higher, and requires the PDO database extension for PHP. The PHP extension for connecting to your chosen database must be installed and enabled. Drupal's currently supported database connectors are: mysql (the original MySQL extension), mysqli (an improved connector for newer MySQL installations), and pgsql (for PostgreSQL). Note: PHP 5.x no longer enables the mysql extension by default. Please read the links above for installing and enabling your chosen connector. Additionally, Drupal 6.x does not provide the option to select the mysql connector if mysqli is enabled in your PHP configuration.

sudo apt-get install php5-gd

For more details see http://drupal.org/requirements

Reload Apache2

Reload Apache2.

sudo /etc/init.d/apache2 restart

Complete Drupal Installation through a Browser

Finally, point your browser to http://localhost/drupal/install.php (or perhaps http://localhost/drupal6/install.php if you installed the package from the official repository), create an account, login, and follow the installation instructions to finish loading the database and configuring the site.

  • drupal.jpg

If you got a page full of warnings, follow the prompts to resolve the issues (and add the fixes to this wiki page if appropriate).

The first account will automatically become the main administrator account with total control.

drupal1.jpg

Sunday, September 5, 2010

Cloud computing

From Wikipedia, the free encyclopedia

Cloud computing is Internet-based computing, whereby shared resources, software, and information are provided to computers and other devices on demand, like the electricity grid.

Cloud computing is a paradigm shift following the shift from mainframe to client–server in the early 1980s. Details are abstracted from the users, who no longer have need for expertise in, or control over, the technology infrastructure "in the cloud" that supports them.[1] Cloud computing describes a new supplement, consumption, and delivery model for IT services based on the Internet, and it typically involves over-the-Internet provision of dynamically scalable and often virtualized resources.[2][3] It is a byproduct and consequence of the ease-of-access to remote computing sites provided by the Internet.[4] This frequently takes the form of web-based tools or applications that users can access and use through a web browser as if it were a program installed locally on their own computer.[5] NIST provides a somewhat more objective and specific definition here.[6] The term "cloud" is used as a metaphor for the Internet, based on the cloud drawing used in the past to represent the telephone network,[7] and later to depict the Internet in computer network diagrams as an abstraction of the underlying infrastructure it represents.[8] Typical cloud computing providers deliver common business applications online that are accessed from another Web service or software like a Web browser, while the software and data are stored on servers. A key element of cloud computing is customization and the creation of a user-defined experience.


Cloud computing conceptual diagram

Most cloud computing infrastructures consist of services delivered through common centers and built on servers. Clouds often appear as single points of access for all consumers' computing needs. Commercial offerings are generally expected to meet quality of service (QoS) requirements of customers, and typically include SLAs.[9] The major cloud service providers include Microsoft,[10] Salesforce, Skytap, HP, IBM, Amazon, and Google.[11][12]

Key features

  • Agility improves with users' ability to rapidly and inexpensively re-provision technological infrastructure resources.[34]
  • Cost is claimed to be greatly reduced and capital expenditure is converted to operational expenditure.[35] This ostensibly lowers barriers to entry, as infrastructure is typically provided by a third-party and does not need to be purchased for one-time or infrequent intensive computing tasks. Pricing on a utility computing basis is fine-grained with usage-based options and fewer IT skills are required for implementation (in-house).[36]
  • Device and location independence[37] enable users to access systems using a web browser regardless of their location or what device they are using (e.g., PC, mobile). As infrastructure is off-site (typically provided by a third-party) and accessed via the Internet, users can connect from anywhere.[36]
  • Multi-tenancy enables sharing of resources and costs across a large pool of users thus allowing for:
    • Centralization of infrastructure in locations with lower costs (such as real estate, electricity, etc.)
    • Peak-load capacity increases (users need not engineer for highest possible load-levels)
    • Utilization and efficiency improvements for systems that are often only 10–20% utilized.[26]
  • Reliability is improved if multiple redundant sites are used, which makes well designed cloud computing suitable for business continuity and disaster recovery.[38] Nonetheless, many major cloud computing services have suffered outages, and IT and business managers can at times do little when they are affected.[39][40]
  • Scalability via dynamic ("on-demand") provisioning of resources on a fine-grained, self-service basis near real-time, without users having to engineer for peak loads. Performance is monitored, and consistent and loosely coupled architectures are constructed using web services as the system interface.[36] One of the most important new methods for overcoming performance bottlenecks for a large class of applications is data parallel programming on a distributed data grid.[41]
  • Security could improve due to centralization of data,[42] increased security-focused resources, etc., but concerns can persist about loss of control over certain sensitive data, and the lack of security for stored kernels.[43] Security is often as good as or better than under traditional systems, in part because providers are able to devote resources to solving security issues that many customers cannot afford.[44] Providers typically log accesses, but accessing the audit logs themselves can be difficult or impossible. Furthermore, the complexity of security is greatly increased when data is distributed over a wider area and / or number of devices.
  • Maintenance cloud computing applications are easier to maintain, since they don't have to be installed on each user's computer. They are easier to support and to improve since the changes reach the clients instantly.
  • Metering cloud computing resources usage should be measurable and should be metered per client and application on daily, weekly, monthly, and annual basis. This will enable clients on choosing the vendor cloud on cost and reliability (QoS).

kunkun-laptop .... ;)