12 Best Cross Browser Compatibility Testing Tools

Cross browser compatibility testing is one of the biggest pain for Software testers, UI of web based application are always broken into some browser or another and sometimes on same browsers but on different operating systems. To minimize the pain of browser compatibilities check out the following list of all cross browser testing tools available on-line for testing website in multiple browsers.

Free Cross Browser Testing Tools

Spoon Browser Sandbox


Spoon Browser Sandbox is a free, cloud-based service that run popular browsers directly from the Web without having to install them on PC. The Cross-browser testing sandbox is particular interest to web developers and for testers to used for cross-browser testing in multiple versions of popular browsers.



Browsershots is one of the most popular cross browser testing tool available for free, A system for capturing screenshots of Web design in different operating systems and browsers. It provids developers a convenient way to test their web pages against all the browsers.

IE NetRenderer


IE Netrenderer is an extension for Mozilla Firefox Web browser, giving you a quick access to the IE NetRenderer service, It allows you to check how a website is rendered by Internet Explorer 7, 6or 5.5.

IE Tab


IT Tab is a another browser extension for the Mozilla Firefox, Google Chrome and SeaMonkey web browsers that allows you to use Internet Explorer to display web pages in a tab.

IE Tester


IE Tester is an application that allows to test on multiple ie versionsInternet Explorer (IE6, IE7, IE8, IE9, IE10 and IE 10 Desktop) at the same time in the same application.

Microsoft SuperPreview


Web SuperPreview speeds the essential task of debugging your web sites for cross-browser compatibility, It is a stand-alone visual debugging tool that helps you debug your websites for cross-browser.

Paid Cross Browser Testing Tools



Automated website layout testing service. Get a free report of your site’s cross-browser problems. It automatically detects cross-browser layout problems and scripting errors on your website.

Adobe BrowserLab


Adobe BrowserLab enables cross-browser testing by producing screenshots of websites from various web browsers across different platforms.



Screenshot service and remote access service for cross browser testing, cross browser compatibility testing, web browser testing, and html email testing.



BrowserSeal is a fast and easy to use cross browser testing tool which looks very promising, . It allows you to capture an image of your web site under multiple operating system.

Cross Browser Testing



Cross browser compatibility testing across browsers and operating systems. Automated screenshots and Live browser testing of ajax, css & javascript.

Cloud Testing


We would love to know which of these cross browser testing tool is your favorite. Please leave us a comment and let us know.


15 Most Commonly Used Selenium WebDriver Command’s in Java

Selenium 2.0 also known as WebDriver is the best automation tool for web and offers very user friendly learning and impressive work. WebDriver has covered all the limitation of Selenium RC and work on a standalone server. Here is the basic and most commonly used functions of  WebDriver with Java.

Popup Windows and Frames

Popup window handling is one of the must handle things in Automation and Webdriver does is really well. Using the window handle command.

Once action is performed, switch back to parent window.

List Box

Select an item in Listbox using Selenium Webdriver is quite easy, there two way’s to do it.



Right Click

Excel Reader

Read or write data from excel using the jxl or POI, I will prefer POI. Create a property reader file then read the data

Database Connection

To connect to Database using WebDriver with Java, we use DBC(“Java Database Connectivity) API.

Time Outs – Wait

Implicit Waits:


Explicit Waits

alert.accept(); or alert.dismiss();


Table Columns

Drag And Drop

Mouse Over


There are various way to check weather an element is present of not before execution of a piece of code.

Option 1

Option 2

Option 3

Sweet and simple one line of code

Web Elements Finding

These are generic ways to do certain common task i.e.

Id: WebElement element = driver.findElement(By.id(""));

Name: WebElement element = driver.findElement(By.name(""));

Tag Name: WebElement frame = driver.findElement(By.tagName("iframe"));

Xpath: WebElement element = driver.findElement(By.xpath.name(""));

CSS: WebElement element = driver.findElement(By.CSS.name(""));

LinkText: WebElement element = driver.findElement(By.LinkText.(""));

I hope these commands will let you work with Selenium WebDriver like a star. Please leave us a comment and share your own experiences, commands with us.

Selenium Is The Best Open Source Automation Testing Tool for Web

Selenium is a Web Browser Automation testing is an open source tool for automating web applications written by ThoughtWorks. It has support of all the browser in the market and possibly all the available language like JAVA, PHP, Python and Ruby. Java and Ruby are two more popular language used with selenium along with some more third party tools such as TestNG, Junit and Ant.

Selenium QA Engineers focus on the one or two tools that most meet the needs of their project, however learning all the tools will give you many different options for approaching different test automation problems. If you are interested in knowing major Selenium tools, then here are the tools you need.

Selenium IDE


Selenium IDE is an integrated development environment testing tool and the Firefox extension, allow to record,edit and play the test. It includes the entire Selenium Core,recording capability and ability to move commands around and option to export the script to any of the supported language.

Selenium RC


Selenium Remote Control or RC is a powerful tool from selenium and allows to write automated web application script for functional and UI test for any of the supported programming language. It comes with two part, A selenium server and client libraries.

Selenium Grid


Selenium Grid allows the test to run parallel on different machines against the different browsers. It also used to speed up the execution, support running tests against multiple run time environments.

Selenium-Grid 2.0

The above version pf Selenium Grid is known as Selenium-Grid 1.0 and now its been updated and have new features and support the Selenium Webdriver. Selenium-Grid 2.0 is now merged with Selenium RC server and available only in one single package of .jar file.

Selenium WebDriver

Selenium WebDriver also known as Selenium 2.0 is the most powerful testing tool from Selenium and one of the best as well. Selenium 2.0 has integrated the WebDriver API to provide simple and accurate programming interface along with handle the limitation of Selenium RC. Selenium WebDriver does not require Selenium-Server to launch the browser.

Selenium RC V/s Selenium WebDriver

  • Selenium RC requires you to install selenium Server and client libraries. Whereas Selenium WebDriver Does not Requires Selenium-RC Server.
  • Selenium RC opens the application in two browser. Whereas, Selenium WebDriver use only one browser.
  • Selenium RC uses JavaScript to automate web pages. Whereas, Selenium WebDriver uses native automation from each language.
  • Selenium RC used to have limitation of popups. Whereas, Selenium WebDriver never faces those limitation issues.
  • Selenium RC has a bit complex API. Whereas, Selenium WebDriver has fast and easy to use API

Integrated Development Environment


Eclipse and IntelliJ IDEA are two more popular open source IDE that provides comprehensive facilities for software development. By use of plugins, Eclipse and Netbeans offers support for other languages i.e.Perl, Python, Ruby, and PHP.

Testing Framework


TestNG along with JUnit and NUnit are mostly used Java testing framework with some of the best functionalities that make it more powerful and easier to use. For reporting purpose TestNG and ReportNG are used to generates test reports in HTML and XML formats.

Build Automation


Apache Maven and Apache Ant are software tool for automating software build processes and best suited to building Java projects. Build automation are used to do day-to-day activities including deployment, running tests and creating documentation or release notes and reports.

Selenium With Ruby

If you are a Ruby developers then here are the tools you should look for. These are amongst the best tools used by the Ruby community.

RSpec – RSpec is behavior driven development (BDD) framework for Ruby programming language, allowing to write application scenarios and test them.

Capybara – Capybara is an integration testing tool for rack based web applications. It simulates how a user would interact with a website.

Cucumber – Cucumber is a tool for running automated acceptance tests written in a behavior driven development style.

I hope all the above Selenium tools if used with respective technology and environment will ease out your testing pain. Just give them a try!. Already used some of all of them? Please leave us a comment and share your experiences with us.

Zephyr 2.0, an Innovative Test Management System

Zephyr 2.0 is a flexible and affordable test management tool, which features like customized desktops, live and automated dashboard, instant collaboration and real time reporting. On top of being a QA test planning, management and reporting tool, Zephyr 2.0 acts as a bridge between test management system and issue tracking software. By partnering with Atlassian, who has created a niche in the area of automated issue tracking, Zephyr 2.0 now comes tightly integrated with JIRA and also Bugzilla, making it more powerful than the original.

The global economic meltdown has forced several companies to cut down their cost and test management systems and tools. But Zephyr 2.0 stands tall, as it’s quite inexpensive; thanks to the SaaS (Software as a Service) model option, where you need to make a monthly payment of only $65. This means, paying only for those services that you want. Its flexible licensing allows you to ramp up one month and ramp down another according to your needs.

Zephyr 2.0 is not only easy to buy but also to deploy and use. You can either opt for the On-premise model or for the Amazon Cloud model. And with the ‘Zephyr KickStart’ program you will not only get a test management system that suits your environment but also have your entire testing team trained in a matter of two days. Zephyr 2.0 provides a simple way for organizations to quickly get a handle on their in-house and outsourced testing activities as it enables IT management and testing teams to collaborate seamlessly across time zones.

Zephyr 2.0 helps organizations gain real-time testing status, so managers can make informed decisions about shipping their products. Its ability to track the amount of effort required for manual testing and also for automation helps them to take relevant business decisions. Zephyr 2.0 allows interoperability with commercial, open-source or homegrown test automation tools, thus it allows companies to leverage on their existing investments on testing infrastructure. The ‘Zbots’ in Zephyr 2.0 allow testers to execute automated test scripts thus making it quite easy to track and report on testing. Thus, bug tracking and testing management gets done easier than before with Zephyr 2.0.

In short, Zephyr 2.0 with its two-way integration to JIRA and Bugzilla is worth the money spent as it gets you the benefits of three systems for the price of one. The QA team also benefits from other features like live tracking, rich UI, SOA, and White Labeling and so on. Moreover, this easy-to-use, low cost test management system is apt for the current economic scenario. Thus, Zephyr 2.0 makes for a good investment as you can improve productivity by spending less money, time and effort.

Leave me a comment and let me hear your opinion. If you’ve got any thoughts, comments or suggestions for things we could add, leave a comment! Also please Subscribe to our RSS for latest tips, tricks and examples on cutting edge stuff.

Quality Process and Management of web applications

Quality itself has been defined as fundamentally relational:  ‘Quality is the ongoing process of building and sustaining relationships by assessing, anticipating, and fulfilling stated and implied needs.

“Quality is the extent to which products, services, processes, and relationships are free from defects, constraints, and items which do not add value for customers.”

You cannot improve what you cannot measure.

In the world of IT and other industry they are facing a challenging problem of staying within budget,market demand,customer satisfaction and finally Increasing quality.It might appear that you have to sacrifice one goal to achieve another.

Measure your products or services in this manner.

* Lower Defects
* Reduce product variability
* Increase first-pass rates
* Improve customer satisfaction
* Improve manufacturing yield
* Reduce scrap, waste, or give-away
* how you develop and produce or provide your products or services

To Improve the quality of a product or service an organization should have a Quality Management Process.

What is a Quality Management Process?

A Quality Management Process is a set of procedures that are followed to ensure that the deliverable produced by a team are “fit for purpose”. The start of the Quality Management Process involves setting quality targets, which are agreed with the customer. A “Quality Assurance Process” and “Quality Control Process” are then undertaken, to measure and report the actual quality of deliverable. As part of the Quality Management Process, any quality issues are identified and resolved quickly.

Quality Management Process will help you to:

* Set Quality Targets to be met by your team
* Define how those quality targets will be measured
* Take the actions needed to measure quality
* Identify quality issues and improvements
* Report on the overall level of quality achieved

A process flow chart for fulfill the customer satisfaction and their teams roles and responsibility to deliver defect free  product to the customers.

The quality management (QM) is supported lastingly, because the quality manager is already optimally supported even in a very stage. QM is able to identify promptly weakness factors up front and could implement a comprehensive quality approach increasing quality of product and process of software projects. This avoids errors, reduces resources needed and improves the quality at the same time. The numerous features relieve the quality manager in analyzing, planning, as well as in field of controlling while giving more time for core tasks.

10 regression/functional web testing tools

Testing tools often saves time and helps cutting testing time as web applications often suffers from bugs, inconsistent behaviors, usability issues, incorrect functionality, security issues and even the believed expectations of the customer. Its imperatively hard to determine bugs and fixes while development phase is in progress, keeping records drafting a bug sheet determining priority and severity will be a pain. Why not use automated test tools to test your web application. Following is a list of 10 web functional/regression testing tools for web applications in terms of Functional and Regression testing.

1. WATIRWatir is a simple open-source library for automating web browsers. It allows you to write tests that are easy to read and easy to maintain. It is optimized for simplicity and flexibility. Watir drives browsers the same way people do. It clicks links, fills in forms, presses buttons. Watir also checks results, such as whether expected text appears on the page. Watir is a Ruby library that works with Internet Explorer on Windows. Watir is currently being ported to support FireFox and Safari.


  • Its free open source tool. There are no costs to use the tool.
  • There’s a very active and growing community behind it.
  • It uses ruby, a full featured modern scripting  language, rather then a proprietary vendorscript.
  • It is a powerful and easy to use.
  • Don’t just take our word for it, Read what our users are saying.

2. SELENIUM IDEIt is an integrated development environment for Selenium tests. It is implemented as a Firefox extension, and allows you to record, edit, and debug tests. Selenium IDE includes the entire Selenium Core, allowing you to easily and quickly record and play back tests in the actual environment that they will run. Selenium IDE is not only recording tool: it is a complete IDE. You can choose to use its recording capability, or you may edit your scripts by hand. With autocomplete support and the ability to move commands around quickly, Selenium IDE is the ideal environment for creating Selenium tests no matter what style of tests you prefer.


  • Easy record and playback
  • Intelligent field selection will use IDs, names, or XPath as needed
  • Autocomplete for all common Selenium commands
  • Walk through tests
  • Debug and set breakpoints
  • Save tests as HTML, Ruby scripts, or any other format
  • Support for Selenium user-extensions.js file
  • Option to automatically assert the title of every page

3. Q ENGINEAdventNet QEngine offers integrated solutions to test and verify the functionality and performance of both web applications and web services. Features include:


  • QEngine toolbar for remote record/playback
  • 100% web-based test script creation and maintenance
  • Multi-user record/playback option
  • One-click access to configured suites and test scripts
  • Single point of control for functional and performance testing
  • Web services functional and performance testing
  • Data-Driven test scripts
  • Error recovery for unattended testing
  • Test scheduling for unattended execution
  • QEngine Issue Manager to track and manage issues

4. MAX Q It is a Web functional testing tool. It includes an HTTP proxy that records your test script, and a command-line utility that can be used to playback tests. The proxy recorder automatically stores variables posted to forms, so you don’t have to write that stuff by hand.


  • Free and open source
  • Scripts are written in Jython, an implementation of Python. Python scripts are easy to understand, modify and extend.
  • Alternatively, the captured HTTP sessions may be saved as XML files, according to ISAC format, for (massive) replay with CLIF load testing platform
  • Works from the command line so you can run scripts unattended.
  • Understands cookies.
  • Written in Java so it runs anywhere.
  • Easy to enhance because the source code is simple.
  • Scripts can run as JUnit tests.
  • Works behind proxy servers.

5. Solex – It is a free open source Web application testing tool built as a plug-in for the Eclipse IDE. It provides functions to record a client session, adjust it according to various parameters and replay it later typically in order to ensure non regression of the application’s behavior (with stress testing capabilities being added at a later stage). Solex acts as an HTTP proxy and records all HTTP requests and responses going through the wire between a Web client (e.g. a Web browser) and a Web server. The task of replaying a scenario consists in sending the previously recorded and eventually customized HTTP requests to the server and asserting each response.

  • Solex can record HTTP messages by acting as a Web proxy.
  • Recorded sessions can be saved as XML and reopened later.
  • HTTP requests and responses are fully displayed in order to inspect and customize their content, thanks to replacement rules.
  • Solex allows the attachment of extraction or replacement rules to any HTTP message content, HTTP header or URL parameter.
  • Recorded requests can be filtered to remove or disable unwanted resources, like .jpg, .gif.
  • Solex allows the attachment of assertions to responses in order to validate a scenario during its playback.
  • Solex can replay an HTTP session request by request or all requests at once.
  • Playback results can be exported as XML with an optional XSL transformation

6. SILK TEST It is an automation tool for testing the functionality of enterprise applications in most versions of Windows, Sun Solaris 9 & 10, and Red Hat Enterprise Linux WS 2.1 & 3.0. It is produced by Segue Software which was acquired by Borland in 2006. SilkTest uses the proprietary 4Test language for automation scripting. It is an object oriented language similar to C++. It uses the concepts of classes, objects, and inheritance.
  • contains all the source script files.
  • translates the script commands into GUI commands (User actions). These commands can be executed on the same machine as the host or on a remote machine.
  • SilkTest can be run to identify mouse movement along with keystrokes (Useful for custom object). It can use both record and playback or descriptive programming methods to capture the dialogs.
  • SilkTest identifies all windows and controls of the application under test as objects and defines all of the properties and attributes of each window. Thus it supports object oriented implementatio.

7. QA WIZARD QA Wizard Pro automates the functional and regression testing of Web and Windows applications, helping your quality assurance team test more of an application in less time.


  • Powerful, Easy to Use Scripting Language
  • Object-based Record and Playback Engine
  • Global Application Repository
  • Validation Checkpoints
  • Data-driven Testing
  • Seamless Integration with Seapine ALM Tools
  • Remote Script Execution.

8. WEB KING The release of WebKing 3.5 makes it easier than ever to verify and improve application reliability throughout the development process. This tool allows developers and QA testers to work together more efficiently by giving them a single tool that spans the entire development life cycle, from content flow and functionality verification to automated testing and deployment of web applications.The new software enables users to run automated testing and analysis of Web applications, addressing four primary areas: Web site risk analysis, functional testing, load and performance testing, and security analysis, the company said. Parasoft officials said WebKing helps to ensure that Web applications meet specific content, performance, reliability and security goals set by users.


  • Powerful, Easy to Use Scripting Language
  • A SOAP Test Wizard automatically creates SOAP test cases from a WSDL
  • Additional Java, JavaScript, and Python scripting options allow users to customiz.
  • A Log File Analysis feature produces traffic analysis reports and automatically record.
  • A SOAP Test Wizard automatically creates SOAP test cases from a WSDL
  • produce high-quality Web applications and Web services within a short time period.

9. TestDriveTestDrive-Gold is designed to test just about any GUI or browser application ‘out of the box’. It has in-built technology to deal with a multitude of controls and techniques.TestDrive-Gold is designed to test just about any GUI or browser application ‘out-of-the-box’. It has in-built technology to deal with a multitude of controls and techniques, without you having to worry about them. With some technologies, it is the only solution that works. Just a couple of the highlights are: Code free testing
Selg healing Technology, Innovative script contaol Technology,  Iseries server side testingMacros lets you record and replay repetitious work.


  • Easily create test scenarios through a simple point-and-click interface.
  • Execute a complete regression test in hours not days, complete with full results, automatic data rules, and analysis.
  • Complex decision-linked tests can be built that integrate with the server functions to give a complete approach to testing.
  • Free of any coding language.
  • Variable data, Tracked Fields and Action map functionality.
  • Schedule playback to run anytime day and night.

TestDrive Gold

10. Rational Functional TesterProvides testers with automated testing capabilities for functional testing, regression testing, GUI testing and data-driven testing.


  • Provides testers with automated capabilities for data-driven and keyword testing.
  • Offers testers a choice of scripting language and industrial-strength editor: Java in Eclipse® or Microsoft® Visual Basic .NET® in Visual Studio .NET – for test authoring and customization.
  • Supports version control to enable parallel development of test scripts and concurrent usage by geographically distributed teams.
  • Supports custom controls through proxy SDK (Java/.Net)

Additional Tools:

  • QuickTest Professional (QTP) It is an automated functional Graphical User Interface (GUI) testing tool created by the HP subsidiary Mercury Interactive that allows the automation of user actions on a web or client based computer application.

HP QuickTest Professional supports functional and regression test automation that addresses every major software application and environment. This solution uses the concept of keyword-driven testing to simplify test creation and maintenance. It enables testers to build test cases by capturing flows directly from the application screens using specialized capturing technology. Test experts also have full access to the underlying test and object properties via an integrated scripting and debugging environment.


  • Improve collaboration between workgroups with shared function libraries, object management and flexible asset storage.
  • Collapse test documentation and test creation to a single ste.
  • Fix defects faster by fully documenting and replicated defects for developer.
  • Set a test development process and propagate it throughout the organizatio.
  • Set a test development process and propagate it throughout the organizatio.

There are various other products available in market today but i have compiled the list based on my experiences of various testing tools. Please post your experiences with these tools or you can add new ones which i may have missed. If you like this post kindly subscribe to our RSS for free updates and articles delivered to you.

Web Application Security

How would you determine whether your website is being hacked or not? Read the way hacker steals the information and hacks your website. Moreover, how you can help preventing your website being hacked.


Some hackers, for example, will take advantage of web application vulnerabilities and may maliciously inject code within vulnerable web applications to trick users and redirect them towards phisphing sites. This technique is called Cross-Site Scripting and may be used even when the web servers and database engine contain no vulnerabilities themselves.

Is your data really safe?
Just because you think your data is safe does not mean your database of sensitive organization information has not already been cloned and is resident elsewhere ready to be sold to the highest bidder. To make matters worse, only recently, it has been discovered that hackers are not simply selling your data; they’re also selling the fact that you have vulnerabilities to others be they hackers, industrial spies or terrorists.

It all sounds apocalyptic, doesn’t it? Well, rather than being an angel of doom, I’ll let the stats speak for themselves.

web Hacking Incidents Database

The web hacking incident database (WHID) is a Web Application Security Consortium project dedicated to maintaining a list of web applications related security incidents.

The database is unique in tracking only media reported security incidents that can be associated with a web application security vulnerability. We also try to limit the database to targeted attacks only. Please refer to the FAQ for further information on what you will find and what you will not find in WHID.
WHID goal is to serve as a tool for raising awareness of the web application security problem and provide information for statistical analysis of web applications security incidents. WHID has been features in Week and slash dot
If you have additional information on those or other web hacking incidents, you are more than welcome to share this information with us.


IndiaTimes.com Visitors Risk High Exposure To

The web site of a leading Indian newspaper is swamped with malware. A recent survey by WebSense cites by the Register found that of the sites hosing malware, 51% where legitimate sites that have been broken into. This is a major shift in the threat landscape, since keeping to web sites that you know is no longer a good protection strategy. Anecdotally undermining WebSense own web site classification technology as a security solution.

SQL injection vulnerabilities
Securing your website and web applications from SQL Injection involves a three-part process:

  1. Analysing the present state of security present by performing a thorough audit of your website and web applications for SQL Injection and other hacking vulnerabilities.
  2. Making sure that you use coding best practice santising your web applications and all other components of your IT infrastructure.
  3. Regularly performing a web security audit after each change and addition to your web components.

Furthermore, the principles you need to keep in mind when checking for SQL Injection and all other hacking techniques are the following: “Which parts of a website we thought are secure are open to hack attacks?” and “what data can we throw at an application to cause it to perform something it shouldn’t do?”.

Checking for SQL Injection vulnerabilities involves auditing your website and web applications. Manual vulnerability auditing is complex and very time-consuming. It also demands a high-level of expertise and the ability to keep track of considerable volumes of code and of all the latest tricks of the hacker’s ‘trade’.

The best way to check whether your web site and applications are vulnerable to SQL injection attacks is by using an automated and heuristic web vulnerability scanner.

An automated web vulnerability scanner crawls your entire website and should automatically check for vulnerabilities to SQL Injection attacks. It will indicate which URLs/scripts are vulnerable to SQL injection so that you can immediately fix the code. Besides SQL injection vulnerabilities a web application scanner will also check for Cross site scripting and other web vulnerabilities.

Apache Web Server Security

An increasing number of attacks on high-profile websites show that web security is still one of the most critical issues to be tackled by any business that has a web presence and conducts operations online.

If your web server and/or web applications are vulnerable to attacks, you can be giving a free access to hackers to access sensitive information stored in your backend database.

One of the elements of your network infrastructure that could be vulnerable to attacks is the web server program. A web server program or web server engine runs a service which listens for, and responds to, web requests made by users via their browser. The most widely used web server engines are Apache and Microsoft IIS. These web server programs could very well exhibit security flaws or vulnerabilities, which, for example, could allow a malicious remote user access to your operating system with privileges which are more wide-ranging than those normally provided to a web browser request.

Furthermore, Apache requires a server-side scripting engine (e.g., PHP, ASP, ASP.NET, JSP) if the website is dynamic or if, for example, certain pages require the user to submit personal information such as their name, email address and credit card details. Web security best practice requires regular auditing to check for scripting engine vulnerabilities, as well as, ensuring that users cannot input character combinations that could exploit these or other weaknesses to eventually gain access to sensitive data.

PHP Security

Whether your site is the web presence for a large multinational, a gallery showing your product range and inviting potential customers to come into the shop, or a personal site exhibiting your holiday photos, web security matters. After the hard work put in to make your site look good and respond to your users, the last thing you want is for a malicious hacker to come along, perform a PHP hack and break it somehow.

There are a number of problems in web security, and unfortunately not all of them have definite solutions, but here we’ll look at some of the problems that should be considered every time you set out to write a PHP script to avoid a PHP hack attack. These are the problems which, with well-designed code, can be eliminated entirely. Before looking in detail at the solutions, though, lets take a moment to define the problems themselves.

SQL Injection
In this attack, a user is able to execute SQL queries in your website’s database. This attack is usually performed by entering text into a form field which causes a subsequent SQL query, generated from the PHP form processing code, to execute part of the content of the form field as though it were SQL. The effects of this attack range from the harmless (simply using SELECT to pull another data set) to the devastating (DELETE, for instance). In more subtle attacks, data could be changed, or new data added.

Directory Traversal
This attack can occur anywhere user-supplied data (from a form field or uploaded filename, for example) is used in a filesystem operation. If a user specifies “../../../../../../etc/passwd” as form data, and your script appends that to a directory name to obtain user-specific files, this string could lead to the inclusion of the password file contents, instead of the intended file. More severe cases involve file operations such as moving and deleting, which allow an attacker to make arbitrary changes to your filesystem structure.

Authentication Issues
Authentication issues involve users gaining access to something they shouldn’t, but to which other users should. An example would be a user who was able to steal (or construct) a cookie allowing them to login to your site under an Administrator session, and therefore be able to change anything they liked.

Remote Scripts (XSS)
XSS, or Cross-Site Scripting (also sometimes referred to as CSS, but this can be confused with Cascading Style Sheets, something entirely different!) is the process of exploiting a security hole in one site to run arbitrary code on that site’s server. The code is usually included into a running PHP script from a remote location. This is a serious attack which could allow any code the attacker chooses to be run on the vulnerable server, with all of the permissions of the user hosting the script, including database and filesystem access.

Validating Input And Stripping Tags
When a user enters information into a form which is to be later processed on your site, they have the power to enter anything they want. Code which processes form input should be carefully written to ensure that the input is as requested; password fields have the required level of complexity, e-mail fields have at least some characters, an @ sign, some more characters, a period, and two or more characters at the end, zip or postal codes are of the required format, and so on.

Each of these may be verified using regular expressions, which scan the input for certain patterns. An example for e-mail address verification is the PHP code shown below. This evaluates to true if an e-mail address was entered in the field named ’email’.

preg_match(‘/^[email protected]+\..{2,3}$/’,$_POST[’email’]);

This code just constructs a regular expression based on the format described above for an e-mail address. Note that this will return true for anything with an @ sign and a dot followed by 2 or 3 characters. That is the general format for an e-mail address, but it doesn’t mean that address necessarily exists; you’d have to send mail to it to be sure of that.

Authentication Hacking Attacks

HTTP can embed several different types of authentication protocols. These include:

  • Basic – Cleartext username/password, Base-64 encode (trivially decoded)
  • Digest – Like Basic, but passwords are scrambled
  • Form-based – A custom form is used to input username/password (or other credentials) and is processed using custom logic on the backend.
  • NTLM – Microsoft’s proprietary authentication protocol, implemented within HTTP request/response headers.
  • Negotiate – A new protocol from Microsoft that allows any type of authentication specified above to be dynamically agreed upon by the client and server. Also adds Kerberos for clients using Microsoft’s IE v5+.
  • Client-side Certificates – Although rarely used, SSL/TLS provides an option that checks the authenticity of a digital certificate present by the Web client, essentially making it an authentication token.
  • Microsoft Passport – A single-sign-in (SSI) service run by Microsoft Corporation that allows web sites (called “Passport Partners”) to authenticate users based on their membership in the Passport service. The mechanism uses a key shared between Microsoft and the Partner site to create a cookie that uniquely identifies the user.

These authentication protocols operate right over HTTP (or SSL/TSL), with credentials embedded right in the request/response traffic.

This kind of attack is not a technological security hole in the Operating System or server software. It depends rather on how securely stored and complex the passwords are and on how easy it is for the attacker to reach the server (network security).

Fallout From the Fall of CAPTCHAs

CAPTCHA went from relatively obscure security measure perfected in 2000 by researchers at Carnegie Mellon University to deployment by most of the major Web e-mail sites and many other Web sites by 2007. Sites such as Yahoo Mail, Google’s Gmail and Microsoft’s Hotmail all used — and, for that matter, continue to use — CAPTCHA to make sure that only human beings, not bots, could get accounts or make postings.

Those days are long gone.

By January 2008, Yahoo Mail’s CAPTCHA had been cracked. Gmail was ripped open in April. Hotmail’s top got popped during the same month.

And then things got bad. “

QA Testing and Developer Awareness

Traditionally, Quality Assurance (QA) teams have not been partners with information security personnel, but trends are showing a shift in thinking.   Mercury Interactive, a major player in automated testing tools, recently announced partnerships with some leading application security testing companies that provide an integrated solution between Mercury’s testing products and the vendors’ application vulnerability detection tools.

Does this mean QA teams will become security experts? Quite the contrary.   We can expect to see more integrated solutions to allow QA testers to continue automated testing, without necessarily needing to understand the underlying security technology.   In fact, we will most likely see a shift towards some type of workflow in which the owners of security policies create the appropriate tests and the QA professionals execute and measure against those tests.

We should also expect to see QA teams move from functional testing into areas of compliance testing as well.   For example, for compliance with various state and federal privacy laws, QA teams could determine which web pages do not reference a privacy policy or which pages leak sensitive information in the URL of a form submission.

Developers will also benefit from increasingly sophisticated web application vulnerability detection tools.   Ideally, detection systems should be able to track defective/insecure lines of code where vulnerabilities might be found.   Whenever possible, this would happen as part of a development tool operation such as a compilation of code.   Some vendors have created development tools for enhancing code security, but to date, sales of these tools have been relatively poor.   In addition, most of these code scanning tools are unable to provide complete application awareness and can only focus on a specific module of code.   Thus, for more complex problems that might extend, for example, between a UI module and a database module, code scanners have traditionally not worked very well as stand-alone solutions.   It is also foreseeable that we will see integration with bug tracking systems so that developers can simply follow their current defect tracking methodology and fix security vulnerabilities as simply as functional defects in their code.

Closing the Loop

Eventually, web application security detection tools will be able to provide border appliances, such as intrusion detection systems (IDS’s) and firewalls, information on how to stop an attack until a vulnerability can be resolved.   Various standards have emerged, each aligned with a particular set of vendors.

Some of the more prominent standards include the Application Vulnerability Description Language (AVDL) and Web Application Security (WAS), which are both XML-based standards.   The shifting marketplace factors heavily into which standard will dominate.    For example, Sanctum was recently acquired by WatchFire.   It remains to be seen what the new parent company will establish as a strategic direction and/or if it shifts Sanctum’s original strategy to support WAS (which was formed as a competitor response to SPI Dynamics’ involvement in AVDL).   While the industry appears to be favoring WAS, it is still unclear which standard will dominate and influence commercial product development.   It’s also not clear how these standards will help customers. Right now, the focus for companies is to find critical vulnerabilities that they can remediate and thus protect themselves from cyber attacks..


The current use of most web application security testing tools is still focused on the penetration tester/information security professional, with use being extended for QA and audit professionals.   We are still a fair distance from holding a developer (i.e., software vendors) accountable for writing insecure code, but clearly the trend is moving in that direction.   Security has always been a holistic solution, requiring all players and systems to work in concert to form a good defense.

Black Box Testing Strategy

What is a Black Box Testing Strategy?

Black Box Testing is not a type of testing; it instead is a testing strategy, which does not need any knowledge of internal design or code etc. As the name “black box” suggests, no knowledge of internal logic or code structure is required. The types of testing under this strategy are totally based/focused on the testing for requirements and functionality of the work product/software application. Black box testing is sometimes also called as “Opaque Testing”, “Functional/Behavioral Testing” and “Closed Box Testing”.

The base of the Black box testing strategy lies in the selection of appropriate data as per functionality and testing it against the functional specifications in order to check for normal and abnormal behavior of the system. Now a days, it is becoming common to route the Testing work to a third party as the developer of the system knows too much of the internal logic and coding of the system, which makes it unfit to test the application by the developer.

In order to implement Black Box Testing Strategy, the tester is needed to be thorough with the requirement specifications of the system and as a user, should know, how the system should behave in response to the particular action.

Various testing types that fall under the Black Box Testing strategy are: functional testing, stress testing, recovery testing, volume testing, User Acceptance Testing (also known as UAT), system testing, Sanity or Smoke testing, load testing, Usability testing, Exploratory testing, ad-hoc testing, alpha testing, beta testing etc.

These testing types are again divided in two groups: a) Testing in which user plays a role of tester and b) User is not required.

Testing method where user is not required:

Functional Testing:
In this type of testing, the software is tested for the functional requirements. The tests are written in order to check if the application behaves as expected.

Stress Testing:
The application is tested against heavy load such as complex numerical values, large number of inputs, large number of queries etc. which checks for the stress/load the applications can withstand.

Load Testing:
The application is tested against heavy loads or inputs such as testing of web sites in order to find out at what point the web-site/application fails or at what point its performance degrades.

Ad-hoc Testing:
This type of testing is done without any formal Test Plan or Test Case creation. Ad-hoc testing helps in deciding the scope and duration of the various other testing and it also helps testers in learning the application prior starting with any other testing.

Exploratory Testing:
This testing is similar to the ad-hoc testing and is done in order to learn/explore the application.

Usability Testing:
This testing is also called as ‘Testing for User-Friendliness’. This testing is done if User Interface of the application stands an important consideration and needs to be specific for the specific type of user.

Smoke Testing:
This type of testing is also called sanity testing and is done in order to check if the application is ready for further major testing and is working properly without failing up to least expected level.

Recovery Testing:
Recovery testing is basically done in order to check how fast and better the application can recover against any type of crash or hardware failure etc. Type or extent of recovery is specified in the requirement specifications.

Volume Testing:
Volume testing is done against the efficiency of the application. Huge amount of data is processed through the application (which is being tested) in order to check the extreme limitations of the system.

Testing where user plays a role/user is required:

User Acceptance Testing:
In this type of testing, the software is handed over to the user in order to find out if the software meets the user expectations and works as it is expected to.

Alpha Testing:
In this type of testing, the users are invited at the development center where they use the application and the developers note every particular input or action carried out by the user. Any type of abnormal behavior of the system is noted and rectified by the developers.

Beta Testing:
In this type of testing, the software is distributed as a beta version to the users and users test the application at their sites. As the users explore the software, in case if any exception/defect occurs that is reported to the developers.

Black Box Testing

beSTORM performs a comprehensive analysis, exposing security holes in your products during development and after release.

beSTORM represents a new approach to security auditing. This new approach is sometimes called “fuzzing”, “fuzz testing” or “fuzzer” and can be used for securing in-house developed applications and devices, as well as applications and devices of external vendors.

Most of the security holes found today in products and applications, can be discovered automatically. By using an automated attack tool that tries virtually all different attack combinations, with the ability to detect certain application anomalies and indicate a successful attack, those security holes can be found almost without user intervention.

How it works

  • Innovative beSTORM performs an exhaustive analysis to uncover new and unknown vulnerabilities in software products. This is different than older generation tools that use attack signatures or attempt to locate known vulnerabilities in products. beSTORM does not need the source code to analyze and uncover vulnerabilities.
  • Broad range Many of the common Internet protocols can be tested by beSTORM – even complex protocols such as SIP (used in Voice over IP products) are supported.
  • Attack Prioritization Special attack prioritizing algorithms allow beSTORM to start with the attacks most likely to succeed, depending on the specific protocol that is audited. This saves considerable time during the audit process and highlights the most important problems, first.
  • Report accuracy beSTORM checks the application externally by triggering actual attacks. Vulnerabilities are reported only if an actual attack has been successful, for example if a buffer overflow has been triggered. Simply put, beSTORM emulates an attacker. If the attacker cannot carry out the attack, beSTORM will not report it, effectively reducing the number of false positives.
  • Protocol compliance beSTORM is able to convert the protocol standard text to automated set of tests by converting the BNF description used in technical RFC documents to attack language. This ensures that the entire functionality of the system is checked, and enables to quickly find bugs that otherwise surface only months or years after the product is released to the market.
  • Comprehensive analysis beSTORM detects vulnerabilities by attaching to the audited process and detecting even the slightest anomalies. By doing so, beSTORM can find attacks as subtle as ‘off-by-one’ attacks, as well as buffer overflow attacks that do not crash the application.
  • Scaling beSTORM is extremely scalable, with the ability to use multiple processors or multiple machines to parallelize the audit and substantially reduce the testing duration.
  • Extensibility beSTORM tests the protocol rather than the product, and therefore can be used to test extremely complicated products with a large code base.
  • Flexibility beSTORM’s protocol analysis can be easily extended to support your proprietary protocol.
  • Language independent beSTORM tests the binary application, and is therefore completely indifferent to the programming language or system libraries used. beSTORM will report the exact interaction that triggers the vulnerability, and the programmers can now debug the application with whatever development environment they wish to see what causes the fault.

Automated Binary Analysis

beSTORM includes an automated engine that can parse through binary data, decode ASN.1 structures as well as length value pairs:

Automated Textual Analysis

beSTORM includes an automated engine that can parse through textual data, recognize multiple forms of data encoding, as well as decode XML structures:

Custom Protocols

For those protocols that cannot be automatically analyzed beSTORM includes a graphical interface that can be used to easily support your proprietary protocols:

Advanced Debugging and Stack Tracing

beSTORM includes an advanced debugging and stack tracing engine that can not only discover potential coding issues, but also show what is the stack trace that brought you to the specific coding issue:


  • Integrates with the existing development strategy Search for security vulnerabilities during development or as part of your QA process.
  • Source code not necessary No need for source code – perfect for auditing 3rd party applications.
  • Reproducable Vulnerabilities are searched for in a methodical way which can be reproduced.
  • Powerful substitute beSTORM can be used to substitute existing tools used by security auditors and black-box testers.

Testing your web application

Web applications are becoming more prevalent and increasingly more sophisticated, and as such they are critical to almost all major online businesses. As with most security issues involving client/server communications, Web application vulnerabilities generally stem from improper handling of client requests and/or a lack of input validation checking on the part of the developer.

The very nature of Web applications – their ability to collate, process and disseminate information over the Internet – exposes them in two ways. First and most obviously, they have total exposure by nature of being publicly accessible. This makes security through obscurity impossible and heightens the requirement for hardened code. Second and most critically from a penetration testing perspective, they process data elements from within HTTP requests – a protocol that can employ a myriad of encoding and encapsulation techniques.

Most Web application environments (including ASP and PHP, which will both be used for examples throughout the series), expose these data elements to the developer in a manner that fails to identify how they were captured and hence what kind of validation and sanity checking should apply to them. Because the Web “environment” is so diverse and contains so many forms of programmatic content, input validation and sanity checking is the key to Web applications security. This involves both identifying and enforcing the valid domain of every user-definable data element, as well as a sufficient understanding of the source of all data elements to determine what is potentially user definable.

The Root of the Issue: Input Validation

Input validation issues can be difficult to locate in a large codebase with lots of user interactions, which is the main reason that developers employ penetration testing methodologies to expose these problems. Web applications are, however, not immune to the more traditional forms of attack. Poor authentication mechanisms, logic flaws, unintentional disclosure of content and environment information, and traditional binary application flaws (such as buffer overflows) are rife. When approaching a Web application as a penetration tester, all this must be taken into account, and a methodical process of input/output or “blackbox” testing, in addition to (if possible) code auditing or “whitebox” testing, must be applied.

What exactly is a Web application?

A Web application is an application, generally comprised of a collection of scripts, that reside on a Web server and interact with databases or other sources of dynamic content. They are fast becoming ubiquitous as they allow service providers and their clients to share and manipulate information in an (often) platform-independent manner via the infrastructure of the Internet. Some examples of Web applications include search engines, Webmail, shopping carts and portal systems.

How does it look from the users perspective?

Web applications typically interact with the user via FORM elements and GET or POST variables (even a ‘Click Here’ button is usually a FORM submission). With GET variables, the inputs to the application can be seen within the URL itself, however with POST requests it is often necessary to study the source of form-input pages (or capture and decode valid requests) in order to determine the users inputs.

An example HTTP request that might be provided to a typical Web application is as follows:

GET /sample.php?var=value&var2=value2 HTTP/1.1 | HTTP-METHOD REQUEST-URI PROTOCOL/VERSION
Session-ID: 361873127da673c | Session-ID Header
Host: www.webserver.com | Host Header
<CR><LF><CR><LF> | Two carriage return line feeds

Every element of this request can potentially be used by the Web application processing the request. The REQUEST-URI identifies the unit of code that will be invoked along with the query string: a separated list of &variable=value pairs defining input parameters. This is the main form of Web applications input. The Session-ID header provides a token identifying the client’s established session as a primitive form of authentication. The Host header is used to distinguish between virtual hosts sharing the same IP address and will typically be parsed by the Web server, but is, in theory, within the domain of the Web application.

As a penetration tester you must use all input methods available to you in order to elicit exception conditions from the application. Thus, you cannot be limited to what a browser or automatic tools provide. It is quite simple to script HTTP requests using utilities like curl, or shell scripts using netcat. The process of exhaustive blackbox testing a Web application is one that involves exploring each data element, determining the expected input, manipulating or otherwise corrupting this input, and analysing the output of the application for any unexpected behaviour.

The Information Gathering Phase

Fingerprinting the Web Application Environment

One of the first steps of the penetration test should be to identify the Web application environment, including the scripting language and Web server software in use, and the operating system of the target server. All of these crucial details are simple to obtain from a typical Web application server through the following steps:

  1. Investigate the output from HEAD and OPTIONS http requests

    The header and any page returned from a HEAD or OPTIONS request will usually contain a SERVER: string or similar detailing the Web server software version and possibly the scripting environment or operating system in use.

    OPTIONS / HTTP/1.0
    HTTP/1.1 200 OK
    Server: Microsoft-IIS/5.0
    Date: Wed, 04 Jun 2003 11:02:45 GMT
    MS-Author-Via: DAV
    Content-Length: 0
    Accept-Ranges: none
    DASL: <DAV:sql>
    DAV: 1, 2
    Cache-Control: private

  2. Investigate the format and wording of 404/other error pages

    Some application environments (such as ColdFusion) have customized and therefore easily recognizable error pages, and will often give away the software versions of the scripting language in use. The tester should deliberately request invalid pages and utilize alternate request methods (POST/PUT/Other) in order to glean this information from the server.

    Below is an example of a ColdFusion 404 error page:

    ColdFusion 404 error page

  3. Test for recognised file types/extensions/directories

    Many Web services (such as Microsoft IIS) will react differently to a request for a known and supported file extension than an unknown extension. The tester should attempt to request common file extensions such as .ASP, .HTM, .PHP, .EXE and watch for any unusual output or error codes.

    GET /blah.idq HTTP/1.0
    HTTP/1.1 200 OK
    Server: Microsoft-IIS/5.0
    Date: Wed, 04 Jun 2003 11:12:24 GMT
    Content-Type: text/html

    <HTML>The IDQ file blah.idq could not be found.

  4. Examine source of available pages

    The source code from the immediately accessible pages of the application front-end may give clues as to the underlying application environment.

    <title>Home Page</title>
    <meta content="Microsoft Visual Studio 7.0" name="GENERATOR">
    <meta content="C#" name="CODE_LANGUAGE">
    <meta content="JavaScript" name="vs_defaultClientScript">

    In this situation, the developer appears to be using MS Visual Studio 7. The underlying environment is likely to be Microsoft IIS 5.0 with .NET framework.

  5. Manipulate inputs in order to elicit a scripting error

    In the example below the most obvious variable (ItemID) has been manipulated to fingerprint the Web application environment:

    ItemID manipulation in a URL

  6. TCP/ICMP and Service Fingerprinting Using traditional fingerprinting tools such as Nmap and Queso, or the more recent application fingerprinting tools Amap and WebServerFP, the penetration tester can gain a more accurate idea of the underlying operating systems and Web application environment than through many other methods. NMAP and Queso examine the nature of the host’s TCP/IP implementation to determine the operating system and, in some cases, the kernel version and patch level. Application fingerprinting tools rely on data such as Server HTTP headers to identify the host’s application software.

Hidden form elements and source disclosure

In many cases developers require inputs from the client that should be protected from manipulation, such as a user-variable that is dynamically generated and served to the client, and required in subsequent requests. In order to prevent users from seeing and possibly manipulating these inputs, developers use form elements with a HIDDEN tag. Unfortunately, this data is in fact only hidden from view on the rendered version of the page – not within the source.

There have been numerous examples of poorly written ordering systems that would allow users to save a local copy of order confirmation pages, edit HIDDEN variables such as price and delivery costs, and resubmit their request. The Web application would perform no further authentication or cross-checking of form submissions, and the order would be dispatched at a discounted price!

<FORM METHOD="LINK" ACTION="/shop/checkout.htm">
<INPUT TYPE="HIDDEN" name="quoteprice" value="4.25">Quantity: <INPUT TYPE="text"
NAME="totalnum"> <INPUT TYPE="submit" VALUE="Checkout">

This practice is still common on many sites, though to a lesser degree. Typically only non-sensitive information is contained in HIDDEN fields, or the data in these fields is encrypted. Regardless of the sensitivity of these fields, they are still another input to be manipulated by the blackbox penetration tester.

All source pages should be examined (where feasible) to determine if any sensitive or useful information has been inadvertently disclosed by the developer – this may take the form of active content source within HTML, pointers to included or linked scripts and content, or poor file/directory permissions on critical source files. Any referenced executables and scripts should be probed, and if accessible, examined.

Javascript and other client-side code can also provide many clues as to the inner workings of a Web application. This is critical information when blackbox testing. Although the whitebox (or ‘code-auditing’) tester has access to the application’s logic, to the blackbox tester this information is a luxury which can provide for further avenues of attack. For example, take the following chunk of code:

if (document.forms['product'].elements['quantity'].value >= 255) {
alert('Invalid quantity');
return false;
} else {
return true;

This suggests that the application is trying to protect the form handler from quantity values of 255 of more – the maximum value of a tinyint field in most database systems. It would be trivial to bypass this piece of client-side validation, insert a long integer value into the ‘quantity’ GET/POST variable and see if this elicits an exception condition from the application.

Determining Authentication Mechanisms

One of the biggest shortcomings of the Web applications environment is its failure to provide a strong authentication mechanism. Of even more concern is the frequent failure of developers to apply what mechanisms are available effectively. It should be explained at this point that the term Web applications environment refers to the set of protocols, languages and formats – HTTP, HTTPS, HTML, CSS, JavaScript, etc. – that are used as a platform for the construction of Web applications. HTTP provides two forms of authentication: Basic and Digest. These are both implemented as a series of HTTP requests and responses, in which the client requests a resource, the server demands authentication and the client repeats the request with authentication credentials. The difference is that Basic authentication is clear text and Digest authentication encrypts the credentials using a nonce (time sensitive hash value) provided by the server as a cryptographic key.

Besides the obvious problem of clear text credentials when using Basic, there is nothing inherently wrong with HTTP authentication, and this clear-text problem be mitigated by using HTTPS. The real problem is twofold. First, since this authentication is applied by the Web server, it is not easily within the control of the Web application without interfacing with the Web server’s authentication database. Therefore custom authentication mechanisms are frequently used. These open a veritable Pandora’s box of issues in their own right. Second, developers often fail to correctly assess every avenue for accessing a resource and then apply authentication mechanisms accordingly.

Given this, penetration testers should attempt to ascertain both the authentication mechanism that is being used and how this mechanism is being applied to every resource within the Web application. Many Web programming environments offer session capabilities, whereby a user provides a cookie or a Session-ID HTTP header containing a psuedo-unique string identifying their authentication status. This can be vulnerable to attacks such as brute forcing, replay, or re-assembly if the string is simply a hash or concatenated string derived from known elements.

Every attempt should be made to access every resource via every entry point. This will expose problems where a root level resource such as a main menu or portal page requires authentication but the resources it in turn provides access to do not. An example of this is a Web application providing access to various documents as follows. The application requires authentication and then presents a menu of documents the user is authorised to access, each document presented as a link to a resource such as:


Although reaching the menu requires authentication, the showdoc.asp script requires no authentication itself and blindly provides the requested document, allowing an attacker to simply insert the docid GET variable of his desire and retrieve the document. As elementary as it sounds this is a common flaw in the wild.


In this article we have presented the penetration tester with an overview of web applications and how web developers obtain and handle user inputs. We have also shown the importance of fingerprinting the target environment and developing an understanding of the back-end of an application. Equipped with this information, the penetration tester can proceed to targeted vulnerability tests and exploits. The next installment in this series will introduce code and content-manipulation attacks, such as PHP/ASP code injection, SQL injection, Server-Side Includes and Cross-site scripting.