Most companies go through some growing pains when they increase web server traffic. Performance issues kill your conversion rates, sales, bounce rate, and even your SEO efforts. If you have a virtual private server (VPS) or a dedicated server, you need to identify where the problem stems from, fix it and then monitor your server for any future issues. Here are some ways you can remove factors and identify problems when going through the struggle of finding the source of performance issues.
Check CPU and Memory Usage
Most web server software has a feature that lets you monitor and check CPU and memory usage. In addition to checking CPU and memory usage, you should also identify programs using the most memory and CPU cycles. If your server software doesn’t let you check these issues, then find a third-party application that lets you monitor these statistics.
For instance, in Windows server software, press CTRL+ALT+DEL and choose “Task Manager.” Click the “Performance” tab to view current CPU and memory usage. You can also use the “Processes” tab to see which programs are using most of your CPU and memory. There are some third-party applications you can install to make your investigation and research easier. Use these with caution, because you don’t want to install unknown software on a server. Only install trusted software on servers.
Check Bots and Crawling
Several bots on the market slam your servers with requests, especially when you move to a new web server. Bots will try to recrawl the whole website and get statistics on your new server. Google is especially notorious for crawling a new web server aggressively, but you also have other bots that could play a role in server performance issues.
The best way to check for bot traffic is to look at your server logs. Some sites also use database procedures to insert records into a table each time a crawler or legitimate user browses the site. The reason some site owners insert records is to create custom reports for traffic. However, if you are trying to troubleshoot performance issues, you probably want to turn off this process until you isolate the issue.
Bots such as Google have a feature in Webmaster Tools where you can lower crawl rate. If the crawler still continues to hammer your site with traffic, use the robots.txt file to block the bot until you can identify the main performance issue. Note that some bots don’t honor robots.txt directives, but you can stop bot traffic from the major search engines.
Reports and Production Database Servers
Small website owners sometimes make the mistake of running reports off of a production website. When your departments need reports, you must replicate production data to a reporting server. The reports are then run off of a separate database server, which has no effect on the production database.
When you’re starting out, database tables are small, you don’t have much traffic, and very few queries run on the server. Running reports off of the production database probably won’t have much of an effect when you’re just starting out. However, as the company grows, your departments create new reports, run these reports more frequently, and more users begin to use data on the database. Moving reports to a reporting database greatly improves performance and eliminates reporting as a performance issue.
Another issue with allowing users to run queries directly on a production server is security. Users with direct access to the reporting server can accidentally edit data and cause major data integrity issues. These integrity issues can cause errors in your public web application and cost you sales and revenue.
Isolate Application Pools
Windows servers have a concept called application pools. You assign websites and web services to application pools. You can have more than one app assigned to an application pool, but it becomes more difficult to isolate issues.
Application pools can have different settings such as memory usage and CPU usage. If you find that one application is causing an issue, a reboot of the application pool will temporarily fix the issue. However, note that recycling the application pool also takes the application down until the pool fully recycles.
If you have multiple apps in one application pool, separate them into separate pools to isolate one app from the other. This step will help you identify which app is causing the memory and CPU spikes, so you can focus on the right code.
Perform a Code Review
Some code is written more efficiently than other code. You can sometimes identify issues by performing a full code review of each app on the web server. You can also perform this step after you isolate application pools if you use a Windows server.
Some coding issues to always check for:
- Closing database connections. Older code doesn’t efficiently open and close database connections, which cause memory leaks.
- Poorly structured queries. Queries that aren’t structured properly can sometimes cause massive performance issues. Whether it’s stored procedures or inline SQL, always check your queries for performance.
- Third-party integrations. Sometimes, it’s not your code that’s the issue. Instead, it’s a third-party DLL or integrated application that doesn’t function well under high performance loads.
- Inefficient code. You’ll need someone who knows the language well to identify inefficient code. Code can be written in several different ways, but some ways are much more inefficient on memory and CPU resources. When the application becomes too big, these minor offenses build up to major issues within your application and on your web server.
Code reviews on major applications can take months, but a good code review and tweaks can greatly improve site performance.
Finding performance issues is a major hassle for any website owner or developer. The process can take months and require several people to finally figure out the problem. The best way to identify issues is to tightly monitor your server’s resources and ask your web server’s host to get involved. You might also need to hire outside help.
Picture by Alexander Svensson