Monday, June 29, 2009

Scalability - Increasing The Size Of The Problem

Heuristics can shave some time off a lengthy computation, but they cannot generally find the best solution. And when number of possibilities rises steeply, as it does in TSP, the solution may not be all that great. The ability of an algorithm to perform well as the amount of data rises is called scalability. An increase in data usually means an increase in the number of calculations, which results in a longer run time. To scale well, an algorithm's run time should not rise prohibitively with an increase in data.

Consider an algorithm that searches for every instance of a certain word in a document. The simplest algorithm would examine one word, see if there is a match, and then move on. If the size of the document is doubled, then the number of words will be about twice as large, and the search algorithm will take twice as long to search, all other things being equal. A document four times as large would take four times as long. This increase is linear - the increase in run time is proportional to the increase in data.

Now consider the brute force approach to solving a TSP. The number of possible routes rises at an astonishing rate with even a slight increase in the number of cities, which is a much worse situation than in the search algorithm. If the input of the search algorithm goes from four words to eight, then the time doubles; for brute force TSP calculations, an increase from four cities to eight increases the time by a factor of 1,680. The search algorithm scales at a tolerable level, but the brute force TSP is a disaster.

Advances in computer technology have resulted in blazingly fast computers, yet even with these speedy machines, algorithm scalability is important. Linear increases, such as the simple search algorithm displays, do not present much of a problem even with relatively large data sets, but this kind of tame behavior is the exception rather than the rule. The TSP is a notoriously difficult case, but there are many other problems that scale poorly. Even with heuristics and approximations, solving these problems when the data set is large takes an extraordinary amount of time, if it is possible at all. To the chagrin of scientists and engineers, many of the more interesting and important problems in science and engineering belong in this category.

There is another practical concern besides run time. Computers hold instructions and data in memory as the program runs. Although some computers have a huge amount of memory, it is still a limited resource that must be conserved. Scientists and engineers who share the use of the most advanced supercomputers are allotted only a certain amount of memory, and users who pay a supercomputer facility to use one of their machines are often charged for the memory as well as the time their programs consume.

Memory can also be an important factor in the run time of a program. Computer memory consists of fast devices such as random access memory (RAM) circuits that hold millions or billions of chunks of data known as bytes, so retrieving a piece of data or loading the next instruction takes little time. But if a program exceeds the amount of available memory, the computer must either stop or, in most cases, will use space on the hard disk to store the excess. Disk operations are extremely slow when compared to memory, so this procedure consumes a great deal of time. A personal computer, for example, should have ample memory, otherwise large programs such as video games will be uselessly slow even if the processor is the fastest one available.

The memory versus speed trade-off was more imperative back in the 1980s, when memory was measured in thousands of bytes rather than today's billions. Although memory is still an important factor in certain applications, the main issue in general is time. When a programmer examines an algorithm's efficiency, he or she is usually concerned with its speed.

An algorithm's complexity can be measured by its run time - this is called time complexity - or the amount of memory resources it uses, which is known as space complexity. But speed, like the amount of memory, depends on the computer - some computers are faster than others. A measure of run time will apply only to a certain machine rather than to all of them.

To avoid the need to analyze an algorithm for each kind of computer, a general measure, such as the number of steps required by the algorithm, can be used. The number of steps does not usually change very much from machine to machine, but it does depend on the size of the problem, as well as the specific outcome. For example, a search algorithm may end early if the target is found, truncating the rest of the search. In order to be as general as possible, an analysis of an algorithm usually considers the worst-case scenario, which requires the most steps. For the example of a search algorithm that quits when the target is found, the worst case is that the target is in the last piece of data, or is not found at all.

You will find interesting information about Emtek door hardware and Epilady hair removal at Kile's latest websites.

Tuesday, June 23, 2009

Computer Algorithms

An algorithm can be written as a series of steps. Computers only do what they are told, so the algorithm must not omit any important steps, otherwise the problem will not be solved. But adding unnecessary steps is unwise because this will increase the program's run time.

Computers operate with a binary language called machine language - words or strings of 1s and 0s - that is difficult and cumbersome for programmers to use. Although all programs eventually must be translated into machine language, most programmers write programs in languages such as BASIC or C, which contain a set of commands and operations that humans can more readily understand. Programs called compilers or interpreters then translate these instructions into machine language.

Programmers often begin with an outline or a sequence of steps the algorithm is to accomplish. For example, consider the problem of finding the largest number in a list. The steps may be written as follows.

1. Input the list of numbers.
2. Store the first number in a location called Maximum.
3. Store the next number in a location called Next and compare the value of Maximum with that of Next.
4. If Maximum's value equals or exceeds that of Next, discard the number in Next. If not, replace the value of Maximum with Next's value.
5. Repeat steps 3-5 until the end of the list is reached.
6. Output Maximum.

More complicated problems involve a lot more steps. Deep Blue, the IBM computer that beat reigning champion Garry Kasparov in chess in 1997, ran a considerably longer program.

Finding the maximum number in a list is a simple problem with a simple solution, but even simple solutions may take a long time. The TSP is an excellent example. Because it is a practical problem as well as representative of a large class of interesting problems in computer science, it has been much studied. Writing in the February 15, 1991, issue of Science, researchers Donald L. Miller, at E. I. du Pont de Nemours and Company, in Delaware, and Joseph F. Pekny, at Purdue University, in Indiana, observed, "Few mathematical problems have commanded as much attention as the traveling salesman problem."

A brute force approach to solving a TSP would calculate each possible distance. A more thrifty approach is to use heuristics - shortcuts to speed up the computation, such as the author's assumption, as mentioned in the sidebar, that the shortest route would not contain the longest intercity distance. Shin Lin and Brian Kernighan published a heuristic in 1973, "An Effective Heuristic Algorithm for the Traveling-Salesman Problem," in Operations Research, that uses a complicated procedure. But these procedures are not guaranteed to find the optimal solution. Instead, the goal is to find a good solution in a reasonable amount of time.

Take a minute and look at the Emtek door hardware and Epilady hair removal on the author's websites.

Thursday, June 18, 2009

Third Party Disk Defragmentation Programs

As you use your computer files and data is moved around on your hard drive. This produces what is called fragmentation. As fragments build up they can slow down your computer. Disk Defragmentation should be done every so often to keep your computer running fast and smooth. But the defragmentation program that came with your operating system might not be the best and fastest way to defrag your hard drive. There are many third party programs out there, some which are less than trustworthy. So I am going to talk about a couple defrag programs that I have used and liked in the past. All of which I have personally tested. Some are free, some are not. Like everything else I talk about, I have no affiliation to any of these programs, and what I write is my own opinion.

Piriform's Defraggler is a great program. I install it on most computers because it is quick and easy to use. It's also a very small program, so you can easily put in on a flash drive and carry it around with you. Its free, unless you care to pay for Priority support. They have two flavors of Defraggler, a home version, and a business version. One of the things that drew me to Defraggler is that it has both a 32-bit and 64-bit version.

Diskeeper is a reliable and powerful program to keep your disks free of fragmentation. I used it a while ago to handle one of my computers that had three hard drives and it did wonders. Diskeeper comes in many different versions for home and business and everything in between, including for home servers. They even offer discounts if you are a buying in bulk. For networks they have this manager that allows a single person to control defragmentation on a network wide setup. If you are not sure you want to buy it, take a look at their 30 day trial.

PerfectDisk11 I have not had the chance to use PerfectDisk11 a lot, but in the few months that I did use it, it was useful. It not only managed all my drives, but it had a bunch of features that not only allowed it to run faster and quicker but it also optimized my boot times. While I was running PerfectDisk I was able to not only use my computer, but my computer ran almost as fast as it did when it was not being defragmented. It also allowed me to control almost every aspect of it. You should defiantly take a look at this program. Unfortunately you need to pay for it, but you can get a 30-day trial.

These programs should hopefully help you defragment your hard drives, and keep them defragmented. Enjoy!

Thursday, June 11, 2009

Read Our Sony VAIO VPCEB390X Review

Buying something on the internet has become very difficult these days. Unlike past, we have now more products versus consumers. This has led to increase the competition in the marketplace where each company and brand wants to get a bigger market share. Same is the case with computers and laptops, almost everyone has the same needs; we all want to get huge storage capacity, good display screen, long life battery, durable processor and high performance with speed. The difference exists in the design, overall convenience, and price. These all factors let us decide whether we should buy a laptop or not. In this review we will throw some light on the features and dark sides of Sony VAIO VPCEB390X.

There is certainly no other opinion that Sony laptops are the most preferred machines among general computer users. It has been a symbol of style and elegance for many years. However considering a specific model of Sony VAIO, we have to open up everything in front of the people to make decision making process very easy. Sony VAIO VPCEB390X review will help you better understand the features and any limitations it has. VAIO VPCEB390X is known for its performance and multimedia functionality that allow you to enjoy movies and high definition videos. However many customers have reported a serious problem that it gets jammed in the middle while operating multiple programs.

Sony VAIO VPCEB390X comes with Intel Pentium Dual core processor that sounds enough to keep up the performance. With a 2 GB memory and a hard drive of 320 GB, this machine satisfies all users with basic computing needs. Since you can have it in 5 different colors; females usually prefer to go with purple, sleek red and light blue color. Similarly businessmen and students like to have it in white and black. This machine will come with genuine Windows 7 Home Edition installed.

Now let's come to another important point of display screen; the screen is 15.5 inches that enables the user to enjoy refined picture editing. Moreover its Intel graphics card helps in producing refined quality of display. With all these features, it's very hard to stop yourself. However, unlike its features the price of Sony VAIO VPCEB390X does not really acts as a magnet. Its price starts with $630 which is a bit high as compared with the laptops of other brands with similar features. This is the reason, a number of people think thrice even when they fully intend to purchase this amazing notebook.

Interested in Sony VAIO notebooks? Check out latest Sony VAIO VPCEB390X review to see the real pros and cons.

About the author:

Nadav Snir operates a website which includes coupons and discounts to the best electronics stores on the web. To get those discounts, visit: http://Great-Info-Products.com/Electronics/index.html