Monday 4 June 2012

SSD - Solid State Disk

Much has been written about solid state disks (SSDs) becoming the next big thing in the IT industry. But as to whether or not we have enough knowledge of how it will revolutionize the industry is a different question. How much do we really know about SSDs? If we are to ask people on the streets, it wouldn't come as a surprise that they don't know much of what an SSD is. Even if they do know, most probably it would be limited to a USB pen drive, a CompactFlash or a Secure Digital card, which are more on the consumer side of the storage industry.

Technically speaking, they are not wrong. Most web definitions describe an SSD as a high-performance plug-and-play storage device that contains no moving parts. Therefore, given that most of the aforementioned mobile disk storage devices contain no movable parts, they can certainly be categorized as SSDs.

But then again, there's more to SSDs than being a non-volatile device. The purpose of this article is to give readers a clearer picture of what an SSD is, its usage as well as its difference when pitted against the predominant data storage device at present - the hard disk drive (HDD). More than its non-volatility, this article will reveal how much potential SSDs have in optimizing the performance of the computing system.

A solid state disk (SSD) is electrically, mechanically and software compatible with a conventional (magnetic) hard disk or winchester. The difference is that the storage medium is not magnetic (like a hard disk) or optical (like a CD) but solid state semiconductor such as battery backed RAM, EPROM or other electrically erasable RAM like chip. This provides faster access time than a disk, because the data can be randomly accessed and does not rely on a read/write interface head synchronising with a rotating disk. The SSD also provides greater physical resilience to physical vibration, shock and extreme temperature fluctuations. The only downside is a higher cost per megabyte of storage.
Abbreviated SSD, a solid state disk is a high-performance plug-and-play storage device that contains no moving parts. SSD components include either DRAM or flash memory boards, a memory bus board, a CPU, and a battery card. Because they contain their own CPUs to manage data storage, they are a lot faster (18MBps for SCSI-II and 44MBps for UltraWide SCSI interfaces) than conventional rotating hard disks ; therefore, they produce highest possible I/O rates. SSDs are most effective for server applications and server systems, where I/O response time is crucial. Data stored on SSDs should include anything that creates bottlenecks, such as databases, swap files, library and index files, and authorization and login information.

Friday 16 March 2012

Spyware Protection

Spyware is one of the biggest threats to online systems today, and must not be taken for granted. Unless you actually experience the consequences of spyware infiltration in your system, it is hard to imagine the extent of damage that these malicious software can bring you.
You've probably heard about people suddenly being in debt because somebody else used their credit card information. Or businesses that have suffered tremendous losses because of inadequate spyware protection. But perhaps you didn't really pay attention because you thought it can't possibly happen to you.
Reality could happen to anyone, and you are not an exception. Everyone who is online faces the threat of having spyware embedded in his computer system. That is why spyware protection is required for everybody.
There are many programs out there that offer spyware protection. Hundreds of anti-spyware programs are available, some more effective than others. There are free anti-spyware programs and there are the more expensive ones that offer a greater amount of spyware protection.
It certainly helps to have such programs installed in your system, but an even more effective spyware protection comes from simply being careful.
Most people fall victim to spyware programs merely because of carelessness. In fact, more than 90% of spyware damages could have been avoided if the user had only exercised more caution and better judgment while surfing on the Internet.
Spyware programs are being developed almost everyday, and even the most comprehensive spyware protection software may miss some of the most recently released spyware. This leads us to the conclusion that the most effective spyware protection is still to practice caution.
If you want to have better spyware protection, you will have to stop clicking here and there while surfing. You need to be more discriminating about which sites you visit and which programs you download. Here are some basic ways of being careful on the Internet that everyone should practice:
1. Never click on dubious sites. Ignore unsolicited ads and offers, as these are among the leading carriers of spyware. 2. Never open email attachments from unknown sources. Resist your urge to take a peek. Just delete the file unopened. 3. Never open email attachments from friends when the subject or body of the email looks bogus. 4. Never click on unwanted popups. 5. Delete cookies on a regular basis. This will prevent spyware from getting any useful information from your computer. 6. Update your system and your anti-spyware programs constantly.
These are just some of the most basic ways to provide yourself with spyware protection. Once you make these practices a habit, you will be certain that you are providing yourself and your system with the best form of spyware protection there is.

Tuesday 28 February 2012

Say Hello to Giga bps Internet Speed

If there is one thing that seems to baffle people about their connection to the Internet, it's the mysterious concept of connection speed.   Your Internet Service Provider (ISP) sold you a connection to the Internet at a certain speed.  Maybe you didn't really care what the speed was when you bought your Internet connection, or maybe you did ?

Connection speed is generally measured in bps which means bits per second.

About 10 years ago, the standard computer room was run at Ethernet speed of 10 Mega bps and the new kid on the block was called Fast Ethernet, running at 100 Mega bps. Now the Gigabit net is changing this rapidly.
Estimates are that over the next six years, 5 to 10 Gigabit networks will be making their appearances in data centers around the world, analysts note, driven by the need for ever more speed. Indeed, some estimates say that the Gigabit market will leap to $28 billion in sales.

It's interesting how the geometry of server technology will be changing as the speed race increases. Even small and medium businesses are being forced to make the jump now because those firms which do not are afraid they will be left behind. Indeed, if you were to look into the wiring closets and computer rooms of today's business, it is like you would see the installation of machines with Gigabit ports that interface directly with the Motherboard. The day of the 100 Mbps fast Ethernet port is just about gone, analysts note. Aside from being left behind, it is all being driven by economics.

The hottest growth area today is in rack-top switching where 10G Ethernet ports are becoming the standard. In the topology of the computer room, a network cable would be run from a 10G switch to the rack-to-switch. The rack-top switch, in turn, takes the jumper right to the 10G-Base-T connector on a server.

Even these speeds will seem slow by 2020 when experts predict that 40 Gigabit and 100 Gigabit networking will be the norm, driving by the ever-increasing speed needs of business.

Why would a business ever need speeds greater than 10G-Base-T? The answer is both complicated and simple. It all begins at the user desktop or at the user tablet or even at the user smartphone. With wide-area networking growth and the needs of WiFi users increasing almost exponentially - it takes a lot of bandwidth to move video streams and audio streams almost instantly from place to place - the infrastructure to support them must be put into place.

And, as users become accustomed to having this speed available at their fingertips, they expect the data stream to continue to their enterprise an suddenly the enterprise computing facility needs this kind of speed and as use grows both transactional speed and computational speeds grow.

It's a business model that seems to be being driven in reverse. Instead of the enterprise controlling the speeds, the end users are controlling the speeds and with wide-area-networking and "anywhere office" needs push the envelope, that envelope has to be filled.

One key to the revolution will be the drop in pricing. By the close of next year, it is estimated that the average 10G-Base-T port will cost about U$700 to $1,000 to implement. And, within seven or eight years, that price is expected to drop to about $200, even as server prices drop. It's all a function of the speed at which technology is adopted.

It was estimated that computing speeds in the last 20 years have doubled about 18 months and with it the need for infrastructure changes to meet the changes.

As an example, if you've been in the business for 30 years, you probably remember the early IBM-PC with their anemic clock speeds and anemic network speeds (they didn't seem so at the time). Within 24 months, the processors speeds had doubled and tripled as had the networking speeds. It then became almost axiomatic that speeds would double every 18 months both in computing and networking. And, it's continuing today so that even today's 100 Mbps gold standard is about to become the lead standard as the 10G-Base-T networks and servers by pass them.

It's like a row of dominoes falling. Within six years, as the pricing of 10G-Base-T ports drops to the $200 level, Gigabit-Base-T ports will drop to about $40 or so, while 100 Mbps ports will drop to $8. At the same time, 10G-Base-T port shipments will jump 400 percent, while 100 Mbps ports will drop like stone by the same 400 percent. It all averages out.

Thursday 23 February 2012

How to Choose the Best SEO Service

The concept of online marketing has been adopted by many companies. This has led to the increased popularity of SEO services. With so many SEO specialists in the market, it may be hard for consumers to identify the best SEO service. Before you decide to hire an SEO company, it is imperative to know what you are actually looking for. What are your business goals and objectives? Ensure that you are also familiar with the various SEO methods. This will go a long way in choosing the best SEO service. Most people access SEO services online. One of the reasons why it may be hard to choose the best SEO service is the popular cases of scam. From the many online websites specializing in Search Engine Optimization, it may be hard to distinguish between the genuine sites and the fraudulent ones. It is imperative to be cautious when outsourcing search engine optimization services to ensure that you do not fall victim of a scam.
While out sourcing SEO services, always consider the reputation and the credibility of the SEO Company. Conduct a background search of various companies offering SEO services to identify the best. You can identify the best SEO service by making enquiries from other clients and friends. Clients who have outsourced search engine optimization services previously are better placed to offer guidelines on how to identify the companies that offer guaranteed SEO services. When seeking the best SEO service, consider the customers’ testimonials in various websites. Customers always post comments regarding the service quality and the level of satisfaction derived from the consumption of services.
It would be easy to identify the appropriate SEO services providers by reviewing comments made by various customers. If you find negative customer commentaries in an SEO website, it would be wise to avoid hiring such services.
Search engine optimization is an intricate process that may take some time to complete. There are numerous factors that have to be considered when optimizing a page. As you outsource SEO services, avoid companies which guarantee instant results or the first position in the search engine ranking. As much as the best SEO service provider should offer a prompt service, it is imperative to be realistic. A reliable SEO firm will be honest to the customer and offer them realistic expectations regarding SEO submission services well as the SEO rankings. Usually, SEO companies cannot guarantee the first position in the search engines especially during the first Search engine optimization effort.
One factor that you cannot fail to consider is the cost of the Search Engine Optimization. How much is being charged for the SEO services? Most companies work within the limits of a predefined budget. The best SEO service would be the one that appropriately complements your budget. You cannot go for what you cannot afford. You can only go for the SEO services that you can comfortably afford to pay. The fact that some SEO services are charged highly does not guarantee their quality. When choosing the best SEO service, the price should not be used as the determining factor. Instead, consider the qualifications and the competency of the service provider.

Monday 20 February 2012

Effective Virus Removal Is Just A Few Clicks Away

Reasons for virus attacks on computers are many, but the most common among them are the use of websites that can be a potential threat to the user's workstation. With every passing day, new types of viruses are discovered that can cause serious damages to computers. Though there are many programs available in the market for effective virus removal, they cannot guarantee complete safety of the systems. It is due to this reason that individuals and business organizations across the world are constantly searching for better way of protecting systems against viruses. They are always looking for an effective service that would save a lot of time and costs.
There is an option of calling one of the computer technicians to get rid of virus from the infected computer. In today's world, waiting for the technician to arrive and eventually get all the viruses removed can be a time consuming process. The virus removal tasks can waste a lot of productive time of business organizations where time plays an important role. The newer concept of remote assistance has been adopted in various businesses to solve the issues in certain workstations in a networked environment. As the tracking of computers is possible through the internet, online virus removal is carried out by certain websites that provide technical support through the internet.
There are certain companies which help individuals and businesses through their online virus removal services. Unplugging the desktops and taking them to the nearest dealer can be a waste of time. Lots of costs are also incurred in the transportation of computers to the technician's place and then bringing them back after repairs.
The companies offering technical support for computers online have highly trained and experienced professionals to fix all the computer errors through remote assistance. The prompt services offered by these online computer repair companies help business organizations to save tremendous amount of time and cost.
The recent remote computer support services allow expert computer technicians to access the affected computer from a different location. They could be troubleshooting the computer from any part of the world. The users of the affected computer grants permission to these experts to access the system and get the problem fixed through internet. This process requires an internet connection on both sides and software that facilitates connectivity to the remote service. Online computer repair through this method is gaining popularity due to the ease and speed at which the task is completed.
Affordability is one of the major factors that is taken into consideration while selecting a remote computer support service. When dealing with all sorts of troubleshooting and virus removal tasks, an expert service provider always ensures that the problem gets solved on time. All the companies that are into the business of providing remote assistance to individuals and businesses have an effective pricing structure to suit different budgets. An internet search can provide many such service providers. The best ones are those who do not take much time in solving the issues and are cost-effective at the same time

Wednesday 8 February 2012

Storage Device History

The first PCs used paper tape and Data cassette recorder the same kind that you listen to music with, using a data cassette for storage was very slow. Removable floppy disks as storage devices did not become popular before 1978 when Apple introduced the disk II. The term "floppy" accurately fit the earliest 8-inch PC diskettes and the 5.25-inch diskettes that succeeded them. The inner disk that holds the data usually is made of mylar and coated with a magnetic oxide, and the outer, plastic cover, bends easily. The inner disk of today's smaller, 3.5-inch floppies are similarly constructed, but they are housed in a rigid plastic case, which is much more durable than the flexible covering on the larger diskettes.
The mid-1800's, punch cards are used to provide input to early calculators and other machines.
1940 is the decade when vacuum tubes were used for storage.
1950 finally, tape drives started to replace punch cards. Only a couple of years later, magnetic drums appeared on the scene.
1956, the first hard drive the IBM 305 RAMAC is the first magnetic hard disk for data storage, and the RAMAC (Random Access Method of Accounting and Control) technology soon becomes the industry standard. It required 50 24-inch disks to store five megabytes (million bytes, abbreviated MB) of data and cost roughly $35,000 a year to lease - or $7,000 per megabyte per year. For years, hard disk drives were confined to mainframe and minicomputer installations. Vast "disk farms" of giant 14- and 8-inch drives costing tens of thousands of dollars each whirred away in the air conditioned isolation of corporate data centers.
1962 - JUN. Teletype ships its Model 33 keyboard and punched-tape terminal, used for input and output on many early microcomputers.
1967 - IBM builds the first floppy disk.
1971 - IBM introduces the "memory disk", or "floppy disk", an 8-inch floppy plastic disk coated with iron oxide.
1973 - IBM introduces the IBM 3340 hard disk unit, known as the Winchester, IBM's internal development code name. The recording head rides on a layer of air 18 millionths of an inch thick.
1976 - AUG. iCOM advertises their "Frugal Floppy" in BYTE magazine, an 8-inch floppy drive, selling for US$1200.
1976 - AUG. Shugart announces its 5.25 inch "minifloppy" disk drive for US$390.
1977 - DEC. At an executive board meeting at Apple Computer, president Mike Markkula lists the floppy disk drive as the company's top goal.
1978 - JUN. Apple Computer introduces the Disk II, a 5.25 inch floppy disk drive linked to the Apple II by cable. Price: US$495, including controller card.
The 1980's The introduction of the first small hard disk drives. The first 5.25-inch hard disk drives packed 5 to 10 MB of storage - the equivalent of 2,500 to 5,000 pages of double-spaced typed information - into a device the size of a small shoe box. At the time, a storage capacity of 10 MB was considered too large for a so-called "personal" computer.
1980 - Sony Electronics introduces the 3.5 inch floppy disk drive, double-sided, double-density, holding up to 875KB unformatted.
1980 - JUN. Seagate Technologies announces the first Winchester 5.25-inch hard disk drive.
1980 - JUN. Shugart begins selling Winchester hard-disk drives.
1982 - JUN. Sony Electronics demonstrates its 3.5 inch microfloppy disk system.
1982 - SEP. Iomega begins production of the Alpha 10, a 10MB 8-inch floppy-disk drive using Bernoulli technology
1982 - NOV. Drivetec announces the Drivetec 320 Superminifloppy, offering 3.33MB unformatted capacity on a 5.25-inch drive.
1982 - DEC. Tabor demonstrates a 3.25-inch floppy disk drive, the Model TC500 Drivette. Unformatted capacity is up to 500KB on a single side.
1982 - DEC. Amdek releases the Amdisk-3 Micro-Floppy-disk Cartridge system. It houses two 3-inch floppy drives designed by Hitachi/Matsushita/Maxell. Price is US$800, without a controller card.
1982 - At the West Coast Computer Faire, Davong Systems introduces its 5MB Winchester Disk Drive for the IBM PC, for US$2000.
1983 - MAY. Sony Electronics announces the 3.5 inch floppy disk and drive, double-sided, double-density, holding up to 1MB.
1983 With the introduction of the IBM PC/XT hard disk drives also became a standard component of most personal computers. The descriptor "hard" is used because the inner disks that hold data in a hard drive are made of a rigid aluminum alloy. These disks, called platters, are coated with a much improved magnetic material and last much longer than a plastic, floppy diskette. The longer life of a hard drive is also a function of the disk drive's read/write head: in a hard disk drive, the heads do not contact the storage media, whereas in a floppy drive, the read/write head does contact the media, causing wear.
1983 - Philips and Sony develop the CD-ROM, as an extension of audio CD technology.
1984 - MAY - Apple Computer introduces the DuoDisk dual 5.25-inch floppy disk drive unit for the Apple II line.
By the mid-1980's, 5.25-inch form factor hard drives had shrunk considerably in terms of height. A standard hard drive measured about three inches high and weighed only a few pounds, while lower capacity "half-height" hard drives measured only 1.6 inches high.
1985 - JUN. Apple Computer introduces the UniDisk 5.25 single 5.25-inch floppy disk drive, with the ability to daisy-chain additional drives through it.
By 1987, 3.5-inch form factor hard drives began to appear. These compact units weigh as little as a pound and are about the size of a paperback book. They were first integrated into desktop computers and later incorporated into the first truly portable computers - laptops weighing under 12 pounds. The 3.5-inch form factor hard drives quickly became the standard for desktop and portable systems requiring less than 500 MB capacity. Height also kept shrinking with the introduction of one-inch high, 'low-profile' drives.
1987 - SEP. Microsoft ships Microsoft Bookshelf, its first CD-ROM application.
1990 - JAN. Commodore gives a sneak preview of a proposed "interactive graphics player", based on a variant of the Amiga 500, with 1MB of RAM. The machine includes an integrated CD-ROM drive, but no keyboard.
1990 - NOV. The Multimedia PC Marketing Council sets the minimum configuration required of a PC to run MPC-class software: 10-MHz 286 processor, 2MB RAM, 30MB hard drive, 16-color VGA, mouse, 8-bit audio card, 150KBps CD-ROM drive.
1991 - JAN. Commodore releases the CDTV (Commodore Dynamic Total Vision) package. It features a CD-ROM player integrated with a 7.16-MHz 68000-based Amiga 500. List price is US$1000.
1991 - JUN. Tandy introduces its low-cost CDR-1000 CD-ROM drive for PCs. At US$400, including drive and controller card, it is about half the price of other drives.
1991 - OCT. Insite Technology begins shipping its 21 MB 3.5-inch floppy disk drive to system vendors. The drive uses "floptical" disks, using optical technology to store data.
By 1992, a number of 1.8-inch form factor hard drives appeared, weighing only a few ounces and delivering capacities up to 40 MB. Even a 1.3-inch hard drive, about the size of a matchbox, was introduced. Of course, smaller form factors in and of themselves are not necessarily better than larger ones. Hard disk drives with form factors of 2.5 inches and less currently are required only by computer applications where light weight and compactness are key criteria. Where capacity and cost-per-megabyte are the leading criteria, larger form factor hard drives are still the preferred choice. For this reason, 3.5-inch hard drives will continue to dominate for the foreseeable future in desktop PCs and workstations, while 2.5-inch hard drives will continue to dominate in portable computers.
1993 - OCT. NEC Technologies unveils the first triple-speed (450KBps) CD-ROM drive.
1994 - JAN. NEC Technologies ships its quad-speed CD-ROM, priced at US$1000.
1994 - DEC. Iomega Corp. introduces its Zip drive and Zip disks, floppy disk sized removable storage in sizes of 25MB or 100MB.
Since its introduction, the hard disk drive has become the most common form of mass storage for personal computers. Manufacturers have made immense strides in drive capacity, size, and performance. Today, 3.5-inch, gigabyte (GB) drives capable of storing and accessing one billion bytes of data are commonplace in workstations running multimedia, high-end graphics, networking, and communications applications. And, palm-sized drives not only store the equivalent of hundreds of thousands of pages of information, but also retrieve a selected item from all this data in just a few thousandths of a second. What's more, a disk drive does all of this very inexpensively. By the early 1990s, the cost of purchasing a 200 MB hard disk drive had dropped below $200, or less than one dollar per megabyte.
1997 - NOV. IBM announced the world's highest capacity desktop PC hard disk drive with new breakthrough technology called Giant Magnetoresistive (GMR) heads. Pioneered by scientists at IBM Research, GMR heads will be used in IBM's Deskstar 16GP, a 16.8-gigabyte drive. This brings down the cost of storage to .25 cents per megabyte.
1998 - NOV. IBM announced a 25GB hard drive. That first hard disk drive in 1956 had a capacity of 5 megabytes. IBM's Deskstar 25GP 25-gigabyte (GB) drive has 5,000 times the capacity of that first drive. It holds either the double-spaced typed text on a stack of paper more than 4,000 feet high, more than six full-length feature films or 20,000 digital images.
1999 - October 18, IBM raised the bar in hard drive technology with a new family of record-breaking hard drives and a new technology that protects drives against temperature variation and vibration. The 10,000 RPM Ultrastar 72ZX -- the world's highest capacity drive at 73 gigabytes (GB).
2000 - Paris, FRANCE - June 20, 2000. IBM® announced the availability of the 1Gb Microdrive, the world's smallest, lightest and largest capacity mobile hard disk increasing storage by a factor of three