best site for it students

  • 22

Saturday, June 20, 2009




An extremely fast computer that can perform hundreds of millions of instructions per second. The fastest type of computer. Supercomputers are very expensive and are employed for specialized applications that require immense amounts of mathematical calculations. For example, weather forecasting requires a supercomputer. Other uses of supercomputers include animated graphics, fluid dynamic calculations, nuclear energy research, and petroleum exploration. The chief difference between a supercomputer and a mainframe is that a supercomputer channels all its power into executing a few programs as fast as possible, whereas a mainframe uses its power to execute many programs concurrently

Sunday, May 3, 2009

Computer Components

CPU (Central Processing Unit)
So what's a CPU? It stands for Central Processing Unit. Many users erroneously refer to the whole computer box as the CPU. In fact, the CPU itself is only about 1.5 inches square. The CPU does exactly what it stands for. It is the control unit that processes all* of the instructions for the computer. Consider it to be the "brain" of the computer. It does all the thinking. So, would you like to have a fast or slow brain? Obviously, the answer to this question makes the CPU the most important part of the computer. The speed here is the most significant. The processor's (CPU's) speed is given in a MHz or GHz rating 3 GHz is roughly 3,000 MHz. In today's computers, the video cards, sound cards, etc. also process instructions, but the majority of the burden lays on the CPU.
MotherboardThe best way to describe the motherboard goes along well with my human body analogy that I used for the CPU. The CPU is the brain, and the motherboard is the nervous system. Therefore, just as a person would want to have fast communication to the body parts, you want fast communication between the parts of your computer. Fast communication isn't as important as reliable communication though. If your brain wanted to move your arm, you want to be sure the nervous system can accurately and consistently carry the signals to do that! Thus, in my opinion, the motherboard is the second most important part of the computer. The motherboard is the circuit board to which all the other components of the computer connect in some way. The video card, sound card, IDE hard drive, etc. all plug into the motherboard's various slots and connectors. The CPU also plugs into the motherboard via a Socket or a Slot.
Hard Disk
As the primary communication device to the rest of the computer, the hard drive is very important. The hard drive stores most of a computer's information including the operating system and all of your programs. Having a fast CPU is not of much use if you have a slow hard drive. The reason for this is because the CPU will just spend time waiting for information from the hard drive. During this time, the CPU is just twiddling it's thumbs... The hard drive stores all the data on your computer - your text documents, pictures, programs, etc. If something goes wrong with your hard drive, it is possible that all your data could be lost forever. Today's hard drives have become much more reliable, but hard drives are still one of the components most likely to fail because they are one of the few components with moving parts. The hard drive has round discs that store information as 1s and 0s very densely packed around the disc.
Video cards
Video cards provide the means for the computer to "talk" to your monitor so it can display what the computer is doing. Older video cards were "2D," or "3D," but today's are all "2D/3D" combos. The 3D is mostly useful for gaming, but in some applications can be useful in 3D modeling, etc. Video cards have their own advanced processing chips that make all kinds of calculations to make scenes look more realistic. The many video cards out there are based on much smaller number of different chipsets (that are run at different speeds or have slight differences in the chipsets). Different companies buy these chipsets and make their own versions of the cards based on the chipsets. For the most part, video cards based on the same chipset with the same amount of RAM are about equivalent in performance. However, some brands will use faster memory or other small optimizations to improve the speed. The addition of other extras like "dual head" (support for two monitors) or better cooling fans may also appear by different brands. At any rate, the first decision to make is what chipset you want your video card to use. If you aren't interested in games, then the choice of chipset isn't too difficult - just about any will do for the 2D desktop applications. There's no point in buying a video card over $100 if you don't plan to play games.
Memory
All programs, instructions, and data must be stored in system memory before the computer can use it. It will hold recently used programs, instructions, and data in memory if there is room. This provides quick access (much faster than hard drives) to information. The more memory you have, the more information you will have fast access to and the better your computer will perform. Memory is much like the short term memory in your brain. It holds your most recent information for quick access. Just as you want to accurately remember this information in your head, you want your computer's memory to have the correct information as well, or problems will obviously occur. Bad memory is one of the more common causes of computer crashes, and also the most difficult problem to diagnose. Because of this, making sure you get good RAM the first time around is very important. There are many, many different types of memory for different tasks. The main ones today are DDR PCxx00 SDRAM DIMMs (this includes PC2700, PC3200, etc.) and Direct RDRAM RIMMs.CD/DVD-ROM Drive
CD-ROM drives are necessary today for most programs. A single CD can store up to 650 MB of data (newer CD-Rs allow for 700 MB of data, perhaps more with "overburn"). Fast CD-ROM drives have been a big topic in the past, but all of today's CD-ROM drives are sufficiently fast. Of course, it's nice to have the little bits of extra speed. However, when you consider CD-ROM drives are generally used just to install a program or copy CDs, both of which are usually done rarely on most users' computers, the extra speed isn't usually very important. The speed can play a big role if you do a lot of CD burning at high speeds or some audio extraction from audio CDs (i.e. converting CDs to MP3s).
CD-R/RW (which stands for Recordable / Rewritable) drives (aka burners, writers) allow a user to create their own CDs of audio and/or data. These drives are great for backup purposes (backup your computer's hard drive or backup your purchased CDs) and for creating your own audio CD compilations (not to mention other things like home movies, multimedia presentations, etc.).
DVD-ROM drives can store up to 4 GB of data or about 6 times the size of a regular CD (not sure on the exact size, but suffice to say it's a very large storage medium). DVDs look about the same and are the same size as a CD-ROM. DVD drives can also read CD-ROM drives, so you don't usually need a separate CD-ROM drive. DVD drives have become low enough in price that there isn't much point in purchasing a CD-ROM drive instead of a DVD-ROM drive. Some companies even make CD burner drives that will also read DVDs (all in one). DVD's most practical use is movies. The DVD format allows for much higher resolution digital recording that looks much clearer than VCR recordings.
DVD recordable drives are available in a couple of different formats - DVD-R or DVD+R with a RW version of each. These are slightly different discs and drives (although some drives support writing to both formats). One is not much better than the other, so it really boils down to price of the media (and also availability of the media).
SCSI card
A SCSI card is a card that will control the interface between SCSI versions of hard drives, CD-ROM drives, CD-ROM burners, removable drives, external devices such as scanners, and any other SCSI components. Most fit in a PCI slot and there is a wide range of types. The three main types of connectors on these cards are 25-pin for SCSI-1, 50-pin for Narrow SCSI, and 68-pin for Wide SCSI (and Ultra-Wide SCSI, Ultra2-SCSI, Ultra160 SCSI, and Ultra 320 SCSI - all of which use a 68 pin connector).
SCSI controllers provide fast access to very fast SCSI hard drives. They can be much faster than the IDE controllers that are already integrated your computer's motherboard. SCSI controllers have their own advanced processing chips, which allows them to rely less on the CPU for handling instructions than IDE controllers do.
For the common user, SCSI controllers are overkill, but for high end servers and/or the performance freaks of the world, SCSI is the way to go. SCSI controllers are also much more expensive than the free IDE controller already included on your motherboard. There is also a large premium in price for the SCSI hard drives themselves. Unless you have deep pockets, there isn't much of a point in going with a SCSI controller.
Many people buy SCSI controllers just for use with their CD-ROM burners and CD-ROM drives (these drives must be SCSI drives of course).
SCSI cards also have the ability to have up 15 devices or more per card, while a single IDE controller is limited to only 4 devices (some motherboards now come with more than one IDE controller though). SCSI cards allow these drives to be in a chain along the cable. Each drive on the cable has to have a separate SCSI ID (this can be set by jumpers on the drive). The last drive on the end of the cable (or the cable itself) has to "terminate" the chain (you turn termination on by setting a termination jumper on the drive - or use a cable that has a terminator at the end of it).
Monitors
Monitors obviously display what is going on in your computer. They can run at various resolutions and refresh rates. 640x480 is the default resolution for the Windows operating systems (this is a low resolution where objects appear large and blocky). 640x480 just means that 640 pixels are fit across the top of your monitor and 480 up and down. Most users prefer higher resolutions such as 800x600 or 1024x768 all the way up to 1600x1200 (and higher for graphics professionals). The higher resolutions make objects smaller, but clearer (because more pixels are fit in the screen). You can fit more objects on a screen when it is in a higher resolution. Larger monitors are better for running at the higher resolutions. If you run a high resolution on a small monitor, the text may be hard to read because of its small size, despite the clarity. The refresh rate is how fast the monitor can refresh (redraw) the images on the screen. The faster it can do this, the smoother your picture will be and the less "flicker" you will see. The monitor has a lot to do with the quality of the picture produced by your video card, but it doesn't actuall "produce" the graphics - the video card does all this processing. But, if your video card is producing a bright detailed picture and your monitor is dim and blurry, the picture will come out the same way.
Printer
As you know, a printer outputs data from your computer on a piece of paper. There are many different types of printers (most common are laser and inkjet), and many printers are better than others for different tasks (printing photographs, clear text, etc.). Laser printers aren't necessarily better quality than inkjets anymore, although they once were. If you want to be able to print in color, inkjet printers are the best option for the cost conscious too. Some of today's "office inkjet" printers also have other functions including scanning, faxing, copying, etc. While the scan and copy quality usually aren't that great, the quality is generally good enough for most office / home office situations.
Modem
If you are at home, then you are most likely using a modem to view this page right now (dial-up modem, cable modem, or DSL modem). The modem is what hosts the communication between your computer and the computers you are connecting to over the Internet. If you're on a network, then you're using a network card (Ethernet card most likely - and that may connect to your cable or DSL modem). A modem uses your phone line to transfer data to and from the other computers. Newer cable modems and DSL modems provide about 10 times the speed of a regular phone modem. These are usually external and plug into a network card in your computer.Modem stands for "modulator / demodulator" and it encodes and decodes signals sent to and from the network servers. Good modems should be able to do all the encoding / decoding work on their own without having to rely on your computer's CPU to do the work.

First Generation Electronic Computers

Three machines have been promoted at various times as the first electronic computers. These machines used electronic switches, in the form of vacuum tubes, instead of electromechanical relays. In principle the electronic switches would be more reliable, since they would have no moving parts that would wear out, but the technology was still new at that time and the tubes were comparable to relays in reliability. Electronic components had one major benefit, however: they could ``open'' and ``close'' about 1,000 times faster than mechanical switches. The earliest attempt to build an electronic computer was by J. V. Atanasoff, a professor of physics and mathematics at Iowa State, in 1937. Atanasoff set out to build a machine that would help his graduate students solve systems of partial differential equations. By 1941 he and graduate student Clifford Berry had succeeded in building a machine that could solve 29 simultaneous equations with 29 unknowns. However, the machine was not programmable, and was more of an electronic calculator. A second early electronic machine was Colossus, designed by Alan Turing for the British military in 1943. This machine played an important role in breaking codes used by the German army in World War II. Turing's main contribution to the field of computer science was the idea of the Turing machine, a mathematical formalism widely used in the study of computable functions. The existence of Colossus was kept secret until long after the war ended, and the credit due to Turing and his colleagues for designing one of the first working electronic computers was slow in coming. The first general purpose programmable electronic computer was the Electronic Numerical Integrator and Computer (ENIAC), built by J. Presper Eckert and John V. Mauchly at the University of Pennsylvania. Work began in 1943, funded by the Army Ordnance Department, which needed a way to compute ballistics during World War II. The machine wasn't completed until 1945, but then it was used extensively for calculations during the design of the hydrogen bomb. By the time it was decommissioned in 1955 it had been used for research on the design of wind tunnels, random number generators, and weather prediction. Eckert, Mauchly, and John von Neumann, a consultant to the ENIAC project, began work on a new machine before ENIAC was finished. The main contribution of EDVAC, their new project, was the notion of a stored program. There is some controversy over who deserves the credit for this idea, but none over how important the idea was to the future of general purpose computers. ENIAC was controlled by a set of external switches and dials; to change the program required physically altering the settings on these controls. These controls also limited the speed of the internal electronic operations. Through the use of a memory that was large enough to hold both instructions and data, and using the program stored in memory to control the order of arithmetic operations, EDVAC was able to run orders of magnitude faster than ENIAC. By storing instructions in the same medium as data, designers could concentrate on improving the internal structure of the machine without worrying about matching it to the speed of an external control. Regardless of who deserves the credit for the stored program idea, the EDVAC project is significant as an example of the power of interdisciplinary projects that characterize modern computational science. By recognizing that functions, in the form of a sequence of instructions for a computer, can be encoded as numbers, the EDVAC group knew the instructions could be stored in the computer's memory along with numerical data. The notion of using numbers to represent functions was a key step used by Goedel in his incompleteness theorem in 1937, work which von Neumann, as a logician, was quite familiar with. Von Neumann's background in logic, combined with Eckert and Mauchly's electrical engineering skills, formed a very powerful interdisciplinary team. Software technology during this period was very primitive. The first programs were written out in machine code, i.e. programmers directly wrote down the numbers that corresponded to the instructions they wanted to store in memory. By the 1950s programmers were using a symbolic notation, known as assembly language, then hand-translating the symbolic notation into machine code. Later programs known as assemblers performed the translation task. As primitive as they were, these first electronic machines were quite useful in applied science and engineering. Atanasoff estimated that it would take eight hours to solve a set of equations with eight unknowns using a Marchant calculator, and 381 hours to solve 29 equations for 29 unknowns. The Atanasoff-Berry computer was able to complete the task in under an hour. The first problem run on the ENIAC, a numerical simulation used in the design of the hydrogen bomb, required 20 seconds, as opposed to forty hours using mechanical calculators. Eckert and Mauchly later developed what was arguably the first commercially successful computer, the UNIVAC; in 1952, 45 minutes after the polls closed and with 7% of the vote counted, UNIVAC predicted Eisenhower would defeat Stevenson with 438 electoral votes (he ended up with 442).

Personal Computer

A small, single-user computer based on a microprocessor. In addition to the microprocessor, a personal computer has a keyboard for entering data, a monitor for displaying information, and a storage device for saving data.A small, relatively inexpensive computer designed for an individual user. In price, personal computers range anywhere from a few hundred dollars to thousands of dollars. All are based on the microprocessor technology that enables manufacturers to put an entire CPU on one chip. Businesses use personal computers for word processing, accounting, desktop publishing, and for running spreadsheet and database management applications. At home, the most popular use for personal computers is for playing games. Personal computers first appeared in the late 1970s. One of the first and most popular personal computers was the Apple II, introduced in 1977 by Apple Computer. During the late 1970s and early 1980s, new models and competing operating systems seemed to appear daily. Then, in 1981, IBM entered the fray with its first personal computer, known as the IBM PC. The IBM PC quickly became the personal computer of choice, and most other personal computer manufacturers fell by the wayside. One of the few companies to survive IBM's onslaught was Apple Computer, which remains a major player in the personal computer marketplace. Other companies adjusted to IBM's dominance by building IBM clones, computers that were internally almost the same as the IBM PC, but that cost less. Because IBM clones used the same microprocessors as IBM PCs, they were capable of running the same software. Over the years, IBM has lost much of its influence in directing the evolution of PCs. Many of its innovations, such as the MCA expansion bus and the OS/2 operating system, have not been accepted by the industry or the marketplace. Today, the world of personal computers is basically divided between Apple Macintoshes and PCs. The principal characteristics of personal computers are that they are single-user systems and are based on microprocessors. However, although personal computers are designed as single-user systems, it is common to link them together to form a network. In terms of power, there is great variety. At the high end, the distinction between personal computers and workstations has faded. High-end models of the Macintosh and PC offer the same computing power and graphics capability as low-end workstations by Sun Microsystems, Hewlett-Packard, and DEC.

Thursday, April 30, 2009

NETWORKING

Users and network administrators often have different views of their networks. Often, users share printers and some servers form a workgroup, which usually means they are in the same geographic location and are on the same LAN. A community of interest has less of a connotation of being in a local area, and should be thought of as a set of arbitrarily located users who share a set of servers, and possibly also communicate via peer-to-peer technologies.
Network administrators see networks from both physical and logical perspectives. The physical perspective involves geographic locations, physical cabling, and the network elements (e.g., routers, bridges and application layer gateways that interconnect the physical media. Logical networks, called, in the TCP/IP architecture, subnets, map onto one or more physical media. For example, a common practice in a campus of buildings is to make a set of LAN cables in each building appear to be a common subnet, using virtual LAN (VLAN) technology.
Both users and administrators will be aware, to varying extents, of the trust and scope characteristics of a network. Again using TCP/IP architectural terminology, an intranet is a community of interest under private administration usually by an enterprise, and is only accessible by authorized users (e.g. employees).[5] Intranets do not have to be connected to the Internet, but generally have a limited connection. An extranet is an extension of an intranet that allows secure communications to users outside of the intranet (e.g. business partners, customers).[5]
Informally, the Internet is the set of users, enterprises,and content providers that are interconnected by Internet Service Providers (ISP). From an engineering standpoint, the Internet is the set of subnets, and aggregates of subnets, which share the registered IP address space and exchange information about the reachability of those IP addresses using the Border Gateway Protocol. Typically, the human-readable names of servers are translated to IP addresses, transparently to users, via the directory function of the Domain Name System (DNS).
Over the Internet, there can be business-to-business (B2B), business-to-consumer (B2C) and consumer-to-consumer (C2C) communications. Especially when money or sensitive information is exchanged, the communications are apt to be secured by some form of communications security mechanism. Intranets and extranets can be securely superimposed onto the Internet, without any access by general Internet users, using secure Virtual Private Network (VPN) technology.
When used for gaming one computer will have to be the server while the others play through it.

Computer networking is the engineering discipline concerned with communication between computer systems or devices. Networking, routers, routing protocols, and networking over the public Internet have their specifications defined in documents called RFCs.[1] Computer networking is sometimes considered a sub-discipline of telecommunications, computer science, information technology and/or computer engineering. Computer networks rely heavily upon the theoretical and practical application of these scientific and engineering disciplines.
A computer network is any set of computers or devices connected to each other with the ability to exchange data.[2] Examples of different networks are:
Local area network (LAN), which is usually a small network constrained to a small geographic area.
Wide area network (WAN) that is usually a larger network that covers a large geographic area.
Wireless LANs and WANs (WLAN & WWAN) are the wireless equivalent of the LAN and WAN.
All networks are interconnected to allow communication with a variety of different kinds of media, including twisted-pair copper wire cable, coaxial cable, optical fiber, power lines and various wireless technologies.[3] The devices can be separated by a few meters (e.g. via Bluetooth) or nearly unlimited distances (e.g. via the interconnections of the Internet[4]).

Wednesday, April 29, 2009


Product Specifications
• Brand Dell 1537
• Processor Intel Core 2 Duo (2.0GHz)
• Screen 15.4" Diagonal Display
• Ram 4GB RAM
• Hard Drive 320GB Hard Disk
• Optical Drive Super Multi Drive
• Communication WiFi, WebCam
• Operting System Free Dos
• Warranty 1 Year Limited Warranty
1 Year Manufacturers Warranty

Monday, April 27, 2009

ANTI-VIRUS REMOVER

One reason you may want to remove Norton AntiVirus is because the program may have come preinstalled in your computer. It usually comes in the form of a trial version that encourages you to sign up for the full-featured version. If you’re not really impressed with what Norton has to offer, you may want to uninstall it. Also, you may want to install a `better’ anti-virus program. The new program may require you to uninstall Norton AntiVirus first before you could proceed with its installation. Another aspect you would want to consider is there are some viruses created to attack Norton and disable it, so that these viruses can freely attack your system and wreak havoc on it.
Read the rest
One reason you may want to remove Norton AntiVirus is because the program may have come preinstalled in your computer. It usually comes in the form of a trial version that encourages you to sign up for the full-featured version. If you’re not really impressed with what Norton has to offer, you may want to uninstall it. Also, you may want to install a `better’ anti-virus program. The new program may require you to uninstall Norton AntiVirus first before you could proceed with its installation. Another aspect you would want to consider is there are some viruses created to attack Norton and disable it, so that these viruses can freely attack your system and wreak havoc on it.
If you’ve been using Norton AntiVirus for quite a while and regularly updating it, you would have noticed that it has the tendency to slow down your system. It could also be that Norton wrongly cuts off Internet access to protect your system.
In such cases uninstalling Norton AntiVirus and replacing it with a freeware antivirus program like AVG is a wise option. Let’s look at how we can go about uninstalling Norton AntiVirus

COMPUTER HARDWARE

[edit] Relationship to computer hardware
Computer software is so called to distinguish it from computer hardware, which encompasses the physical interconnections and devices required to store and execute (or run) the software. At the lowest level, software consists of a machine language specific to an individual processor. A machine language consists of groups of binary values signifying processor instructions which change the state of the computer from its preceding state. Software is an ordered sequence of instructions for changing the state of the computer hardware in a particular sequence. It is usually written in high-level programming languages that are easier and more efficient for humans to use (closer to natural language) than machine language. High-level languages are compiled or interpreted into machine language object code. Software may also be written in an assembly language, essentially, a mnemonic representation of a machine language using a natural language alphabet. Assembly language must be assembled into object code via an assembler.
The term "software" was first used in this sense by John W. Tukey in 1958.[4] In computer science and software engineering, computer software is all computer programs. The theory that is the basis for most modern software was first proposed by Alan Turing in his 1935 essay Computable numbers with an application to the Entscheidungsproblem
Practical computer systems divide software systems into three major classes: system software, programming software and application software, although the distinction is arbitrary, and often blurred.

[edit] System software
System software helps run the computer hardware and computer system. It includes:
device drivers,
operating systems,
servers,
utilities,
windowing systems,
(these things need not be distinct)
The purpose of systems software is to unburden the applications programmer from the details of the particular computer complex being used, including such accessory devices as communications, printers, readers, displays, keyboards, etc. And also to partition the computer's resources such as memory and processor time in a safe and stable manner.

[edit] Programming software
Programming software usually provides tools to assist a programmer in writing computer programs, and software using different programming languages in a more convenient way. The tools include:
compilers,
debuggers,
interpreters,
linkers,
text editors,
An Integrated development environment (IDE) is a single application that attempts to manage all these functions.

COMPUTER SOFTWARE

Computer software, or just software is a general term used to describe a collection of computer programs, procedures and documentation that perform some tasks on a computer system.[1]
The term includes:
Application software such as word processors which perform productive tasks for users.
Firmware which is software programmed resident to electrically programmable memory devices on board mainboards or other types of integrated hardware carriers.
Middleware which controls and co-ordinates distributed systems.
System software such as operating systems, which interface with hardware to provide the necessary services for application software.
Software testing is a domain independent of development and programming. It consists of various methods to test and declare a software product fit before it can be launched for use by either an individual or a group. Many tests on functionality, performance and appearance are conducted by modern testers with various tools such as QTP, Load runner, Black box testing etc to edit a checklist of requirements against the developed code. ISTQB is a certification that is in demand for engineers who want to pursue a career in testing.[2]
Testware which is an umbrella term or container term for all utilities and application software that serve in combination for testing a software package but not necessarily may optionally contribute to operational purposes. As such, testware is not a standing configuration but merely a working environment for application software or subsets thereof.
Software includes websites, programs, video games, etc. that are coded by programming languages like C, C++, etc.
"Software" is sometimes used in a broader context to mean anything which is not hardware but which is used with hardware, such as film, tapes and records.[3]

Contents[hide]
1 Overview
1.1 Relationship to computer hardware
2 Types of software
2.1 System software
2.2 Programming software
2.3 Application software
3 Software topics
3.1 Architecture
3.2 Documentation
3.3 Library
3.4 Standard
3.5 Execution
3.6 Quality and reliability
3.7 License
3.8 Patents
3.9 Ethics and rights
4 Design and implementation
5 Industry and organizations
6 See also
7 References
overview

Computer software is often regarded as anything but hardware, meaning that the "hard" are the parts that are tangible while the "soft" part is the intangible objects inside the computer. Software encompasses an extremely wide array of products and technologies developed using different techniques like programming languages, scripting languages or even microcode or a FPGA state. The types of software include web pages developed by technologies like HTML, PHP, Perl, JSP, ASP.NET, XML, and desktop applications like Microsoft Word, OpenOffice developed by technologies like C, C++, Java, C#, etc. Software usually runs on an underlying software operating systems such as the Microsoft Windows or Linux. Software also includes video games and the logic systems of modern consumer devices such as automobiles, televisions, toasters, etc.

C0MPUTER HARDWARE

Saturday, April 25, 2009

Find simple ways to extend the life of your computer hardware and accessories. The key to making the most of computer hardware isn't just knowing where to purchase the cheapest replacement parts--it's in knowing how to make the computer components you already have last as long as possible. The most common problems facing the standard business computer user are shortened laptop battery life, hard drive failure or slowdown, and mysterious computer issues and component failure due to ... Read more

COMPUTER HARDWARE



TIPS & TRICKS
12 Ways To Secure Your Servers
Servers store your most important business data, deliver your e-mail, and run your Web site, but keeping them from harm's way is a full time job. The 12 tips and tricks in this how-to guide will help keep your data secure.




The question is not whether an unsecured server placed on the Internet will be infected, but rather how many minutes (not hours) it will take. That's why it's important to configure your server off-line and load initial patches through a CD. The goal is to make your server as secure as possible before putting it online.
2. Control access
Administrator login privileges have access and control to your entire system; only a few, ultra-trustworthy individuals should have administrator privileges and only they should know the password.
3. Keep it simple
Though it's possible to combine multiple functions on a Web server (such as database and e-mail server functions), it is not a good idea if that server is exposed to the Internet. A single server running multiple applications is a rich target for hackers and has more potential for software incompatibilities. It's considered best practice to devote one function per server.
Once you've decided to dedicate a server to a single purpose, remove all software and services not directly related to that purpose -- simply disabling unused services through configuration settings may not stop hackers from using them. Among the things that you'll want to consider removing are unused network services, language compliers, and system development tools.
Also very important: make sure there's no instant messaging software on your server as it can be a gateway for intruders. And if your organization has an internal, private Web site on its intranet, it is best not to host that on the same server as the public Web site.
1 2 3 NEXT PAGE

Thursday, April 23, 2009





INTRODUCING THE NEW DELLVOSTRO 1320, 1520 & 1720 LAPTOPS
The next generation of Vostro laptops can make the mobile experience come together for your small business at an amazing value - offering new solutions for increasing productivity, enhancing collaboration, and protecting your data like never before!
Updated with some of the latest technologies, the new Vostro laptops deliver increased performance and a spectacular visual experience. And they're now energy efficient, helping you to save money and protect the environment.
With the latest Vostro laptops you can:
GET MORE — More of the features and services you need and nothing you don’t – all at a great price.
SECURE MORE — Enhanced multi-level security options to protect your business critical data.
DO MORE — Vostro laptops simply make it easy and fast to achieve more no matter where you are.




Dell Vostro Desktops
Dell Vostro Laptops
Reliable desktop solutions designed for small business.
Stay productive and mobile with laptops designed for small business.
Choose Vostro Desktops
Choose Vostro Laptops


Manufacturer: HP

Configration:

CN Number: CN0010932Category: Computer SystemsDesktop Computers


AMD Turion 64 X2 Dual-Core Mobile Technology TL-52 processor (1.6GHz) Genuine Windows Vista Home Premium Integrated system with computer, monitor, wir...<Read more>


Manufacturer: AOC

Configration:

CN Number: CN0012114



Get ready to be enchanted by an elegant design and thin profile that saves space and adds class to any room, whether it be at home or in the office. J...<Read more>

Embedded software shapes our world through cell phones, satellites, medical and home appliances, and automotive components—with defects causing life-threatening situations and delays creating huge costs. In this issue, we provide an overview of techniques and methods that impact embedded-software engineering. We also look at digital rights management, cloud computing costs, reliable distributed storage, and a Web-based question-answering system.
Editor’s note: Due to a printing error, the wrong cover captions were presented on Computer’s April issue. The correct captions are provided here: Biologically Inspired Computing, p. 95; Digital Wallet, p. 100; Requirements-Based Visualization Tools, p. 106. Computer regrets this error and offers sincere apologies for any inconvenience it may have caused our readers.
Features April 2009 PerspectivesHow Viable Is Digital Rights Management? Seong Oun Hwang Technologies that aim to protect digital content have fallen short of their mission. The computing community must find ways to make protection schemes interoperable and adopt a use model that lifts restrictions on paid-for protected content.
March 2009 Computing PracticesPuzzling Problems in Computer Engineering Behrooz Parhami University faculty have designed an engaging puzzle-based freshman seminar intended to motivate and retain computer engineering students.