A graphics card, or video card, processes and outputs images to the computer's monitor. Less expensive graphics card are integrated directly into the motherboard, but more powerful cards come as a separate component that you can replace without having to change the entire motherboard.
The graphics card receives information from the CPU about what to display, decides how to use the pixels on the screen to display that image, and sends that information to the monitor. For 3-D images, the graphics card first creates everything out of straight lines, called a "wireframe," and then fills in all the lighting, texture and color. In a fast-paced game, it has to do this around sixty times per second.
Many people think that graphics cards are just used for playing computer games, but they are also useful for graphic designers, video editors, and 3-D animators, who usually need the best display possible.
If your graphics card is not integrated into the motherboard, it's very simple to replace. Before you buy a new one though, make sure you know what you need and what your system can support. Some monitors can't display the highest resolution that an expensive graphics card can produce, and some graphics cards use the computer's memory rather than their own to produce their display.