Takeaway: Based on conversations with IT leaders and technology companies, here are Jason Hiner’s top tech trends for the enterprise in the year ahead.
A lot of the critical technology trends that dominated the business world in 2010 will continue to accelerate in 2011 while several new trends will develop enough momentum to become significant.
Based on my conversations with IT leaders and tech vendors and my daily observations of the latest developments in the industry, here are my top five tech trends that businesses should keep a close eye on for the year ahead.
Also read: My 2010 list of trends to watch
5. The enterprise warms to Apple and Android
In 2010, a surprising number of enterprises embarked on iPhone deployments after an extended period of internal testing and convincing Apple to update iOS to improve security and IT manageability. This even included a number of companies in the highly-security-conscious financial services industry, which has previously been a BlackBerry stronghold.
The iPhone testing also opened the door for enterprise iPad trials and deployments. That momentum will likely continue in 2011, since BlackBerry — the enterprise smartphone incumbent — has done little to erode the iPhone’s massive usability advantage.
Ironically, the iPhone/iPad breakthrough will also open the door for many enterprises to experiment with Android smartphone and tablet deployments as well, since like the iPhone Android also connects through Exchange ActiveSync and Google has been making similar modifications in security and manageability to please enterprise IT departments. Plus, Android phones are available at steeper discounts and devices such as the Motorola Droid Pro offer hardware keyboards to accommodate BlackBerry veterans.
Vendor HotSpot
Get the Most Out of Server Virtualization
Maximize the efficiency of your virtualization solution. Visit the Dell Virtualization Center today.
Learn More »
4. The shrinking private data center
It would have been easy to add cloud computing to this list, but that term has become such a widely-used cliche that it doesn’t have much meaning. It’s also part of a larger move to resources delivered over the Internet that transcends just one trend and actually touches many different aspects of today’s IT departments.
One of the most direct consequences of the cloud is that IT departments are shrinking their private data centers as they move to purchasing some of their apps through third party companies that deliver them over the Internet (e.g. Salesforce.com).
A couple other factors are also driving data center consolidation and shrinkage. The distributed server movement of the 1990s is clearly over as companies are buying bigger (but far fewer) server boxes and then using virtualization to divide them into as many logical servers as they need.
The next stage of this trend will come in 2-3 years when some businesses move toward renting server capacity on demand rather than running their own servers at all.
3. IT consumerization marches on
Last year, I had IT consumerization as No. 5 on my list of trends to watch. For 2011, it’s only going to accelerate as more employees choose to use their own tools rather than the ones provided by their companies, and more IT departments support worker-owned devices as a cost-saving mechanism that can reduce or postpone hardware purchases.
The other factor that will impact consumerization in 2011 will be the spread of multi-touch tablets. The growing legions of workers with iPads and Android tablets will want to use these devices for work and many IT departments will make room in their employee policies for these devices using similar guidelines to those for workers who use their personal smartphones to access corporate apps and data.
2. Desktop thinning
I will not predict that 2011 will be the year that thin clients replace a lot of desktop PCs. That false promise has been proclaimed for over a decade but has only become a reality in a few niche industries and never gained mass acceptance. It’s not going to happen this year either.
However, we are going to begin to see a lot more companies experimenting with desktop virtualization. By taking the company’s standard software image (the default OS configuration and all of the company apps) and putting them on a virtual machine, the IT department can enable a new level of flexibility that appeals to both IT administrators and workers. The virtualized desktop is hosted on a server and can be accessed from a company PC (even an old underpowered one), a worker’s personal PC, a thin client device, and even some tablets and smartphones. While the end user controls the access device, IT has complete control over the software and settings in the virtual machine. With the rise of IT consumerization, the appeal should be obvious here.
We could also see a surprising number of companies run large-scale experiments with Google Chrome OS systems, which are little more than bootable Web browsers. The number of enterprises that are already considering this and starting to test it might surprise you. We’re talking about big names like American Airlines, Kraft Foods, and Virgin Airlines. Google’s partnership with Citrix and the fact that it is strongly considering an enterprise version of Chrome OS are indicators that the company sees a lot of potential for this product in the business market.
1. Business units absorb more IT
The biggest trend of 2011 will be the continued decline of the traditional centralized IT department. More companies will continue to align their IT professionals with individual business units rather than in a central services group. The demand for corporate-savvy IT professionals who can serve as business analysts and project managers will continue to grow.
Meanwhile, many of the technical roles in IT — from server administrators to help desk technicians to network engineers to software developers — will get outsourced to companies that specialize in those areas. Keep in mind, that “outsource” in this context doesn’t necessarily mean “off-shoring” to another country. In many cases, local companies or at least local branches of larger companies will be the beneficiaries of this shift in IT labor. It’s simply a matter of companies sticking to their core competencies, and for most companies IT is not a core competency. This is especially true in small and medium organizations, but plenty of big companies are thinking along the same lines.
In return for giving up some control, these organizations will get 24/7/365 service and a fleet of IT professionals with more specialized skills at their disposal. This doesn’t mean that there will be a net loss of IT jobs in the market, but many of the jobs will shift from individual companies to service providers that work for lots of different companies. Again, the exception to the rule will be business analysts and project managers who will be able to bridge the gap between IT expertise and practical business solutions.
Wednesday, January 26, 2011
Tuesday, January 25, 2011
Dell Inspiron Zino HD (Inspiron 410)
You can certainly use the Dell Inspiron Zino HD (Inspiron 410) ($849.99 direct) as the centerpiece in your home-theater setup. This compact desktop takes up about as much shelf space as the recent Apple Mac mini (HDMI) ($699 list, 4.5 stars), even though it's a bit taller. It has a quad-core processor, decent discrete graphics, a large 750GB, 7,200-rpm hard drive, MCE remote, integrated IR sensor for the remote, 6GB of DDR3 memory, and a Blu-ray drive. It's almost a perfect PC companion to an HDTV—the only thing missing is a TV tuner, so the system can work as a DVR. Assuming that you already have a cable company DVR, then the system becomes the perfect HDTV companion. At this moment, the Dell Inspiron Zino HD is the best candidate for your home theater PC, so we award it the Editors' Choice for compact desktop PCs.
Design and Features
The Zino HD (Inspiron 410)'s outer appearance is essentially unchanged from the Inspiron 400 version. It has the same 3.5-by-8-by-8-inch (HWD) chassis, and you can order up to three different colored lids in addition to the standard glossy black lid. Our review unit had the Mercury Silver lid; the other available colors are Tomato Red and Peacock Blue. There's a tray-loading optical drive and 4-in-1 media card reader in the front, and a plethora of ports in the back. Most notably, the Zino HD (Inspiron 410) has a HDMI-out port for connections to large monitors or HDTVs, a VGA port, a S/PDIF port for digital audio, and two eSATA ports for external hard drives. It's a good thing it's so well connected, as internal expansion is impossible, as the Zino HD (Inspiron 410) isn't user serviceable.
Our review unit came configured with a bunch of high-end upgrades. The desktop has a Blu-ray player (BD-ROM) with DVD- and CD-burning capabilities. The system came with 802.11a/b/g/n Wi-Fi, which means you can connect to dual-radio routers with both 2.4-GHz and 5-GHz connectivity. The 5-GHz channel is a lot less crowded than the more common 2.4-GHz channels.
Specifications
Type = Multimedia, Digital Entertainment System
Processor Family = AMD Phenom II
RAM = 6 GB
Storage Capacity (as Tested) = 750 GB
Graphics Card = ATI Radeon HD 5450
Primary Optical Drive = Blu-Ray Disc
Operating System = Microsoft Windows 7 Home Premium
My unit came bundled with a a full-sized wireless keyboard and mouse; Dell also sells a more compact wireless keyboard with built-in trackball, which is better suited for couch use. Since we're nit-picking, the Zino HD (Inspiron 410)'s external power brick makes the overall package a little bulky: The Mac mini's internal power supply eases installations and keeps the system's svelte lines intact.
There isn't any bloatware or extraneous ads on the desktop that sometimes show up on retail PCs. It comes with a usable copy of Office 2010 Starter preloaded, and you can of course upgrade that to a full version online for a fee. The system comes with Cyberlink PowerDVD software integrated into the Media Center interface, so you can view home videos, DVDs, and Blu-ray movies without leaving MCE. The system also comes with Roxio CD/DVD burning software. The only (slight) stumble is the short 30-day subscription to McAfee SecurityCenter (antivirus, anti-malware, etc.)—it really should be 15 months like it was on the previous iteration, the Dell Inspiron Zino HD (Inspiron 400).
Performance
Dell Inspiron Zino HD (Inspiron 410) The Zino HD (Inspiron 410) is certainly fast enough to support HD viewing on a HDTV or large screen monitor. The system smoothly displayed Web videos, DVDs, and Blu-ray movies. The system's quad-core AMD Phenom II X4 P940 processor and ATI Mobility Radeon HD 5450 graphics helped its smooth video and passable multimedia performance. Its scores of 4 minutes 11 seconds on our Handbrake test and 9:44 on Photoshop CS5 are within the same class as the Apple Mac mini (HDMI), our small/compact-form-factor Editors' Choice. Likewise, the Zino HD can far outperform the compact nettops like the Giada Slim-N20 ($449 direct, 3.5 stars) (12:19 Handbrake, 23:29 CS5) and Lenovo IdeaCentre Q150 ($399 direct, 4 stars) (12:31 Handbrake, 23:38 CS5).
The Zino HD (Inspiron 410) isn't a high-end 3D gaming performer, but it is able to run Crysis at a respectable 22 frames per second (fps) and Lost Planet 2 at 15 fps. Though you wouldn't want to play these games at these frame rates, it does mean that you have a good chance of playing older and less strenuous 3D games on your HDTV, like Doom 3 or Halo. Its certainly got enough oomph to play simpler 3D games like World of Warcraft and Spore. This performance is a vast improvement over the nettops, which can't even run the 3D games.
The Zino HD (Inspiron 410) is a prime example of a compact entertainment PC meant to hook up to a HDTV or large-screen monitor. Since it has full Windows compatibility, it's more powerful on the Web and more flexible than Google TV adjuncts like the Logitech Revue or Sony's Blu-ray player with Google TV. It's a smidge bigger than the Mac mini, but it also comes with so much more in that larger chassis. Besides, both systems take up about the same amount of shelf space. The Mac mini is a faster system if you're going to be creating and manipulating videos and photos, but the simple fact of the matter is that the Zino HD is a better multimedia consumption device due to its Blu-ray player, much larger hard drive, more memory for better multitasking, and Windows 7 operating system. Sure, you can install Windows on the Mac mini in Boot Camp, but on the Mac you'll have to buy Windows Home Premium for $100. Though services like Netflix, Hulu, and Boxee are available on Macs, unless you're monogamous with iTunes, Windows has many more entertainment options like CinemaNow, Media Center, Rhapsody, etc. About the only thing keeping the Zino HD (Inspiron 410) from gaining a higher overall score is the lack of a TV tuner and the short 30-day Internet security subscription.
Design and Features
The Zino HD (Inspiron 410)'s outer appearance is essentially unchanged from the Inspiron 400 version. It has the same 3.5-by-8-by-8-inch (HWD) chassis, and you can order up to three different colored lids in addition to the standard glossy black lid. Our review unit had the Mercury Silver lid; the other available colors are Tomato Red and Peacock Blue. There's a tray-loading optical drive and 4-in-1 media card reader in the front, and a plethora of ports in the back. Most notably, the Zino HD (Inspiron 410) has a HDMI-out port for connections to large monitors or HDTVs, a VGA port, a S/PDIF port for digital audio, and two eSATA ports for external hard drives. It's a good thing it's so well connected, as internal expansion is impossible, as the Zino HD (Inspiron 410) isn't user serviceable.
Our review unit came configured with a bunch of high-end upgrades. The desktop has a Blu-ray player (BD-ROM) with DVD- and CD-burning capabilities. The system came with 802.11a/b/g/n Wi-Fi, which means you can connect to dual-radio routers with both 2.4-GHz and 5-GHz connectivity. The 5-GHz channel is a lot less crowded than the more common 2.4-GHz channels.
Specifications
Type = Multimedia, Digital Entertainment System
Processor Family = AMD Phenom II
RAM = 6 GB
Storage Capacity (as Tested) = 750 GB
Graphics Card = ATI Radeon HD 5450
Primary Optical Drive = Blu-Ray Disc
Operating System = Microsoft Windows 7 Home Premium
My unit came bundled with a a full-sized wireless keyboard and mouse; Dell also sells a more compact wireless keyboard with built-in trackball, which is better suited for couch use. Since we're nit-picking, the Zino HD (Inspiron 410)'s external power brick makes the overall package a little bulky: The Mac mini's internal power supply eases installations and keeps the system's svelte lines intact.
There isn't any bloatware or extraneous ads on the desktop that sometimes show up on retail PCs. It comes with a usable copy of Office 2010 Starter preloaded, and you can of course upgrade that to a full version online for a fee. The system comes with Cyberlink PowerDVD software integrated into the Media Center interface, so you can view home videos, DVDs, and Blu-ray movies without leaving MCE. The system also comes with Roxio CD/DVD burning software. The only (slight) stumble is the short 30-day subscription to McAfee SecurityCenter (antivirus, anti-malware, etc.)—it really should be 15 months like it was on the previous iteration, the Dell Inspiron Zino HD (Inspiron 400).
Performance
Dell Inspiron Zino HD (Inspiron 410) The Zino HD (Inspiron 410) is certainly fast enough to support HD viewing on a HDTV or large screen monitor. The system smoothly displayed Web videos, DVDs, and Blu-ray movies. The system's quad-core AMD Phenom II X4 P940 processor and ATI Mobility Radeon HD 5450 graphics helped its smooth video and passable multimedia performance. Its scores of 4 minutes 11 seconds on our Handbrake test and 9:44 on Photoshop CS5 are within the same class as the Apple Mac mini (HDMI), our small/compact-form-factor Editors' Choice. Likewise, the Zino HD can far outperform the compact nettops like the Giada Slim-N20 ($449 direct, 3.5 stars) (12:19 Handbrake, 23:29 CS5) and Lenovo IdeaCentre Q150 ($399 direct, 4 stars) (12:31 Handbrake, 23:38 CS5).
The Zino HD (Inspiron 410) isn't a high-end 3D gaming performer, but it is able to run Crysis at a respectable 22 frames per second (fps) and Lost Planet 2 at 15 fps. Though you wouldn't want to play these games at these frame rates, it does mean that you have a good chance of playing older and less strenuous 3D games on your HDTV, like Doom 3 or Halo. Its certainly got enough oomph to play simpler 3D games like World of Warcraft and Spore. This performance is a vast improvement over the nettops, which can't even run the 3D games.
The Zino HD (Inspiron 410) is a prime example of a compact entertainment PC meant to hook up to a HDTV or large-screen monitor. Since it has full Windows compatibility, it's more powerful on the Web and more flexible than Google TV adjuncts like the Logitech Revue or Sony's Blu-ray player with Google TV. It's a smidge bigger than the Mac mini, but it also comes with so much more in that larger chassis. Besides, both systems take up about the same amount of shelf space. The Mac mini is a faster system if you're going to be creating and manipulating videos and photos, but the simple fact of the matter is that the Zino HD is a better multimedia consumption device due to its Blu-ray player, much larger hard drive, more memory for better multitasking, and Windows 7 operating system. Sure, you can install Windows on the Mac mini in Boot Camp, but on the Mac you'll have to buy Windows Home Premium for $100. Though services like Netflix, Hulu, and Boxee are available on Macs, unless you're monogamous with iTunes, Windows has many more entertainment options like CinemaNow, Media Center, Rhapsody, etc. About the only thing keeping the Zino HD (Inspiron 410) from gaining a higher overall score is the lack of a TV tuner and the short 30-day Internet security subscription.
Sunday, January 23, 2011
Cisco to Buy Flip Camcorder Maker for $590M
Cisco Systems said Thursday that it would buy Pure Digital, manufacturer of the popular Flip pocket camcorder line, to enhance its consumer offerings.
After the deal's close, Flip will be folded into Cisco's Consumer Business Group, home to the Linksys by Cisco home networking products, as well as media storage. Pure Digital CEO Jonathan Kaplan will be made the general manager of the group.
"The acquisition of Pure Digital is key to Cisco's strategy to expand our momentum in the media-enabled home and to capture the consumer market transition to visual networking," said Cisco senior vice president Ned Hooper, in a statement issued today. "Pure Digital has revolutionized the way people capture and share video with Flip Video. This acquisition will take Cisco's consumer business to the next level as the company develops new video capabilities and drives the next generation of entertainment and communication experiences."
Cisco is expected to pay $590 million in stock for all of Pure Digital's shares, plus $15 million for "retention-based equity incentives for continuing employees." The deal is expected to close in Q4 2009.
Since first introducing the Flip, Pure Digital has sold more than two million devices, including, most recently the Mino and Mino HD.
After the deal's close, Flip will be folded into Cisco's Consumer Business Group, home to the Linksys by Cisco home networking products, as well as media storage. Pure Digital CEO Jonathan Kaplan will be made the general manager of the group.
"The acquisition of Pure Digital is key to Cisco's strategy to expand our momentum in the media-enabled home and to capture the consumer market transition to visual networking," said Cisco senior vice president Ned Hooper, in a statement issued today. "Pure Digital has revolutionized the way people capture and share video with Flip Video. This acquisition will take Cisco's consumer business to the next level as the company develops new video capabilities and drives the next generation of entertainment and communication experiences."
Cisco is expected to pay $590 million in stock for all of Pure Digital's shares, plus $15 million for "retention-based equity incentives for continuing employees." The deal is expected to close in Q4 2009.
Since first introducing the Flip, Pure Digital has sold more than two million devices, including, most recently the Mino and Mino HD.
Saturday, January 22, 2011
Razer Unveils 'Switchblade' Mobile Concept Gaming PC
For the past 2 years Razer has been developing something far more advanced than gaming peripherals. In collaboration with Intel, Razer has been creating its very own gaming PC, but nothing like the Maingear and Origin custom builds you see on the market—it's a handheld gaming PC.
Codename "Switchblade" is still a concept design, though there's a working model at CES. It's powered by Intel's Atom processor that, according to Razer, is able to handle video and 3D tasks better than the past generation. It's uncertain, however, if the processor will be up to snuff to run the multi-touch screen as well.
The user interface (UI) is different from what you'd find on a 7-inch PC. The less-than-full-size chiclet keyboard has been replaced with a tactile one that changes in accordance to what kind of game you're playing. Razer will, ideally, be coupling its own software that will allow you to download profiles pre-built for popular titles. Custom macro building will also be integrated into this program. Razer is currently working with partners to create products based off of this design.
"The main problem with mobile PC gaming so far is that no one has been able to port the full mouse and keyboard experience onto a small size portable solution," said Min-Liang Tan, CEO and Creative Director, Razer. "By combining adaptive on-the-fly controls and display, we managed to maintain the full tactile keyboard in a miniature computer while saving valuable screen estate."
The Switchblade could potentially breathe new life into the PC gaming market by taping into the mobile user market. However, it's uncertain if this handheld gaming PC will be competing in the netbook ($300-$500) or DS ($179.99) and PSP ($169.99) market space. In images and videos Razer has demonstrated that the device can be held like a DS, but the internal hardware and popularity will ultimately determine pricing.
The Switchblade measure 6.77 by 4.52 by .98-inches, but the weight is currently unknown. Integrated Wi-Fi and 3G connectivity will be available with the unit. Ports are few being just a mini HDMI, USB 3.0, and audio ports. Aside from the Razer gaming program, the only other software included is Windows 7.
There is no word on availability.
Codename "Switchblade" is still a concept design, though there's a working model at CES. It's powered by Intel's Atom processor that, according to Razer, is able to handle video and 3D tasks better than the past generation. It's uncertain, however, if the processor will be up to snuff to run the multi-touch screen as well.
The user interface (UI) is different from what you'd find on a 7-inch PC. The less-than-full-size chiclet keyboard has been replaced with a tactile one that changes in accordance to what kind of game you're playing. Razer will, ideally, be coupling its own software that will allow you to download profiles pre-built for popular titles. Custom macro building will also be integrated into this program. Razer is currently working with partners to create products based off of this design.
"The main problem with mobile PC gaming so far is that no one has been able to port the full mouse and keyboard experience onto a small size portable solution," said Min-Liang Tan, CEO and Creative Director, Razer. "By combining adaptive on-the-fly controls and display, we managed to maintain the full tactile keyboard in a miniature computer while saving valuable screen estate."
The Switchblade could potentially breathe new life into the PC gaming market by taping into the mobile user market. However, it's uncertain if this handheld gaming PC will be competing in the netbook ($300-$500) or DS ($179.99) and PSP ($169.99) market space. In images and videos Razer has demonstrated that the device can be held like a DS, but the internal hardware and popularity will ultimately determine pricing.
The Switchblade measure 6.77 by 4.52 by .98-inches, but the weight is currently unknown. Integrated Wi-Fi and 3G connectivity will be available with the unit. Ports are few being just a mini HDMI, USB 3.0, and audio ports. Aside from the Razer gaming program, the only other software included is Windows 7.
There is no word on availability.
Thursday, January 20, 2011
How to Switch from a PC to a Mac
You've decided to "think different" and become a Mac user. Here are the things to consider and steps to take as you make a big change to your computing lifestyle.
Acclimating to MacOS
Once your files are transferred to your new Mac, you're ready to go, right? Not exactly. If you use programs that work on both platforms –like Firefox, Thunderbird, LastPass, etc. (see below)—you'll find you can get up and running quickly. But while 90% of how you compute isn't all that different, there are some Mac methods that can be especially hard for experienced Windows users to acclimate to.
1) Desktop and Finder: You'll easily recognize the desktop on your screen, where you can store icons for files and folders, in particular your "Macintosh HD" drive icon. But the differences from Windows quickly become apparent. First, icons flow from the top right down, instead of the top left. There's a dock full of icons on the bottom, which is similar in ways to the Windows taskbar. The top edge of the desktop has a bar running all the way across that changes settings depending on the program you open—this is where you access menus that in Windows would be part of each individual program window. On the right side of that bar, however, are some permanent MacOS System Tools. The Finder is the MacOS equivalent of Windows Explorer, for finding files and folders to work with.
2) Installing Programs: Instead of double clicking on a .EXE file like you do in Windows and then watching the installation program copy files to who-knows-where on your hard disk drive, the Mac makes it simple. The installation files typically end with .DMG; to call it an "installation" file is a bit erroneous however. It is just a disk-image distribution file, with the application inside. The better DMG files will, when double-clicked, show a graphical Finder short cut to indicate that you should copy the application file to your Applications folder on the Mac. If it doesn't, you can manually go to your Macintosh HD, look for Applications, and put it in the folder yourself. If you want the new app on the dock at the bottom of the Mac desktop, just drag one from the Applications folder.
3) Uninstalling Programs: Again, simplicity rules. Instead of using a control panel to find and uninstall programs, like you must do on Windows, you can just go into Applications, find the program you no longer want, and drag it to the trash.
4) Closing, Maximizing, and Minimizing: In Windows, all the controls for closing, maximizing, and minimizing a window are on the upper right. On the MacOS, they're color coded round buttons on the upper left and they don't work the same as the Windows tools. For example, the green expand button on MacOS doesn't make a window go full-screen unless the contents of the window need that much screen real estate. To go full screen, you need to drag a window to the upper left, then "pull" on its lower right corner. Or download a program like Right Zoom to force windows to zoom the same way as the Windows maximize function.
5) Input Madness: Undoubtedly, the hardest thing about adjusting to the Mac will be using the keyboard and mouse. On a Windows keyboard, you'd typically pair the Cntl key with letters to make a shortcut (Ctrl+C to copy, Ctrl+X to cut, Ctrl+V to paste, for example). The letters are equivalent for Mac shortcuts, but instead of using the "control" key (as Apple calls it), you use "command" key. That might not be that hard to master, except command is right next to the space bar, where the Alt key is found on typical Windows keyboard. It will require some practice to overcome your fingers' natural desire to hit certain keys. You can find a full list of MacOS keyboard shortcuts online.
6) No Right Click: The old-style Apple-provided mouse didn't even have buttons—the entire mouse was the button. So how do you "right-click" if you're using an older mouse? First, when you click, hold it down and wait just wait, and a contextual menu will appear, if appropriate. The other option: put to fingers on the Mac track pad and then click. That brings up the contextual menu instantly. The latest Apple Magic Mouse is like a multi-finger touch-pad on top, including a spot to right click, so if you get a new Mac, you can get access easily.
7) Cntl+Alt+Delete Equivalents: You use that magic keystroke to do a lot in Windows, but it doesn't do anything on the MacOS. If you want to force an application to quit, you use Option+Shift+Command+Esc and hold for three seconds (use Option+Command+Esc to get a list of all running apps) or select Force Quite from the Apple menu in the upper left. If you want to check things like RAM usage, you access the Activity Monitor by going to the Programs menu then the Utilities folders.
Acclimating to MacOS
Once your files are transferred to your new Mac, you're ready to go, right? Not exactly. If you use programs that work on both platforms –like Firefox, Thunderbird, LastPass, etc. (see below)—you'll find you can get up and running quickly. But while 90% of how you compute isn't all that different, there are some Mac methods that can be especially hard for experienced Windows users to acclimate to.
1) Desktop and Finder: You'll easily recognize the desktop on your screen, where you can store icons for files and folders, in particular your "Macintosh HD" drive icon. But the differences from Windows quickly become apparent. First, icons flow from the top right down, instead of the top left. There's a dock full of icons on the bottom, which is similar in ways to the Windows taskbar. The top edge of the desktop has a bar running all the way across that changes settings depending on the program you open—this is where you access menus that in Windows would be part of each individual program window. On the right side of that bar, however, are some permanent MacOS System Tools. The Finder is the MacOS equivalent of Windows Explorer, for finding files and folders to work with.
2) Installing Programs: Instead of double clicking on a .EXE file like you do in Windows and then watching the installation program copy files to who-knows-where on your hard disk drive, the Mac makes it simple. The installation files typically end with .DMG; to call it an "installation" file is a bit erroneous however. It is just a disk-image distribution file, with the application inside. The better DMG files will, when double-clicked, show a graphical Finder short cut to indicate that you should copy the application file to your Applications folder on the Mac. If it doesn't, you can manually go to your Macintosh HD, look for Applications, and put it in the folder yourself. If you want the new app on the dock at the bottom of the Mac desktop, just drag one from the Applications folder.
3) Uninstalling Programs: Again, simplicity rules. Instead of using a control panel to find and uninstall programs, like you must do on Windows, you can just go into Applications, find the program you no longer want, and drag it to the trash.
4) Closing, Maximizing, and Minimizing: In Windows, all the controls for closing, maximizing, and minimizing a window are on the upper right. On the MacOS, they're color coded round buttons on the upper left and they don't work the same as the Windows tools. For example, the green expand button on MacOS doesn't make a window go full-screen unless the contents of the window need that much screen real estate. To go full screen, you need to drag a window to the upper left, then "pull" on its lower right corner. Or download a program like Right Zoom to force windows to zoom the same way as the Windows maximize function.
5) Input Madness: Undoubtedly, the hardest thing about adjusting to the Mac will be using the keyboard and mouse. On a Windows keyboard, you'd typically pair the Cntl key with letters to make a shortcut (Ctrl+C to copy, Ctrl+X to cut, Ctrl+V to paste, for example). The letters are equivalent for Mac shortcuts, but instead of using the "control" key (as Apple calls it), you use "command" key. That might not be that hard to master, except command is right next to the space bar, where the Alt key is found on typical Windows keyboard. It will require some practice to overcome your fingers' natural desire to hit certain keys. You can find a full list of MacOS keyboard shortcuts online.
6) No Right Click: The old-style Apple-provided mouse didn't even have buttons—the entire mouse was the button. So how do you "right-click" if you're using an older mouse? First, when you click, hold it down and wait just wait, and a contextual menu will appear, if appropriate. The other option: put to fingers on the Mac track pad and then click. That brings up the contextual menu instantly. The latest Apple Magic Mouse is like a multi-finger touch-pad on top, including a spot to right click, so if you get a new Mac, you can get access easily.
7) Cntl+Alt+Delete Equivalents: You use that magic keystroke to do a lot in Windows, but it doesn't do anything on the MacOS. If you want to force an application to quit, you use Option+Shift+Command+Esc and hold for three seconds (use Option+Command+Esc to get a list of all running apps) or select Force Quite from the Apple menu in the upper left. If you want to check things like RAM usage, you access the Activity Monitor by going to the Programs menu then the Utilities folders.
Wednesday, January 19, 2011
Next Generation Computing: Touch
Pinch-zoom and other multi-touch gestures are the beginning of a new interface model for computing—and it will be here sooner than you think.
The touch screen revolution is upon us. From the Apple iPad, to Android cell phones, to Microsoft's brilliant Surface table, multi-touch interfaces have given us the power to control computers with simple hand gestures in ways that were mere science fiction just a few years ago.
Thanks to recent technological advances, touch screens are only now becoming mainstream. But the idea is nothing new. Think of the old pen-based graphics tablets for the Atari 800 and Apple IIe in the 1980s, or early stylus-based handhelds like the Apple Newton and the PalmPilot in the 1990s. Those latter devices contained pressure-sensitive resistive screens, which consisted of two separate layers and worked best with a plastic or metal stylus.
Contrast that to today's Apple iPad and the Motorola Droid X, which feature large, single-layer, glass capacitive touch screens. Capacitive screens allow for greater control than what you'd find on an ATM or older car navigation system. They respond accurately to the lightest fingertip touches, and support multi-touch gestures like two-finger zoom and rotation. Some product vendors have added haptic feedback, which couples each touch with mild vibrations in order to give physical resistance to each key press.
Touch screens have also finally come to desktop PCs, most notably from HP—albeit with mixed results. Ergonomic issues render desktop touch screens a dubious proposition: who wants to keep an arm outstretched all day long? One stellar exception is the Wacom Cintiq 21UX, a 1280-by-1024-pixel, highly configurable drawing surface for artists, photo editors, and graphic designers that doubles as a multi-position LCD monitor. And if you've watched cable news recently, you've probably seen Perceptive Pixel's giant Multi-Touch Collaboration Wall, which lets anchors flip between images and zoom into maps with hand gestures as they deliver the latest breaking news.
CES 2011 has already come and gone; two main themes were the proliferation of touch screen cell phones and tablet computers. While most of the latter models mimicked the iPad, others like the transformable, dockable Motorola Atrix 4G phone allow for multiple methods of interaction. There's also the Toshiba Libretto W100, an innovative device that saw a very limited production run in late 2010. Essentially, it's a dual-screen tablet that doubled as a touch-based laptop. While this model wasn't successful, there's a good chance we haven't seen the last of that concept.
There's much more coming, and it's just over the horizon. Microsoft recently applied for a patent for a new kind of touch screen, dubbed a light-induced, shape-memory polymer display, which offers actual tactile feedback. It contains a special layer that, when activated via ultraviolet light, can raise or lower individual pixels in order to give the display some texture. For example, a developer could create a "real" physical keyboard out of the screen that gives lifelike feedback whenever it's typed on, only to disappear when it's no longer needed.
Meanwhile, Hitachi Displays has developed a projection-based, electrostatic capacitance multi-touch panel. It works via gloves, plastic, or other insulating materials. That means it's not dependent on electrical impulses from your fingertip the way a capacitive touch screen normally is.
Other advances could dispense with the glass panel altogether altogether. By now, most PCMag readers have heard of Microsoft Kinect for Xbox 360, which dispenses with physical controllers and lets players use their entire body to interact with each game. PrimeSense, the company behind the technology for the Kinect, has already demonstrated a Minority Report-style interface that lets someone manipulate photos and other objects on an HDTV screen using both hands in the air. And Light Blue Optics' Light Touch interactive projector, first announced last year, turns any surface into a 10.1-inch touch screen using holographic laser projection.
Some pundits predict that in the not-so-distant future, we'll all be controlling our computers, cars, home appliances, and other devices using touch-based interfaces. We've already begun to develop standard methods of interaction, such as the aforementioned pinch-zoom and the ability to flip between images with a single finger swipe. But for every successful advance, we're likely to see many design failures along the way. The past few decades are littered with formerly cutting-edge user interface paradigms like pen-based computing and virtual reality. Both debuted to much fanfare, only to sputter out and disappear over time.
Still, the recent introduction of capacitive touch screens and finger-based user interfaces marked a turning point. More than one of us has witnessed a formerly technophobic person pick up an iPad and start using it right away; it's a sight to behold. As touch-based hardware and software continue to evolve, we can only imagine how everyday computing will change even further over the next decade. One thing is for certain: it's going to be a wild ride.
The touch screen revolution is upon us. From the Apple iPad, to Android cell phones, to Microsoft's brilliant Surface table, multi-touch interfaces have given us the power to control computers with simple hand gestures in ways that were mere science fiction just a few years ago.
Thanks to recent technological advances, touch screens are only now becoming mainstream. But the idea is nothing new. Think of the old pen-based graphics tablets for the Atari 800 and Apple IIe in the 1980s, or early stylus-based handhelds like the Apple Newton and the PalmPilot in the 1990s. Those latter devices contained pressure-sensitive resistive screens, which consisted of two separate layers and worked best with a plastic or metal stylus.
Contrast that to today's Apple iPad and the Motorola Droid X, which feature large, single-layer, glass capacitive touch screens. Capacitive screens allow for greater control than what you'd find on an ATM or older car navigation system. They respond accurately to the lightest fingertip touches, and support multi-touch gestures like two-finger zoom and rotation. Some product vendors have added haptic feedback, which couples each touch with mild vibrations in order to give physical resistance to each key press.
Touch screens have also finally come to desktop PCs, most notably from HP—albeit with mixed results. Ergonomic issues render desktop touch screens a dubious proposition: who wants to keep an arm outstretched all day long? One stellar exception is the Wacom Cintiq 21UX, a 1280-by-1024-pixel, highly configurable drawing surface for artists, photo editors, and graphic designers that doubles as a multi-position LCD monitor. And if you've watched cable news recently, you've probably seen Perceptive Pixel's giant Multi-Touch Collaboration Wall, which lets anchors flip between images and zoom into maps with hand gestures as they deliver the latest breaking news.
CES 2011 has already come and gone; two main themes were the proliferation of touch screen cell phones and tablet computers. While most of the latter models mimicked the iPad, others like the transformable, dockable Motorola Atrix 4G phone allow for multiple methods of interaction. There's also the Toshiba Libretto W100, an innovative device that saw a very limited production run in late 2010. Essentially, it's a dual-screen tablet that doubled as a touch-based laptop. While this model wasn't successful, there's a good chance we haven't seen the last of that concept.
There's much more coming, and it's just over the horizon. Microsoft recently applied for a patent for a new kind of touch screen, dubbed a light-induced, shape-memory polymer display, which offers actual tactile feedback. It contains a special layer that, when activated via ultraviolet light, can raise or lower individual pixels in order to give the display some texture. For example, a developer could create a "real" physical keyboard out of the screen that gives lifelike feedback whenever it's typed on, only to disappear when it's no longer needed.
Meanwhile, Hitachi Displays has developed a projection-based, electrostatic capacitance multi-touch panel. It works via gloves, plastic, or other insulating materials. That means it's not dependent on electrical impulses from your fingertip the way a capacitive touch screen normally is.
Other advances could dispense with the glass panel altogether altogether. By now, most PCMag readers have heard of Microsoft Kinect for Xbox 360, which dispenses with physical controllers and lets players use their entire body to interact with each game. PrimeSense, the company behind the technology for the Kinect, has already demonstrated a Minority Report-style interface that lets someone manipulate photos and other objects on an HDTV screen using both hands in the air. And Light Blue Optics' Light Touch interactive projector, first announced last year, turns any surface into a 10.1-inch touch screen using holographic laser projection.
Some pundits predict that in the not-so-distant future, we'll all be controlling our computers, cars, home appliances, and other devices using touch-based interfaces. We've already begun to develop standard methods of interaction, such as the aforementioned pinch-zoom and the ability to flip between images with a single finger swipe. But for every successful advance, we're likely to see many design failures along the way. The past few decades are littered with formerly cutting-edge user interface paradigms like pen-based computing and virtual reality. Both debuted to much fanfare, only to sputter out and disappear over time.
Still, the recent introduction of capacitive touch screens and finger-based user interfaces marked a turning point. More than one of us has witnessed a formerly technophobic person pick up an iPad and start using it right away; it's a sight to behold. As touch-based hardware and software continue to evolve, we can only imagine how everyday computing will change even further over the next decade. One thing is for certain: it's going to be a wild ride.
Tuesday, January 18, 2011
Because You Need Better Security
If you can read this, you probably run Windows XP in administrator mode. There's no cause and effect here. It's just that, while Win XP allows and recommends creation of Limited user accounts that make many exploits impossible, a vast number of programs and common activities don't work under a Limited account, so typically, everybody's an Administrator. This is one of Win XP's biggest security problems.
Vista's User Account Control (formerly User Account Protection) should go a long way toward fixing this. Microsoft aims to allow the widest possible range of activity in a Standard (don't call it Limited!) account and handles legacy programs that assume greater privilege. Any Administrator can grant one-time permission when necessary. And now, even Administrators run at Standard level, with a warning from Vista when elevated privilege is required. Unfortunately the frequent "Windows needs your permission . . . " warnings can get pretty annoying.
The implementation of Windows Service Hardening, a related feature, has been completed since Vista Beta 1. It minimizes the impact of malware that exploits Windows services by allowing each service access only to the resources it actually needs.Yellow = Suspicious
These protections may make it harder for hackers to take advantage of the OS, but today the bad guys are just as likely to try to take advantage of you instead. That's why IE7 in Vista now implements Microsoft's antiphishing scheme. I tried it on known phishing sites and one with an invalid security certificate—it red-flagged them and blocked access. Sites with verified security are green-flagged; others get a yellow rating if analysis of their HTML code reveals suspicious elements. Spyware protection from Windows Defender (for a review, see go.pcmag.com/vista) is now built into Vista, and it automatically scans any files downloaded through IE. And as IE7 is a prime target, Vista safeguards it further by having it run in Protected Mode, a still-lower privilege level designed to thwart browser-subverting malware.
In Beta 1, Vista's Parental Controls system could limit access to games by name, content, or ESRB rating. This was nice, but most parents are more concerned about what their kids are doing online and how long they're spending soldered to their systems. Now that the Parental Controls system has all its planned features it addresses these issues, going much further than in Beta 1 and rivaling some third-party products. The administrator can limit the days and times each user is allowed on the computer and establish a list of approved programs (denying access to all others). Web-site filtering based on content, whitelists, and blacklists is available on a per-user basis. And Activity Monitoring will report a wealth of details about the user's actions, including the top ten sites visited, blocked sites, files downloaded, applications launched, and more. I put a Standard account under Parental Control and tried to hack the protection, but I couldn't break it. Time Restrictions Grid
Unlike the firewall in Windows XP, the Windows Firewall in Vista protects against unauthorized outbound connections. New in the latest CTP, this outbound protection has a configuration interface. It's not for the faint of heart, but an expert user can tweak dozens of existing exceptions or create new exceptions to let specific programs do their job.
When I travel with my notebook, I'm always concerned that someone will swipe it and use the many tools out there to get a peek behind the PCMag curtain. Vista's BitLocker Drive Encryption (formerly Secure Startup Volume Encryption) will help, making the drive unreadable to all but me. In the earlier beta this would work only on a system containing a Trusted Platform Module chip. Mercifully, BitLocker now works on any system, encrypting all but a sliver of the boot disk. Systems can be decrypted at start-up after you supply the password or insert a coded USB drive. A laptop thus protected may still be lost or stolen, but the data on it won't be revealed.
Digging into the Group Policy settings, I discovered two useful security secrets. The new Removable Storage Access policy can deny read, write, or execute status to removable devices—from CD drives to personal audio players—to prohibit "slurping" files into an iPod or thumb drive. And the Device Installation Restrictions policy limits the types of devices that can be installed, though you need to know the Device ID or Device Class.
Will hackers find weak spots in Vista? Most definitely—but from what I can see, they're going to have to work a lot harder at it.
Vista's User Account Control (formerly User Account Protection) should go a long way toward fixing this. Microsoft aims to allow the widest possible range of activity in a Standard (don't call it Limited!) account and handles legacy programs that assume greater privilege. Any Administrator can grant one-time permission when necessary. And now, even Administrators run at Standard level, with a warning from Vista when elevated privilege is required. Unfortunately the frequent "Windows needs your permission . . . " warnings can get pretty annoying.
The implementation of Windows Service Hardening, a related feature, has been completed since Vista Beta 1. It minimizes the impact of malware that exploits Windows services by allowing each service access only to the resources it actually needs.Yellow = Suspicious
These protections may make it harder for hackers to take advantage of the OS, but today the bad guys are just as likely to try to take advantage of you instead. That's why IE7 in Vista now implements Microsoft's antiphishing scheme. I tried it on known phishing sites and one with an invalid security certificate—it red-flagged them and blocked access. Sites with verified security are green-flagged; others get a yellow rating if analysis of their HTML code reveals suspicious elements. Spyware protection from Windows Defender (for a review, see go.pcmag.com/vista) is now built into Vista, and it automatically scans any files downloaded through IE. And as IE7 is a prime target, Vista safeguards it further by having it run in Protected Mode, a still-lower privilege level designed to thwart browser-subverting malware.
In Beta 1, Vista's Parental Controls system could limit access to games by name, content, or ESRB rating. This was nice, but most parents are more concerned about what their kids are doing online and how long they're spending soldered to their systems. Now that the Parental Controls system has all its planned features it addresses these issues, going much further than in Beta 1 and rivaling some third-party products. The administrator can limit the days and times each user is allowed on the computer and establish a list of approved programs (denying access to all others). Web-site filtering based on content, whitelists, and blacklists is available on a per-user basis. And Activity Monitoring will report a wealth of details about the user's actions, including the top ten sites visited, blocked sites, files downloaded, applications launched, and more. I put a Standard account under Parental Control and tried to hack the protection, but I couldn't break it. Time Restrictions Grid
Unlike the firewall in Windows XP, the Windows Firewall in Vista protects against unauthorized outbound connections. New in the latest CTP, this outbound protection has a configuration interface. It's not for the faint of heart, but an expert user can tweak dozens of existing exceptions or create new exceptions to let specific programs do their job.
When I travel with my notebook, I'm always concerned that someone will swipe it and use the many tools out there to get a peek behind the PCMag curtain. Vista's BitLocker Drive Encryption (formerly Secure Startup Volume Encryption) will help, making the drive unreadable to all but me. In the earlier beta this would work only on a system containing a Trusted Platform Module chip. Mercifully, BitLocker now works on any system, encrypting all but a sliver of the boot disk. Systems can be decrypted at start-up after you supply the password or insert a coded USB drive. A laptop thus protected may still be lost or stolen, but the data on it won't be revealed.
Digging into the Group Policy settings, I discovered two useful security secrets. The new Removable Storage Access policy can deny read, write, or execute status to removable devices—from CD drives to personal audio players—to prohibit "slurping" files into an iPod or thumb drive. And the Device Installation Restrictions policy limits the types of devices that can be installed, though you need to know the Device ID or Device Class.
Will hackers find weak spots in Vista? Most definitely—but from what I can see, they're going to have to work a lot harder at it.
Monday, January 17, 2011
Could Intel-Nvidia deal finally do in AMD?
Last November, ARM co-founder Hermann Hauser predicted that Intel was doomed. The reason: The chip giant was floundering in the mobile space because it persisted and failed at building a viable chip for mobile that could compete with the dominant ARM model.
"People in the mobile phone architecture [business] do not buy microprocessors. So if you sell microprocessors you have the wrong model. They license them," Hauser told the Wall Street Journal.
Evidently, the powers that be at Intel have seen the wisdom in Hauser's words. The company has penned a staggering $1.5 billion multiyear cross-platform licensing deal with Nvidia, giving it access not only to Nvidia's portfolio of GPUs, but also its mobile processors, such as Tegra 2, which had a strong showing at this year's CES. In addition, the deal puts to rest all ongoing lawsuits between the two companies.
The deal should give Intel a much-needed push in the mobile market, a space in which it and Microsoft have struggled to make traction, perhaps held back by a PC-world mentality. In fact, combine the $1.5 billion Nvidia deal with Intel's $1.4 billion purchase of Infineon's wireless division last September, and you have a company that may very well be able to successfully make the transition to a mobile world -- if it can build something with all the shiny tools its added to its arsenal through hard cash instead of in-house innovation.
The Nvidia-Intel deal could, unfortunately, be the final nail in the coffin for AMD. Yes, observers have predicted AMD's imminent demise before, but now it appears that the company is really in dire straits. Like Intel, AMD seems to have spent too much time worrying about its standing in the PC chip market and hasn't brought any winning technologies to the mobile space. AMD has certainly demonstrated innovation for PC chips, including its newly minted Fusion APU (accelerated processing units) and, at CES, a new line of Phenom II processors. But mobile doesn't seem to be a priority for AMD.
Meanwhile, the deal with Nvidia also gives Intel access to the sort of graphic-intensive processing technologies that can help compete against the aforementioned AMD offerings.
Further, unlike Intel, AMD doesn't have a couple of billion dollars on hand to buy up potentially winning mobile technologies. The fact the AMD's CEO Dirk Meyer has just stepped down -- on the very same day that the Nvidia-Intel deal was announced -- also doesn't bode well for the chipmaker. Sure, new leadership brings a chance for a shakeup and a different course, but this time, it may be too late for AMD.
"People in the mobile phone architecture [business] do not buy microprocessors. So if you sell microprocessors you have the wrong model. They license them," Hauser told the Wall Street Journal.
Evidently, the powers that be at Intel have seen the wisdom in Hauser's words. The company has penned a staggering $1.5 billion multiyear cross-platform licensing deal with Nvidia, giving it access not only to Nvidia's portfolio of GPUs, but also its mobile processors, such as Tegra 2, which had a strong showing at this year's CES. In addition, the deal puts to rest all ongoing lawsuits between the two companies.
The deal should give Intel a much-needed push in the mobile market, a space in which it and Microsoft have struggled to make traction, perhaps held back by a PC-world mentality. In fact, combine the $1.5 billion Nvidia deal with Intel's $1.4 billion purchase of Infineon's wireless division last September, and you have a company that may very well be able to successfully make the transition to a mobile world -- if it can build something with all the shiny tools its added to its arsenal through hard cash instead of in-house innovation.
The Nvidia-Intel deal could, unfortunately, be the final nail in the coffin for AMD. Yes, observers have predicted AMD's imminent demise before, but now it appears that the company is really in dire straits. Like Intel, AMD seems to have spent too much time worrying about its standing in the PC chip market and hasn't brought any winning technologies to the mobile space. AMD has certainly demonstrated innovation for PC chips, including its newly minted Fusion APU (accelerated processing units) and, at CES, a new line of Phenom II processors. But mobile doesn't seem to be a priority for AMD.
Meanwhile, the deal with Nvidia also gives Intel access to the sort of graphic-intensive processing technologies that can help compete against the aforementioned AMD offerings.
Further, unlike Intel, AMD doesn't have a couple of billion dollars on hand to buy up potentially winning mobile technologies. The fact the AMD's CEO Dirk Meyer has just stepped down -- on the very same day that the Nvidia-Intel deal was announced -- also doesn't bode well for the chipmaker. Sure, new leadership brings a chance for a shakeup and a different course, but this time, it may be too late for AMD.
Sunday, January 16, 2011
2011 New Years Computer Resolutions
This year, instead of making unachievable resolutions, better yourself by adopting these universal computer user resolutions.
Every year about this time, people make resolutions. The origins of this practice are shrouded in mystery, but it suffices to say that for most people they have become a joke. Few people actually follow through on any of them. This is mostly because they are unachievable resolutions. This year, I have a list of universal resolutions for computer users that may solve part of this problem and help them improve themselves in the process.
Here are the 2011 universal computer user resolutions. Adopt as many as you can.
* Develop a genuine data backup strategy that works. Have you ever gone through a complete year without hearing about a friend who stupidly lost all his data because of an equipment failure? You always ask them about backing up, and they tell you they meant to back up but never did. This can happen to anyone if they do not have some process to do it. Even if you have to do it by hand, buy yourself one of the new external hard drives and back up everything. I saw a 2TB hard disk for $99 recently.
* Test your backup. At least once a year, test your backup on a neutral machine to see if you can actually restore the data. Often, you'll discover you are not really backing up anything. Test, test, test.
* Put in play an entire backup computer. You should buy an inexpensive box to use in an emergency. Even a laptop would suffice. Ask yourself: what would happen on a Sunday night if your machine blew up? Let's say the motherboard gives out. What would you do? Save yourself the hassle, and get a backup machine.
* Upgrade your machine. Most people out there have computers that are over two years old and some can barely deal with video streams. Get a new display card or look around for a new machine. Then use the old machine as the backup (see above).
* Scan your machine. Every so often, you should run an alien scanner, not from your normal anti-virus software, on your machine just to make sure your anti-virus software is catching everything. You'd be surprised at the different results you get from different scanners.
* Learn Linux. Resolve to get a decent book on Linux and learn how to use it. Better still download Ubuntu or one of the Linux distributions and install it on a new machine or a very old machine you have in the closet. Play with it until you feel comfortable. I can assure you that you'll love it.
* Learn how to really use Adobe Photoshop and Illustrator. Again, there are a lot of guides to using these products as well as tutorial websites. There are also classes galore. Learn to use these or other products out there, and you will improve your computing knowledge in the process.
* Learn sound and video editing. You do not have to be a pro to know how to do some simple sound and video editing. If you use a digital camera and like to take short movies, do everyone a favor and edit them a little bit before you put them on YouTube. By this, I don't mean for you to produce the short video as if you are a Hollywood director showing off. I mean, do some simple editing to make the video watchable.
* Sell your excess gear. Many people have an extra printer that they will never use or old SCSI cables or other weird junk around the office and closet. Pack this stuff up and get rid of it by donating or selling it to people who will use it. In other words, do your spring cleaning early!
* Organize your photos. Get a copy of some good organizational software, such as ThumbsPlus, and organize your digital photos. Most people with a digital camera soon begin to accumulate thousands of pictures. The date and time is never set correctly on the camera and the pictures end up lost in a pile of pictures. Sort and organize them before it's too late.
I'm sure you can come up with a few more universal computer user resolutions for 2011. These are easy and simple. All you have to do is make sure you do them. Have a Happy New Year.
Every year about this time, people make resolutions. The origins of this practice are shrouded in mystery, but it suffices to say that for most people they have become a joke. Few people actually follow through on any of them. This is mostly because they are unachievable resolutions. This year, I have a list of universal resolutions for computer users that may solve part of this problem and help them improve themselves in the process.
Here are the 2011 universal computer user resolutions. Adopt as many as you can.
* Develop a genuine data backup strategy that works. Have you ever gone through a complete year without hearing about a friend who stupidly lost all his data because of an equipment failure? You always ask them about backing up, and they tell you they meant to back up but never did. This can happen to anyone if they do not have some process to do it. Even if you have to do it by hand, buy yourself one of the new external hard drives and back up everything. I saw a 2TB hard disk for $99 recently.
* Test your backup. At least once a year, test your backup on a neutral machine to see if you can actually restore the data. Often, you'll discover you are not really backing up anything. Test, test, test.
* Put in play an entire backup computer. You should buy an inexpensive box to use in an emergency. Even a laptop would suffice. Ask yourself: what would happen on a Sunday night if your machine blew up? Let's say the motherboard gives out. What would you do? Save yourself the hassle, and get a backup machine.
* Upgrade your machine. Most people out there have computers that are over two years old and some can barely deal with video streams. Get a new display card or look around for a new machine. Then use the old machine as the backup (see above).
* Scan your machine. Every so often, you should run an alien scanner, not from your normal anti-virus software, on your machine just to make sure your anti-virus software is catching everything. You'd be surprised at the different results you get from different scanners.
* Learn Linux. Resolve to get a decent book on Linux and learn how to use it. Better still download Ubuntu or one of the Linux distributions and install it on a new machine or a very old machine you have in the closet. Play with it until you feel comfortable. I can assure you that you'll love it.
* Learn how to really use Adobe Photoshop and Illustrator. Again, there are a lot of guides to using these products as well as tutorial websites. There are also classes galore. Learn to use these or other products out there, and you will improve your computing knowledge in the process.
* Learn sound and video editing. You do not have to be a pro to know how to do some simple sound and video editing. If you use a digital camera and like to take short movies, do everyone a favor and edit them a little bit before you put them on YouTube. By this, I don't mean for you to produce the short video as if you are a Hollywood director showing off. I mean, do some simple editing to make the video watchable.
* Sell your excess gear. Many people have an extra printer that they will never use or old SCSI cables or other weird junk around the office and closet. Pack this stuff up and get rid of it by donating or selling it to people who will use it. In other words, do your spring cleaning early!
* Organize your photos. Get a copy of some good organizational software, such as ThumbsPlus, and organize your digital photos. Most people with a digital camera soon begin to accumulate thousands of pictures. The date and time is never set correctly on the camera and the pictures end up lost in a pile of pictures. Sort and organize them before it's too late.
I'm sure you can come up with a few more universal computer user resolutions for 2011. These are easy and simple. All you have to do is make sure you do them. Have a Happy New Year.
Saturday, January 15, 2011
How Attackers Get Away With Data
Just as breaking into a bank is pointless without a getaway plan, so too is breaking into a network without the ability to sneak away with data.
The exfiltration stage of data theft often garners less attention than the methods used to infect computers, but is no less important. At Black Hat DC, Sean Coyne, a security consultant at Mandiant, is offering attendees a look at some of the more advanced ways attackers sneak data out of the digital doors of enterprises.
“We’ll be covering the basics of data exfiltration along with a few examples of more advanced methods and tricks we’ve seen in recent incident investigations that we have performed for clients,” Coyne told eWEEK. “Specifically, we’ll be talking about archiving and compression, selection of file-staging locations, encrypted tunnels, and malware that uses public Web mail or chat programs to transmit data. We’ll also highlight how many of the common data-theft techniques are simple, use common tools and protocols, yet are widely utilized because they still work.”
Like an autoimmune disease, sophisticated attackers sometimes turn a target’s IT infrastructure against the target itself. For example, Coyne explained, most organizations maintain a relatively “flat” internal Windows network with little-to-no network-layer access controls restricting workstation-to-workstation and workstation-to-server communication.
“The same environments tend to reuse local administrator passwords and allow most of their domain users to log in to their own systems with elevated privileges,” he said. “This makes it trivial for an attacker to start out with a small number of victim users, compromised via something like a spear-phishing e-mail, and leverage administrator privileges for lateral movement and access to servers and other systems housing critical data.”
As a compromise evolves, the attacker may also leverage existing remote access tools such as VPNs after they have obtained the requisite credentials, he added.
Attackers temporarily compile the data they steal in a staging area on the victim’s network before moving it out to their own systems, the researcher said. The staging point is usually a workstation, but could also be a server or other system on the network with outbound access to the Internet.
“Attackers typically choose an existing system directory on the host, such as the parent RECYCLER or System Volume Information (Restore Point) folders, and set attributes so that [victims] won’t come across the files during normal usage,” he said. “Once the data has been sent out of the network from the staging point, the attacker can clean the area by removing the files and any other traces of activity.”
“Host-based detection of in-progress data-theft activity is extremely rare; attackers tend to use and reuse the same staging systems and directories to facilitate consistent processes, minimize their footprint and clean up their tracks,” he continued. “For both victims and investigators, it’s important to be aware of these methods and quickly recognize an attacker’s habits so that they know what to look for throughout a compromised environment.”
Attackers also look to dodge DLP (data-loss-prevention) tools. Most organizations, Coyne said, have not implemented true end-to-end DLP tools to cover both endpoints and the network. Many of these products appear to be designed to detect unsophisticated or unintentional disclosure of protected data by users rather than protecting against a targeted attack, he added.
“A more significant problem is that most organizations still give all of their users Local Administrator privileges to their own systems…Specific to DLP solutions that rely on network traffic inspection–our talk covers a myriad of techniques for compressing, encrypting and otherwise obfuscating stolen data in ways that easily bypass this type of monitoring,” he said.
The best strategies to contain and limit attacks include enforcing strict network-segmentation, privilege-management and fundamental Windows security architecture best practices, he added.
Coyne's talk is scheduled for Jan. 18. The conference itself will run from Jan. 16-19 at the Hyatt Regency Crystal City hotel in Arlington, Va.
The exfiltration stage of data theft often garners less attention than the methods used to infect computers, but is no less important. At Black Hat DC, Sean Coyne, a security consultant at Mandiant, is offering attendees a look at some of the more advanced ways attackers sneak data out of the digital doors of enterprises.
“We’ll be covering the basics of data exfiltration along with a few examples of more advanced methods and tricks we’ve seen in recent incident investigations that we have performed for clients,” Coyne told eWEEK. “Specifically, we’ll be talking about archiving and compression, selection of file-staging locations, encrypted tunnels, and malware that uses public Web mail or chat programs to transmit data. We’ll also highlight how many of the common data-theft techniques are simple, use common tools and protocols, yet are widely utilized because they still work.”
Like an autoimmune disease, sophisticated attackers sometimes turn a target’s IT infrastructure against the target itself. For example, Coyne explained, most organizations maintain a relatively “flat” internal Windows network with little-to-no network-layer access controls restricting workstation-to-workstation and workstation-to-server communication.
“The same environments tend to reuse local administrator passwords and allow most of their domain users to log in to their own systems with elevated privileges,” he said. “This makes it trivial for an attacker to start out with a small number of victim users, compromised via something like a spear-phishing e-mail, and leverage administrator privileges for lateral movement and access to servers and other systems housing critical data.”
As a compromise evolves, the attacker may also leverage existing remote access tools such as VPNs after they have obtained the requisite credentials, he added.
Attackers temporarily compile the data they steal in a staging area on the victim’s network before moving it out to their own systems, the researcher said. The staging point is usually a workstation, but could also be a server or other system on the network with outbound access to the Internet.
“Attackers typically choose an existing system directory on the host, such as the parent RECYCLER or System Volume Information (Restore Point) folders, and set attributes so that [victims] won’t come across the files during normal usage,” he said. “Once the data has been sent out of the network from the staging point, the attacker can clean the area by removing the files and any other traces of activity.”
“Host-based detection of in-progress data-theft activity is extremely rare; attackers tend to use and reuse the same staging systems and directories to facilitate consistent processes, minimize their footprint and clean up their tracks,” he continued. “For both victims and investigators, it’s important to be aware of these methods and quickly recognize an attacker’s habits so that they know what to look for throughout a compromised environment.”
Attackers also look to dodge DLP (data-loss-prevention) tools. Most organizations, Coyne said, have not implemented true end-to-end DLP tools to cover both endpoints and the network. Many of these products appear to be designed to detect unsophisticated or unintentional disclosure of protected data by users rather than protecting against a targeted attack, he added.
“A more significant problem is that most organizations still give all of their users Local Administrator privileges to their own systems…Specific to DLP solutions that rely on network traffic inspection–our talk covers a myriad of techniques for compressing, encrypting and otherwise obfuscating stolen data in ways that easily bypass this type of monitoring,” he said.
The best strategies to contain and limit attacks include enforcing strict network-segmentation, privilege-management and fundamental Windows security architecture best practices, he added.
Coyne's talk is scheduled for Jan. 18. The conference itself will run from Jan. 16-19 at the Hyatt Regency Crystal City hotel in Arlington, Va.
Thursday, January 13, 2011
The CompTIA Authorized Service Center Program
The CompTIA Authorized Service Center Program designates businesses that have above a certain percentage of A+ certification service technicians. CompTIA certifications have proven to be a strong predictor of employee success.
As a global, vendor-neutral credential, the CompTIA Authorized Service Center name proves to clients that their technicians are knowledgeable, capable professionals. Major hardware and software vendors, distributors and resellers accept CompTIA certifications as the industry standard in foundation-level, vendor-neutral certifications for service technicians.
Certified individuals can handle more service calls and are more comfortable dealing with technology changes and customer complaints. Businesses that employ certified individuals report higher customer satisfaction and lower customer turnover. And companies with more certified technicians meet the requirements of more bid proposals and land more jobs.
The CompTIA Authorized Service Center Program designates businesses that have above a certain percentage of CompTIA-certified service technicians. CompTIA certifications have proven to be a strong predictor of employee success.
Certkingdom has a variety of CompTIA 220-701 torrent and other exam preparation materials giving you consolidation in the ways of your interest and ease. Certkingdom 220-701 TESTING ENGINE Download give you detailed and logical coverage of CompTIA 220-701 exam objectives and provide you with the real exam environment as these products are built by IT examiners so you experience the real exam features in our product Certified individuals can handle more service calls and are more comfortable dealing with technology changes and customer complaints. Businesses that employ certified individuals report higher customer satisfaction and lower customer turnover. And companies with more certified technicians meet the requirements of more bid proposals and land more jobs.
This program will prepare you for CompTIA A+ certification. You can earn this certification after you pass two exams. The A+ Essentials exam, 220-701, covers the free A+ practice exams foundational knowledge a PC support technician should know. The Practical Application exam, 220-702, tests practical knowledge and troubleshooting skills.
As a global, vendor-neutral credential, the CompTIA Authorized Service Center name proves to clients that their technicians are knowledgeable, capable professionals. Major hardware and software vendors, distributors and resellers accept CompTIA certifications as the industry standard in foundation-level, vendor-neutral certifications for service technicians.
Certified individuals can handle more service calls and are more comfortable dealing with technology changes and customer complaints. Businesses that employ certified individuals report higher customer satisfaction and lower customer turnover. And companies with more certified technicians meet the requirements of more bid proposals and land more jobs.
The CompTIA Authorized Service Center Program designates businesses that have above a certain percentage of CompTIA-certified service technicians. CompTIA certifications have proven to be a strong predictor of employee success.
Certkingdom has a variety of CompTIA 220-701 torrent and other exam preparation materials giving you consolidation in the ways of your interest and ease. Certkingdom 220-701 TESTING ENGINE Download give you detailed and logical coverage of CompTIA 220-701 exam objectives and provide you with the real exam environment as these products are built by IT examiners so you experience the real exam features in our product Certified individuals can handle more service calls and are more comfortable dealing with technology changes and customer complaints. Businesses that employ certified individuals report higher customer satisfaction and lower customer turnover. And companies with more certified technicians meet the requirements of more bid proposals and land more jobs.
This program will prepare you for CompTIA A+ certification. You can earn this certification after you pass two exams. The A+ Essentials exam, 220-701, covers the free A+ practice exams foundational knowledge a PC support technician should know. The Practical Application exam, 220-702, tests practical knowledge and troubleshooting skills.
Monday, January 10, 2011
Datacenter for the paranoid
It looks like it’s time to start prepping the tinfoil hats and preparing for the black helicopters. Not to mention adding 512-bit encryption to all your tweets and Facebook posts. The NSA will soon be ready to come after everyone electronically with the groundbreaking of their new 1,000,000 square foot datacenter in Utah.
The disinformation has already started with various news articles reporting the cost of the project as either $1.2 or $1.5 billion dollars, and civilian contractors (the project is being overseen by the US Army Corp of Engineers) being quoted as saying “we can’t talk about it.” Fortunately US Senator Orrin Hatch, who represents the state of Utah, where the project is being constructed on a 240 acre site within the Camp Williams training facility, isn’t quite so reticent saying “This country defends itself against cyber-attacks every day. It’s an arena where we need to defend ourselves” going on with “This center will support the effort to better understand that threat.”
As to why the facility is being built in Utah, there was apparently a detailed selection process using over 130 criteria that determined this was the best location, though US Army Corp of Engineers Brigadier General Peter Deluca was quoted in the Salt Lake Tribune as saying there are “at least 50 perfect states to build a data center.”
I’m really hoping that was a tongue-in-check comment. Sort of like my lead paragraph.
The datacenter itself will utilize 100,000 sq. ft. within the facility, with the rest of the space housing other NSA and civilian employees presumably focused on cyber warfare issues, though some sources have stated that this is strictly a technical facility with no plans for analysts on-site. Harvey Davis, associate director of installations for the NSA, described the hardware going into the datacenter component of the facility as being the essence of the NSA’s work. The datacenter is considered part of the Comprehensive National Cyber-security Initiative.
The facility is scheduled to go online in October 2013, a timeline that includes not only building the datacenter itself but also the power, and presumably networking, infrastructures needed to support the undertaking.
The disinformation has already started with various news articles reporting the cost of the project as either $1.2 or $1.5 billion dollars, and civilian contractors (the project is being overseen by the US Army Corp of Engineers) being quoted as saying “we can’t talk about it.” Fortunately US Senator Orrin Hatch, who represents the state of Utah, where the project is being constructed on a 240 acre site within the Camp Williams training facility, isn’t quite so reticent saying “This country defends itself against cyber-attacks every day. It’s an arena where we need to defend ourselves” going on with “This center will support the effort to better understand that threat.”
As to why the facility is being built in Utah, there was apparently a detailed selection process using over 130 criteria that determined this was the best location, though US Army Corp of Engineers Brigadier General Peter Deluca was quoted in the Salt Lake Tribune as saying there are “at least 50 perfect states to build a data center.”
I’m really hoping that was a tongue-in-check comment. Sort of like my lead paragraph.
The datacenter itself will utilize 100,000 sq. ft. within the facility, with the rest of the space housing other NSA and civilian employees presumably focused on cyber warfare issues, though some sources have stated that this is strictly a technical facility with no plans for analysts on-site. Harvey Davis, associate director of installations for the NSA, described the hardware going into the datacenter component of the facility as being the essence of the NSA’s work. The datacenter is considered part of the Comprehensive National Cyber-security Initiative.
The facility is scheduled to go online in October 2013, a timeline that includes not only building the datacenter itself but also the power, and presumably networking, infrastructures needed to support the undertaking.
Friday, January 7, 2011
Free CompTIA A+ Certification Online School
CompTIA A+ is a Computer Technology Industry Association CompTIA A+ certification acceptance advised to accommodate approved accreditation to technology professionals absorbed in acceptable Professional Computer Technicians. Demonstrating adequacy as a computer technician, the A+ appellation requires 500 hours of hands-on training, specialization in assertive abstruse areas, and appearance two multiple-choice exams (hardware and operating arrangement technologies) area altered answers can be graded correctly.
Although advised bell-ringer neutral, CompTIA certifications tend to abutment Microsoft operating systems if testing the amount assay capacity of IRQs, absolute anamnesis access, applied computer repair, accession and acclimation harder drives, modems, arrangement cards, CPUs, ability supplies, and printers. CompTIA is primarily a accouterments assay and requires a account of 515/900 (hardware) and 505/900 (OS) in adjustment to pass. The A+ affidavit is just one of abounding CompTIA certifications in such areas as networks, servers, and computer security, but is broadly admired as the aboriginal footfall in advancing such a path. A+ holders usually acquire about $31,000 although this bacon amount increases bound with the achievement of added certificates.
CompTIA A+ acceptance validates the ability of computer account technicians with the agnate of 500 hours of hands-on experience. Earning CompTIA A+ acceptance proves that a applicant has a ample abject of ability and adequacy in amount accouterments and operating arrangement technologies including installation, configuration, diagnosing, antitoxin aliment and basal networking.
If you plan to build, adjustment configure or advance computers or computer networks, or plan to accomplish added functions of a computer account technicians,220-701 will let you appear abundant stronger in your career. Enhanced Career Opportunities: Recruiters adopt certified candidates for job placements. Being A+ certified can aswell play a role in accepting a advance at a job. Abounding companies such as CompuCom, CompUSA and IBM accept aswell fabricated A+ acceptance binding for their account technicians.
Credits Towards Added Certifications:
Popular acceptance programs such as those by Microsoft, Hewlett-Packard, Cisco, Novell and Certiport admit CompTIA A+ acceptance in their avant-garde acceptance tracks.
Prerequisite For Trainings:
Abounding companies crave candidates to be free CompTIA practice tests to authorize for their accumulated and vendor-specific training programs.
Although advised bell-ringer neutral, CompTIA certifications tend to abutment Microsoft operating systems if testing the amount assay capacity of IRQs, absolute anamnesis access, applied computer repair, accession and acclimation harder drives, modems, arrangement cards, CPUs, ability supplies, and printers. CompTIA is primarily a accouterments assay and requires a account of 515/900 (hardware) and 505/900 (OS) in adjustment to pass. The A+ affidavit is just one of abounding CompTIA certifications in such areas as networks, servers, and computer security, but is broadly admired as the aboriginal footfall in advancing such a path. A+ holders usually acquire about $31,000 although this bacon amount increases bound with the achievement of added certificates.
CompTIA A+ acceptance validates the ability of computer account technicians with the agnate of 500 hours of hands-on experience. Earning CompTIA A+ acceptance proves that a applicant has a ample abject of ability and adequacy in amount accouterments and operating arrangement technologies including installation, configuration, diagnosing, antitoxin aliment and basal networking.
If you plan to build, adjustment configure or advance computers or computer networks, or plan to accomplish added functions of a computer account technicians,220-701 will let you appear abundant stronger in your career. Enhanced Career Opportunities: Recruiters adopt certified candidates for job placements. Being A+ certified can aswell play a role in accepting a advance at a job. Abounding companies such as CompuCom, CompUSA and IBM accept aswell fabricated A+ acceptance binding for their account technicians.
Credits Towards Added Certifications:
Popular acceptance programs such as those by Microsoft, Hewlett-Packard, Cisco, Novell and Certiport admit CompTIA A+ acceptance in their avant-garde acceptance tracks.
Prerequisite For Trainings:
Abounding companies crave candidates to be free CompTIA practice tests to authorize for their accumulated and vendor-specific training programs.
Wednesday, January 5, 2011
220-701 Exam torrent
It is well known that latest 220-701 exam test is the hot exam of CompTIA A+ Essentials. Certkingdom offer you all the Q&A of the 220-701 real test.It is the examination of the perfect combination and it will help you pass 220-701 exam at the first time!
Certkingdom offers exclusive CompTIA 220-701 Study Materials for a detailed and accurate look inside the current CompTIA 220-701 Exam Objectives. Our CompTIA 220-701 Study Materials provide you an ultimate source of study for CompTIA 220-701 Certification Exam. Our CompTIA 220-701 study guide cover 100% CompTIA 220-701 Exam Objectives while preparing you for the practical life at the same time.
Demand for our training 220-701 study guide has been rapidly increasing because our certification practice exam study guides contain the most accurate and up to date certification training practice exam material which makes Certkingdom the 220-701 provider in the world.
Acquiring CompTIA A+ certifications are becoming a huge task in the field of I.T. More over these exams like 220-701 exam are now continuously updating and accepting this challenge is itself a task.This 220-701 practice test is an important part of CompTIA certifications and at A+ braindumps we have the resources to prepare you for this.
The 220-701 exam is essential and core part of CompTIA certifications and once you clear the exam you will be able to solve the real time problems yourself. Want to take advantage of the Real 220-701 Value Pack and save time and money while developing your skills to pass your CompTIA Certified Network Associate (A+) Exam? Let Certkingdom help you climb that ladder of success and pass your 220-701 now!
We are all well aware that a major problem in the IT industry is that there is a lack of quality study materials. Our Exam Preparation Material provides you everything you will need to take a certification examination. Like actual certification exams, our Practice Tests are in multiple-choice (MCQs) Our CompTIA 220-701 Exam will provide you with exam questions with verified answers that reflect the actual exam. These questions and answers provide you with the experience of taking the actual test. High quality and Value for the 220-701 Exam:100% Guarantee to Pass Your A+ Exams and get your A+ Certification.
Certkingdom offers exclusive CompTIA 220-701 Study Materials for a detailed and accurate look inside the current CompTIA 220-701 Exam Objectives. Our CompTIA 220-701 Study Materials provide you an ultimate source of study for CompTIA 220-701 Certification Exam. Our CompTIA 220-701 study guide cover 100% CompTIA 220-701 Exam Objectives while preparing you for the practical life at the same time.
Demand for our training 220-701 study guide has been rapidly increasing because our certification practice exam study guides contain the most accurate and up to date certification training practice exam material which makes Certkingdom the 220-701 provider in the world.
Acquiring CompTIA A+ certifications are becoming a huge task in the field of I.T. More over these exams like 220-701 exam are now continuously updating and accepting this challenge is itself a task.This 220-701 practice test is an important part of CompTIA certifications and at A+ braindumps we have the resources to prepare you for this.
The 220-701 exam is essential and core part of CompTIA certifications and once you clear the exam you will be able to solve the real time problems yourself. Want to take advantage of the Real 220-701 Value Pack and save time and money while developing your skills to pass your CompTIA Certified Network Associate (A+) Exam? Let Certkingdom help you climb that ladder of success and pass your 220-701 now!
We are all well aware that a major problem in the IT industry is that there is a lack of quality study materials. Our Exam Preparation Material provides you everything you will need to take a certification examination. Like actual certification exams, our Practice Tests are in multiple-choice (MCQs) Our CompTIA 220-701 Exam will provide you with exam questions with verified answers that reflect the actual exam. These questions and answers provide you with the experience of taking the actual test. High quality and Value for the 220-701 Exam:100% Guarantee to Pass Your A+ Exams and get your A+ Certification.
Monday, January 3, 2011
A Brief Introduction of Comptia A+ Certification
CompTIA presents a lot of certifications and CompTIA A+ is one of such certifications. The CompTIA A+ 2009 versions is going through changes and receiving few extra topics to calculating the expertise and knowledge required to an entry-level computer support and service technician. The earlier revision of these test requirements was back in year 2006.
CompTIA 220-602 certifications are a well-known credential all over the IT industry, validating ground level IT understanding and skills. Comp TIA offers twelve certification programs in main technology fields. To help complete the modifications, CompTIA sent out surveys to help in the exam growth process and to clarify the substance for the CompTIA A+ Essentials Exam.
There were two reviews: one for the future "Essentials" exam necessities and the next for the projected "Technician" exam requirements. All those two surveys focused on the existing draft of the exam requirements and asked for a view on how important each topic was, and how often a borderline fit applicant would do each task.
The surveys enclosed two of the four obtainable CompTIA A+ exams:
• CompTIA A+ Essentials
• CompTIA IT Technician
Comp TIA A+ Essentials authenticates knowledge of fundamental computer hardware and operating systems. The CompTIA IT Technician exam is beset for those who work or have it in mind to work in a mobile or business technical atmosphere with a high point of personally client communication. Those might add in both surveys since the A+ certification exam would attribute new knowledge requirements, many CompTIA A+ certification CompTIA SY0-101 training would be very supportive in learning the fresh requirements.
While the current certification pre-requisite composition was a draft edition that is in process, some things stood out from the study. The new draft was not really that much different from the existing major category domain arrangement, however, there were a lot of different fresh technologies registered, such as SD cards, TV tuner cards, Blu-ray, DDR3, and Bluetooth. Windows XP, Windows 2000, and Windows Vista were also enrolled as workings. A number of elder networking technologies were going down and the most recent cabling standards were added, whereas CompTIA 220-602 cellular networking was stated often. The security part was updated to comprise encryption, authentication technologies, and updated biometric technologies.
Attaining an A+ CompTIA N10-003 training course would completely help you start thinking and learning on what you may meet on the new requirements' catalog, as well as instructs you the essentials that never alter on the exam.
CompTIA 220-602 certifications are a well-known credential all over the IT industry, validating ground level IT understanding and skills. Comp TIA offers twelve certification programs in main technology fields. To help complete the modifications, CompTIA sent out surveys to help in the exam growth process and to clarify the substance for the CompTIA A+ Essentials Exam.
There were two reviews: one for the future "Essentials" exam necessities and the next for the projected "Technician" exam requirements. All those two surveys focused on the existing draft of the exam requirements and asked for a view on how important each topic was, and how often a borderline fit applicant would do each task.
The surveys enclosed two of the four obtainable CompTIA A+ exams:
• CompTIA A+ Essentials
• CompTIA IT Technician
Comp TIA A+ Essentials authenticates knowledge of fundamental computer hardware and operating systems. The CompTIA IT Technician exam is beset for those who work or have it in mind to work in a mobile or business technical atmosphere with a high point of personally client communication. Those might add in both surveys since the A+ certification exam would attribute new knowledge requirements, many CompTIA A+ certification CompTIA SY0-101 training would be very supportive in learning the fresh requirements.
While the current certification pre-requisite composition was a draft edition that is in process, some things stood out from the study. The new draft was not really that much different from the existing major category domain arrangement, however, there were a lot of different fresh technologies registered, such as SD cards, TV tuner cards, Blu-ray, DDR3, and Bluetooth. Windows XP, Windows 2000, and Windows Vista were also enrolled as workings. A number of elder networking technologies were going down and the most recent cabling standards were added, whereas CompTIA 220-602 cellular networking was stated often. The security part was updated to comprise encryption, authentication technologies, and updated biometric technologies.
Attaining an A+ CompTIA N10-003 training course would completely help you start thinking and learning on what you may meet on the new requirements' catalog, as well as instructs you the essentials that never alter on the exam.
Saturday, January 1, 2011
CompTIA Network+ practice exams
The CompTIA Authorized Account Centermost Affairs designates businesses that acquire aloft a assertive allotment of CompTIA Network+ account technicians. CompTIA certifications acquire accurate to be a able augur of agent success.
As a global, vendor-neutral credential, the CompTIA Authorized Account Centermost name proves to audience that their technicians are knowledgeable, able professionals. Major accouterments and software vendors, distributors and resellers acquire CompTIA certifications as the industry accepted in foundation-level, vendor-neutral certifications for account technicians.
Certified individuals can handle added account calls and are added adequate ambidextrous with technology changes and chump complaints. Businesses that apply certified individuals address college chump achievement and lower chump turnover. And companies with added certified technicians accommodated the requirements of added bid proposals and acreage added jobs.
This affairs will adapt you for CompTIA A+ certification. You can acquire this acceptance afterwards you canyon two exams. The A+ Essentials exam, 220-701, covers the basal ability a PC abutment artisan should know. The Applied Appliance exam,A+ certificate applied ability and troubleshooting skills.
Answer sample questions. The CompTIA website aswell provides a amount of sample questions for anniversary acceptance assay it offers. Afterwards bushing out a anatomy on the CompTIA website, you will acquire Enroll in a CompTIA Learning Alliance training center. There are abundant CompTIA Learning Alliance training centers beyond the United States. Go to the CompTIA website for a account of accustomed training centers in your area. Such training centers accommodate the best Security+ acceptance assay alertness available.
Every CompTIA 220-701 analysis comes with an appropriately able CompTIA 220-701 download for the assay engine that admiral our simulator. This appliance will simulate the absolute testing ambiance and acquiesce you to baddest what areas of the assay you wish to focus on. Use or actualize CompTIA 220-701 addendum as you go and re-visit questions that you missed. If you trend an breadth of the analysis that you charge abutment on, you can absolute the CompTIA 220-701 actor to alone serve those questions.
As a global, vendor-neutral credential, the CompTIA Authorized Account Centermost name proves to audience that their technicians are knowledgeable, able professionals. Major accouterments and software vendors, distributors and resellers acquire CompTIA certifications as the industry accepted in foundation-level, vendor-neutral certifications for account technicians.
Certified individuals can handle added account calls and are added adequate ambidextrous with technology changes and chump complaints. Businesses that apply certified individuals address college chump achievement and lower chump turnover. And companies with added certified technicians accommodated the requirements of added bid proposals and acreage added jobs.
This affairs will adapt you for CompTIA A+ certification. You can acquire this acceptance afterwards you canyon two exams. The A+ Essentials exam, 220-701, covers the basal ability a PC abutment artisan should know. The Applied Appliance exam,A+ certificate applied ability and troubleshooting skills.
Answer sample questions. The CompTIA website aswell provides a amount of sample questions for anniversary acceptance assay it offers. Afterwards bushing out a anatomy on the CompTIA website, you will acquire Enroll in a CompTIA Learning Alliance training center. There are abundant CompTIA Learning Alliance training centers beyond the United States. Go to the CompTIA website for a account of accustomed training centers in your area. Such training centers accommodate the best Security+ acceptance assay alertness available.
Every CompTIA 220-701 analysis comes with an appropriately able CompTIA 220-701 download for the assay engine that admiral our simulator. This appliance will simulate the absolute testing ambiance and acquiesce you to baddest what areas of the assay you wish to focus on. Use or actualize CompTIA 220-701 addendum as you go and re-visit questions that you missed. If you trend an breadth of the analysis that you charge abutment on, you can absolute the CompTIA 220-701 actor to alone serve those questions.
Subscribe to:
Comments (Atom)
