Wednesday, December 23, 2015

As 2015 ends, Ubuntu Linux misses its 200 million user goal

The popular Linux distribution set a lofty goal four years ago, and hasn’t come anywhere close.

With the end of 2015 imminent, Ubuntu appears to have fallen far short of the 200 million user goal it set back in 2011.

“[Our] goal is 200 million users of Ubuntu in four years,” Canonical CEO Mark Shuttleworth said at a developer summit in May 2011. “We’re not playing a game for developers’ hearts and minds—we’re playing a game for the world's hearts and minds, and to achieve that we’re going to have to play by a new set of rules.”

As Linux site Phoronix points out, reports on Ubuntu server and desktop installations have yet to even pass 100 million. Ubuntu’s own website says the desktop operating system has more than 40 million users. Linux as a whole accounted for 1.61 percent of desktops accessing the Internet last month, according to NetApplications. By comparison, Windows 10 hit 9 percent of that market in November, the same month that Microsoft announced 110 million users of its latest OS.

With the PC market in decline overall, it’s unlikely that Ubuntu will get much help from the desktop side in hitting 200 million installs. As such, Ubuntu steward Canonical has made only minor changes to the desktop operating system lately, and has turned greater attention to smartphones and a converged smartphone-desktop OS. Still, Ubuntu’s phone efforts are slow-going so far, with one estimate claiming just 25,000 users as of September.

The “new set of rules” Shuttleworth spoke about may have referred to putting Ubuntu on other rapidly-growing device categories such as TVs and connected cars. But while Linux-based systems as a whole are making some strides in these areas, Ubuntu’s influence has been minimal.

Why this matters: The unmet goal of 200 million users underscores how difficult it is for a platform like Ubuntu to shift from desktops to other devices where the potential for growth is greater. It’s worth noting that Microsoft has set a similarly lofty ambition of 1 billion Windows 10 devices within three years—something that may be difficult to achieve unless Windows can latch onto new product categories such as smart homes, robots, and augmented reality.

Tuesday, December 8, 2015

Patch Tuesday: Microsoft released 12 patches, 8 rated critical, 1 for a zero-day

For Patch Tuesday December 2015, Microsoft released 12 security updates, 8 fixes rated critical for remote code execution vulnerabilities and one 'important' patch for a zero-day hole.

Way to go! Congratulations on suffering through another year of deploying security patches. Microsoft released 12 security bulletins for the last Patch Tuesday of 2015, eight of which are rated as critical for remote code execution vulnerabilities. Hopefully none will result in exceedingly uncool changes like Microsoft snuck into Windows 10 last month to reset privacy settings and default programs.

Although Microsoft regards MS15-135 only as “important,” it would be wise to jump on this one as it is the fix for a zero-day vulnerability in the Windows kernel that attackers are exploiting to escalate privilege, according to Qualys CTO Wolfgang Kandek. You wouldn’t know it by its Microsoft-rated “important” status, as Redmond’s security team mentioned that it resolves flaws in Windows kernel-mode drivers. Nils Sommer of bytegeist, working with Google Project Zero, is credited with reporting three CVE’s associated with this patch.

Rated critical
MS15-124 is a cumulative security update to resolve Internet Explorer flaws tied to 30 CVEs. 22 deal with IE memory corruption and “Microsoft browser” memory corruption vulnerabilities. Two fix XSS filter bypass flaws, another addresses an ASLR (Address Space Layout Randomization) bypass, one closes a hole in “Microsoft browser” that could lead to elevation of privilege, and a separate CVE is earmarked as an Internet Explorer XSS filter bypass; there’s also a patch for an IE information disclosure bug, as well as for a scripting engine information disclosure vulnerability, and a scripting engine memory corruption to correct a multiple RCE flaw.

MS15-125 is the cumulative security update for Microsoft Edge, fixing a variety of security shortcomings which range from memory corruption flaws, XSS filter bypass, ASLR bypass, elevation of privilege and spoofing vulnerabilities.

While discussing the importance of keeping browsers as up-to-date as possible, since they are constantly used in attack scenarios like drive-by-downloads and spear phishing, Kandek noted, “Edge has ‘only’ 15 issues, with 11 duplicates from IE and four issues native to Edge itself.”

MS15-126 is a cumulative security patch for Windows VBScript scripting engine and Jscript; if left unpatched, an attacker could abuse the bugs for information disclosure or worse, to pull off remote code execution.

MS15-127 addresses a vulnerability in Microsoft Windows DNS that could allow RCE if an attacker sent maliciously crafted requests to a DNS server.

Regarding MS15-127, Bobby Kuzma, CISSP, systems engineer at Core Security, said:

“Microsoft has really given us a doozy of a Christmas present, with the ability for attackers to work a remote code execution with a DNS query! If your organization runs public facing DNS servers on Windows, you’ve got a problem. If you’ve got internal DNS servers running Windows, then you’ve got an easy escalation path for attackers who are able to phish end users.”

Kandek also remarked on MS15-127, writing, “Attackers that exploit MS15-127 in Microsoft’s DNS server would gain control over the server and execute code in the system context. The attack is remote and does not require authentication, no workarounds are available. Bring your Microsoft DNS servers up to date as soon as possible, with the required testing and soak time for such a fundamental service.”

MS15-128 patches Microsoft graphics component to resolve “vulnerabilities in Microsoft Windows, .NET Framework, Microsoft Office, Skype for Business, Microsoft Lync, and Silverlight. The vulnerabilities could allow remote code execution if a user opens a specially crafted document or visits a webpage that contains specially crafted embedded fonts.”

Core Security’s Kuzma was amusing in his font vulnerability frustration last month when he asked, “What genius decided that FONT HANDLING belonged in the most sensitive parts of the operating system kernel?” He suggested that “allowing untrusted fonts into an environment is bordering on negligence,” especially since this type of flaw happens again and again. After seeing MS15-128, Kuzma said, “Another font handling issue? Didn’t we have one of these LAST month?”

MS15-129 fixes several vulnerabilities in Microsoft Silverlight that could allow information disclosure up to more critical remote code execution attacks.

MS15-130 resolves a flaw in Microsoft Uniscribe that could be exploited for RCE. Core Security’s Kuzma remarked, “More fonts. If you love your users, block fonts at the firewall. PLEASE.”

MS15-131 addresses numerous vulnerabilities in Microsoft Office, with the most severe being RCE. The flaws are fixed, Microsoft explained, “by correcting how Office handles objects in memory.”

Rated important
MS15-132 fixes vulnerabilities which could lead to remote code execution, but like the zero-day fix, Microsoft – in all its wisdom – rates this only as important.

MS15-133 is a security update for Windows PGM to resolve yet another monthly EoP problem. Microsoft wrote, “Microsoft Message Queuing (MSMQ) must be installed and the Windows Pragmatic General Multicast (PGM) protocol specifically enabled for a system to be vulnerable. MSMQ is not present in default configurations and, if it is installed, the PGM protocol is available but disabled by default.”

MS15-134 addresses bugs in Windows Media Center which could allow remote code execution. Core Security has an advisory write-up on the Media Center flaws as well as proof-of-concept attack code.

MS15-135: As mentioned above, Kandek said this should be your top priority as it addresses a zero-day currently in use by attackers for elevation of privilege. Jon Rudolph, principal software engineer at Core Security, also mentioned the patch rated as “important” being a zero-day which could allow an escalation of privilege.

“In addition to these updates,” Rudolph noted, “an increasing number of users are getting more curious about the techniques Microsoft is using to encourage users to upgrade to windows 10, and about some of the data collection practices enabled by default as well as the always-on data collection. It’s clear that there’s value in knowing what your users are up to and there are times where we don’t seem to care as users, but it’s one step closer to a market where all successful tools come with a string attached - data collection by default.”

Don’t forget Adobe security updated for the dreaded Flash

A wise person resigned to deploying patches, and who hasn’t sworn off Flash, might also hop on Adobe’s new version of Flash as Kandek said, “APSB15-32 addresses a record number of 78 vulnerabilities. All but three of the vulnerabilities could be used by an attacker to gain code execution running under the user in the browser. From there a second vulnerability would have to be used to become system on the machine (look at MS15-135 for an example), but then the attacker would have full control. Flash based attacks have been a favorite for attackers for the year with many exploit kits providing very up-to-date exploits – include this in your high priority items.”

Happy patching!
 

Sunday, November 29, 2015

10 affordable cities that offer good IT jobs

IT is everywhere, not just Silicon Valley. Here are the top 10 'off metro' regions for IT workers, based on median salary and unemployment rate.


Affordable cities that offer good IT jobs

Across the U.S., smaller, "off metro" cities are home to talent that's just as accomplished -- you just need to know where to look. Here, CIO.com has put together a list of the 10 best cities for IT -- ones that don't involve sky-high rents and impossible costs of living.

Data on median income and percentage of workforce in technology comes from the U.S. Bureau of Labor Statistics. Data on population, unemployment rate -- which covers all workers with at least a bachelor's degree -- and cost of living is from the U.S Census Bureau and U.S. Census Bureau's American Community Survey.

Omaha, Nebraska
Omaha's economy is booming, putting it squarely at the top of the list, and that's great news for tech workers looking for work and for companies looking to lower overhead. With 3.7 percent of Omaha's workforce in tech, an average salary of $74,710, and an unemployment rate of just 2.5 percent, it's a great place to be an IT professional.

Huntsville, Alabama
Huntsville's proximity to a number of military installations and aerospace research facilities makes it a thriving tech center. Though the city boasts a population of only about 180,000, nearly 13,000 of those residents, or 6.5 percent, work in technology, and the unemployment rate is a respectable 3.7 percent. With a median salary of $89,050, IT workers in Huntsville make more than their Los Angeles counterparts ($88,940) and with a lower cost of living.

Springfield, Illinois
As the capital of Illinois, it's not surprising that the largest employer in the state's sixth-largest city is the State of Illinois. But that offers plenty of opportunity for tech workers, who earn an average $85,000 a year, make up 3.2 percent of the overall workforce and enjoy an unemployment rate of just 3 percent.

Fort Worth, Texas
Forth Worth has proximity to some major IT employers, including Lockheed Martin Aeronautics, Bell Helicopter and American Airlines. With an average salary of $83,900, an unemployment rate of 3.7 percent and no state income tax, it's a great city for the IT workers that make up 2.6 percent of its workforce.

Rochester, Minnesota
Rochester's main claim to fame is the internationally renowned Mayo Clinic, and the city's second-largest employer is IBM. There are approximately 5,000 IT workers in the region, making up 4.8 percent of the workforce. An average salary of $83,900 and an unemployment rate of just 1.9 percent makes it an incredible location for IT talent.

Lynchburg, Virginia
Though the Lynchburg metro area is fairly large, with an approximate population of about 250,000, the city of Lynchburg itself is only comprised of about 80,000, of which 2.6 percent are in technology fields. The unemployment rate is just 1.9 percent and the average salary is a respectable $79,440.

Des Moines, Iowa
Des Moines is rapidly growing as a heartland tech hub, adding over 700 technology jobs in 2014, according to Bureau of Labor Statistics data, which represents an increase of about 6 percent and bumping up the number of tech sector job participation to 3.8 percent. Median salary is $76,840, and the unemployment rate is a steady 2.6 percent.

Columbus, Ohio
Ohio State University and the Ohio Supercomputer Center contribute to Columbus's standing as a central U.S. tech hub. The average salary is $78,790, the unemployment rate is 3.6 percent and the percentage of workers in the technology sector is 4 percent.

Houston, Texas
"Space City," with its proximity to NASA and other aeronautics industries, as well as a focus on biomedical engineering and the energy industry contribute to Houston's standing as a tech-centric city. The average salary is $88,230, the unemployment rate is 4.1 percent and the number of workers in the IT industry is 2.9 percent.

St. Louis, Missouri
There are more 40,000 technology workers in the St. Louis metro region, making up 3.3 percent of the total workforce. These numbers are bolstered by organizations like T-REX, a tech incubator and co-working space located in downtown, and LaunchCode, a non-profit that helps train IT workers and connect them with jobs. The average salary is $79,170 and the unemployment rate is 4.1 percent.



Saturday, November 21, 2015

74-343 Managing Projects with Microsoft Project 2013


QUESTION 1
You use Project Professional 2013 to manage a project that has customer-required milestone
completion dates. You need to see graphically that your milestones have slipped beyond your
customer-required dates. What should you add to the project milestones?

A. a Must finish on constraint
B. a finish date
C. a deadline
D. a Finish no later than constraint

Answer: C


QUESTION 2
You manage a project by using Project Professional 2013. Your project is updated with changes to
the baseline for selected tasks. You back up your project fife before you start your next tracking
cycle. After completing the tracking cycle, you notice that the baseline duration values on some
of the summary tasks are not correct. You need to correct these values before re-entering the
tracking data. You open the backed up version of the project schedule. What should you do next?

A. Manually update the summary tasks with the new duration values.
B. Reset the summary tasks to manual scheduling.
C. Ensure the tasks durations are of the same denomination values. Then reenter the tracking data.
D. Reset the baseline checking the Roll up baselines to all summary tasks option. Then reenter
the tracking data.

Answer: D


QUESTION 3
You are a program manager. Your project managers use Project Professional 2013 to manage
projects. The project managers want to utilize the same resources across their projects. You need
to combine the projects, as well as the project resources, to see allocations across the projects.
What should you do?

A. Copy and paste all resource assignments into a Master file.
B. Share resources from an external resource pool.
C. Create a Master project and insert subprojects by using Link to project.
D. Open all projects in a new window.

Answer: B


QUESTION 4
Your company uses Project Standard 2013 to track project progress. You need to accurately
calculate cost performance index (CPI) as a health indicator. Which three actions should you
perform? (Each correct answer presents part of the solution. Choose three.)

A. Ensure there is a value in the Status Date field.
B. Ensure there is a value in the Standard Rate field.
C. Ensure Task Dependencies exist in the schedule.
D. Enter actual progress information.
E. Level resources within available slack.

Answer: ABD


QUESTION 5
You are a project manager who uses Microsoft Excel 2013. Your company decides to migrate all of
the current projects in Microsoft Excel 2013 to Project Professional 2013. They allow all
employees to spend 8 hours migrating each project plan. Your current and unique project plan
has 462 tasks with duration in days, and resources have been assigned and named. You have a
status meeting in two days. You need to provide your project's information by using Project
Professional 2013. What should you do?

A. Create a new project plan in Project Professional 2013 and use the Gantt Chart Wizard to
import from an Excel Workbook.
B. Rename the Excel file from .xlsx to .mpx, and open it by using Project Professional 2013,
activating the Mapping Excel Workbook feature. Map tasks, durations, and resources
assigned into Microsoft Project fields.
C. Open the Excel File .xlsx directly from Project Professional 2013, which will convert and map
tasks, durations, and resources assigned into Project fields.
D. Create a VBA macro by using the Excel Record Macro feature to import all tasks, durations,
and resources assigned from Excel into your new Project Professional 2013 project plan.

Answer: C

Wednesday, November 4, 2015

18 companies that suck at customer service

A recent Forrester Research report shines a spotlight across different industries on companies whose people skills leave a bit to be desired, to say the least.

Customer experience can make or break a company. Whether customer service is very good or very bad, it gets peoples' attention, creates customer loyalty and affects a company's bottom line

Forrester Research launched the U.S. Customer Experience Index as an annual report in 2007. This year, for the first time, the company published it twice, and from now on it will release two reports per year. The company looked at 299 brands in 18 industries.

"The impetus was that companies were increasingly looking to customer experience as a way to set themselves apart and how important it is in terms of customer loyalty,” says Megan Burns, an analyst at Forrester and the report's principle author. “As companies decided to focus on this, they wanted to measure themselves in some way. The problem is there's a disconnect between selling and marketing and what's actually happening."

According to Forrester, analyzing and tracking customer buying behavior to ensure that they keep coming back is only the beginning. Customer experience improvements also drive revenue, ensure customer loyalty and benefit investors.

The edition published in May presents the example of AT&T, which grew its U-verse service 28 percent after improvements in the TV and Internet service provider product experiences. Forrester also found that customer experience "explains anywhere from 47 percent of the variation in loyalty for investment firms on the low end to 76 percent of that variation for car rental companies." It also states that Watermark Consulting has analyzed differences in stock performance between customer experience "leaders and laggards," accounting for 80 percentage points of difference, and a gap in the S&P 500 Index by 26 percentage points.

According to Burns, "The audience is broader than just CRM executives. It's a growing population of people. In addition to the vice president of customer experience and CMO, increasingly CEOs have cited the data on investor calls. Sprint, for example, has done this. They say, 'The way we'll compete is on customer experience, and we need a benchmark to measure that.' Even COOs, interestingly, are part of the audience, because they're operationalizing customer experience."

"Overall, and we have 25 different drivers per industry. Things like how quickly did they solve a problem? The report publishes a synopsis, the best of the best. Now, companies don't aspire to be the best in their industry, but to be the best in any industry. Regardless of what industry they're in, they want to benchmark against Amazon or Zappos."

The second report for 2015, released in October, reveals that most companies have a long way to go, "but they're trying," says Burns. The lowest scorers in each industry have their work cut out for them. The list of the worst scorers reads like a who's who of organizations that have plenty of publicity, and proved the old saw about any publicity being good publicity dead wrong.

It’s hard to climb out of the cellar

Comcast gets double the bad news in the report, earning the worst score in two industry categories.

In an effort to improve customer service, last year, the company appointed Charlie Herrin as senior vice president of customer experience. In the press release on the new hire, Comcast's Neil Smit, president and CEO, Comcast Cable, and executive vice president, Comcast Corp., said "Transformation isn’t going to happen overnight. In fact, it may take a few years before we can honestly say that a great customer experience is something we’re known for. But that is our goal and our number one priority … and that’s what we are going to do."

But Comcast continued to struggle with its image, and in May, after federal government regulators denied its plan to buy Time Warner Cable, Comcast pledged to devote $300 million to customer service. The plan included hiring 5,500 new customer service representatives, new technicians, and an app called Tech Tracker, building three new call centers, and redesigning its bills.

Comcast didn't fare any better in other customer service reports. It scored the third lowest in the American Customer Satisfaction Index's 2015 Telecommunications and Information Report. The company's satisfaction score slipped 10 percent from 2014. Comcast's Internet service was rated last in that industry.

For its part, Consumer Reports placed Comcast near the bottom in the telecom industry. According to the Philadelphia Inquirer, the company's hometown newspaper, Brian Roberts, Comcast's CEO said that resources that would have been directed at merging with Time Warner Cable would be directed into improving its experience. Neil Smit, head of the cable division said customer experience would come to be viewed as the company's best product. He announced a 10-point plan to achieve that.

U.S. Cellular survived widespread billing problems in 2013, including cases where customers using electronic payment were billed multiple times, and some weren't billed at all. The company has taken a long fall from its position in first place among wireless carriers in Forrester's 2012 and 2013 U.S. Customer Experience Index reports.

After a renewed focus on turning profits on its smartphones, Acer made a push in China by inking deals with Chinese carriers in 2012. While that may have helped the bottom line, the company still gets bad marks from Forrester in terms of customer experience. The company took the bottom spot in both 2015 reports.

In response to our request for comment, Acer's vice president of customer service, Mark Groveunder, responded as follows:

"With the goal of raising the bar on our customer experience, we've leveraged Forrester's U.S. Customer Experience Index this year to begin implementing key changes within our organization with a priority on customer service. Customer needs are changing and we're working to adapt to not only accommodate them, but to really exceed their expectations. We believe we've always provided good service, but we are taking it to an even higher level. In that spirit, we launched "Project CX" that includes investing heavily in improving employee involvement, employee training on the importance of the customer experience and leveraging journey mapping to determine from the customer's perspective what works best and what doesn't in terms of our internal processes. We're using this training to create cross-functional teams to identify the moments that matter in the various customer journeys and working to implement changes to improve the customer experience at these critical moments. Toward the end of this year, we'll start implementing the changes based on our research, while we're also providing feedback to our product design teams on how to improve future products and to our web team on how we can improve the customer experience there."
It can get crowded at the bottom

The banks that were "too big to fail" already had their work cut out for them after the 2008 financial crisis and federal government bailout. Consumer Reports’ Consumerist website, in its annual poll to nominate the Worst Company in America, has consistently placed the company in the top three worst performers. At the same time, the Consumer Financial Protection Bureau has recorded tens of thousands of complaints about the banking giant over the past several years, most of them about how it services mortgages. Bank of America has received more than 30,000 mortgage complaints in the database since 2011, representing about 23 percent of all mortgage-related complaints.

The company has made annual appearances on other worst-performers lists, including appearances in the Zogby Analytics and 24/7 Wall Street Customer Service Hall of Shame since 2009, and the American Customer Service Satisfaction Index five of the last six years, leading the pack in 2014.

Don Vecchiarello, a spokesman for Bank of America, responded to our request for a comment with the following statement:

"We take our clients’ feedback very seriously, and we are confident in our strategy and the investments we have made to improve their experience with us. Everything we do now is focused on making the financial lives of our clients better. We believe over time, we will earn a better perception in the eyes of our clients."

For its part, Cigna spokesman Joe Mondy, provided this statement:

"The 2015 customer experience index runs contrary to the findings of Cigna's own Net Promoter Score (NPS) research, as well as the findings of other third party studies that measure customer experience that we have been part of over the past several years. We are reviewing the recently released data, as well as what impact the recent changes in methodology that company has instituted, but our efforts to improve customer experience have been showing great results, and we will continue to invest in efforts to improve the experience customers have with Cigna."
Lower-cost doesn’t mean lowered expectations

Low cost is the name of the game for Frontier Airlines, but travelers should still expect a modicum of customer service. Earlier this year, CEO Dave Siegel stepped down, and was replaced by the company's chairman, Bill Franke, and president, Barry Biffle.

In May, a government report ranked Frontier last among U.S. carriers in on-time performance and customer complaints. Complaints were logged at the rate of 8.2 per 100,000 boardings, compared with 2.98 in May 2014. At the time, Biffle told the Denver Post that the new leadership was focused on improving reliability by changing the design of the schedule. As a result, the airline's on-time performance improved.

Still, in 2015, Frontier ranked last in J.D. Power's North America Airline Satisfaction Study. More than one third of the company's flights were at least 15 minutes late in March 2015. The federal government said that Frontier had the highest complaint rate and worst on-time performance among the nation's leading airlines in March.

Customers are paying for more than just their car rental at Dollar Rent a Car, even if they didn't know it. Among rental car agencies, Dollar Rent a Car scored lowest, possibly due in part to complaints that it signed customers up for insurance even when they explicitly did not want it.

Anyone who expected not to see Walmart take the last spot hasn't been on the Internet or watched TV in the past few years. In 2014, the American Customer Satisfaction Index found that overall satisfaction with retail stores fell 1.4 percent after improving steadily for three years. Walmart scored its worst rating since 2007, at the bottom of the list.

Early this year, the company announced it would spend $1 billion to raise pay for employees. CEO Doug McMillon vowed to make customer service a priority after a rash of complaints from customers about empty store shelves and not being able to find store clerks, even as pallets of merchandise piled up in warehouses. Bloomberg recently reported that even though stores grew 13 percent, the number of employees only grew 2 percent.

Analytics firm ForeSee's annual ranking of customer satisfaction during the holiday shopping season in 2012 ranked Gilt at the bottom for the eighth consecutive year. The company attributed the poor result to bad Web site functionality.
The Feds aren’t getting off the hook, either

In an audit by the federal government's Office of Inspector General in May, the U.S. Postal Service was charged with having an excess of rude employees, and this could cost the government $288.5 million in lost revenue. The audit stated, "While the Postal Service's goal is 90 percent customer satisfaction, we found that more than 20 percent of its customers in FY 2013 responded to the POS [point of service] survey that they have been treated 'worse than other retailers when visiting Postal Service retail counters. Dissatisfied customer exist, in part, because procedures for improving customer service are not functioning as intended."

The agency also took a hit in a survey conducted by Accenture in 2013, in which the company evaluated 24 government-operated postal organizations and two private companies that account for 75 percent of the world's mail. Accenture concluded that the USPS blamed pensions and "restrictive" universal service obligations.

It has been a battle for HealthCare.gov. When it was rolled out in 2013, the Web site was full of bugs and agonizingly slow. The site is due for upgrades that will take affect with the start of 2016 open enrollment. The changes are designed to make it easier to browse taxpayer-subsidized health insurance plans, among other things.

Monday, October 19, 2015

70-336 Core Solutions of Microsoft Lync Server 2013


QUESTION 1
When Web conferencing with users from a partner company the users must be able to download
the Web conference content.
How would you enable this?

A. By configuring a Conferencing Policy.
B. By configuring a Group Policy Object.
C. By modifying the meeting configuration.
D. By configuring a Client Policy.

Answer: A

Explanation:


QUESTION 2
When Web conferencing with users from a partner company the users must be able to record the
Web conference.
How would you enable this?

A. By configuring a Conferencing Policy.
B. By configuring a Group Policy Object.
C. By modifying the meeting configuration.
D. By configuring a Client Policy.

Answer: C

Explanation:


QUESTION 3
You need to troubleshoot the error message received by the test users. The Centralized Logging
Service (CLS) is running with its default settings.
You plan to use ClsController.exe to help you use the logs to troubleshoot the issue.
In which order should you run the commands?

A. ClsController – start, ClsController – flush, ClsController – search, ClsController –stop.
B. ClsController – start, ClsController – stop, ClsController – flush, ClsController –search.
C. ClsController – flush, ClsController – start, ClsController – stop, ClsController –search.
D. ClsController – search, ClsController – flush, ClsController – start, ClsController –stop.

Answer: B

Explanation:


QUESTION 4
You need to request a certificate for the additional Front End Server.
Which of the following should you use as the common name in the certificate?

A. lyncfe2.westbridgeind.com
B. webext.westbridgeind.com
C. fepool1.westbridgeind.com
D. lyncfe1.westbridgeind.com
E. lyncdiscover.westbridgeind.com

Answer: C

Explanation:


QUESTION 5
You need to resolve the audio streaming problems reported for conferences between users in the
New York and Atlanta offices.
Which of the following solutions would resolve the issue?

A. A Lync Server Location Policy.
B. Separate Active Directory Sites for each office.
C. A faster WAN link between the offices.
D. A domain controller in the Atlanta office.

Answer: C

Explanation:


Sunday, September 27, 2015

How important is CompTIA Certification

The need to advance or improve one’s credentials has become an import factor in this growing world. To beat the competition one should do the certifications and be a step ahead from others. There are many certifications, based on the expertise you have in, you can choose the one most relevant. Out of many one is CompTIA, doing which you not only gets the recognition but also the growth you would want. CompTIA is one of the most valued certifications as it covers a variety of fields like computer networking, IT security, Linux programming etc.

One way to be ahead of others, in a growing world of competitive job market the need to advance or improve one’s credentials cannot be ignored. For the beginners of the IT field CompTIA A+ is the most recommended of all as it makes you perfect as a computer technician. Topics covered under A+ are as installation, preventative maintenance, networking, security and troubleshooting. The exam is internationally accepted and relevant, also it is vendor neutral. Many employers see it as the proof of your ability to work with computers. No wonder, experts call A+ certification as a step in the right direction to more advanced CompTIA certification.

Once you are done with CompTIA A+ certification, you would look out for the next level i.e. CompTIA Network+ which is even more important than the CompTIA A+ certification. This certification makes you efficient in running, maintaining, troubleshooting, installing, and configuring computer network infrastructures. Companies which have large number of computer networks, connecting many employees, this certification is for them. There is high demand for IT networking professionals, so getting your CompTIA Network+ certification is a good next step after you complete your A+ certification.

CompTIA A+ is the basic of all the certifications provided by CompTIA. Another important CompTIA certification is CompTIA Network+. This certification tests your proficiency in maintaining, running, troubleshooting, installing, and configuring computer network infrastructures. This is majorly in need in companies where hundreds of employees get connected to the network. After you complete your A+ certification, it’s always suggested to get CompTIA network+ certification too as it is in huge demand

There are many more certification offered by CompTIA, based on your interest and passion, you can choose the one which is more relevant for your career growth.


Wednesday, September 23, 2015

Windows through the ages

 Twenty-five years later, we look back at Microsoft's popular operating system.

A Brief History of Windows
In 1985, Microsoft released version 1.0 of the graphical desktop operating system known as Windows. Over the next 25 years, the industry would see over 16 significant versions of the now-ubiquitous OS. In honor of this anniversary, let's take a look back at Windows' major revisions through the ages.

Special thanks to GUIdebook and GUI Gallery for help with the screenshots.

Editor's note: This article was originally published on November 16, 2010, but was updated by Brad Chacos on August 24, 2015 to include Windows 8, 8.1, and 10.

Windows 1.01 (1985)
The first version of Windows was a primitive affair. To avoid legal issues with the Mac OS, Microsoft had to ensure that application windows did not overlap and that there was no trash can to be seen. However, Windows 1.x did include a taskbar across the bottom of the screen.

Windows 2.0 (1987)
Windows 2.0 introduced overlapping windows and slightly improved graphics, while nixing the taskbar of its predecessor. It included a suite of familiar, simple applications (Paint, Terminal, Clock) and a file manager called MS-DOS Executive. Soon after its release, Microsoft ported Word and Excel to Windows for the first time.

Windows 3.0 (1990)
Windows 3.0 introduced a host of new features: Program Manager, Solitaire, support for VGA and virtual memory, and a new "3D" look. Third-party support was stronger than ever, which (when combined with the new features) made Windows 3.0 the first widely used Windows release.

Windows 3.1 (1992)
Windows 3.1 improved significantly upon its predecessor with the inclusion of scalable TrueType fonts, turning Windows into a serious platform for desktop publishing. This version also improved drag-and-drop functionality and OLE (Object Linking and Embedding), and it introduced multimedia support for the first time. And then there's Minesweeper...

Windows for Workgroups 3.1 (1992)
This version of Windows took the standard Windows 3.1 base and added native networking support--especially for LANs, which were gaining popularity in businesses at the time. It included relatively sophisticated networking tools along with the usual complement of simple Windows applications.

Windows NT 3.1 (1993)
Windows NT sprang forth from work on OS/2. It represented an entirely new 32-bit OS (previous Windows editions were graphical shells for MS-DOS) that targeted high-end workstation and server applications. Its first version sported a look and feel similar to that of Windows 3.1, but its underpinnings were completely original.

Windows 95 (1995)
Microsoft continued development of its DOS-based consumer OS line with Windows 95, which didn't require a separate DOS install. It introduced the now famous taskbar and Start menu, along with dozens of other improvements that garnered strong sales and cemented Microsoft's domination of the desktop-OS marketplace.

Windows NT 4.0 (1996)
Windows NT 4.0 integrated Windows 95-like interface improvements with the very stable Windows NT kernel, further advancing the NT line to a point where it became Microsoft's most popular business release for the next few years.

Windows CE 1.0 (1996)
Here we're taking a slight detour from our survey of desktop Windows versions to consider the first pocket version, Windows CE. This completely independent OS appeared alongside the advent of a new generation of handheld computers in the mid- to late 1990s, and it lives on as part of Windows Mobile today.

Windows 98 (1998)
Microsoft’s Windows 98 made the Internet Explorer Web browser an inextricable part of Windows, allowing Web pages to render in Explorer windows or on the desktop. It also added the quick launch toolbar and native USB support, among other features. It proved to be a popular (albeit unstable) OS.

Windows 2000 (2000)
As a member of the NT line, Windows 2000 added Windows 98's Web integration to a stable, relatively secure OS designed for workstation and server environments. It also eliminated the need for many reboots when people installed software or changed system settings. It saw heavy use as a desktop OS.

Windows ME (2000)
As the last gasp of MS-DOS-based Windows, this long-in-the-tooth, highly unstable release was reviled by users and critics alike. Windows ME emphasized multimedia and user-friendliness above all, but reality veered far from Microsoft's intentions and proved how badly we needed XP.

Windows XP (2001)
At long last, Windows XP integrated Microsoft's stable and mature NT OS line with its 9x line of consumer OSs. The result was the company's most popular operating system to date--a release that provided a colorful new interface and security enhancements while retaining backward compatibility with previous versions of Windows.

Windows Server 2003 (2003)
Windows Server 2003 continued the tradition of NT-based server OSs while including interface improvements that had arrived with Windows XP two years earlier.

Windows Vista (2007)
By 2007, Microsoft was feeling the heat from Mac OS X, which made Windows XP look two decades out of date. The folks in Redmond answered with a new OS dubbed Windows Vista, cloaked in the flashy, translucent Aero interface and sporting larger icons, Gadgets, and a new Start menu. Critics weren't impressed.

Windows Server 2008 (2008)
Windows Server 2008 was essentially the server version of Windows Vista--minus the dizzyingly flashy Aero environment. Microsoft had long since learned that fancy, colorful graphics didn't go over well in business circles.

Windows 7 (2009)
With Microsoft getting a shellacking in the press for the Vista debacle, the OS giant rushed a successor to market that addressed many of Vista's problems. The result was Windows 7, which gained critical praise and renewed Windows' viability in the face of threats from both mobile platforms and Apple.

Windows 8 (2012)
Then there was Windows 8, which was nothing short of an epic flop after the stunning success of Windows 7. Windows ditched the Start menu in favor of a full-screen “Start screen” glittering with shifting Live Tiles tied to Windows apps—a new sort of software, delivered through the Windows Store, that consumed your entire screen and didn’t play nice with traditional desktop software. Many core settings were hidden behind invisible “hot corners,” and the Live Tile and desktop portions of the operating system had entirely separate settings in entirely separate areas.

The desktop-to-app dichotomy was jarring at best, and frustrating at worst. Despite Windows 8’s impressive under-the-hood tweaks and convenient cloud features, PCWorld officially recommended that happy Windows 7 users stick with Windows 7.

Windows 8.1 (2014)
Windows 8.1 wasn’t a full-fledged new release, but this free update smoothed over many of Windows 8’s most egregious sins. Users could opt to boot to the desktop rather than the Start screen, and the Start button made its return—though not the Start menu; clicking the Start button merely dropped you onto the Start screen. Microsoft also introduced helpful new tutorials with Windows 8.1 to help users become accustomed to the radical new Windows 8 experience.

Windows 10 (2015)
Finally, an operating system that Windows 7 users can happily upgrade to.

Windows 10 simultaneously re-embraced the desktop experience (by shoving Windows apps into desktop Windows, bringing the Start menu back, and introducing virtual desktop support) and pushed Windows into the future by embracing an “operating system as a service” ethos. Rather than being a mostly stationary release destined to be replaced by a major upgrade three years from now, Microsoft plans to constantly update Windows 10 with new features and services.

Windows 10 is easily one of the best operating systems Microsoft has ever released, and it’s stuffed with all sorts of new features and productivity-enhancing tricks.

Best Microsoft MCTS Certification, Microsoft MCP Training at certkingdom.com

Tuesday, September 1, 2015

VMware rounds out data center virtualization stack

VMware has added more components to its software-defined data center, updating vCloud, NSX and its OpenStack distribution

VMware has updated its stack of data center virtualization software, rounding out capabilities that allow an organization to run an entire data center operation and related cloud services as a single unified entity.

Among the new additions are components to the vCloud Air suite of software for running cloud services. The company has expanded its network virtualization software to provide more ways of moving a workload across a data center. And it has also released a new version of its OpenStack distribution for running cloud workloads.

VMware's vCloud Air is the company's answer to the success of cloud service providers such as Amazon Web Services. The software lets organizations run their own IT operations as a set of cloud services. It also provides a unified base for multiple cloud service providers to offer vCloud services that interoperate with each other as well as with customer's internal vCloud deployments.

The VMware vCloud Air now has a number of new options for storing data, such as vCloud Air Object Storage for storing unstructured data. It features built-in redundancy, eliminating the need to make backups. The data can be accessed from anywhere in the globe as well.

The company also has a new database-as-a-service, called vCloud Air SQL, which provides the ability to store relational data on a pay-as-you-go model. Initially, vCloud Air SQL will be compatible with Microsoft SQL Server, but plans are to make it compatible with other relational databases.

The company has updated its VMware vCloud Air Disaster Recovery Services, which provide a way to ensure that operations continue even if the enterprise's data center goes offline. It now has a new management console for testing, executing and orchestrating disaster recovery plans.

VMware also updated its software for virtualizing network operations. VMware NSX 6.2 allows a virtual machine to be copied across a single data center, or even two different data centers, while retaining its networking and security settings.

NSX 6.2 now can recognize switches through the Open vSwitch Database (OVSDB) protocol, providing new ways for the users of such switches to segment their physical servers into smaller working groups. VMware NSX 6.2 also has a new central command line interface and a set of troubleshooting capabilities, called TraceFlow.

VMware says NSX is now being used by more than 700 customers, with over 100 cases being used in production deployments.

VMware vRealize Operations, which provides a single interface to watch the operational health of applications running on VMware, has been updated to include capabilities to find the best set of resources within a data center to place a workload. It also does rebalancing to move workloads around for most efficient use of data center resources.

Also on the management side, the company has updated its logging software, which is now capable of ingesting 15,000 messages per second. The software also now offers new ways to chart and search through operational data.

The newly released VMware Integrated OpenStack 2 is based on the latest release of the open source OpenStack software, which was codenamed Kilo and released in April. The new release has a load-balancing feature as well as the ability to automatically scale up workloads should they require more resources.


Wednesday, August 26, 2015

OpenStack is redefining the business model for data solutions

Want proof? Industry leading vendors are snatching up OpenStack-based companies

This vendor-written tech primer has been edited by Network World to eliminate product promotion, but readers should note it will likely favor the submitter’s approach.

IT is headed toward being something more akin to a utility service, transformed by OpenStack’s open standardized cloud architecture, which will improve interoperability and render vendor lock-in a thing of the past.

Initially a solution adopted by smaller ISVs lacking the capital to build private clouds, OpenStack-based cloud solutions are shaping up to be the logical choice for large enterprise as industry leaders, including IBM, Cisco, EMC, HP and Oracle, bet on its value for defining the next-generation model for business computing.

These industry giants have been snatching up OpenStack-based companies over the past couple years, building up their capabilities around the architecture. IBM and Cisco are some of the latest to close deals, with their respective acquisitions of Blue Box and Piston Cloud Computing. Other relevant acquisitions include EMC’s purchase of Cloudscaling, Oracle’s Nimbula acquistion, and Cisco’s MetaCloud acquisition.

OpenStack’s value for business lies in its capacity for facilitating seamless private-to-public scalability and extensive workload portability, while removing the need to lay out capital to acquire and maintain depreciating commodity hardware.

These companies see that innovations in open clouds will inevitably win out as the premiere solution for business data management. The days of commodity hardware and internally managed datacenters are rapidly fading. With cloud services available on a pay-as-you-go basis and infrastructure as a service (IaaS) removing the need to invest in commodity hardware, customers will look at performance, pricing and quality of service as the most important factors in choosing a cloud provider, while maintaining the freedom to easily switch if a better option comes along.

OpenStack’s core strength is interoperability, allowing for seamless scaling across private and public environments, as well as easier transition and connectivity across vendors and networks.

Companies like IBM and Cisco buying up OpenStack-based providers to bolster their own hybrid cloud solutions does not mean the architecture will lose touch with its open-source roots. Open standards and interoperability go hand-in-hand and are at the heart of OpenStack’s unique capabilities.

What we are seeing is the maturation of OpenStack, with major names in business computing positioned to mainstream its adoption by leveraging their financial, IP, R&D resources and brand trust to meet complex demands and ensure confidence from large enterprise organizations transitioning to the cloud.

Cisco listed OpenStack’s capabilities for enhancing automation, availability and scale for hybrid clouds as playing a major role in its new Intercloud Network, while HP is utilizing OpenStack to facilitate its vendor-neutral Helion Network, which will pool the services of Helion partners to offer global workload portability for customers of vendors within their network.

Adoption of OpenStack by these providers signals a major shift for the industry, moving away from dependence on hardware sales and heavy contractual service agreements to a scalable IaaS utilities model, where customers pay for what they need when they need it and expect it to just work. Providers may need to shoulder the burden of maintaining datacenters but will reap the reward of pulling the maximum value from their commodity investments.

Interoperability may seem like a double-edged sword for companies that were built on their own software running exclusively on their own hardware. But the tide is shifting and they realize that closed platforms are losing relevance, while open architecture offers new opportunities to expand their business segments, better serve customers, and thrive with a broader customer base.

Cisco recently added new functionalities for its Intercloud offering, extending virtual machine on-boarding to support Amazon Virtual Private Cloud and extending its zone-based firewall services to include Microsoft Azure. Last year, IBM partnered with software and cloud competitor Microsoft, each offering their respective enterprise software across both Microsoft Azure and the IBM Cloud to help reduce costs and spur development across their platforms for their customers. OpenStack furthers these capabilities across the quickly expanding list of providers adapting the cloud architecture, enabling a vendor-agnostic market for software solutions.

Open standardized cloud architecture is the future of business IT, and OpenStack currently stands as the best and only true solution to make it happen. Its development was spurred by demand from small ISVs who will continue to require its capabilities and promote its development, regardless of whether large enterprise service providers are on board.

However, its inevitable development and obvious potential for enterprise application is forcing the hand of IT heavyweights to conform. Regardless if they’d prefer to maintain the status quo for their customers, the progress we’ve seen won’t be undone and the path toward vendor neutrality has been set.

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

Monday, August 17, 2015

Apple releases final OS X Yosemite non-security update

Signals September launch of El Capitan

Apple on Thursday updated both OS X Yosemite and iOS 8, probably for the last time other than for security fixes.

The updates' appearance Thursday signaled that Apple is just weeks away from introducing their successors, OS X El Capitan and iOS 9. The latter will almost certainly be unveiled Sept. 9, the best-guess date for Apple's debut of new iPhones, while the former may release to customers shortly after.

The Mac's operating system update, pegged as 10.10.5, boasted a short list of reliability and compatibility improvements to Yosemite's baked-in Mail, Photos and QuickTime Player apps. Meanwhile, iOS 8.4.1 focused on bug fixes for the new Apple Music, the subscription-based streaming music service the Cupertino, Calif. company kicked off June 30. Both also featured dozens of vulnerability patches.

Yosemite 10.10.5's arrival pointed to an earlier-rather-than-later release of El Capitan, aka OS X 10.11.

Over the last two cycles, Apple has released the fifth -- and final -- non-security update of its then-current Mac operating system five to six weeks before the successor goes public. Last year, for instance, Mavericks final non-security update, 10.9.5, made the scene on Sept. 17, or 29 days before Yosemite's launch. In 2013, the fifth non-security update for Mountain Lion reached users 40 days before Mavericks' release.

If Apple hewed to the same schedule, El Capitan would materialize between Sept. 11 and Sept. 22, or about a month earlier than the last two upgrades.

There have been other clues that Apple may beat the calendar with El Capitan: Its developer preview build release tempo has accelerated in the last month, with three versions since July 21.

Also on Thursday, Apple issued what was likely the final security update for OS X Mountain Lion, patching 33 vulnerabilities in the three-year-old operating system.

OS X 10.10.5 (Yosemite) and Security Update 2015-006 (Mountain Lion and Mavericks) can be downloaded from Apple's website or installed using the operating system's built-in update service.

iOS 8.4.1 can be downloaded over the air from iPhones, iPads, iPad Minis and iPod Touches, or through iTunes.

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com


Thursday, August 13, 2015

Will your network be ready for the Internet of Things?

Preparing the network to transition seamlessly to the connected environment.

It seems like we can't go a day without hearing about the hottest new connected device or the latest smart [fill in the blank] technology. The much-discussed Internet of Things (IoT), and the varied devices that comprise it, could be considered the biggest trend in the industry today. We are in the midst of an exciting era of technology. And who doesn't love a cool gadget? I'm sure many of us are looking forward to the day when it's common practice for our fridge to tell us we're out of milk, and when we can lock our doors remotely if we accidentally forget. Our kids will definitely be thrilled about the gamified toothbrush that controls the movements of video game characters.

According to Gartner, 4.9 billion connected things will be in use this year, and this number will skyrocket to 25 billion by 2020. But what is most exciting about IoT isn't the devices. It's how the devices work. What is going on behind the scenes that most people won't ever see? The possibilities for IoT are endless, but there are so many variables to consider, and many questions remain unanswered. Perhaps most importantly, how do IT professionals prepare their infrastructures to ensure a seamless transition to this ultra-connected world of things?

For enterprises, things like medical devices, smart utility services, HVAC systems, physical security systems, supply chain logistics, and even vending machines will become increasingly smarter and require more connectivity than ever before. Hung LeHong, research vice president at Gartner, in a 2013 interview with Network World, refers to this as operational technology, "where enterprise assets such as manufacturing equipment, fleet trucks, rail cars, even patient monitoring equipment in hospitals, become networked devices." All of these assets have the potential to become end points on the enterprise network, just like smartphones and tablets, explained LeHong.

The fact that the IoT and what it entails is still so uncertain can be unsettling for IT professionals tasked with building and maintaining technology infrastructures, and for the business leaders that rely on them to move their organizations forward. An article from Computerworld's Patrick Thibodeau puts this concern into perspective: "IoT development today is at an early stage, perhaps at a point similar to 1995, the same year Amazon and eBay went online, followed by Netflix in 1997 and Google in 1998. People understood the trend at the time, but the big picture was still out of focus."

Without knowing the direct impact IoT will have on your day-to-day business operations, how do you effectively plan or prepare? There's no right answer for this yet, but it's important to start having that conversation. By building an understanding now, and analyzing the possibilities and the challenges it could create for your business, you can ensure you're not left in the dust when the IoT truly starts to come to life.

As more organizations and consumers start to use IoT technologies, it's evident that the networks that support them will need to maintain increased bandwidth and data capacity. Is my current infrastructure performing at the levels necessary to meet existing demands, and is it scalable for future growth? In addition to the network, is my layer one cabling infrastructure equipped to handle increased speeds and data flow? Will we be ready to handle the security challenges that will certainly be present with IoT? These are some of the questions IT executives should be asking and discussing with their teams. Map out a plan, knowing it will likely change, and start from square one.

According to Information Age, "One of the major challenges of the IoT involve the volumes of data created by connected devices, and the connected devices that produce and respond to it. For the enterprise network, all of the same considerations need to be made - primarily reliability, performance and security, and in anticipation of IoT, network administrators should also consider extensibility and scalability."

The successful implementation of IoT technologies in the enterprise can lead to real business outcomes. For example, an IoT-connected network helped a medical center reduce patient-bed turnaround time by 51 minutes. With smart IoT fixtures, a major city was able to reduce energy costs by 62%. What outcomes could IoT create for your organization?

We all aim to provide the best experience for our end users – whether that's patients, students, employees, clients or customers. While widespread adoption of the IoT may not happen overnight, it's hard to deny that it will provide a number of opportunities. The earlier we all consider what these opportunities entail, the more prepared we'll be to take advantage of them when the time is right.

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

Saturday, August 8, 2015

Ultimate guide to Raspberry Pi operating systems, part 1

Raspberry Pi
Since we published a roundup of 10 Raspberry Pi operating systems the number of choices has exploded. In this piece I’m including every option I could find (and for you pickers of nits, yes, I’m counting individual Linux distros as individual operating systems, so sue me). If you know of anything I’ve missed or a detail that’s wrong, please drop me a note at feedback@gibbs.com and I’ll update the piece and give you a shout out.

Want to know immediately when the next installment of this guide is published? Sign up and you’ll be the first to know.

Now on with the awesomeness …

Adafruit - Occidentalis v0.3
Occidentalis v0.3 is the result of running Adafruit’s Pi Bootstrapper on a Raspbian installation to build a platform for teaching electronics using the Raspberry Pi. Arguably not a true distro (the previous versions were) it’s included because it’s kind of cool.

Arch Linux ARM
Arch Linux ARM is a fork of Arch Linux built for ARM processors. This distro has a long history of being used in a wide range of products, including the Pogoplug as well as the Raspberry Pi. It’s known for being both fast and stable. There is no default desktop but above, I show the option of Openbox.

BerryTerminal
BerryTerminal has not been updated for several years: “BerryTerminal is a minimal Linux distribution designed to turn the Raspberry Pi mini computer into a low-cost thin client. It allows users to login to a central Edubuntu or other [Linux Terminal Server Project] server, and run applications on the central server.”

DarkELEC
DarkELEC: “None of the currently available solutions do a perfect job running XBMC on the Pi, however OpenELEC comes by far the closest, in spite of its locked down nature. [The DarkELEC] fork aims to remedy the very few flaws in its implementation and to focus 100% on the Pi, while also sticking to the upstream and incorporating its updates.”

Debian 8 (“Jessie”)
Debian 8 (“Jessie”) is the latest and greatest version of Debian and Sjoerd Simons of Collabora appears to be the first person to get it running on the Raspberry Pi 2 back in February this year. As of this writing, there isn’t an “official”release of Debian 8 for the Raspberry Pi so, if you go down this path, expect a few bumps (and complexities) on the way.

DietPi
DietPi: “At its core, DietPi is the go to image for a minimal Raspbian/Debian Server install. We've stripped down and removed everything from the official Raspbian image to give us a bare minimal Raspbian server image that we call DietPi-Core.” DietPi is optimized for all Pi models and has a 120MB compressed image, fits on a 1GB or greater SD card, has only 11 running processes after boot, requires just 16MB of memory after boot, and, “unlike most Raspbian minimal images, ours includes full Wifi support.” An LXDE desktop is optional.

Fedora Remix (Pidora)
Fedora Remix (Pidora): Pidora is a Fedora Remix, a customized version of the Unix-like Fedora system, running on the ARM-based Raspberry Pi single board computer and it moves faster than a politician taking a donation. First released in 2003 Fedora has a long history and is noted for its stability. Given that there are thousands of packages available in the Pidora repository you’ll be able to find pretty much any functionality or service you need for your project.

GeeXboX ARM
GeeXboX ARM is a free and Open Source Media Center Linux distribution for embedded devices and desktop computers. GeeXboX is not an application, it’s a full-featured OS that can be booted from a LiveCD, from a USB key, an SD/MMC card or installed on an HDD. The core media delivery application os XBMC Media Center 12.2 “Frodo”.

IPFire
IPFire is a specialized version of Linux that operates as a firewall. Designed to be highly secure and fast, it’s managed through a Web-based interface.

Kali Linux
Kali Linux is one of my favorite flavors of Linux because of its excellent collection of penetration testing and diagnostic tools (plus it has a great logo). Being able to run this bad boy on a Raspberry Pi means you can have your own custom pen tester in your pocket.

Lessbian 8.1 (“Raptor”)
Lessbian 8.1 (“Raptor”): A stripped down bare minimal Debian “Jessie”. The goal of Lessbian is to “provide a small and fast jessie image for servers and wifi security testing without the madness of system.” This release is described as “A bootable wifi system optimized for throughput, performance, and encryption”and it’s a great platform for running a Tor Relay.

Minepeon
Minepeon: There’s gold in them thar' BitCoin mines! You can get it out using the Minepeon operating system based on Linux and running on a Raspberry Pi. Of course you’re going to need a lot of machines to get your digital “quan”given how much more “work”is needed to mine BitCoin today, but given the price of the Raspberry Pi you won’t go broke assembling a roomful of miners. Show me the digital money!

Moebius
Moebius: A minimal ARM HF distribution that needs just 20Mb of RAM for the entire operating system and fits on a 128MB SD card. Version 2 is current stable version. An LXDE desktop is optional.

nOS
nOS: Based on Ubuntu and the KDE, this distro has been abandoned: “Development of nOS has stopped, existing versions will continue to work and receive updates from the package manufacturers until April 2019. The only things that will no longer be issued are updates for nOS specific software and the monthly image releases (they haven't been going for a while anyway).”

OpenELEC
OpenELEC, an acronym for Open Embedded Linux Entertainment Center, is a Linux-based OS that runs the popular XBMC open source digital media center software. The first release of OpenELEC was in 2013 and, according to the OpenELEC Wiki, “Installing OpenELEC for Raspberry Pi from a Linux computer is a very simple process and whether you're new to Linux or a hardened *NIX user, you shouldn't have any problems.”

OpenWrt for Raspberry Pi
OpenWrt for Raspberry Pi is “a Linux distribution for embedded devices.” Systems based on OpenWrt are most often used as routers and, with something like 3,500 optional add-on packages, its features can be tailored in pretty much anyway imaginable. Want an ultraportable, incredibly tiny wireless router that can be run anywhere? OpenWrt on a Raspberry Pi running off a battery with a USB WiFi dongle can only be described as “epic.”

Raspberry Digital Signage
Raspberry Digital Signage is based on Debian Linux running on a Raspberry Pi and used in Web kiosks and digital signage (including digital photo frames). A really well thought out system, Digital Signage is designed to be easily administered while being as “hacker-proof”as possible.

Raspberry Pi Thin Client
Raspberry Pi Thin Client: Creates a very low price thin client that supports Microsoft RDC, Citrix ICA, VMWare View, OpenNX and SPICE.

Raspbian Pisces R3
Raspbian Pisces R3: Another non-official distro, Raspbian Pisces created by Mike Thompson, is an SD image of Raspbian and creates a minimal Debian installation with the LXDE desktop.

Raspbian Server Edition
Raspbian Server Edition: A stripped-down version of Raspbian with some extra packages that boots to a command prompt. It is an excellent tool to use for testing hard float compilations and running benchmarks.

Raspbmc
Raspbmc: Yet another distro that is designed for the popular XBMC open source digital media center, Raspbmc is lightweight and robust.

RaspEX (Edition 150706)
RaspEX (Edition 150706): RaspEX is a full Linux desktop system with LXDE and many other useful programs pre-installed. Chromium is used as Web Browser and Synaptic as Package Manager. RaspEX uses Ubuntu’s software repositories so you can install thousands of extra packages if you want.

Raspian Debian 7.8 (“Wheezy”)
Raspian Debian 7.8 (“Wheezy”): The Raspian Debian “Wheezy”distro for the Raspberry Pi is a fully functional Debian Wheezy installation containing the LXDE desktop, the Epiphany browser, Wolfram Mathematica, and Scratch. It supports the Raspberry Pi and the Raspberry Pi 2 and is the current Debian version supported by the Raspberry Pi Foundation.

Red Sleeve Linux
Red Sleeve Linux: “RedSleeve Linux is a 3rd party ARM port of a Linux distribution of a Prominent North American Enterprise Linux Vendor (PNAELV). They object to being referred to by name in the context of clones and ports of their distribution, but if you are aware of CentOS and Scientific Linux, you can probably guess what RedSleeve is based on. RedSleeve is different from CentOS and Scientific Linux in that it isn’t a mere clone of the upstream distribution it is based on –it is a port to a new platform, since the upstream distribution does not include a version for ARM.”

RISC OS Pi
RISC OS Pi: Originally developed and released 1987 by UK-based Acorn Computers Ltd. RISC OS is, as the RISC OS Web site claims, “its own thing –a very specialized ARM-based operating system… if you’ve not used it before, you will find it doesn’t behave quite the same way as anything else.”. RISC OS Pi has been available on the Raspberry Pi since 2012.

SliTaz GNU/Linux Raspberry Pi
The SliTaz GNU/Linux Raspberry Pi distribution is “a small operating system for a small computer! The goal is to provide a fast, minimal footprint and optimized distro for the Raspberry Pi. You can setup a wide range of system types, from servers to desktops and learning platforms.”

Windows 10 IoT Core Edition
Windows 10 IoT Core Edition’s GUI stack is limited to Microsoft’s Universal App Platform so there’s no Windows desktop or even a command prompt. With PowerShell remoting you get a PowerShell terminal from which you can run Windows commands and see the output of native Win32 apps. Currently available as a preview version, there’s no support for Wi-Fi or Bluetooth.

outro
In our next installment of Network World’s Ultimate Guide to Raspberry Pi Operating Systems we’ll be covering a whole new collection: Bodhi, Commodore Pi, FreeBSD, Gentoo, ha-pi, I2Pberry, Kano OS, MINIBIAN, motionPie, Nard, NetBSD, OSMC, PiBang Linux, PiBox, PiMAME, PiParted, Plan 9, PwnPi, RasPlex, Slackware ARM, SlaXBMCRPi, slrpi, Tiny Core Linux, Ubuntu, Volumio, XBian, and more.

Want to know immediately when the next installment is published? Sign up and you’ll be the first to know.
Want more Pi? Check out 10 Reasons why the Raspberry Pi 2 Model B is a killer product and MIPS Creator CI20: Sort of a challenge to the Raspberry Pi 2 Model B. What could be the next RPi? Check out Endless: A computer the rest of the world can afford and How low can we go? Introducing the $9 Linux computer!

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

Monday, July 27, 2015

Slide show: Best tools for email encryption

 The products we reviewed show good signs that encryption has finally come of age.
best tools email encryption 1

Email encryption

Recipients of encrypted email once had to share the same system as the sender. Today, products have a “zero knowledge encryption” feature, which means you can send an encrypted message to someone who isn’t on your chosen encryption service. Today’s products make sending and receiving messages easier, with advances like an Outlook or browser plug-in that gives you nearly one-button encryption. And the products we reviewed have features like setting expiration dates, being able to revoke unread messages or prevent them from being forwarded. (Read the full review.)

AppRiver CipherPost Pro
Basically, you layer CipherPost Pro on top of your existing email infrastructure via a plug-in. It has mobile apps for iOS, Android, Windows phones and BlackBerry 10s that offer the ability to send and receive encrypted messages, but not attachments. To correspond with people outside your email domain, send a message with a Web link, which recipients click on and register with the system. The heart of the product is a special “Delivery Slip” sidebar that appears on the page as you are composing your message. This is where controls are located to enable message-tracking options, and to add an extra security layer. These are all nice features. If you have to send large attachments, then CipherPost should be on your short list.

DataMotion SecureMail
DataMotion has a very mature offering that makes use of a gateway to process mail. Getting it set up will require a couple of hours, and most of that is in understanding the many mail processing rules. Users need to append a [SECURE] tag in the subject line to trigger the encryption process. You can also set up rules that will encrypt messages containing sensitive information. DataMotion doesn’t have any limits on the size of the user’s inbox. However, it does place a limit of up to 500MB worth of messages that can be sent in a user’s Track Sent Folder. Features include the ability to see exactly when your recipient opened the message and the attachment.

HP/Voltage SecureMail
Voltage was recently purchased by HP and rebranded. The technology is an email gateway, software that sits on either a Linux or Windows server or in the cloud and inserts the encryption process between mail client and server. There are numerous add-on modules that come as part of this ecosystem. You administer the gateway via a Web browser, and there are dozens of options to set, similar to the DataMotion product. Voltage has a zero download client, as it calls its software that can be used to exchange messages with someone not on their system. While parts of Voltage are showing their age, the overall experience is quite capable, and the add-ons for mobile and Outlook/Office are quite nifty.

Hushmail for Business
Hushmail is the easiest of the products we tested to set up and use. There is no software to install on the client side; all mail is accessed via two ways: First, via a secure webmail client that connects to the Hush servers. This is the only way you can send encrypted email to someone who isn’t part of the Hush network. The second method is for users fond of their existing email clients and who are communicating with other Hush users. In this situation there is literally nothing for them to do: they make use of their existing client to send an encrypted message. Between the client and the Hush server, mail is encrypted using either SSL or TLS. Once it arrives on the server, it is then encrypted via PGP. Hush has a 20MB limit on attachment size, and this could be a deal breaker for some businesses.

ProtonMail

Proton is one of the newer encrypted email services that have come along post-Snowden, with an emphasis on keeping your emails private. It makes a point of this by being based in Switzerland. However, the company is still building its product out and as a result it has a very simple Web UI for its client and admin tool. Proton uses double password protection. The first is used to authenticate the user. After that, encrypted data is sent to the user. The second password is a decryption key used to decrypt data on your device. Proton never sees that latter key so they do not have access to the decrypted data. On top of all this encryption, they also employ SSL connections so your data is encrypted across the Internet to and from their servers. There is no option for on-premises servers. While Proton is not really suitable for an enterprise deployment, it shows what the latest encryption products can deliver.

Tutao Tutanota
Of the products tested, Tutanota is the least reliable and least feature-laden. Tutanota uses a variety of clients to set up encrypted mail connections across your existing email infrastructure. There are no changes to your servers and you can continue using Outlook for sending unencrypted communications. We had some trouble with the installation, mainly because the software version has German instructions and installs the German version of .Net Framework. Once installed, though, the menus and commands are in English. Tutanota is based in Germany, which could be important for customers concerned about American email privacy. One of the distinguishing features is that its zero knowledge encryption process hides the message subject. Most of its competitors still send this information in the clear.

Virtru Pro
Virtru has a nice balance of plug-ins and mobile apps that support its easy-to-use encryption operations across a variety of email circumstances. If you have installed the necessary plug-in, when you want to send something, there is a small toggle switch on the top of the compose screen. Turning that on will bring up a “send secure” button to encrypt your message. There are tool tips that appear as you hover over the various options with your mouse, a nice touch. These include the ability to add an unencrypted introductory message that will introduce your recipient to the context of the message that you are sending, and why you want to encrypt the remainder of the message. You can also set when your message will expire or disable any forwarding for additional security.

Virtru also supports zero knowledge encryption, although it adds a separate activation step when a new user receives the first encrypted message.

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com


Tuesday, July 21, 2015

Next-generation endpoint protection not as easy as it sounds

Endpoint protection technology is making strides and may soon be touted as anti-virus
Rather than looking for signatures of known malware as traditional anti-virus software does, next-generation endpoint protection platforms analyze processes, changes and connections in order to spot activity that indicates foul play and while that approach is better at catching zero-day exploits, issues remain.

For instance, intelligence about what devices are doing can be gathered with or without client software. So businesses are faced with the choice of either going without a client and gathering less detailed threat information or collecting a wealth of detail but facing the deployment, management and updating issues that comes with installing agents.

Then comes the choice of how to tease out evidence that incursions are unfolding and to do so without being overwhelmed by the flood of data being collected. Once attacks are discovered, businesses have to figure out how to shut them down as quickly as possible.

Vendors trying to deal with these problems include those with broad product lines such as Cisco and EMC, established security vendors such as Bit9+Carbon Black FireEye, ForeScout, Guidance Software and Trend Micro, and newer companies focused on endpoint security such as Cylance, Light Cyber, Outlier Security and Tanium. That’s just a minute sampling; the field is crowded, and the competitors are coming up with varying ways to handle these issues.

The value of endpoint protection platforms is that they can identify specific attacks and speed the response to them once they are detected. They do this by gathering information about communications that go on among endpoints and other devices on the network, as well as changes made to the endpoint itself that may indicate compromise. The database of this endpoint telemetry then becomes a forensic tool for investigating attacks, mapping how they unfolded, discovering what devices need remediation and perhaps predicting what threat might arise next.

Agent or not?
The main aversion to agents in general is that they are one more piece of software to deploy, manage and update. In the case of next-gen endpoint protection, they do provide vast amounts of otherwise uncollectable data about endpoints, but that can also be a downside.

Endpoint agents gather so much information that it may be difficult to sort out the attacks from the background noise, so it’s important that the agents are backed by an analysis engine that can handle the volume of data being thrown at it, says Gartner analyst Lawrence Pingree. The amount of data generated varies depending on the agent and the type of endpoint.

Pingree and the NSS researchers
Without an agent, endpoint protection platforms can still gather valuable data about what machines are doing by tapping into switch and router data and monitoring Windows Network Services and Windows Management Instrumentation. This information can include who’s logged in to the machine, what the user does, patch levels, whether other security agents are running, whether USB devices are attached, what processes are running, etc.

Analysis can reveal whether devices are creating connections outside what they would be expected to make, a possible sign of lateral movement by attackers seeking ways to victimize other machines and escalate privileges.

Agents can mean one more management console, which means more complexity and potentially more cost, says Randy Abrams, a research director at NSS Labs who researches next-gen EPP platforms. “At some point that’s going to be a difference in head count,” he says, with more staff being required to handle all the consoles and that translates into more cost.

It’s also a matter of compatibility, says Rob Ayoub, also a research director at NSS Labs. “How do you insure any two agents - of McAfee and Bromium or Cylance – work together and who do you call if they don’t?”

Security of the management and administration of these platforms should be reviewed as well, Pingree says, to minimize insider threat to the platforms themselves. Businesses should look for EPP with tools that allow different levels of access for IT staff performing different roles. It would be useful, for example, if to authorize limited access for admins while incident-response engineers get greater access, he says.

Analysis engines
Analysis is essential but also complex, so much so that it can be a standalone service such as the one offered by Red Canary. Rather than gather endpoint data with its own agents, it employs sensors provided by Bit9+CarbonBlack. Red Canary supplements that data with threat intelligence gathered from a variety of other commercial security firms, analyzes it all and generates alerts about intrusion it finds on customers’ networks.

The analysis engine flags potential trouble, but human analysts check out flagged events to verify they are real threats. This helps corporate security analysts by cutting down on the number of alerts they have to respond to.

Startup Barkly says it’s working on an endpoint agent that locally analyzes what each endpoint is up to and automatically blocks malicious activity. It also notifies admins about actions it takes.

These engines need to be tied into larger threat-intelligence sources that characterize attacks by how they unfold, revealing activity that leads to a breach without using code that can be tagged as malware, says Abrams.

Most of what is known about endpoint detection and response tools is what the people who make them say they can do. So if possible businesses should run trials to determine first-hand features and effectiveness before buying. “The downside of emerging technologies is there’s very little on the testing side,” Pingree says.

Remediation
Endpoint detection tools gather an enormous amount of data that can be used tactically to stop attacks but also to support forensic investigations into how incursions progressed to the point of becoming exploits. This can help identify what devices need remediation, and some vendors are looking to automating that process.

For example Triumfant offers Resolution Manager that can restore endpoints to known good states after detecting malicious activity. Other vendors offer remediation features or say they are working on them, but the trend is toward using the same platforms to fix the problems they find.

The problem businesses face is that endpoints remain vulnerable despite the efforts of traditional endpoint security, which has evolved into security suites – anti-virus, anti-malware, intrusion detection, intrusion prevention, etc. While progressively working on the problem it leads to another problem.

“They have actually just added more products to the endpoint portfolio, thus taking us full circle back to bloated end points,” says Larry Whiteside, the CSO for the Lower Colorado River Authority. “Luckily, memory and disk speed (SSD) have kept that bulk from crippling endpoint performance.”

As a result he is looking at next-generation endpoint protection from SentinelOne. Security based on what endpoints are doing as opposed to seeking signatures of known malicious behavior is an improvement over traditional endpoint protection, he says. “Not saying signatures are totally bad, but that being a primary or only decision point is horrible. Therefore, adding behavior based detection capabilities adds value.”

So much value that he is more concerned about that than he is about whether there is a hard return on investment. “The reality is that I am more concerned about detection than I am ROI, so I may not even perform that analysis. I can say that getting into a next-gen at the right stage can be beneficial to an organization,” he says.

Anti-virus replacement?
So far vendors of next-generation endpoint protection have steered clear of claiming their products can replace anti-virus software, despite impressive test results. But that could be changing. Within a year, regulatory hurdles that these vendors face may disappear, says George Kurtz, CEO of CrowdStrike.

Within a year rules that require use of anti-virus in order to pass compliance tests will allow next-generation endpoint protection as well, he says. “That’s really our goal,” he says. “From the beginning we thought we could do that.”

He says everyone is focused on malware, but that represents just 40% of attacks. The rest he calls “malware-less intrusions” such as insider theft where attackers with credentials steal information without use of malware.

Until regulations are rewritten, it’s important for regulated businesses to meet the anti-virus requirement, Abrams says, even though other platforms may offer better protection. “It some cases that’s actually more important than the ability to protect because you won’t be protected from legal liabilities.”

Meanwhile having overlapping anti-virus and next-gen endpoint protection means larger enterprises are likely customers for now vs. smaller businesses with fewer resources, he says. But even for smaller businesses the cost may be worth it.

“What do they have to lose and how much does it cost to lose this information vs how much does it cost to protect it?” Abrams says. “

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com