Subscribe: RSS News Feeds::.
Added By: Feedage Forager Feedage Grade B rated
Language: English
base salary  base  certification  job openings  job  median base  median  number job  number  openings  salary number  salary 
Rate this Feed
Rate this feedRate this feedRate this feedRate this feedRate this feed
Rate this feed 1 starRate this feed 2 starRate this feed 3 starRate this feed 4 starRate this feed 5 star

Comments (0)

Feed Details and Statistics Feed Statistics
Preview: RSS News Feeds::. RSS News Feeds::.



You can try all your exams before the official exam, such as cisco, java, language etc...

Intel confirms remote code execution hole in all Intel CPUs since 2008 (Security)
According to security researchers, media, and now Intel themselves, a security hole allowing remote code execution (RCE) has been present in Intel CPUs since 2008. The exploit was usable on Intel Active Management Technology (AMT), Small Business Technology (SBT), and Standard Manageability (SM). Those are present in most every computer with an Intel CPU from the last ten years, and allowed for remote execution of code on the CPU. Charlie Demerjian at SemiAccurate first reported the news earlier today: [quote][color=Teal]“The short version is that every Intel platform with AMT, ISM, and SBT from Nehalem in 2008 to Kaby Lake in 2017 has a remotely exploitable security hole in the ME (Management Engine) not CPU firmware.”[/color][/quote] Intel confirms escalation of privilege vulnerability SemiAccurate has known about the exploit for over five years before releasing the news earlier today. SemiAccurate, along with many others such as Richard Stallman, have been warning that something like this could happen and likely had happened already. Today, May 1st, 2017, we have confirmation. The issue was confirmed with a security advisory later in the day while thanking Maksim Malyutin from Embedi. They have released a firmware fix and it will be distributed asap. The CPU maker admitted that the vulnerability allowed for “an unprivileged attacker to gain control of the manageability features provided by these products.” The manageability features allow all sorts of shenanigans. Intel confirmed to SemiAccurate that AMT can be used to “bare metal image a dead machine over a cellular connection.” Needless to say, if they can do that – they can do anything. The security advisory also states that this vulnerability did not/does not exist on consumer PCs, only non-consumer PCs. In the public eye, the veracity of this claim has not been proven. If your computer doesn’t have VPro, then it doesn’t have AMT and isn’t vulnerable. It’s also worth noting that Apple Macs do not use Intel AMT, and thus were not vulnerable.

A New Physics Theory of Life (Misc)
Why does life exist? Popular hypotheses credit a primordial soup, a bolt of lightning and a colossal stroke of luck. But if a provocative new theory is correct, luck may have little to do with it. Instead, according to the physicist proposing the idea, the origin and subsequent evolution of life follow from the fundamental laws of nature and “should be as unsurprising as rocks rolling downhill.” From the standpoint of physics, there is one essential difference between living things and inanimate clumps of carbon atoms: The former tend to be much better at capturing energy from their environment and dissipating that energy as heat. Jeremy England, a 31-year-old assistant professor at the Massachusetts Institute of Technology, has derived a mathematical formula that he believes explains this capacity. The formula, based on established physics, indicates that when a group of atoms is driven by an external source of energy (like the sun or chemical fuel) and surrounded by a heat bath (like the ocean or atmosphere), it will often gradually restructure itself in order to dissipate increasingly more energy. This could mean that under certain conditions, matter inexorably acquires the key physical attribute associated with life. “You start with a random clump of atoms, and if you shine light on it for long enough, it should not be so surprising that you get a plant,” England said. England’s theory is meant to underlie, rather than replace, Darwin’s theory of evolution by natural selection, which provides a powerful description of life at the level of genes and populations. “I am certainly not saying that Darwinian ideas are wrong,” he explained. “On the contrary, I am just saying that from the perspective of the physics, you might call Darwinian evolution a special case of a more general phenomenon.” His idea, detailed in a recent paper and further elaborated in a talk he is delivering at universities around the world, has sparked controversy among his colleagues, who see it as either tenuous or a potential breakthrough, or both. England has taken “a very brave and very important step,” said Alexander Grosberg, a professor of physics at New York University who has followed England’s work since its early stages. The “big hope” is that he has identified the underlying physical principle driving the origin and evolution of life, Grosberg said. “Jeremy is just about the brightest young scientist I ever came across,” said Attila Szabo, a biophysicist in the Laboratory of Chemical Physics at the National Institutes of Health who corresponded with England about his theory after meeting him at a conference. “I was struck by the originality of the ideas.” Others, such as Eugene Shakhnovich, a professor of chemistry, chemical biology and biophysics at Harvard University, are not convinced. “Jeremy’s ideas are interesting and potentially promising, but at this point are extremely speculative, especially as applied to life phenomena,” Shakhnovich said. England’s theoretical results are generally considered valid. It is his interpretation — that his formula represents the driving force behind a class of phenomena in nature that includes life — that remains unproven. But already, there are ideas about how to test that interpretation in the lab. “He’s trying something radically different,” said Mara Prentiss, a professor of physics at Harvard who is contemplating such an experiment after learning about England’s work. “As an organizing lens, I think he has a fabulous idea. Right or wrong, it’s going to be very much worth the investigation.” At the heart of England’s idea is the second law of thermodynamics, also known as the law of increasing entropy or the “arrow of time.” Hot things cool down, gas diffuses through air, eggs scramble but never spontaneously unscramble; in short, energy tends to disperse or spread out as time progresses. Entropy is a measure of this tendency, quantifying how dispersed the energy is among the particles in a sys[...]

“Most serious” Linux privilege-escalation bug ever is under active exploit (updated) (Security)
[color=Red][b]“Most serious” Linux privilege-escalation bug ever is under active exploit (updated)[/b][/color] A serious vulnerability that has been present for nine years in virtually all versions of the Linux operating system is under active exploit, according to researchers who are advising users to install a patch as soon as possible. While CVE-2016-5195, as the bug is cataloged, amounts to a mere privilege-escalation vulnerability rather than a more serious code-execution vulnerability, there are several reasons many researchers are taking it extremely seriously. For one thing, it's not hard to develop exploits that work reliably. For another, the flaw is located in a section of the Linux kernel that's a part of virtually every distribution of the open-source OS released for almost a decade. What's more, researchers have discovered attack code that indicates the vulnerability is being actively and maliciously exploited in the wild. "It's probably the most serious Linux local privilege escalation ever," Dan Rosenberg, a senior researcher at Azimuth Security, told Ars. "The nature of the vulnerability lends itself to extremely reliable exploitation. This vulnerability has been present for nine years, which is an extremely long period of time." The underlying bug was patched this week by the maintainers of the official Linux kernel. Downstream distributors are in the process of releasing updates that incorporate the fix. Red Hat has classified the vulnerability as "important." As their names describe, privilege-escalation or privilege-elevation vulnerabilities allow attackers with only limited access to a targeted computer to gain much greater control. The exploits can be used against Web hosting providers that provide shell access, so that one customer can attack other customers or even service administrators. Privilege-escalation exploits can also be combined with attacks that target other vulnerabilities. A SQL injection weakness in a website, for instance, often allows attackers to run malicious code only as an untrusted user. Combined with an escalation exploit, however, such attacks can often achieve highly coveted root status. The in-the-wild attacks exploiting this specific vulnerability were found by Linux developer Phil Oester, according to an informational site dedicated to the vulnerability. It says Oester found the exploit using an HTTP packet capture, but the site doesn't elaborate. Update: In e-mails received about nine hours after this post went live, Oester wrote: [quote]Any user can become root in < 5 seconds in my testing, very reliably. Scary stuff. The vulnerability is easiest exploited with local access to a system such as shell accounts. Less trivially, any web server/application vulnerability which allows the attacker to upload a file to the impacted system and execute it also works. The particular exploit which was uploaded to my system was compiled with GCC 4.8.5 released 20150623, though this should not imply that the vulnerability was not available earlier than that date given its longevity. As to who is being targeted, anyone running Linux on a web facing server is vulnerable. For the past few years, I have been capturing all inbound traffic to my webservers for forensic analysis. This practice has proved invaluable on numerous occasions, and I would recommend it to all admins. In this case, I was able to extract the uploaded binary from those captures to analyze its behavior, and escalate to the appropriate Linux kernel maintainers.[/quote] The vulnerability, a variety known as a race condition, was found in the way Linux memory handles a duplication technique called copy on write. Untrusted users can exploit it to gain highly privileged write-access rights to memory mappings that would normally be read-only. More technical details about the vulnerability and exploit are available here, here, and here. Using the acronym derived from copy on write, some researchers have dubbed the vuln[...]

Open Source Licenses (Security)
Licenses Open source licenses grant permission to everyone to use, modify, and share licensed software for any purpose, subject to conditions preserving the provenance and openness of the software. The following licenses are arranged from one with the strongest of these conditions (GNU AGPLv3) to one with no conditions (Unlicense). GNU AGPLv3 Permissions of this strongest copyleft license are conditioned on making available complete source code of licensed works and modifications, which include larger works using a licensed work, under the same license. Copyright and license notices must be preserved. Contributors provide an express grant of patent rights. When a modified version is used to provide a service over a network, the complete source code of the modified version must be made available. GNU GPLv3 Permissions of this strong copyleft license are conditioned on making available complete source code of licensed works and modifications, which include larger works using a licensed work, under the same license. Copyright and license notices must be preserved. Contributors provide an express grant of patent rights. GNU LGPLv3 Permissions of this copyleft license are conditioned on making available complete source code of licensed works and modifications under the same license or the GNU GPLv3. Copyright and license notices must be preserved. Contributors provide an express grant of patent rights. However, a larger work using the licensed work through interfaces provided by the licensed work may be distributed under different terms and without source code for the larger work. Mozilla Public License 2.0 Permissions of this weak copyleft license are conditioned on making available source code of licensed files and modifications of those files under the same license (or in certain cases, one of the GNU licenses). Copyright and license notices must be preserved. Contributors provide an express grant of patent rights. However, a larger work using the licensed work may be distributed under different terms and without source code for files added in the larger work. Apache License 2.0 A permissive license whose main conditions require preservation of copyright and license notices. Contributors provide an express grant of patent rights. Licensed works, modifications, and larger works may be distributed under different terms and without source code. MIT License A short and simple permissive license with conditions only requiring preservation of copyright and license notices. Licensed works, modifications, and larger works may be distributed under different terms and without source code. The Unlicense A license with no conditions whatsoever which dedicates works to the public domain. Unlicensed works, modifications, and larger works may be distributed under different terms and without source code. The above licenses represent the entire spectrum of open source licenses, from highly protective to unconditional. One of these should work for most new open source projects. Many other open source licenses exist, including older versions of and close substitutes for some of the above.

10 strange novels of the British countryside (Misc)
he best creepy stories take place against the creepy backdrop of the British countryside. Here are ten of our favourites... From the moors to the mountains, from gnarled woods to deserted beaches, Britain has spectacular countryside. But it's also, in the right literary hands, very creepy countryside. So many of the best UK novels make use of that strange, unsettling quality of the British landscape, and our relationship with it. Spooky houses, dangerous relationships, ancient folk figures or alien invaders - here's a look at ten novels that use the countryside to connect us to a time and a place, and to make us realise that it's not all sweet tweeting birdies and green rolling hills in Britain. There's something dark at the heart of our landscape, and these writers know how to show it: 1. On The Black Hill by Bruce Chatwin On the border between Wales and England lies a farm called The Vision. Twins Lewis and Benjamin Jones live there, working the land, sleeping in the same double bed, and this book is consumed by their relationship to the land and to each other. From the beginning of the twentieth century, through the span of their lives, there is the sense that they are part of each other and of the farm in a way that cannot be explained in words. Bruce Chatwin was a great travel writer, visiting places such as Australia and Patagonia and imbuing a sense of their mysteries to the reader. He did the same with the Welsh borders in this book. It reminds us of what is being lost as time moves on, and how unquantifiable an understanding of the land is. 2. Rawblood by Catriona Ward Set in a gothic house in the middle of Dartmoor, with a ghost that haunts it through the years, turning those who see it mad: Rawblood is a brilliant throwback to all those stories about generational families with their terrible secrets, and things that go bump in the night. But it delves deeper into the whys of these strange events, and has a great understanding of the characters of the Villarca family that live there. The west-country moors have been the setting for some of the most memorable of unsettling stories in British literature (see the book just below for another one); there's something about that unconquerable space that threatens us, even now. Rawblood is a continuation of our long-standing fear of the bleak expanse of the wide-open, so far from shelter or from normality, and it's brilliant. 3. An English Ghost Story by Kim Newman So the Naremores, a family with a few problems, move into their new home miles from the nearest town after a particularly rash decision to purchase, and come to realise they're not alone. It's not a new idea, but old ideas done well can be just as fun. Of course there are ghosts in the spooky house in the middle of nowhere. But what I love about Newman's ghost story is that the relationship the spirits have with the new owners of the Hollow, a big house in Somerset surrounded by fruit trees and hedgerows; from bounteous beginnings to the big freeze, everything follows the seasons to suggest that the ghosts are not abominations, but part of nature too. I love that idea. 4. Jamaica Inn by Daphne Du Maurier Du Maurier wrote a number of books that used their west-country setting to maximum effect, and I think Jamaica Inn is the spookiest of them, being a tale of terrible happenings on Bodmin Moor, and of the hard, desperate men and women who live there. The book begins with one of the great descriptions of English weather at its worst. It's two o'clock in the afternoon in late November and a clammy mist has settled over the moor as our heroine journeys towards the inn in a rickety carriage that isn't keeping out the damp in the least. She's shivering, and heading towards an unwelcoming inn, and an amoral uncle who will frighten and abuse her. Brrrrrrrr. 5. Puffball by Fay Weldon The juxtaposition between the city and the country is done so well in Puffball, which sees y[...]

Congress Keeps Holding Repeated, Pointless Hearings Just To Punish The FCC For Standing Up To (Government)
[size=3]Congress Keeps Holding Repeated, Pointless Hearings Just To Punish The FCC For Standing Up To ISPs On Net Neutrality[/size] In the year since the FCC passed net neutrality rules, ISP allies in Congress have run the agency through an [url=]endless gauntlet[/url] of show-pony hearings. While most of these hearings profess to be focused on agency transparency and accountability, they're really geared toward one single purpose: to publicly shame the agency for standing up to deep-pocketed telecom campaign contributors. Given the fact the only real way to overturn the rules is for ISPs to prevail in court or via Presidential election, this showmanship has been little more than a stunning display of wasted taxpayer dollars and stunted intellectual discourse. Undaunted, the Senate held yet another "FCC accountability" (read: pointless tongue-lashing) hearing last week, during which Senators [url=]pummeled FCC boss Tom Wheeler[/url] with many of the same, repeatedly-debunked claims net neutrality opponents have been making since the rules were approved. Among them was the claim that the rules somehow hampered broadband investment, despite the fact that objective data (including quarterly ISP earnings reports) repeatedly shows that [url=]simply isn't the case[/url] For somebody that's had his time repeatedly wasted simply for upsetting the telecom status quo, Wheeler remains impressively cool under fire. For example, when fellow FCC Commissioner (and former Verizon regulatory lawyer) Ajit Pai took to the hearing to again trot out the industry-backed think tank claim that broadband investment had suffered under net neutrality, Wheeler casually highlighted that repetition does not magically forge reality: [quote] "With all due respect to my colleague, what he has just portrayed as facts are not,” Wheeler responded. He said that investment in broadband increased, along with a 13% jump in fiber investment, as well as Internet usage and increased revenue per subscriber. But Pai insisted that is not the case. “The FCC’s policies have failed. The administration’s policies on broadband has failed,” he said. "We are not seeing a decline in broadband infrastructure investment. You can say it and say it and say it, but that does not make it a fact,” Wheeler responded. Pai said he would offer up sworn declarations from Internet providers showing how the new rules had caused them to slow their investment. But Wheeler, too, offered to submit corporate statements on Internet investment, which would face Securities and Exchange Commission penalties if they were misleading.[/quote] When the rules were approved you might recall that net neutrality opponents also tried to claim that the White House "improperly influenced" the creation of the rules, since the White House vocally supported the Title II approach in [url=]in November of 2014[/url], and Wheeler voiced his support for Title II in[url=]in February of 2015[/url]. This, net neutrality opponents argued, was clear evidence of an unholy cabal. Net neutrality opponents can't point out what law was broken (because none was) and FCC history is filled with exa[...]

Netflix launches to show how fast (or slow) your Internet connection really is (New Releases)
Netflix really wants to show you how fast (or slow) your Internet connection is, and to do so it has launched a new website at [url=][/url] that conveys the real-time speed of your connection to the Web. It’s designed to give people “greater insight and control of their Internet service.” It looks like Netflix procured the domain last month, and, according to a filing with the United States Patent and Trademark Office (USPTO), is in the midst of trademarking the Fast logo. In the application, which was filed on May 5, Netflix said it was for: [*] Providing a website featuring non-downloadable software for testing and analyzing the speed of a user’s Internet connection [*] Downloadable computer software for testing and analyzing the speed of a user’s Internet connection There is no word on any downloadable software yet, but the website is certainly now live and it’s a fairly straightforward offering. It works globally, on mobile Internet or domestic broadband, and it displays the speed of your connection, in megabits-per-second, at any given moment. The launch comes just a few weeks after Netflix introduced new cellular data controls that let users [url=]tweak the quality of their video streams[/url], so that those on patchier connections can select lower-quality streams. does seem like a random product on the surface, especially given the myriad of existing speed-test tools already available online, but Netflix wanted something of its own, that works in real time, and that is devoid of any annoying distractions (such as ads and other bells and whistles). Netflix’s new speed test tool actually links directly to to allow you to compare the results (we can confirm that they are accurate). “We all want a faster, better Internet, yet Internet speeds vary greatly and can be affected by other users on your network or congestion with your Internet service provider,” said David Fullagar, vice president of content delivery architecture at Netflix, [url=]in a blog post[/url]. “When you’re experiencing streaming issues, allows you to check the download speeds you’re getting from your Internet service provider. Using Netflix servers, works like other globally available tools including, and the results should be similar in most cases.” With more than 25 million members in the United States, Canada and Latin America, Netflix, Inc. [Nasdaq: NFLX] is the world's leading Internet subscription service for enjoying movies and TV.

Think you're not being tracked? Now websites turn to audio fingerprinting to follow you (Misc)
New research into web-tracking techniques has found some websites using audio fingerprinting for identifying and monitoring web users. During a scan of one million websites, researchers at Princeton University have found that a number of them use the AudioContext API to identify an audio signal that reveals a unique browser and device combination. "Audio signals processed on different machines or browsers may have slight differences due to hardware or software differences between the machines, while the same combination of machine and browser will produce the same output," the researchers explain. The method doesn't require access to a device's microphone, but rather relies on the way a signal is processed. The researchers, Arvind Narayanan and Steven Englehardt, have published [url=]a test page[/url] to demonstrate what your browser's audio fingerprint looks like. "Using the AudioContext API to fingerprint does not collect sound played or recorded by your machine. An AudioContext fingerprint is a property of your machine's audio stack itself," they note on the test page. The technique isn't widely adopted but joins a number of other approaches that may be used in conjunction for tracking users as they browse the web. For example, one script that they found combined a device's current charge level, a canvas-font fingerprint and a local IP address derived from WebRTC, the framework for real-time communications between two browsers. The researchers [url=]found[/url] 715 of the top one million websites are using WebRTC to discover the local IP address of users. Most of these are third-party trackers. Another more widely used method is fingerprinting based on the HTML Canvass API, which aims to deduce the fonts installed on a browser. They found 3,250 first-party sites using this technique. Meanwhile, Canvass fingerprinting was found on 14,371 sites with scripts loaded from 400 different domains. The researchers analysed canvass fingerprinting in 2014, and note three changes since then. "First, the most prominent trackers have by and large stopped using it, suggesting that the public backlash following that study was effective. Second, the overall number of domains employing it has increased considerably, indicating that knowledge of the technique has spread and that more obscure trackers are less concerned about public perception. Third, the use has shifted from behavioral tracking to fraud detection, in line with the ad industry's self-regulatory norm regarding acceptable uses of fingerprinting." The other key finding, which may be good news depending on your attitude to Google, Facebook and Twitter, is that the number of third-party trackers that users will encounter on a daily basis is small. "All of the top five third parties, as well as 12 of the top 20, are Google-owned domains. In fact, Google, Facebook, and Twitter are the only third-party entities present on more than 10 percent of sites," the researchers note. The researchers say their data suggests there has been a consolidation in the market for third-party tracking, which contrasts to the perception that there has been an explosion in third-party trackers. And that could be good news in terms of pressuring the industry to make privacy-enhancing improvements. "For 100 or so third parties that are prevalent on one percent or more of sites, we might expect that they are large enough entities that their behavior can be regulated by public-relations pressure and the possibility of legal or enforcement actions," they argued. [b]News sites have the most trackers, according to the researchers.[/b] [img][...]

Microsoft and Canonical partner to bring Ubuntu to Windows 10 (Hot n Happening)
According to sources at Canonical, Ubuntu Linux's parent company, and Microsoft, you'll soon be able to run Ubuntu on Windows 10. This will be more than just running the Bash shell on Windows 10. After all, thanks to programs such as Cygwin or MSYS utilities, hardcore Unix users have long been able to run the popular Bash command line interface (CLI) on Windows. With this new addition, Ubuntu users will be able to run Ubuntu simultaneously with Windows. This will not be in a virtual machine, but as an integrated part of Windows 10. The details won't be revealed until tomorrow's morning keynote speech at Microsoft Build. It is believed that Ubuntu will run on top of Windows 10's recently and quietly introduced Linux subsystems in a new Windows 10 Redstone build. Microsoft and Canonical will not, however, sources say, be integrating Linux per se into Windows. Instead, Ubuntu will primarily run on a foundation of native Windows libraries. This would indicate that while Microsoft is still hard at work on bringing containers to Windows 10 in project Barcelona, this isn't the path Ubuntu has taken to Windows. That said, Canonical and Microsoft have been working on bringing containers to Windows since last summer. They've been doing this using LXD. This is an open-source hypervisor designed specifically for use with containers instead of virtual machines (VMs). The fruits of that project are more likely to show up in Azure than Windows 10. It also seems unlikely that Ubuntu will be bringing its Unity interface with it. Instead the focus will be on Bash and other CLI tools, such as make, gawk and grep. Could you run a Linux desktop such as Unity, GNOME, or KDE on it? Probably, but that's not the purpose of this partnership. Canonical and Microsoft are doing this because Ubuntu on Windows' target audience is developers, not desktop users. In particular, as Microsoft and Canonical continue to work more closely together on cloud projects, I expect to find tools that will make it easy for programmers to use Ubuntu to write programs for Ubuntu on the Azure cloud. So is this MS-Linux? No. Is it a major step forward in the integration of Windows and Linux on the developer desktop? Yes, yes it is.

Salary Survey Extra: What is the value of your certification? (Career)
Salary Survey Extra is a series of periodic dispatches that give added insight into the findings of our most recent Salary Survey. These posts contain previously unpublished Salary Survey data. What is the value of your IT certification?What determines the “value” of a certification? There are many different reasons that people choose to get certified. So on some level, it’s a personal question. Your certification is valuable to you for whatever reasons are most important to you. Two questions, however, are often discussed more than any others when it comes to determining whether it’s “worth it” to get certified. Will the certification add to my knowledge and understanding of a given IT topic? Will the certification increase my earning power? For the Salary Survey, we isolated those two items and asked the question directly: Which is the most important benefit of earning a certification? At least in terms of perception among IT professionals, there’s no contest. A commanding 71.8 percent of those surveyed said that gaining increased knowledge and skills is the most important benefit of earning a certification. Just 15.8 percent feel that gaining increased earning power is the most important benefit of earning a certification. And we did hear from a contrarian 12.4 percent of survey respondents who believe that neither increased earning power nor increased knowledge and skills is the most important benefit of earning a certification. For those people, the most important benefit of certification probably lies somewhere in the follow-up question that we asked: Besides education and salary, what are the most important benefits of getting a certification? We asked each respondent to name his or her two top choices. Here are the options, along with the percentage of all survey respondents who chose that option: Gain qualifications for a future job — 49.4 percent Improve or confirm my qualification for my current job — 48.2 percent Gain greater confidence in my own skills — 40 percent Become eligible for positions of greater responsibility with my current employer — 34.9 percent Gain prestige and recognition among colleagues — 28.7 percent Gain advanced access to technical data — 25.7 percent My employer requires this certification — 21.9 percent Enjoy belonging to a community of certified professionals — 19.8 percent Enjoy receiving increased support from IT vendors — 5.6 percent Finally, what about the value of certifications over time? This is a battleground, of sorts. Some people like to argue that certification is less valuable than it used to be. Time and the rapid advance of technology have changed the playing field, certain skills are invalid or outdated, and so forth. So we asked IT professionals that question as well: What will happen to the overall worth and impact of certifications over the next five years? For the most part, people either don’t see the overall picture changing much, or they’re optimistic. A considerable 47.9 percent of those surveyed think that certifications will become more valuable and impactful over the next five years, while 39.8 percent see things remaining about the same. A little less than 10 percent of those surveyed believe that certifications will become less valuable and impactful, while 2.7 percent think they will become irrelevant. MMMMM, HOT DOGS In October of last year, the World Health Organization rained on everyone’s parade. Not quite literally, of course, but it was close. Parades happen in summer, after all, and summer means cookouts, and cookouts need hot dogs … and hot dogs (and a lot of other meats) cause cancer. Boooo! And waaaah! You're going to miss me when I'm gone ...While it seems unlikely that hot dogs (and other processed meats) will immediately vanish from circulati[...]

25 Highest Paying Jobs in America for 2016 (Career)
Nearly seven in ten (68%) people report that salary and compensation is among their leading considerations when determining where to work. So for people who really want to earn a big paycheck, which jobs offer the highest salaries? According to Glassdoor’s latest report highlighting the 25 Highest Paying Jobs in America for 2016*, physicians, lawyers and research & development managers are bringing home the biggest paychecks. This report is entirely based on people with these jobs who have shared their salaries on Glassdoor over the past year. Which other jobs offer the highest salaries? Check out the complete results: 1. Physician Median Base Salary: $180,000 Number of Job Openings: 2,064 2. Lawyer Median Base Salary: $144,500 Number of Job Openings: 995 3. Research & Development Manager Median Base Salary: $142,120 Number of Job Openings: 112 4. Software Development Manager Median Base Salary: $132,000 Number of Job Openings: 3,495 5. Pharmacy Manager Median Base Salary: $130,000 Number of Job Openings: 1,766 6. Strategy Manager Median Base Salary: $130,000 Number of Job Openings: 701 7. Software Architect Median Base Salary: $128,250 Number of Job Openings: 655 8. Integrated Circuit Designer Engineer Median Base Salary: $127,500 Number of Job Openings: 165 9. IT Manager Median Base Salary: $120,000 Number of Job Openings: 3,152 10. Solutions Architect Median Base Salary: $120,000 Number of Job Openings: 2,838 11. Engagement Manager Median Base Salary: $120,000 Number of Job Openings: 1,452 12. Applications Development Manager Median Base Salary: $120,000 Number of Job Openings: 263 13. Pharmacist Median Base Salary: $118,000 Number of Job Openings: 4,502 14. Systems Architect Median Base Salary: $116,920 Number of Job Openings: 439 15. Finance Manager Median Base Salary: $115,000 Number of Job Openings: 2,582 16. Data Scientist Median Base Salary: $115,000 Number of Job Openings: 1,985 17. Risk Manager Median Base Salary: $115,000 Number of Job Openings: 1,137 18. Creative Director Median Base Salary: $115,000 Number of Job Openings: 696 19. Actuary Median Base Salary: $115,000 Number of Job Openings: 175 20. Data Architect Median Base Salary: $113,000 Number of Job Openings: 762 21. Tax Manager Median Base Salary: $110,000 Number of Job Openings: 1,495 22. Product Manager Median Base Salary: $107,000 Number of Job Openings: 7,758 23. Design Manager Median Base Salary: $106,500 Number of Job Openings: 510 24. Analytics Manager Median Base Salary: $106,000 Number of Job Openings: 988 25. Information Systems Manager Median Base Salary: $106,000 Number of Job Openings: 147 However, a handsome salary doesn’t always equate to job satisfaction, as recent Glassdoor Economics Research suggests. “This report reinforces that high pay continues to be tied to in-demand skills, higher education and working in jobs that are protected from competition or automation. This is why we see several jobs within the technology and healthcare industries,” said Dr. Andrew Chamberlain, Glassdoor Chief Economist. “There’s no doubt that pay is among the leading factors most job seekers weigh when determining where to work. However, our research shows that a big paycheck isn’t necessarily tied to long-term satisfaction in your job. Instead, when we dig deeper into what keeps employees satisfied once they’re in a job and with a company, we find that culture and values, career opportunities, and trust in senior leadership are the biggest drivers of employee satisfaction.”

Advance Your Career through Cisco’s Specialist Certification Programs (Career)
The #CertsandLabs team is always working to ensure that we keep you in the know about our certifications and how you can use them to grow your IT career. This month we’d like to highlight some of our programs that you may be less familiar with - our Specialist programs. The Specialist program consists of eight topic areas offering certification programs that are specialized for a variety of IT industry needs and markets. The eight topics range from business essentials that allow management to have a holistic understanding during the IT decision making process to the Data Center essentials that allow IT professionals to demonstrate their technical skills and abilities to design, install, and support a data center networking solution. Some of the specialist certifications offered require no prior technical knowledge, such as the business certifications. Others let IT professionals validate their specific technical proficiency by learning additional skills in a niche market. Professionals who earn a specialist certification, like the networking programmability certification, can provide invaluable contributions to their company’s IT department because they are designed to address more specific industry needs. Below is a breakdown of the eight specialist certification areas. If you are interested in the topics below and have the desire to grow your IT skills, we encourage you to review the information and find a certification that meets your skills development and career goals. Once you have reviewed the summary, use the certification link to learn about which specific certifications are available in each category and find the requirements for earning the certification. 1. Business - The business specialist certifications are holistic and reflect multiple topics often described in today's IT roles. These subjects include: business analysis, technology trends, finance, business-focused solution design, organization change / IT adoption and effective communications. 2. Data Center – With the Data Center specialist programs, you can enhance your skills and abilities to design, install, and support a data center networking solution. Data center specialist certifications geared toward unified computing, unified fabric, or flexpods can enhance your technical skills, confidence and the value you bring to your IT department. 3. Internet of Things (IoT) – The IoT specialist program includes the Industrial Networking Specialist certification. This certification is for information technology (IT) and operational technology (OT) professionals in the manufacturing, process control, and oil and gas industries. It evaluates foundational skills to manage and administer networked industrial control systems. It provides plant administrators, control system engineers and traditional network engineers with an understanding of the networking technologies needed in today's connected plants and enterprises. 4. Network Programmability - The Cisco Network Programmability Specialist certifications enhance your networking skills through foundational networking knowledge and allow you to use your software skills to develop network applications in programmable environments. 5. Operating System Software - The Cisco Operating System Software Specialist program validates proficiency in Cisco internetwork operating systems and includes the Cisco IOS XR Specialist Certification. 6. Service provider – These specialist certifications enhance your skills and ability to design, install, and support specific aspects of service provider networks. Service Provider certifications can help develop expand your career path by allowing you to become a valuable resource to your organization in such a niche market. 7. Coll[...]

DoD Invites Vetted Specialists to ‘Hack’ the Pentagon (Security)
WASHINGTON, March 2, 2016 — The Defense Department is launching a pilot program in April to allow vetted computer security specialists to do their best to hack DoD public web pages, Pentagon Press Secretary Peter Cook said today. “Hack the Pentagon” is the first cyber bug bounty program in the history of the federal government, Cook said in a statement issued today. Bug bounty programs are offers by software developers and company websites to reward people who report bugs related to vulnerabilities or hacking exploits. Jarrett Ridlinghafer, at the time a technical support engineer for Netscape, created the first “bugs bounty” program in 1995, according to the entrepreneur’s website. Today has a directory of 369 such programs offered by everyone from Adobe and Amazon to Twitter and Sony. Commercial-Sector Crowdsourcing “We can't hire every great ‘white hat’ hacker to come in and help us,” a senior defense official said today on a media call, “but [Hack the Pentagon] allows us to use their skill sets, their expertise, to help us build better more secure products and make the country more secure.” Cook said the department will use commercial-sector crowdsourcing to allow qualified participants to conduct vulnerability identification and analysis on the department's public webpages. “The bug bounty program is modeled after similar competitions conducted by some of the nation's biggest companies to improve the security and delivery of networks, products and digital services,” Cook said. The pilot is the first in a series of programs designed to test and find vulnerabilities in the department's applications, websites and networks, he added. Bug Bounty The Pentagon’s bug bounty participants will have to register and submit to a background check before being involved in the program. Once vetted, Cook said, the hackers will participate in a controlled, limited-duration program during which they’ll be able to identify vulnerabilities on a predetermined department system. “Other networks, including the department's critical, mission-facing systems, will not be part of the bug bounty pilot,” he added, noting that bug bounty hunters could receive monetary awards and other recognition. The program, Cook said, shows Defense Secretary Ash Carter’s commitment to driving the Pentagon to identify new ways to improve the department's cybersecurity. Enhancing National Security Carter said he’s confident the initiative will strengthen DoD’s digital defenses and ultimately enhance national security. The department’s Defense Digital Service, launched by Carter last November, is leading Hack the Pentagon. Cook said the DDS is an arm of the White House's cadre of technology experts at the U.S. Digital Service and includes a small team of engineers and data experts meant to improve DoD’s technological agility. “Bringing in the best talent, technology and processes from the private sector not only helps us deliver comprehensive, more secure solutions to the DoD, but it also helps us better protect our country," DDS director and technology entrepreneur Chris Lynch said. Hack the Pentagon, Cook said, “is consistent with the administration's Cyber National Action Plan announced on Feb. 9 that prioritizes near-term actions to improve our cyber defenses and codifies a long-term strategy to enhance cybersecurity across the U.S. government.” The pilot program will launch in April and the department will provide more details on requirements for participation and other ground rules in the coming weeks, he said. A live asset will be chosen as the target for the hackers, the senior defense official said, but one that is under constant attack and has [...]

10 Reasons a Degree Alone is No Longer Enough (Career)
Education is, of course, one of the foundations of a successful career. But in IT, traditional education becomes that much powerful when paired with certifications, which keep pace with the rapidly evolving needs of the IT job market. With its CompTIA Academy Partner Program, CompTIA is deeply invested in enabling aspiring IT pros to succeed in their educations and further thrive in their careers via certification. The CompTIA Academy Partner Program provides academic institutions with resources and tools to effectively educate on CompTIA certification exams, along with promotional materials and teaching tips. These services are offered at far below retail cost, because CompTIA cares about students and their future careers. “Our Academy team has been focused on two words as our ultimate goal – student success,” Alan Rowland, director of business development for the CompTIA Academy Partner Program, said. “We consider the goal of our program to be the same as that of a high school or post-secondary institution – to help students be as successful as possible. [Educators sometimes] say, ‘I’m an industry expert, what I’m teaching my students is very valuable.’ There’s no question about that. But having a credential that addresses certain specifics within a class is absolutely essential.” The following are ten reasons why certification should be a cornerstone of any IT education: 1. More Jobs Require a Certification Just to Get in the Door The number of jobs that won’t even entertain an application from an un-certified professional is growing – either for non-negotiable compliance purposes, such as in government jobs, or because of the validation of an applicant’s skills certifications offer. Kirk Smallwood, senior director of U.S. academic sales at CompTIA, pointed out that some 85 percent of jobs in various parts of the IT field now require certification. “There are network companies that will tell you flat out, ‘If a resume doesn’t have CompTIA A+ on it we’re not going to grant [a candidate] an interview,’” Smallwood said. 2. Sometimes Non-Certified Resumes Don’t Even Come Up in a Search The increasing use of job search databases means that headhunters looking for talent depend on keyword searches to find qualified candidates, and CompTIA certifications are keywords that carry clout. Without a CompTIA A+ or a CompTIA Security+ by your name, you may get skipped. 3. Certifications Have Global Prestige The global economy is changing rapidly, and with countries throughout the world putting countless resources into developing their IT infrastructures, so is the shape of the international IT workforce. Job seekers both in the U.S. and abroad stand to benefit by holding CompTIA certifications. “It’s an economic passport allowing you to cross borders or frontiers, because our certification exam means exactly the same thing in the U.S. as it does in London or South Africa or Tokyo or any of the other places where we operate,” Rowland said, “whereas a degree from your local community college in Anytown, USA may not be easily transferable even to one of the other 49 states in this country.” 4. Degrees Can Get Dusty, Certifications Stay Current People sometimes find when returning to the IT workforce after a break that a computer science degree from years ago doesn’t carry the same heft it once did. Certifications, on the other hand, endorse current skills. IT professionals have to keep their certifications up-to-date with continuing education or by retaking exams. Many of CompTIA’s certifications are likewise updated to conform to Department of Defense regulations, so they always test on current information. 5. Cert[...]