Skip to main content

Can Campus Networks Ever Be Secure?

http://www.theatlantic.com/technology/archive/2015/10/can-campus-networks-ever-be-secure/409813/?single_page=true

Universities are struggling to find balance between academic openness and the need for computer security across their networks.

The Internet was built on university campuses. It was built by academics for academics, and without any notion of the new kinds of commerce, crime, and espionage it would enable.
As the Internet spread beyond purely academic circles and became a valuable tool for corporations and governments, as well as a target for criminals and terrorists, a vast—and growing—security industry sprang up to develop new controls and defenses that would protect us against threats. But the academic world, from which the Internet originally came, has not wholeheartedly embraced this growing market of new security tools and tactics.

A highly secured network that closely regulates and restricts new devices, users, or online traffic streams may be desirable in a secretive government agency or a company jealously guarding its intellectual property and strategy, but it may make less sense on the campus of higher-education institutions, which draw intellectual sustenance from the regular arrivals of new people from all over the world, carrying their own devices, looking to study and collaborate alongside everyone else.

If you run a college or university you don’t necessarily aspire to a computing environment that fortifies your chosen “insiders” and their data, and keeps out everyone else, and their potentially infected laptops and smartphones. At the same time, you still want to protect the personal information and intellectual property of your staff and students, and make sure that their computing resources are not being taken advantage of, or used to attack or infiltrate other targets. So, on those same campuses where the Internet was born, administrators, academics, information-technology staff, and students are struggling to figure out the right way to balance their academic research and educational missions with the need for computer security.
This is a struggle that goes back more than a decade. The 2003 U.S. National Strategy to Secure Cyberspace called for higher-education institutions to “make IT security a priority.” More recently, the company SecurityScorecard generated attention when it highlighted campus information-security postures in its 2015 Higher Education Security Report, which compares 485 colleges and universities and ranks the top and bottom 10 schools.
The value of such lists is questioned even by the institutions that get the highest scores because the rankings raise complicated issues around what it means—and what it should mean—for a campus to have good computer security. Does it make sense to apply the same kinds of metrics and criteria to a college or university that might be used to assess the security of a for-profit company? What happens to the research mission and intellectual environment of higher education in a world where everyone’s network security is interconnected, and breaches at one institution may rapidly lead to more serious breaches somewhere else?


The SecurityScorecard report is “very limited in its depth so I’ve asked everyone here to take it with a grain of salt,” said Mitch Parks, the director of information technology at the University of Idaho, which ranked eighth in SecurityScorecard’s list of the most secure campuses. “I wouldn’t really consider us that much more secure than a number of other institutions of higher education,” he added. “We’ve probably been more lucky than skilled.”
Several colleges have not been lucky when it comes to information security. So far this year, the University of California-Berkeley, Harvard University, and the University of Connecticut have all reported data breaches. “There are parts of what a university does that are just like anyone else—we have credit cards, we have social-security numbers, we have health records, we have educational records—all of which we have to, by law, lock down in just as firm a fashion as corporations do,” said Jim Waldo, a computer-science professor and the chief technology officer at Harvard.
But in other ways, campus networks call to mind those belonging to coffee shops or hotels rather than corporations. For one, colleges usually have relatively little control over the devices that students and faculty use on campus. Universities often welcome visitors from all over, who come bearing devices of their own and expecting to be able to connect them. And beyond just playing host to foreign visitors, an increasing number of U.S. schools are opening or partnering with international campuses and sending their faculty and students overseas to study and teach at these satellite campuses.
This international back-and-forth can provide opportunities for foreign infiltration of U.S. university networks, which may, in turn, serve as jumping off points for attacks directed at other U.S. targets. For instance, when The New York Times reported in early 2013 that its computer systems had been breached by Chinese hackers, the subsequent investigation by the security firm Mandiant found that the attacks had been routed through compromised computers at U.S. universities. In other words, 10 years after the publication of the National Strategy to Secure Cyberspace, higher-education networks were still serving as platforms for enterprising spies and criminals to target companies like the Times, thanks to significant computing resources and relatively relaxed access policies at the schools.


“The whole notion of a university is that it thrives on collaboration and exchange of scholarship and ideas, both with people inside the university and outside the university,” Waldo said. “Building an infrastructure for IT that is based around those assumptions is pretty different from the kind of things that can be done in a corporation where you can dictate to your customer base what they can and can’t do, and where you really want to keep the outside out and the inside under control—we can’t do either of those things.”
Instead, several university representatives said, they try to segment and partition their campus networks as much as possible so that the most sensitive, administrative information can be protected adequately while still allowing for relatively open parts of the network to support educational and research agendas. But given how decentralized campuses can be, with individual departments and research groups often managing their own data and resources, it’s not always straightforward for a university to keep track of where all of its most sensitive data is stored and what’s going on across all parts of its network.
“Universities are extremely attractive targets,” explained Richard Bejtlich, the chief security strategist at FireEye, which acquired Mandiant, the firm that investigated the hacking incident at the Times. “The sort of information they have can be very valuable—including very rich personal information that criminal groups want, and R&D data related to either basic science or grant-related research of great interest to nation state groups. Then, on the infrastructure side they also provide some of the best platforms for attacking other parties—high bandwidth, great servers, some of the best computing infrastructure in the world and a corresponding lack of interest in security.”
FireEye responds to security incidents at “plenty of universities,” Bejtlich added, noting that in the past the dominant trend has been campus networks being used as launch points for attacks on third parties but, in the past few years, an increasing number of school are themselves the target of breaches and are being drained of their intellectual property, not just used to launder connections to other targets. Academic institutions’ resistance to stricter security measures on campus is the result of misguided notions about how restrictive security controls are, Bejtlich said. “There’s a definite idea in schools that security equals constraint and that constraint is anathema to the open exchange of information,” he explained. “You have to get over this idea that security is going to prevent you from achieving your mission.”
There are two schools of thought on computer security at institutions of higher education. One is that universities are lagging behind companies in their security efforts and need to embrace a more locked-down, corporate approach to security. The other holds that companies are, in fact, coming around to the academic institutions’ perspective on security—with employees bringing their own devices to work, and an increasing emphasis on monitoring network activity rather than enforcing security by trying to keep out the outside world.

For instance, some hold up academic institutions as models for dealing with the security threats posed by bring-your-own device (BYOD) environments. “I think we’re maybe reaching the point where other institutions outside higher ed may need to look at higher ed a little more for solutions,” said Joanna Grama, the director of the Cybersecurity Initiative at Educause, a nonprofit focused on IT use in higher education. “Higher ed has always had this thought that corporations have it so much easier when it comes to security because they can lock down their devices and staff and we can’t do that, but now we’re seeing that corporations are having to struggle with these BYOD paradigms.”
Kim Milford, the executive director of the Research and Education Networking Information Sharing and Analysis Center (REN-ISAC), echoed that sentiment. “The BYOD debate—we figured that out 10 to 15 years before corporations did,” she said. REN-ISAC, which started in 2002 and coordinates information sharing about computer-security threats and countermeasures among higher-education institutions, has seen considerable growth over the past several years, reflecting the growing interest in security on campuses across the country, Milford said. In April 2009, there were 264 REN-ISAC members collaborating to exchange security information. By September 2015, that number had risen to 452.

“The information sharing is the biggest differentiator in higher education,” Grama said. “We see lots of sharing about threats and how to mitigate them. I don’t have experience in other industry sectors but it really does strike me that that level of sharing and helpfulness is unique.” It helps, perhaps, that universities are not generally direct competitors, unlike the members of the other sector-specific ISACs devoted to areas like financial services, aviation, and oil and gas.
Universities have also been ahead of the curve in monitoring network behavior for anomalies, Waldo said. “We assume that we are always under attack and that the adversary is always inside the network and we’ve been taking that view longer than anyone else because we have more open networks so we have to just assume that people we don’t want there are going to get in there,” he explained. “We tend to watch the behavior of the network much more closely rather than trying to build a perimeter around what we do.” This is a view that’s become increasingly prevalent in companies and organizations outside higher education, as it has become clear that no set of security controls or defenses is sufficient to completely keep out a determined adversary. But corporations can still exercise a much greater degree of control over their employees’ online activity and devices than can most educational institutions.
A company, for instance, may mandate a new security system or patch for everyone on its network, but a crucial element of implementing security measures in an academic setting is often providing users with options that will meet their needs, rather than forcing them to acquiesce to changes. For example, Parks said that at the University of Idaho, users are given the choice to set passwords that are at least 15 characters in length and, if they do so, their passwords last 400 days before expiring, whereas shorter passwords must be changed every 90 days (more than 70 percent of users have chosen to create passwords that are at least 15 characters, he added).



Mark Silis, the associate vice president for information systems and technology at the Massachusetts Institute of Technology, also stressed the importance of offering users choices when it comes to security, including allowing users on campus to opt out of a default firewall. “Traditional technologies often made [security] an all-or-nothing choice and in an environment where that was the choice that was not for MIT,” Silis said. MIT, which SecurityScorecard ranked last in its Higher Education Security Report, has “always been an environment that has valued openness and creating a platform for people doing incredible, creative things,” he added.

“In order to support high-velocity collaboration, high-velocity innovation, high-velocity research and do it at scale—where the collaborators and the interdisciplinary work happens even beyond the MIT community, globally, we have to make sure we don’t put barriers in place that make it more difficult,” said John Charles, MIT’s vice president for information systems and technology.
Still, MIT, which was the victim of attacks redirecting its homepage in January 2013, has rolled out several new security policies over the course of the past few years. These measures include a new password policy, a new firewall, and, most recently, a new two-factor authentication system—none of which would seem out of place at a corporation.
But Charles said he would like to see the conversation about security on campus shift away from adding more controls to existing systems and instead focus on “design and engineering work that allows these systems to recover quickly or not be taken down by these attacks.” It’s a goal that would require some significant advances in research and our understanding of computer networks and security threats—advances on par with the ones that brought us the Internet in the first place, the same advances for which we are, in large part, indebted to research universities. So perhaps it’s understandable that those same universities are a little irked by the idea they don’t understand how to protect their computer systems or should be ranked according to how well they conform to standards embraced by companies and governments.
Many of these institutions are neither ignorant nor amateurs when it comes to dealing with the Internet. They are the places where this world-changing technology first emerged, and the places where it continues to figure in intellectual discourse and creative research endeavors—partly because it has not been bound up in too many restrictions governing what you can download, send, and run.
Universities foster a form of technological freedom that is, undoubtedly, not suited to every environment or to data in need of the strongest forms of protection. But it is, all the same, a freedom that may warrant protection in its own right.




Comments

Popular posts from this blog

The Difference Between LEGO MINDSTORMS EV3 Home Edition (#31313) and LEGO MINDSTORMS Education EV3 (#45544)

http://robotsquare.com/2013/11/25/difference-between-ev3-home-edition-and-education-ev3/ This article covers the difference between the LEGO MINDSTORMS EV3 Home Edition and LEGO MINDSTORMS Education EV3 products. Other articles in the ‘difference between’ series: * The difference and compatibility between EV3 and NXT ( link ) * The difference between NXT Home Edition and NXT Education products ( link ) One robotics platform, two targets The LEGO MINDSTORMS EV3 robotics platform has been developed for two different target audiences. We have home users (children and hobbyists) and educational users (students and teachers). LEGO has designed a base set for each group, as well as several add on sets. There isn’t a clear line between home users and educational users, though. It’s fine to use the Education set at home, and it’s fine to use the Home Edition set at school. This article aims to clarify the differences between the two product lines so you can decide which

Let’s ban PowerPoint in lectures – it makes students more stupid and professors more boring

https://theconversation.com/lets-ban-powerpoint-in-lectures-it-makes-students-more-stupid-and-professors-more-boring-36183 Reading bullet points off a screen doesn't teach anyone anything. Author Bent Meier Sørensen Professor in Philosophy and Business at Copenhagen Business School Disclosure Statement Bent Meier Sørensen does not work for, consult to, own shares in or receive funding from any company or organisation that would benefit from this article, and has no relevant affiliations. The Conversation is funded by CSIRO, Melbourne, Monash, RMIT, UTS, UWA, ACU, ANU, ASB, Baker IDI, Canberra, CDU, Curtin, Deakin, ECU, Flinders, Griffith, the Harry Perkins Institute, JCU, La Trobe, Massey, Murdoch, Newcastle, UQ, QUT, SAHMRI, Swinburne, Sydney, UNDA, UNE, UniSA, UNSW, USC, USQ, UTAS, UWS, VU and Wollongong.

Logic Analyzer with STM32 Boards

https://sysprogs.com/w/how-we-turned-8-popular-stm32-boards-into-powerful-logic-analyzers/ How We Turned 8 Popular STM32 Boards into Powerful Logic Analyzers March 23, 2017 Ivan Shcherbakov The idea of making a “soft logic analyzer” that will run on top of popular prototyping boards has been crossing my mind since we first got acquainted with the STM32 Discovery and Nucleo boards. The STM32 GPIO is blazingly fast and the built-in DMA controller looks powerful enough to handle high bandwidths. So having that in mind, we spent several months perfecting both software and firmware side and here is what we got in the end. Capturing the signals The main challenge when using a microcontroller like STM32 as a core of a logic analyzer is dealing with sampling irregularities. Unlike FPGA-based analyzers, the microcontroller has to share the same resources to load instructions from memory, read/write the program state and capture the external inputs from the G