More Content - Including Podcasts

Tuesday, June 19, 2012

IBM and Evidence Based Medicine Decision Support Systems

Jeffrey Betts from IBM makes a repeat appearance at the conference to discuss how the IBM Watson project can be leveraged for evidence based medicine and clinical decision support systems.

The Watson systems understands natural language, generates and evaluates hypothesis, and learns by homing its own decision algorithms. The solution has been developed hardware agnostic, but is generally run on parallel HPC systems for optimal response times.

The key point of this lecture is that computers are by default and historically poor at responding to unstructured data. Human minds have natural abilities to view unstructured data and identify patterns, and this is the goal of a true expert system such as Watson aspires to be.

Jeffrey takes us through screen shots of a case of an oncologist using Watson to assist with a consult. It was an interesting update for which no one in the audience had any questions.



- Posted using BlogPress from my iPad

Location:Hwy 97 S,Kelowna,Canada

itManageCast Review of 12th Annual Western Cdn Health Summit

This week I've been in Kelowna attending my third Western Canadian Healthcare Summit, and while I have been blogging summaries of some of the more interesting sessions, this entry is intended to be an over-all review of the conference, venue, and most importantly, the value of the time spent in Kelowna.

Let me start by saying that I rarely if ever am disappointed to have to be in Kelowna. So Reboot Communications Ltd. couldn't pick a better city for this in my opinion. While the weather was crappy (like everywhere in BC) so far this week, and I still can't swing a golf club, my colleagues at UBCO, the UBC Southern Medical Program, and Interior Health make it worthwhile for me to be here, outside of the conference itself.

So what unholy deal exists between Carla Tadla and Keith Baldry? While I certainly don't have anything against Keith personally, it would be a nice change to have a fresh face for the event emcee/moderator. Not that he takes much speaking time, but it's just getting a little long in the tooth. Something that isn't long (enough) is the hashtag for the event - #HCYLW. I get what it means, because I'm here and I've thought about it a bit, but is it really a tag people would search on? I would suggest something a bit more descriptive like #WstrnHCSummit or some shorter variant that is slightly more self-explanatory.

And on the topic of technology, at the start of day two we have no conference wifi, and ergo, my blogging & tweeting will be restricted. Perhaps by design? :-) Those are some extreme measures to keep rogue bloggers like me under control. I'm thinking it's by design (or GE Healthcare only paid for one day?) as the passcode changed on the second day, but this wasn't announced until enough people complained around 10:00. I'd encourage the organisers to address that more quickly in the future, or at least let attendees know that it was a technical issue if that was the case. Otherwise, it just appears to be disorganisation, which my experience with Reboot would make me think that unlikely.

The theme for Tuesday seems to be big data, so I'm enjoying this, although I must admit I was surprised to see Jeffrey Betts from IBM presenting for the second year on IBM Watson. That said, it was interesting in so much that he was able to provide us with a case study of use in an oncological patient discussion to provide deep and wide evidence based patient care.

The big data panel discussion was excellent and was a highlight for me of the second day. Excellent panel, and great job by the conference planners.

However, since it's only 1.5 days of session content, I think the organisers should be more consistent with the level of speakers and topics. The audience seems to mostly be clinical administration, so I get that there needs to be logistics/supply chain discussions and I can tolerate that, but I get the feeling that some of the panels were really stuck for speakers as not all were of an equal calibre, or regarded by the audience as being the appropriate subject matter experts for their topic area.

If the organisers could get more content like we had on Monday morning and all day Tuesday, I'd encourage adding a half day to the conference. All in all, my biggest take away each year is the networking, and I have to thanks HP Canada in large part for their facilitation of that!


- Posted using BlogPress from my iPad

Big Ideas About Big Data in Health Care

Lindsay Kislock, ADM in the BC Ministry of Health, introduced the panel and started the discussion on the premise that health has to be proactive with the data at our disposal, and be responsible and forward thinking about how we can turn health data into positive health outcomes.

Dr. Tom Karson spoke first, informing us that globally, we are in the zettabyte era, and we cannot do anything with this volume of data in healthcare without big data analytics. Genomic sequencing & epigenomic analysis are given as examples of big data that medical research and practice need to have and manage daily.

Dr. Karson clarified the reality in our healthcare systems where there is no governance or standardisation of the data sets that individual groups within the provincial health system, and that this is the lowest step of the DELTA five stage maturity model for data analytics. dr. Karson's point was that we need to evolve through this maturity model to better use the data, but that we cannot do that without in parallel establishing and maturing governance over this data.

Additionally, Dr. Karson insisted that we must develop, recruit, and educate the appropriate talent pool to manage big data, and be able to turn data to information, and then to insight.

Julie Lockner from Informatica followed to discuss how we prepare our data centres for big data. The questions came up around pure capacity, security and governance, and obtaining the skills needed to manage these systems.

When asked what is stopping people from dealing with big data better they say: "Time constraints on business analysts and lack of skills for staff in how to manage big data."

We are next introduced to the concept of hadoop, which allows for real-time massive data processing on standard hardware platforms, as an Open Source solution.

Our last speaker on this topic is Rachel Debes, a biostatistics researcher from Cerner. Rachel states that the two biggest drivers towards big data solutions is electronic medical/health records, and the emergence of an accountability framework for the Canadian healthcare system. I would suspect that she is overlooking medical research requirements and data generation/analysis, but I'll assume she's targeting the clinical administrative audience here.

ADM Kislock asked the panel "is big data bad?" and the response was that it is not, but it's all about the governance and skills to handle that big data responsible and effectively.

A question came up from the audience as to whether the protections we put in place around big data in the possession of healthcare are nullified by patients and the general populations freely placing health and health-care information in the public domain via social media, which can be mined by anyone who wishes to invest in that.
My thoughts are that people will place this information in the public domain along with all kinds of other things that if that data were placed into government care we would be held accountable, and the fact that people are irresponsible with their information, or that people on an individual level doesn't feel that certain information is actually "private" doesn't absolve us of our responsibility to protect the data given into our care. If the public voice eventually changes the definition of what is "private" or "personal information"
then we will adapt our levels of governance accordingly.

Dr. Karson provided an answer to this question that mostly aligned with me thoughts, and cited the regulations we work under in the healthcare industry.

Dan Gonos from HP asked the panel their thoughts on the challenges with mining unstructured data. Dr. Karson answered that unstructured data is best mined if you have discrete unstructured data and you understand the data sources, so that the algorithms can be modified to assume contexts. Julie added that certain vernacular can complicate free-form data, further to Dr. Karson's point.



- Posted using BlogPress from my iPad

Location:Hwy 97 S,Kelowna,Canada

Monday, June 18, 2012

Innovating in Health Care While Managing Fiscal Constraints

Ida Goodreau, Board of Directors, Genome BC & Adjunct Professor, Sauder School, UBC

Premise is that public sector always views innovation as something that drives up costs. Private sector doesn't see it that way, because of what innovations they allow into the business.

Ironically, while health care costs across the Western developed world are increasing rapidly, hospital spending specifically has trended down in the last 12 years. The well known and discussed paradox is the demand for high quality care within a fiscally sustainable system. Patients (who are taxpayers) want improvements in life expectancy and functionality, while taxpayers (who are also patients) want greater system efficiencies at lower costs.

The definition targeted is a diminished gap between GDP and health care spending within 20 years.

Innovation in health care over the past twenty years an be measured as successful if we use extended life expectancy as a metric, but not if we measure it against the cost. It seems cold to put a price against the length of a life, but this is the reality that the population wants, per the paradox we discussed earlier.

So the crux is how to adapt innovations that improve health outcomes at reduced costs. The innovations in question are technology devices, drugs, and information, process redesign, and over-all system redesign. We know what all these innovations can and should look like. We need a model for cost-effective integration of these innovations, and an agreed upon set of metrics for measuring progress and assessing risks.

Ida suggest we need to look at systems around the globe where healthcare is privatized, as those are driven by business economics to be the most innovative. The key factors to be considered are:
Lower cost and consumer direct payment
Simplification
Closer to the patient
Re-invention of delivery by use of existing technologies
Right-skilling the workforce
Standardised operating procedures
Copying and then building

Ida proposes also that innovations should be frugal to deliver superior value at a fraction of the costs typically seen, and new technologies should be designed to work within an integrated continuum of care.

Ida re-iterates that the core issue with Canada's healthcare system and why we cannot make the urgent changes needed to innovate cost effectively is that "no one is really in charge."

The innovations and the cost reductions are both necessary, and have been put off for years, but time is running out, as we are approaching a tipping point. Ida proposes that Canadian health care leaders must align, and agree on how to make the public system leverage the optimizations that privatized health care solutions use.


- Posted using BlogPress from my iPad

Location:Water St,Kelowna,Canada

Complexity Exceeds Cognition - How Analytics Will Transform Healthcare

Dr. Graham Hughes, Chief Medical Officer, SAS

We are challenged to provide meaningful and useful health care information available online to every Canadian. Big data is about how much data we are getting from where, how quickly, and how to turn it into meaningful information.

This was an interesting exploration of one of the Game Changing technology disruptors, and how it can, should, and is being leveraged to improve health outcomes.

Health information is still in silos, and needs to be integrated or federated in meaningful ways to enable clinical decision support systems (CDSS). Carolina's Health Systems in North Carolina has been innovative in this area.

Structured and unstructured data continues to expand rapidly, and not all of it is electronic, and most of it continues to grow in the silos. Data-intensive mega trends such as population based patterns, personal signatures, genomics, home monitoring, mobility, & social media.

Home Depot in the US expects to have an aisle dedicated to home medical monitoring systems in two years.

IT consumerisation and mobility have provided us a platform for ubiquitous bidirectional access to health care resources. We are introduced to fitbit which provides 24x7 wearable health monitoring. This is worthy of further investigation. Health oriented apps are growing rapidly, allowing EMR access by patients. Telemedicine continues to evolve and reduce the demand for face-to-face health care provisioning. Gaming theory continues to improve the engagement of people in preventative healthcare and wellness. The immediate impact and benefit that might be missing in personal engagement of wellness is provided by gamification of peoples health monitoring. Influence networks leveraging social media provides a platform for quicker responsiveness to health care interactions.

The price for sequencing the human genome is below $1k, this is an example of how meaningful use of health care information is being made affordable, but the challenge is to use these innovations to improve specific patient outcomes through primary care and wellness. Again, we create mounds of data, but need to turn this into useful information accessible to patients and health care providers in a proactive manner.

Predictive analysis feeding into CDSS allows us to better understand risks in individual and population level health actions. We can identify potential patient cohorts who need intervention based on health habits and target them with the appropriate wellness services. At an individual level, we can better understand how our individual health situation may be impacted by various health care or wellness decisions. Treatment sequences by populations demographics can ensure better medical outcomes in clinical situations.

The problem is we are going to be over-whelmed by a tidal wave of data, the opportunity is that we will have the information we need to improve health outcomes.




- Posted using BlogPress from my iPad

Location:Water St,Kelowna,Canada

Healthcare Success Stories from Western Canada

Western Cdn Healthcare Summit 2012

Healthcare leaders from the Yukon, Alberta and British Columbia shared their stories of challenges and visions in the search for effective and efficient healthcare delivery throughout Western Canada.

Introduced by: Donna Lommer, VP Residential Services & CFO, Interior Health Authority of B.C.
Speakers:
Graham Whitmarsh, Deputy Minister of Health, Province of British Columbia
Chris Mazurkewich, EVP & COO, Alberta Health Services
Stuart Whitley, QC, Deputy Minister of Health & Social Services, Government of Yukon

Chris spoke first, shared some statistics about the scope and depth of AHS (Alberta Health Services). AHS is down the road that HSSBC is taking, and has insight to share for BC folks. A key comment is that the merger was done, and the details are being worked out post-change.

A primary metric that AHS uses to measure success is hip and knee elective primary replacements per annum. The integration of EMS is a current large initiative, and response times are publicly available to ensure transparency and availability. EMS has been a stand-alone entity, but this is changing. This allows the EMS responders to have greater support and options and provision deeper care quicker.

AHS believes that having a deeper and wider integration of clinical services across the province allows for quicker innovation and response to discovered administrative or clinical challenges.

AHS has asked clinicians and staff to identify game changers in health care, and some identified are standardized discharge methodologies and metrics, allowing communities and families to be better prepared for when patients are discharged back into the communities.

Wellness is a major push, with high profits and demand from private sector to license a successful regionalized program. AHS feels they are ready to move their primary care networks to the next level, but have identified that good governance is vital to that success. What that next level really looks like was not described.


Stuart shared an anecdote that illustrated that it is important to focus on need by examining where risk is.

Stuart asked us rhetorically how we innovate and transform health care in Canada. We are referenced to the innovations happening in EU Nordic countries. Extraordinary technological innovations are occurring daily, but the cost to accommodate and implement these are barriers to adoption. Transformation therefore must occur in the management and funding, as well as the current culture of health care. Negotiations with practitioners is the beginning place.

Acute disorders are stealing attention from the chronic issues which Canadians are increasing with high-risk health behaviours increasingly dramatically in school aged children. Interventions must start here, and we must look further upstream to be more preventative and intervene before conditions become acute, and warrant more expensive treatment.

Top 30 users of the Yukon health care system, on average, incur more than $150k each year. A key issue in their acute issues is prolonged alcoholism.

Many small, simple innovations focussed on the upstream aspects of the health care system will (and have been proven in the Yukon) reduce the costs and improve delivery of health care.


Graham Whitmarsh presented the innovation and change agenda diagram. The three pillars of the program are:
1. Effective health promotion & prevention
2. Integrated & targeted primary & community health care
3. High quality hospital services

A series of metrics were shared for each of these three pillars to identify what is working in the past year, and what isn't, and where to focus efforts this year. Over $250M in savings is expected from HSSBC next year.

Royal Columbian & St. Paul's improvements are planned and committed to.

ThinkHealthBC is where this strategy is shared with the public. We are provided a teaser, but if you are interested in learning more about the strategy, current results, and next steps, you should check out the site.



- Posted using BlogPress from my iPad

Thursday, June 14, 2012

Navigating Internet Privacy

Shifts in the modern Internet landscape are creating new challenges and business imperatives for security, IT and legal professionals. Join our panel of experts as they examine the legal, regulatory and public policy initiatives that are impacting online businesses, Internet usage and Internet security today, and tackle the most pressing questions in today's marketplace, including: prospects for new privacy legislation; the potential impact on how companies operate and design products; conflicts that may arise with the development of cloud computing; legal jurisdiction over international data flows in "the cloud?"; the progress of online tracking and advertising; the impact of increasing calls for Privacy by Design from policymakers and organizations; and the rise of rise of class action lawsuits in the privacy sphere.
Justin Weiss, Senior Director, International Privacy and Policy, Yahoo and
Trevor Hughes, President and CEO, IAPP

We started with the topic of kids. The COPA is currently under review, including verifiable parental consent issues. The business consent mechanism for that consent is payment card requirements.


I asked about teen use of social media, and the right to be forgotten. Particularly when teens post things online that they might regret later. The answer from Trevor was started with the fact that Social media adoption by youth was higher than any any other age group last year, for under thirteen, the parents are complicit in kids getting onto facebook. As a society we are evolving in how we think about our personal histories. We cant teach our kids how to use social media, they will teach us. The answer is that our perception of personal pasts is shifting with this young generation in North America, and it doesn't mean the same thing to us as adults that it will to the next group of adults. So we told that teens wont regret later what they post today. I don't really agree with this moral philosophy standpoint, but it is interesting that privacy experts are using this as an argument.

Next we discussed the right to be forgotten specifically in the EU. This includes the portability of your data, and that implies that social network providers would allow you to vote with your data. The expectation currently does not exist that you can control information about you on the Internet; but how much of this was uploaded by you? How much information about you is uploaded by someone else? Who owns this, and who has the right to remove it? The contra to this is that just because it is difficult to do doesn't mean that it shouldn't be done. The thought came up that more practical than the right to be forgotten is the right to know what is online about yourself. Trevor & Justin indicated that there are already business coming online to purge and/or correlate aggregated public information about you. Last question on this topic was what happens when you die? When you die, your data and how your data is handled should be the responsibility of your estate, but compliance regulation is not enforcing this - yet.

We move to the discussion about the e-privacy directive and cookies, and tracking. This concept originally came about because of the stateful nature of the web. Cookies have become much more sophisticated and dangerous, and easily abused. Any modern website can contain 16-20 cookies on their front page. This number is more than likely on the very small side of the average. The EU is proposing informed consent for each cookie. Alternative state management and information grooming tools are being developed to proactively circumvent any legislation.
So we need to clarify that there is a difference between tracking technologies and cookies. By focussing the issue on cookies we are looking at the more transparent technology, but others that are far less transparent do not use cookies to track you. So the legislators and the public need to be aware of this difference. The caveat considered is "unless the tracking is expressly requested by the consumer of the online service." Advertisers and third party data collectors should and are be the ones who are targeted by this legislation.
Browsers are the interface here, and we have four or five real vendors of note here, and it may well be that browser settings will be the key to finding a solution here. In the end we come back to the risk of technology specific legislation, versus focusing on principles of privacy. The case of browsers including "private browsing" options is shown as a case in point of how the market can respond to demand by simplifying the interface to give us what we actually want.

Trevor explains that OBA (online behavioural advertising) is intended to be targeted to ensure that it is beneficial to you, but often comes across as invasive and creepy. Other privacy issues are starting to over take this one, as industry slowly starts to self-regulate. Its not by any means a perfect state, but it is progress. Yahoo provides icon solutions to let users know which ads are targeted and which are not.
There is a scope creep issue here, because cookies are often involved in gathering the data for OBA, so law makers need to approach this topic, once again, very carefully to not paint any forthcoming legislation into a toothless corner.
I asked about the scenario where facebook provides hook-up and dating site advertisements to 14 year old boys, and this became an interesting conversation around whether the advertisers or the host holds responsibility for the advertisements, and if they are allowed to have enough information to know your age. My opinion is that since facebook has this info about us already,

Justin's prediction on the headline for online privacy for this year will be "a google technology team bypassed default preference settings in Safari browser" which was todays headline. This will continue to fire the flame on regulation because it is apparent that self regulation isn't working.

trevor expects a $20M settlement over a privacy issue n the US that will drive more compliance.


- Posted using BlogPress from my iPad

Location:13th Security & Privacy Conference

Monday, June 4, 2012

Game Changers - A Day of Innovation Keynote Summary

Sukhi Gill, CTO EMEA & HP Fellow

The idea is to gauge innovation value by how excited the business units outside of IT get with the concepts. Conversations are sparked by discussing major disrupters and how to get budget to keep the lights on and innovate.

Case in point, the new Galaxy phone is equivalent computing power to a rack of servers and storage from 2000. How do we get the business to take the "cool" things like incredible hand-held power, pervasive network access, and social media and get them to fund IT to enable those things they want to use to be usable.

Adopting new technologies without a business model to support them is setting yourself up for failure. In most organizations, IT innovation lags behind business, and we're constantly playing a game of catch-up.

So what are the top three disruptors?
Consumerisation of IT, or BYOD as it's more familiarly known in our industry. Start with allowing email access through consumerised devices, then address the mobility challenges of which apps should be allowed, and which should not. Virtualisation and mobile delivery of the core user functions of business applications is where we should focus our attention.

Cloud is continuing to pervade, and the convergence of clouds will make public cloud offerings more appealing. The IT department will have to change, as they are more so brokers of movement to cloud services, and must manage the IT supply chain with extreme diligence.

Big Data and real-time analytics. 85% of organisational data is not formatted in a way that meta-data is available to readily locate and understand the context of the data. We are increasingly forced to understand unstructured data by legislation, and the pure quantity. Sukhi provided the idea of placing sensors in our luggage to track where it is via our mobile devices, where most of the actual information is available, and the technology is all about, but the data isn't used,mor even understood where to be found. This example is how a consumer demand drives business technology change.

If we as IT leaders lead budget conversations with infrastructure upgrades and software revisions, we won't make our case. If we talk about real business problems, and how we can address them with game changing technologies, we are more likely to get the funding needed to balance operations and innovation. Identify the biggest business problems, break down the complexity of the problems, and look for options to solve those components. We must be bold in addressing the disruptive technologies with the business.

Consider writing a business briefing document that poses "what if our competitors did this before us?"

Have an annual budget planning meeting with the business units where you share a roadmap with the business units on how your business could be disrupted, and spark a conversation you've prepared for on how to adapt proactively. Scenario planning is vital, as is holding workshops using demonstrations. As an example, corning.com and hp.com have public domain videos of what the future technologies they are developing are like.
Combine innovation in infrastructure refresh projects.








- Posted using BlogPress from my iPad

Thursday, May 3, 2012

#BCNET_HPCS Issues and Challenges in Engaging in Research with Private and Proprietary Data

Stephen Neville, UVic
Rozita Dara, IPC-ON
Patricia Brantingham, SFU
Caitlin Hertzman, Population Data BC

This session was intended as four case study presentations of how to manage the security and availability of big data in research.

The claim from Stephen is that the responsibility fundamentally falls on the researcher to understand the issues and ensure that the data and results are handled correctly.

Stephen outlined several existing scenarios including single computers or drives with encryption, external service providers or "cloud" solutions, private clouds, and the pluses and minuses of all of these options. All of these are very common to us who work in these research environments and there were no surprises here.

EmuLab in Utah was cited as a HPC facility that VLANs out the various research computing inside a secure data centre. Compute Canada is waiting on response to a proposal they have submitted to build a like facility.

When asked how to ensure that data destruction is complete at the end of a research project, Stephen said he takes complete responsibility, buys self encrypting drives on servers, and does everything himself, and that's the solution. This approach was challenged from the University management and privacy responsibility perspective. I agree that Stephen's approach is at best a stop-gap measure at best, and is not efficient, institutionally accountable, or scalable.

Rozita spoke to us about virtual tool use to protect information freedom. The relationship between the amount of data becoming available, the value of that data, and the legislation and controls available to govern the use of that data was posed to us as an increasing challenge.

The challenges Rozita summarised as:
Data overload
Unauthorised use
Over-regulation of data
Privacy in context (one ring will not rule them all)

We are suggested to check out the site http://PrivacyByDesign.ca to see the summary of her research on these challenges, and her proposed solution, "SmartData" which is effectively tagging data with metadata and building a parallel architecture to manage this. That is how I understood what she proposed. A SmartData symposium will be held in Toronto for those interested.

Population Data BC is a clearing house for health, demographic, occupational, environmental, and educational data from various public bodies. They work out of UVic, SFU, and UBC to provide the data and training on how to use it. They do not conduct research themselves, but provide the data to researchers.

Three models of privacy are suggested, enterprise risk management, information governance, and privacy by design. A best practice is to understand each of these, and apply the aspects that best suit your organisation and the data you collect, store, and use.

Best practices that are used by PopData BC are:
Physical zoning with fobbed access and alarms
Video surveillance
Fortification of walls
Sign in and escort for visitors
Network zoning with two factor authentication
Dummy terminals (physically different computers for working with secure data than using for general administrative work)
Separation of identifiers from content
Proactive linkage (data anonymization)
Auditing, logging, monitoring
Secure research environment (a VPN for researchers to access data pools)
Encryption (full data lifecycle protection)
Data destruction methods
External auditing
Data access request formats
Agreements
Privacy policy, incident response plan
Privacy training (and testing after)
Criminal records check
Close working relationship with OCIO et al

Patricia gave us illustrations of the criminology studies, data, and data representations used at SFU. Plans are for a provincial data store of criminology relevant data, and a Commonwealth internetwork to share research.

Lorenzo from SFU gave us a vague but useful explanation of the complexities involved in building the 5 layer secure environment for housing this research data.


- Posted using BlogPress from my iPad

Location:W Hastings St,Vancouver,Canada

Building a "Zero Energy" Data Centre

Firstly, as opposed to zero energy, we're talking about a movement towards a zero net emissions HPC data centre.

The Energy Efficient HPC Working Group focuses on driving energy conservation measures and energy efficiency in HPC data centre design, the group is open to all parties, and can be found online at http://eehpcwg.lbl.gov.

There are three subcommittees:
Infrastructure committee working on liquid cooling guidelines, metrics (ERE, Total PUE), and energy efficiency dashboards.
The system team is working on workload based energy efficiency metrics, and system measurement, monitoring, and management.
The conferences team puts on a monthly webinar every second Tuesday, and is primarily focussed on awareness.

A pay-for membership group related is The Green Grid, who for a $400 annual fee provide access to top resources to learn and apply.

75% of the top 500 super-computing facilities are in US, China, and Japan. The top system (in Japan) uses 12.66 MWatts, the average of the top 10 4.56 MWatts, with an average efficiency of 464 Mflops/watt

The WestGrid HP compute system at UBC is the 189th most powerful supercomputer in the world, but the 398th most efficient, this is a derived number as only half the systems in Canada have submitted their numbers to the EE HPC WG.

There are three tiers of power measurement quality:
1. Sampling rate; more measurements/higher quality
2. Completeness of what is being measured; more of the systems translates to higher quality
3. Common rules must be followed for start/stop times.

The EE HPC WG has a beta methodology that is being tested in Quebec.

Energy use in US datacentres doubled between 2000 to 2006 from 30 Billion KWatts per year to 60. Awareness, efficiency, and the economic downturn affected that trend and in 2011 the growth since 2006 was calculated to have slowed to 36%.

PUE = total energy divided by IT energy
This is equivalent to cooling, power distribution, misc, plus IT divided by just the IT energy consumption.

PUE average is 1.91 according to EPA Energy Star. Intel has a data center operating at 1.41, and the Leibniz Supercomputing Centre is predicted to operate at 1.15.

PUE does not allow for energy re-use, but the ERE does. ERE is the same as PUE, except it minuses reused energy before dividing by IT.

HP discusses next steps in re-thinking servers and datacentre designs. We're told the median PUE in 2009 for DC's was over 2, and today, efficient systems and the use of chillers and water cooling can get you to about 1.3. The FreeAir/EcoPOD methodology can get you to 1.1, theoretically.

The lowest a PUE calculation can get to is 1, so we're challenged to look at efficiencies in the 1, to pay attention to the numerator and denominator of the fraction. CPU (48%) and DRAM (18%) are the biggest energy/heat pigs in HPC systems. HP now tells us about Project Moonshot, which features a workload tuneable compute to I/O ratio, leveraging the cost structures of commodity processors. The reference is made to how ARM processors operate, and how this methodology is applicable to efficient processing.

Water has ~50 times the efficiency in removing heat that air does; making the argument that using air to cool our systems is significantly less efficient than water. Compare liquid cooled to air cooled engines; air cooled is much simpler, but highly inefficient.

Liquid cooling has been around for ages, but has not been attractive from a cost perspective. Power costs are rising, and liquid cooling options are now becoming available and commodity (almost). The argument from HP is that this is what we will using in our data centres in the immediate future, although nothing is readily commercially available to the higher ed market space.

Last year, ASHRAE issued a whitepaper on liquid cooling guidelines. This includes standard architectures and metrics that must be measurable to achieve success in liquid cooling for your data centres. The specs are rated in 5 levels of increasing efficiency.

Large improvements have been made in the past ten years in energy efficiency, and the focus will now turn to total sustainability. This includes now looking at metrics for carbon footprint, water usage, and energy consumption.
A key consideration needs to be location of the data centre, which looks at temperature and humidity of the locale. Based on all these factors, Canada is actually the best location for efficient data centres in North America. Yet all the new data centres are being built in locales in the US where either land or power is cheap, but the over-all efficiency is poor.

An example of a great pilot site is the cow manure powered data centre outside of Calgary AB. The discussion moved to total carbon footprint, and the GreenPeace "So-Coal Network" video is shared as an example of two things; the poor decision making around coal powered data centres, but also the pressure that can and should be put on North American (and global) organisations to make the right decisions.

The challenge is that it is short-term profitable to pollute. We're posed with the idea of using our campus data centres as carbon offsetting tool. By heating buildings with the heat output from the chilling water from the computer cooling we can claim not only reduced natural gas use, but carbon credit compensations in a cap and trade situation to reduce operating costs.

Lake Mead's loss of water, and potential complete loss by 2021 is cited as part of the challenge for us to think hard about evaporative cooling solutions. Stay tuned for the site www.top50DC.org which will create a playing field and world stage for accountability in truly green computing and data centres.

- Posted using BlogPress from my iPad

Location:W Hastings St,Vancouver,Canada

Next Steps for Canada's HPC Platform

Jill Kowalchuck, Interim Executive Director, Compute Canada

Compute Canada hosts 1,273 researchers and PIs, 2,379 grad students, and been the infrastructure to provide research in support of 3,500 publications. The services Compute Canada has built for $29M would cost the research community $50M to access without a centralised subsidised service.160k cores, 15 PB disk, at 75% - 90% utilisation makes up the Compute Canada infrastructure deployed across the country.

Marshall Zhang, a medical researcher at 16 years old has identified a new drug cocktail that could treat cystic fibrosis using Compute Canada resources under the direction of a PI in Ontario.

Expansion is planned into secured data centres and expanded storage to enable the medical/biomedical, criminology, and other research using sensitive sensitive industrial data. The information security program is currently under revision in support of these initiatives, to ensure compliance federally and provincially.

The three current priorities are:
Governance
Cost benefit analysis of data centres
CEO search

Compute Canada officially launched their new website today at http://www.computecanada.ca

HPCS 2013 will be hosted in Ottawa, dates and location TBD.


- Posted using BlogPress from my iPad

Location:W Hastings St,Vancouver,Canada

Wednesday, May 2, 2012

Mobility & Security on Campus

Panel from BCNET Application Advisory Committee
Phil Chatterton, UBC
Paul Stokes, UVic
Leo de Sousa, BCIT
Hugh Burley, TRU

UBC figures about 180,000+ mobile devices on campus
70% IOS, 20% Android, 10% other
There is a shift underway, but this is current state

A mobile web first approach is in place as of this year. Kurogo Mobile Platform from MIT & Harvard is in use, and iOS and Android apps are in development campus-wide. A campus wide encryption program has launched (WDE), and an examination of mobile use and security program is ongoing.

UVic claims to be far less maturer than UBC when it comes to ability to serve the demands of a mobile hungry user community. Faculty, staff, and students each have different needs, and need to be supported and managed differently.

66% of mobile devices in use at UVic are iOS based. The focus is intended to be on teaching and learning when it comes to UVic's IT planning. A focus on privacy and security is also essential.

BCIT has had a more administrative focus on BYOD. They noticed a significant uptake in iOS and tablet use from an employee point of view as of this past Christmas. Employees wanting a work-life blend will want to use the tools they are comfortable with and thus the consumerisation of systems at BCIT for staff.

Heavy use of Citrix to deliver applications has been a stronger focus than managing mobile platforms or delivering virtual desktops. Hosting virtual desktops will be a focus for next year, as will network access control in a controlled but not closed methodology; systems will always get at least Internet access.

A vulnerability has made its presence known at BCIT - hole 196 allows a man-in-the-middle attack by an internal user; the advice from Leo is to run HTTPS on all your servers.

Hugh, from TRU, states part of their success comes from a centralised IT group, and the most important achievement is in establishing standards and governance. This has led to an understanding of what they are trying to protect, and why.

A mobile device management server architecture is in place at TRU for iOS/Android and Blackberry. This helps address the issues with mobile devices, which are best understood when we understand how we use the devices and why.

TRU is seeing an exponential growth of mobile devices used to access campus electronic services for administration and learning, as well as the fact that people have multiple devices they want connected wirelessly.

When asked if we are being these technologies are pushed, pulled, or dragged from campus IT services groups, the consensus is a mixture of all three; varying at each campus.

Leo posited that access to resources via mobile and consumer devices should be determined by the security and privacy requirements of the service you are trying to access. This is how we address the challenge of how do we use NAC, mobile device management, and application delivery to ensure that we support the delivery of education and research.

I asked if researcher users are being considered to fit into a similar approach to academic and administrative, and the panel agreed that they are unique in many factors, and that the basic principle of educate before enforce is vital with that community.

- Posted using BlogPress from my iPad

Location:W Hastings St,Vancouver,Canada

BC's Freedom of Information & Privacy Act - Implications for Higher Ed

Paul Hancock, UBC
Bill Trott, UVic
Craig Neelands, SFU

All three are members of the BCNET security working group.

Paul provided a privacy primer for everyone. A quick history of how we've gotten to where we are today in Canada, and the paradigm of informational privacy, and the distinction between security and privacy. Security is about protection from threats, privacy is related, but different. Privacy is more about what you can and cannot do with information.

We are subject to FIPPA, one if Canada's most stringent rule sets governing the collection, storage, protection, retention, use, and disclosure of information. Significant implications around storage come from the requirement that information can only be stored in Canada.

Failure to comply affects us not only financially, but reputationally.

Paul shifted to discuss privacy impacts of cloud computing. After defining cloud computing, Paul reminded us of the constant impetus we have in Higher Ed to move to cloud based services. The primary implications are foreign storage, and access issues such as the US Patriot Act. Consent may be a loophole allowing this, but it isn't bullet-proof. Even encrypting the data does not make this acceptable in the eyes of the law.

Security, retention, jurisdiction, all pose challenges - in fact roadblocks - to moving services like email to a cloud solution with foreign data storage or movement.

Recent developments in the act may provide interesting options as the Minister is apparently being given powers to waive compliance.

The privacy impact assessment topic was next covered by Bill. When a breach is noted, the first two questions will be "was it encrypted" and "is there a privacy impact assessment"?

A PIA is not so much a 19 page form as it is a process. A PIA is a compliance tool, a risk assessment and mitigation tool, a decision making tool for the executive, and most importantly, an educational tool.

Section 69.5.3 in the recently revised FIPPA clarifies that we have a responsibility to conduct PIAs, and while its not clear that it is mandatory, we should be erring on the side of caution. Several situations were brought up by Bill where we need to be doing a PIA, and they all came across as common sense.

The root is to build trust inside and outside our organisation, show leadership in privacy, and have the best defence in the event of a breach.

Craig defined privacy breaches to us, and cited examples common in the higher ed sector. Craig showed that there is a difference between privacy and data breaches, so that we can focus responses to privacy breaches.

We should start with a framework for privacy breach responses; acceptable use policies that are in effect and understood, breach response processes and tools, and an understanding of when and how to notify the Office of Information & Privacy Commissioner. Many of these tools are available from the OIPC website.

SFU had 10 breaches last year, which has led to revisions to processes and tools, and made awareness of a need to account for the financial impacts.

The question came up as to whether Google Analytics was a challenge, and UVic noted that they've developed their own system to deal with that. It was noted that Google Analytics has the option to turn off collecting the last octet of IP Addresses, and that may or may not be a solution.

A great question came up asking if BCNET was in violation due to the pathway through Blaine WA that has data transmission through the US for traffic to and from UVic. The answer is that transmission does not legally equal access or storage, so at present we believe we are compliant. This discussion spun out further to an excellent debate with no solid answer.



- Posted using BlogPress from my iPad

Location:W Cordova St,Vancouver,Canada

BCNET's Advanced Network Projects & Shared Resources

Marilyn Hay
Andre Toonk
Scott Jamieson

Scott presented on Building a fibre network on the Saanich Peninsula from 540 Blanchard to UVic, 9.7 km.

Scott reviewed the key decisions that needed to be undertaken in the planning process, reviewed the complex permits process, and then shared some of the key challenges in the construction process.

The network management topic was brought up, starting with fibre inventories and ensuring that fibres through public venues are clearly labelled and recorded, to ensure clarity of ownership. Your fibre mgt system should also track splices, OTDR readings, and transport circuits.

The point made at the end was to always add as much spare capacity of fibre and ducting as you can.

Marilyn presented on the recent WDM implementation. A ring from UBC to BCIT to SFU was created using 4 strands and allowing a diverse path with cross connects. Enter the WDM solution.

An RFP was issued, requesting multiple wavelengths, 5 10 Gb to each site, scalability, and the lowest reasonable cost. A decision on DWDM vs. ROADM was needed, and when the results came back, a ROADM solution from ADVA was selected.

The ROADM solution with equipment at each site allows for changes to channels and wavelengths without interruption to pass throughs. The solution is scalable to add wavelengths for each site and can run as protected or unprotected circuits enabling high availability as needed.

All the sites will be brought up over the summer of 2012.

Andree shared with us network management tools at BCNET. The first challenge was to have a CMDB, currently a homegrown solution is in use. They are planning to OpenSource this solution, based on PHP, perl, Nagios plug ins, SNMP, & MySQL.

This system provides device management documenting all devices in the network, interface info, & statistics collection. Location, contact, and IP address information are also core functions of the tool.

The tool has been built with a view towards managing at the service level, by documenting the network services provisioned to clients, and provisioning information collected in customer seervice oriented views.

Fom an event management perspective, inspiration was taken from the Nagios framework to include key incident and reporting functions. Additional components, including one for change management are built in.

Different algorithms are available in the incident management component to allow customisation of the logic around alarm escalation and incident impact.

Andree.toonk@bc.net is interested in working with anyone who wishes to collaborate.

Check it out at Https://wiki.bc.net/atl-conf/display/bcnetcmdb

Marilyn updated us next on the Virtual Routing Service implemented via the Juniper devices in use at BCNET. This solution supports BGP, IPV4, IPV6, IPSec, & GRE tunnels. This solution allows clients to manage their own virtual routers. What this provisions for customers is an IaaS model, leveraging the BCNET staff & resources. This is a new service offering, with more information available from the site.

- Posted using BlogPress from my iPad

Location:W Cordova St,Vancouver,Canada

HP Discover 2012 - Why I'm Going

Good morning, and thanks for checking in again. Amidst a series of blogs I'm writing on the BCNET HPCS conference in Vancouver, I thought I'd throw in a bit of a diversion to give some insight on HP Discover 2012 in Las Vegas.

For the past few years I've been a member of the Vivit Worldwide board of directors, helping run the official and unbiased truly global user community for HP Software users. Long before that, I've been a local chapter leader in Vancouver, and have spoken at least three times at the conferences.

I've been going to HP Discover for many years now, through it's various iterations as OpenView Forum, HP Software Universe, and now HP Discover. I would have thought that I'd be burnt out by now, and a couple of years ago I was getting close to that until I got involved in the SoMe (social media) area. Attending a conference as an unofficial (hopefully one day more official!) blogger has been eye opening, and allowed me to view the conference with a different paradigm.

Instead of looking at sessions, booths, and plenaries thinking "what can I get out of this" I now look at abstracts, tracks, keynote speakers, and the floor show with the perspective of "what can I learn? What would be interesting to the colleagues back at the office?"

One of the key things I'm interested in this year, from the big picture perspective, is what Meg Whitman has planned for HP. With the rapid succession of CEOs in the past few years, I think everyone is holding off a bit to see what the software and hardware giant will undertake in 2012/13. On a more tactical scale, I'm planning to meet up with some of the HP technical crew to get a first-hand look at the new Operations Manager appliance. You can bet that I'll have some blog entries for Vivit on both of these topics, plus some additional input where I can.

If you are planning to be at the conference, come find me in the Community Lounge or Bloggers area on the show floor, I'm always happy to meet new people and learn your stories of why YOU are interested in HP Discover.


- Posted using BlogPress from my iPad

Location:W Cordova St,Vancouver,Canada

Tuesday, May 1, 2012

Impact of Consumerization, BYOD, & Social Media on Campus

The CIOs (or representatives there-of) formed a panel to discuss this topic.

Greg Conden, UNBC
Michael Thorsen, UBC
Stephen Lamb, BCIT
Paul Stokes, UVic
Jay Black, SFU
Brian MacKay, TRU

Each speaker in turn shared their perspective on how they are managing these challenges.

Some examples of technological projects related to these needs are: Identity based firewalling, application delivery platform agnostic, virtualised desktops, application virtualisation, social intranet, self-subscribed mass notification

Acknowledgement is made of a cultural shift we need to respond to, not keep our heads in the sand. We need to engage in the discussions around technologies we aren't comfortable with, but students live and function within.

How does the institution keep control over the data that the University is accountable for? FOIPPA in BC is pretty serious business, and the institution can be liable regardless of the actions of the individual.

Comparison made to a K-12 situation, recommendations in the Manitoba region:
IT must buy and manage all access points.
Move to IPV6
Understand the relationships of people to computers for function
Vendor agnostic, industry standard environment
Delineation between business network and public network, logically separate. Students have ONLY internet access.
Access control to the secure network; full NAC for business VLAN
Anyone connected to the secure network must agree to all T&C's for access.
Training and support plans
Acceptable use policy for users of public network (students)

TRU piloted giving nursing students iPads, and the use cases evolved beyond what was originally expected.

The question was raised if these technologies are really about teaching and learning. Stephen proposed thats a moot point as we must bridge the world of academia and the workplace of the future. That we should be facilitators and support the evolutions in pedagogy.

It is proposed that these are the most important changes we can support for the faculty and students for the next two years, without getting onboard, we alienate those we are there to build a teaching environment for.

Stephen suggested that the CIOs should not be afraid to engage in SoMe. Michael asserted that if you aren't going to participate, you should at least claim your space or it'll be claimed for you.




- Posted using BlogPress from my iPad

Location:W Cordova St,Vancouver,Canada

CANARIE's Next Mandate: The Way Forward

Speaker is Jim Ghadbane, CTO of CANARIE.

CANARIE just completed a new 2 year cycle at $40M. The funding will be used to improve the effectiveness of research in canada and accelerate growth of Canada's ICT industry. This year a new connection between Calgary and Edmonton will be lit up. Thunder Bay to Winnipeg is also being lit up.

CANARIE runs Canada's ultra-high-bandwidth research network, with primary investment from the Government of Canada.

The strategic objectives are:
Create a world leading collaboration network
Research platform infrastructure
Stimulate ICT innovation
Demonstrate operational excellence
Evolve funding models and reduce the risk and impact of funding cycle
Bridge the gap and lower barriers between research, education, and the private sector

Research traffic growth is projected for a tenfold increase in network bandwidth during the next two years, and the current network capacity will be exceeded by mid-2012/13. By 2017 the estimates are for 526,224 TB per annum. Research traffic has grown from 6.7PB to 46.1 PB from 2007 - 2011.

Commodity services used by CANARIE will be outsourced.



- Posted using BlogPress from my iPad

Location:W Cordova St,Vancouver,Canada

#BCNET_HPCS Shared IT Services for Research & Higher Ed

Mike Hrybyk, BCNET CEO discussed the expanding mandate for BCNET.

UBC, SFU, UVic, BCIT, UNBC, & TRU are the core members of BCNET, and these member institutions run the network as a consortium, a unique model in North America. The 24x7 NOC for BCNET is based at UBC.

The transit exchange in each major centre serviced is the interconnect point for public and private sectors to buy into the services of BCNET.

Having provided a background on Internet transit services at BCNET, Mike transitions us to what kinds of shared services BCNET is expanding its mandate to provision. Shared data centres, cloud computing, back up services, & video conferencing.

A proposal is currently with the BC government to run three provincial shared data centres. Storage procurement and data back up services are at $250/TB per year for back up, and the ability to leverage the BCNET deal on storage hardware.

A small test cluster using eucalyptus is currently in play to provision cloud computing infrastructure.

Bluejeans, a cloud based MCU is being leveraged to provide cloud-based VC services. Integrates with Skype, GoogleTalk, and HDLink. Mike indicates his feeling that this is a game changing solution and service offering.

BCNET partnered with the Canadian Access Federation and was the first organisation in Canada to join eduroam, and 36 National campuses, including 11 in BC offer wireless roaming.

BCNET is considering use of ServiceNow as a shared SD solution across all member campuses. This may be a bigger bite than is being presented, given our experience at UBC.

BCNET intends to provision integrated HDVC services, and video storage and streaming for academic and research purposes, to manage the challenges around intellectual property rights in an inter-institutional paradigm.

Shared network management tools, fibre optic asset management, and a unified client portal are on the roadmap for BCNET.

Future service areas summarised are:
Networks
Software
Service management
Video conferencing
Elastic computing
Storage and backup

- Posted using BlogPress from my iPad

Location:SFU Harbour Centre, Vancouver BC

#BCNET_HPCS Plenary

So it's the first session, the opening plenaries for this 12th annual 3 day conference. The focus is on higher ed shared network and computing services. We're pretty bought into this as we use BCNET to provision our provincial video conferencing system that facilitates the distributed medical program.

Jay Black, Chairman of the BCNET board welcomed us all, and noted that this is the first year that HPCS and BCNET have partnered on this event. Jay shared that the line up of speakers is chosen to ensure quality information for the research and IT attendees, and is being not only recorded for future playback, but webcast realtime.

A copy of backbone magazine was handed out to attendees, I'll be reading through that later, and provide my thoughts.

Hashtag for the event is #BCNET_HPCS.

Mike Hrybyk, CEO of BCNET & a founder of Canadian Internet services is up to welcome us and explain some of the background for the event. Mike explained the track breakouts, and how speakers are selected by working groups on the five different topic areas, ensuring that the content is focussed on information that attendees will find of value.

Jill Kowalchuk, Executive Director for Compute Canada was introduced next. Jill shared some information about the involvement and support of various vendors and institutions to ensure the success of the HPCS part of the event.

Last introduction was for Joe Thompson, Acting ADM, Ministry of Advanced Education. Joe shared some of the government's perspective on the importance of BCNET for education, research, and innovation in post secondary. Joe discussed recent conversations he has had with higher ed technology leadership and the increasing rate of change on the demand for services in academics, and the responsiveness we as tech leaders in higher ed must have to these demands. Joe announced a suite of online tools launched recently to support students across BC.

Stephen Wheat, GM of Intel's HPC business is the final introductory speaker before our keynote speaker, and shared some industry perspectives on the conference themes. Stephen proposed that HPC is on the verge of commoditisation, that a balance between task complexity and user accessibility is coming. The road to this state aligns with the connect, compute, and collaborate theme of this conference.

The keynote speaker Leonard Brody was introduced. Leonard's theme is "This Monumental Shift." leonard tells us his job is to look 3 to 5 years out to see trends, which is ironic since the last time I saw him speak, 2 months ago, he said there's no point looking forward more than 365 days. And then the lightbulb goes on for me as he launches into effectively the same presentation I saw from him 2 months ago at a leadership session.

Leonard states we are at a conjunction of four major changes in civilisation; economic, environmental, technological, and generational. To understand these changes and be prepared, we need to understand the historical context of how we got here, and understand the impacts and drivers of human behaviour. Humans are changing faster now then they ever did, and need a compass and roadmap for where we are going in the next year, and leadership that understands all these factors.

Leonard poses that we should consider why the Internet matters. Historically, we are referenced to the uniting of West and East of the US via railway, and how the US shifted in just over a decade to become a world economic leader.

All significant movements in media throughout history were restricted by cost and government intervention. Until the Internet. This is a paradigm shift that may preclude using the past to predict the next shifts in technology and sociology.

Sidney Crosby in 2010 is compared to Paul Henderson in 1972; 3.5 million status updates on Facebook in 30 minutes when Sid scored the "Golden Goal." The planet has moved to a level of interconnectivity unprecedented in rate of social and technological change. The point is that we must pay attention to cycles, natural and man made, as they are shrinking and will impact us and the world around us.

Our physical & virtual lives are at a confluence, and key markers that illustrate how we are changing are trust, relationships, memory, brain physiology/use, political governance, and the differentiation between the markers in our physical & virtual lives.

In the end, it was actually great that I had the opportunity to hear Leonard's presentation again, as I took different ideas from it. I'll also be looking to watch the movie "Waiting for Superman" Leonard recommends.

- Posted using BlogPress from my iPad

Location:SFU Harbour Centre, Vancouver

Friday, February 17, 2012

Policing in the Digital Revolution

Session 13 - Keynote Speaker

Dale McFee, President, Canadian Association of Chiefs of Police


The police of Canada are active players in the digital revolution.
There are four goals here for the CACP:
Quality of service
The public right and responsibility and right tomparticipate
Innovative solutions to crime and public order
Community partnerships

The CACP is not a privacy advocate, but a very interested party in access to information in alignment with those four goals.

Four areas of the CACP relate to privacy and security:
Counter terrorism and national security
Electronic crime
Emergency management and informatics
Law amendments committee

"... The police are the public and the public are the police."
Sir Robert Peele was quoted to make the point that policing the electronic domain is not exclusively the domain of the police, but the public must contribute and participate. Case in point is the wide spread use of portable digital recording devices such as phones and cameras.

Digitally recorded information from the public shifted the perspective of the police as to who were the instigators of the Stanley Cup riots.
Hackers recently outed NeoNazi groups in Canada to the RCMP and prevented hate crimes.

Court acceptance of digital evidence varies by region and by individual judge, as the law is vague and open to interpretation. Police will continue to test the bounds of digital privacy in the interest of keeping peace and preventing crime, while respecting the Canadian charter of rights and freedoms. This document is silent on the right to privacy, or at the best, vague.

PI is not defined in the criminal code of Canada.

Information is claimed as the lifeblood of policing, so the plea is for access to the information desired. The police are asking for checks and balances but not roadblocks. The need for privacy is acknowledged, but the need for information is vital. Lawful access debates have been going on for 10 years, and the police are asking for a balance between privacy and safety.




- Posted using BlogPress from my iPad

Location:13th Privacy and Security Conference

Privacy, accountability and the digital revolution

Luncheon Keynote Address
(Salon AB)

Elizabeth Denham, Privacy and Information Commissioner of British Columbia

Privacy, accountability and the digital revolution
Just as the computer revolutionized how we work and the internet revolutionized how we connect with people, we must revolutionize the way we think about privacy in today’s digitized world.
Join B.C.’s Information and Privacy Commissioner Elizabeth Denham for an engaging discussion about how we fuse privacy with technology as the digital revolution unfolds, including case examples and practical tools to help organizations demonstrate their compliance with B.C.’s privacy laws.

This year marks the 20th anniversary of our privacy legislation. It was not predicted how the technologies have transformed our lives. June 1993 had 130 websites, Mosaic was brand new as a graphic browser, the Apple Newton was released, and the US White House had 2 email addresses.

Today 1/3 of the world population is online, and the number of people seeking to mine the data of our online transactions is growing rapidly.

Privacy is not an add on or upgrade, nor is it a lens applied to data moving across borders. Privacy must be part of an organisations DNA.

The encouragement is for us all to become proactive to privacy, not reactively. Last year, her team was split into an investigatory group, and a development team that looks forward to guide organisations and individuals as well help the Office be proactive.

The topic of SmartMeters was discussed; the investigation led to the discovery that Hydro did not provide their customers with adequate notice of their intent. The question was not only is Hydro complying with privacy laws, but can they manage the data they collect. BC Hydro is complying with all 13 recommendations.

The Playoff Riot is the next topic, and ICBCs offer to leverage facial recognition to identify rioters. This led to a realisation that most BC citizens did not know ICBC had and used this technology. The data matching offer was denied because it did not align with the original intended use of the technology. ICBCs data and privacy management program was subsequently reviewed, and recommendations have been reported.

Both of these show the value of strong data governance.

BC's movement into IDM is exciting in our national leadership, but the people, policies, and practices to ensure privacy is baked in has been and continues to be essential to this effort.

Tools are coming available to all organisations for the development of privacy policies and incident response. This relates to the workshop I attended Wednesday morning. In all situations the bottom line to privacy protection is accountability.

An accountability tool is announced. "getting accountability right for privacy management frameworks" is a document that will be publicly available in two to three weeks.

Bill C-30, which combines previous bills that failed to pass the house. Police and other authorities are granted access to private information with much lower thresholds of access controls than ever before, and Canadians fundamental rights to privacy and confidentiality is at risk, and concerns about lawful access need to be brought to bear against your MPs. Elizabeth clarified that she has concerns about the bill as it stands, and that should be a consideration for us all.


- Posted using BlogPress from my iPad

Location:13th Privacy & Security Conference

Cyber Security Panel Debate

Panel B: Cyber Security
(Theatre)

Privacy and security are truly symbiotic, yet because each has its own focus and proponents, there is often contention. This esteemed panel of experts will work towards ending some of that conflict. We will begin with a simple question: What are the top 3 things that security experts can offer the privacy sector that have not yet been adopted or integrated? Why are they so important and how can they benefit the goals of privacy professionals? In a PowerPoint free setting, this issues-oriented panel is designed to be highly interactive, encouraging audience questions and spirited debate so attendees come away with new insights and approaches.

Moderator: Winn Schwartau, President, Interpact Inc. Author of Information Warfare, Cyber Shock, Time Based Security & Internet & Computer Ethics for Kids
Speakers:
1. John Engels, Group Product Manager, Enterprise Mobility Group, Symantec
2. Robert Dick, Director General, National Cyber Security Directorate
3. Steve Hutchens, Director, Global Government Industry, HP
4. Paul Laurent, Public Sector Director of Cybersecurity Strategy, Oracle Canada
5. Eddie Schwartz, Chief Security Officer, RSA

Our moderator starts with a position on the critical infrastructure interdependencies between nation states, and the related privacy issues.

Robert rebuts the moderator's proposal that the US invade Canada to protect power reserves with a reference to the 100th anniversary of the war of 1812.
Robert moves on to note the seriousness of command and control infrastructure and the protection thereof, in addition to the protection of Canadian citizens privacy. The solution proposed is to not go alone as a nation state, but to partner wisely to protect national security. Suspects are national state actors as well as private criminal organisations, and failures to infrastructure that may be out of the direct control of Ottawa require clarity of communications between business and government, not draconian gov't actions. Debate on these topics to find collaborative opportunities is encouraged. Need to understand where the responsibility lines are drawn between public and private sectors for the protection against risk to all the infrastructures that support the functioning of our nation.

It is proposed that 70-80% of successful attacks can be defended against by proper infrastructure maintenance (patch management, security controls, audit, etc), but there is a small but vital percentage of very determined and well backed attackers where there is no easy defence, so we need a capable and prepared response.

John spoke to the risk of mobility to not just the PI of average citizens, but to those in positions of pow and leadership in industry and government - consider the risk of the bad guys knowing where the PMs kids are or will be.
There is also a need to be able to manage and secure not only what information is taken, but what information leaks due to unaware consumers of mobile platforms using the technology improperly. Tools and applications are great, but awareness and education are core. John claims that as an industry we must be more advanced in how we manage mobile devices and the data that moves back and forth to them; an auto delete button at central control is great, but not an ideal solution for the consumer.

Steve brings a different perspective, and states that the soft part of IT security is around policy and must be kept in context of the need to use or populate that information in a crisis to maximise the well being of citizens. Understand who are your customers and consumers, and who might might to obtain that information, and why. Steve considers that this is at the root of the risk analysis and management. Balance all of this with appropriate access to the information for the right people at the right time, be prepared to do this with minimal interference in a critical situation. Steve cites the examples of physicians bypassing network security for ease of access when working remotely from the site where the EMR systems are, and that our policies must bridge the need for access with the need for privacy. Steve proposed the concept of "secret shoppers" as employees who will share their feedback on the security of the operational infrastructure and the availability of the information they need.

Paul feels that data classification is the starting point of calculating risk, as you must know what you have before you determine how best to protect it. The extension nationally is how much effort we should place on critical infrastructure versus how much we protect the civil liberties of Canadians. Paul states that in the privacy discussion, the people involved should be outward facing, as public trust is at core.

Eddie has three points to share, to consider security from a perspective of control and visibility.
The first point is that security is broken. The more you invest in technology, you don't really move the security level higher. The prevention game is a game of catch up, but detection and response is a far more useful place to invest. Step back and say what do I have today that was relevant 10 years ago, and what is relevant today? Rethink information security.
Second, if we think there are changes needed in the doctrine of security management, make them. How do we measure usefully our risk level? Almost all metrics available are arbitrary, and don't consider all assets at relative values to the organisation. Eddie cites the recent RSA breach, and asks what was the actual objective? What are your high value assets to you, to your customers, and to the attackers? What is your ability to collaborate outside your organisation in response and in preparation? What is your ability to take what you learn about an adversary or the value of your assets and apply that knowledge dynamically to improve your security stance?
This segues to the third, how do you evaluate your performance metrics? Rate yourself in your effectiveness and continue to move that bar. We cant have compliance be the driver for security and privacy programs; we have to get security right first.

The topic of graceful degradation was brought up by Winn; how much can we consider shooting back as a mechanism of protection. The answer proposed is layers and segregation as a defence concept. Adaptive networking defense is also brought up, but that is at a risk of creating your own DoS on yourself. The rush to shut down, re-image, and other reactive actions is a risk to your business continuity; you need to understand the attack vector and respond accordingly to balance protection and service delivery. Paul brings up a really valid point, which is "what does normal look like?" as a necessary understanding of our own enterprises so that we can not only detect, but understand the scope, impact, and assess the correct response to any information incident.





- Posted using BlogPress from my iPad

Location:13th Privacy & Security Conference

The Elements of a Data Governance Program: People, Practices, Policies and Technology

Joe Alhadeff, Vice President for Global Public Policy, Chief Privacy Officer, Oracle Corporation
(Theatre)
The Elements of a Data Governance Program: People, Practices, Policies and Technology
This keynote will focus on the evolving needs of organizational governance and accountability. Governance and accountability are multifaceted concepts that must be applied in ways that are accessible to the individual, credible at the level of the organization and extensible across the ecosystem. The elements of such a program are based in organizational policies and processes, the technology that supports them and people that oversee and implement them. Today’s accountability and governance program must be developed collaboratively across disciplines to assure that each element supports and underpins the other. Where technology may have limitations to secure data beyond the transaction; policies, processes and contracts may supplement. Technology may support policies and processes through identity management, rights allocation, audit and other tools. When all of these elements function together the whole is greater than the sum of its parts. As part of this keynote we will also consider trends in Canadian law and practice as well as specific applications of technology in identity and privilege management

Global data flows and big data can be "something really cool and marvellous that happens when you get enough data together" or they can be Big Brother.

Privacy questions span generations, but change as they do; again, theme of the continuously moving target of privacy definitions and requirements that legal bodies are continually playing catch-up with.

"Canada has the PhD on accountability" when it comes to privacy leadership worldwide. We are moving from a compliance of objects to an accountability and governance approach.

At the core of privacy and data management, we are tasked with getting the right data to the right people at the right time. This is reflective of the Wednesday morning workshop I attended at the conference.

Reference made to the TAS3 project in the EU. Trusted Architecture for Securely Shared Services. This is a PPP project where technology, governance, law, and policy were co-developed in support of privacy and security. Technolgy assures the first hop, but law, and policy fill the ecosystem and value chain gaps.

Visual shared, a sign from Quebec that states fair-play SVP. Being prepared means being a good neighbour, playing fair, and successful preparation for information management involves:
Stewardship of information
Transparency
Controls
Proof/audit/testing
Information lifecycle
Training
Learning organisation

We are encouraged to look at compliance as an opportunity; privacy impact assessments must be user friendly to be valuable. Make it an opportunity to learn, and teach. Security and privacy are visualised as a Venn diagram, and we want to operate in the sweet spot, which is compliance, which optimises operational costs in the long term. Have the backend understand compliance, and governance bodies understand security.






- Posted using BlogPress from my iPad

Location:13th Privacy & Security Conference

Know Your Enemy: Understanding the Threat Landscape, Challenges, and Best Practices

Cheri F. McGuire, Vice President, Global Government Affairs & Cybersecurity Policy, Symantec Corporation
Know Your Enemy: Understanding the Threat Landscape, Challenges, and Best Practices
Sensitive information under attack from a wide variety of sources, including well-meaning insiders, organized crime rings, nation states and advanced persistent threats (APT’s). Private and Public Sector are facing a changing information technology landscape that sees more information stored on smart phones, tablets and cloud services. Tiffany Jones will discuss the current global threat landscape, identify key security challenges apply critical best practices and solutions to protect your environment.

Key trends and security drivers
1. Sophisticated attacks
97% 2009 breaches used customised malware
75% of enterprises reported a cyber attack
2. Complex and changing infrastructure
More than 1B mobile devices connected tonthe Internet
Cloud computing expected to double by 2014, enterprise architectures at greater risk
3. Information explosion
Corporate info grows by 66% each year
4. Consumerisation of IT
BYOD, telecommuting, and the opening of corporate and public service networks to greater risks

Trends changing the threat landscape
1. Moving from a signature model to a reputational model
2. Desktop to mobile
3. Physical to virtual

Security must move from being system centric approaches to information centric to adapt and protect.

Threat landscape trends, as noted in report to be published in two months:
1. Targeted attacks continue to evolve
2. Social networking leveraged via social engineering
3. Hide and seek or 0 day vulns and rootkits
4. Attack kits are becoming more easily leveraged and accessed and complexity of attack is simplified in delivery
5. Mobile threats are increasing dramatically, as the PI on mobile devices is a high value target

Symantec is proposing that hacking remIns the highest impact breach type, and the average resolving cost is $7.2M. I think these numbers are inflated by a small number of high profile attacks, and think that insider attacks deserve far more attention. This smells of marketing scare tactics to sell security tools.

Mobile devices are noted as being primarily subject to trojans as the preferred attack vector, and often these are tied into social media avenues to gain access to PII and PCI; this I agree with.

Critical infrastructure attacks (SCADA) is cited by Symantec as an increasing risk area. In reality these have always been high risk, it's simply increased awareness of this now, I would suggest.

Device management, device security, content security, and identity & access are the defences against mobile threats proposed by Symantec. I wonder if they sell any products that do this? Yes, that was sarcasm.

The bottom line was to present a layered and clear security technology approach, to which I can agree, but I would have an increased focus on in parallel with the technologies, building both awareness and governance.

And at the tail end of the discussion, Cheri comes to plans and policies, so now we are in agreement. She suggests we start with governance with policies and plans socialised and established in the enterprise, including security requirements being built into acquisition contracts, buying from trusted sources, effective backup and recovery plans, and support for setting and enforcing security policies from the top of the organisation.

Cloudsecurityalliance.org, onlinetrustalliance.org, SAFECode.org are cited as useful sources for preparedness and practice planning.

The suggestion came for collaboration between the public and private sectors to increase visibility, adaptability, and optimisation of plans, policies, and preparedness.


- Posted using BlogPress from my iPad

PII & the Law

Session 9 – Keynote Speaker

Daniel J. Solove, Professor of Law, George Washington University Law School
and Paul Schwartz, Professor of Law at the University of California, Berkeley School of Law.

Personally identifiable information (PII) is one of the most central concepts in information privacy regulation. The scope of privacy laws typically turns on whether PII is involved. The basic assumption behind the applicable laws is that if PII is not involved, then there can be no privacy harm. At the same time, there is no uniform definition of PII in information privacy law. Moreover, computer science has shown that the very concept of PII can be highly malleable. Because PII defines the scope of so much privacy regulation, the concept of PII must be rethought. Professors Paul Schwartz (Berkeley Law School) and Daniel Solove (George Washington University Law School) will argue that PII cannot be abandoned; the concept is essential as a way to define regulatory boundaries. Instead, they will propose a new conception of PII, one that will be far more effective than current approaches.

Daniel is the founder of the organisation TeachPrivacy

Introduced themselves as Bert & Ernie of the Privacy world.

Technology changes the meaning of PII, it is a moving target. It plays a central concept in privacy law, and is often the trigger for when privacy law applies. Unfortunately, there is not a consistent definition or approach to PII in the law.

The three approaches to PII in the US

Tautological approach
PII is information that identifies a person. Not particularly useful as it is circular logic. Then the aspect of the answer being indentified versus indentifiable, means the burden of proff is upon the claimant to prove that the information clearly identified, not is at risk of indentifying.

Non public approach
the problem here is that there is actually not a clear definition of what non public actually means. There is a huge grey area, and this becomes an ineffective trigger.

Specific types approach
This is a rule, as opposed to the prior twomstandards. It attempts to enumerate the specific PII types and list them. The childrens PII act does this in the US. The problem here is that this is a static and inflexible approach being applied to a moving target. Many of these statues become under inclusive when it comes to information that actually could identify a person.

PIPEDA uses the term identifiable data, and is fairly broad in its application for PII. The problem becomes less the definition now rather the approach becomes all or nothing under Cdn legislation. This is reflective of EU legislation.

Problems of de-identification.
Case in point is the NetFlix survey contents, where supposedly anonymous data in the survey was readily identified by a third party research group, by cross correlating against data publicly available in IMDB.

We are seeing more and more data about people out there, and the ability to link it up to create correlations is becoming easier. The more information you have on the Internet, the harder it is to remain anonymous. The calim is that the combination of a zip code, birthdate and gender can identify 80% of the US population, my seatmate, a seasoned privacy expert calls BS on that claim quietly at our table.

The scholars provide us with a spectrum of risk of identification based on their theory.

U of Colorado prof is quoted as comparing PII to a game of whack a mole, and states that we should instead regulate the flow of information. However, without some concept of PII, privacy law has to regulate all data, not just the sensitive data.

Google flue trends cited as an example of the use of deidentified PII in the medical field as a public service.

PII 2.0 is the proposed solution to these dilemmas based on three tenents:
Identifiability is a continuum of risk.
Approach should be as a standard, not a rule.
Privacy should not be a hard on/off switch, but a tailored solution.

There are three categories of PII in this theory, moving from the current two categories.
Identified - the PII has been ascertained and the information must be protected. Plus identifiable data when significant probability of linkage to a specific person can occur.
Identifiable - specific identification is possible, has not yet occurred, And this data must also be protected and audited.
Non identifiable - only a remote risk of identification, need for protection of data is minor.

The speakers cite the dangers of the "release and forget" approach, and agree that there is a need for a track and audit approach coupled with risk assessments for identified and identifiable data.

This approach is compatible with the methodology of privacy by design, embedding privacy constraints and models into technological design and business practices.

Summing up, the presenters state that there is still great legal uncertainty about the concept of PII on a world-wide basis, and it is hard to predict the impact of privacy law on business, and therefore it is a source of business risk.

In the end, the PII 2.0 concept is about the taxonomy of PII, intended to help organisations to understand if they are subject to privacy laws or not per geo-political boundaries and constraints.

One delegate challenged that creating these categories in a vacuum from practical application is of limited value. The response was that the first two categories put the onus on the regulatory regimes and business to be responsible about how they classify data.

Questions were raised on the practicality of data moving from one category to the other over time, and how this could be managed from a track and audit purpose. The response was de-identification should be the rule, not the option for organisations holding data that is to be published. I'm uncertain this really answered the question.

The discussion was fairly esoteric, and likely provides something of use within legal circles, but moderate to low value in practical application in the technology world until legislation applies clearer boundaries to the PII containers, which, is what these gents are trying to encourage.




- Posted using BlogPress from my iPad

Location:13th Privacy & Security Conference

Thursday, February 16, 2012

20 in 2012: The Top Privacy Issues to Watch

Trevor Hughes, President and CEO, International Association of Privacy Professionals
(Salon AB)

20 in 2012: The Top Privacy Issues to Watch
Privacy has long been an important part of any information protection program; however, new potential laws and shifts in the landscape are creating new challenges and business imperatives for privacy, security, IT and legal professionals. Organizations and companies are under more pressure than ever to develop and explain strong privacy practices. From calls for a ”Do Not Track” tool to requiring concepts of Privacy by Design and new potential new data breach notification rules, there are many new priorities to consider. J. Trevor Hughes, president and CEO of the world’s largest association of privacy professionals, will cover the top privacy policy and technical developments to watch in the coming year.

EU proposed regulations
First major review of the existing regulations.
First aspect of this is the right to be forgotten, including data portability to take it from a vendor and move it with you. This is a challenge to implement. There is also the right to delete, or expunge their data from any place it might be stored.
Europe is looking for streamlined jurisdiction; the european main site of your business will be authoritative for the regulations you must comply with.

E-Privacy Directive
This EU directive (the cookie directive) says if you set or read information on a client device you need to get consent for that. This will be untenable for the end user with today's web browsing technologies. The regulators are debating still what this will actually mean. Browser controls permitting cookies may be the loophole for this.

FTC Staff Report
The US has been struggling with their privacy regulator, and an analysis of privacy issues (including online privacy but not exclusively) has resulted in a draft framework report. It should be released in the next 6 weeks, and is expected to include the idea of operational privacy; it becomes a business concern, it is baked into business controls in each enterprise/organisation. This accepts that there are implied consent items, within the boundaries of reasonable privacy expectations between the consumer and the enterprise.

Do not track is hugely accepted, switching off online tracking being an option for all browsers. Browser manufacturers are already on this, and we can see more of this available later this year.

The FTC accepts that there is a new type of data called consumer data; data that relates to a particular consumer, but is not identifiable. The definition of this will be in the paper.

The US Dept of Commerce has a white paper report coming (called a green paper until it is released in 6 weeks) and are playing chicken with the FTC on who will release first. The Obama administration is willing to consider a privacy bill of rights, and a recognition that law cannot answer every question, therefore industry needs a code of conduct.

Notice of security breach is catching on like wildfire since it started in California. The current state of this provides a patchwork quilt of responses because each state legislation is unique. Industry is pushing for a standardised approach to simplify. The strong aspects of this policy is that it is consequential, rather than prescriptive, and therefore has increased the use of encryption, for example.

Art called FaceOff by italian artist illustrates the layers of persona that social media encourages of the populace.

facebooks IPO listed privacy more times than any other risk, showing that social media giants recognise the risks, but aren't yet really doing anything because we are not voting with our fingers.

Online behavioural advertising where via cookies you are cross site tracked for your interests and behaviours. Self regulatory efforts are starting to see some traction. The digital advertising alliance is starting to see some maturity.
Consumers value privacy, but we have trouble setting that value to more than 50 cents off a cheeseburger.

Mobile devices, and the privacy considerations for mobile apps. Industry must accept and respect privacy because people are begining to vote with their fingers, and it easy to delete an app that violates our trust.

Geo-data sensitivity is an awareness that is growing with the consumer marketplace. Most devices that deliver your geo-data to other parties do so with no knowledge of the device user.

Cloud computing continues to be a controversial topic, because the information economy knows no jurisdictional boundaries. The issues are not de facto compatible with data transfer and privacy expectations and needs to make functional use of the cloud concept.

Emerging markets introducing privacy laws, mexico, brazil, argentina, india are all creating privacy laws that face outwards more so than inwards, to protect the outsource business processing industry.

Regulatory risk is where the rubber hits the road for privacy and security. Regulators around the world are seeking and obtaining more powers than they have ever had to enforce data protection. The FTC is becoming more aggressive in going after privacy violating organisations.

Class action risk also grows, NetFlix settled for $9M in the US this week, for their data collection practices. The barrier has been the issue of harm, but a number of judges are starting to show a willingness to close their eyes to allow the cases to progress to the point where a settlement occurs. Watch the US market and the reactions to these law suits.

Brand risk is more amorphous but it is growing in awareness, as most major publications are establishing beat reporters for privacy topics specifically. As many as 500 stories per day globally are published with respect to privacy issues, so the brand risk is growing as media is slavering for the next big story.

Privacy by design and default is necessary because of these risks. Privacy cannot be an option, or an after thought placed on the infrastructure to hold responsibility for.

Accountability is necessary through metrics, audits, controls, and generally taking information and managing the data in your enterprise seriously.

Everyone is talking about big data because it is solidly in place, and every role dealing with big data is on some respect a privacy role. privacy needs to have complete oversight over big data collection, storage, use, and management. big data is driving big jobs that require privacy knowledge and awareness.

we are all privacy professionals: if you touch data, information security, or systems that touch data, you need to understand privacy to an adequate level to react correctly when any issue arises.

Stay aware, track the EU framework, FTC report, and the risk environment. Build privacy before launch, operationalise privacy into your organisation. Build response plans, and train your organisation.


- Posted using BlogPress from my iPad

Location:13th Privacy & Security Conference