UBC's CIO Oliver Grüter Andrews MC'ed the event this year, and started by introducing the chair of BCNET, Michael Hrybyk. Michael noted this year is the 13th annual event, and there are over 500 registrants. Additionally we were informed of some key growth metrics this year:
1. All public post secondary institutions in BC are now members of BCNET
2. BC Libraries cooperative just joined BCNET
As an interesting note for those who might wish to speak at BCNET, there is no CFP (Call For Papaers) for the conference; the BCNET sub-committees each get a slot and determine the topics and speakers from within their groups.
Lastly, Michael announced that the Data Safe service hosted at TRU is now available to all BCNET members.
With that, the keynote speaker, Jer Thorp was introduced. His topic was "Making Data More Human."
Jer starts by asking us to consider"What is the human experience of technology; what is the subjective experience we are increasingly facing?"
We told that at any given time, there are more than 1M people in the air. Jer then shared a digital moving image illustrating the global air traffic system, that impresively illustrated context for that number, helping us grasp a ridiculously large number in context of things we can easily express.
Jer recommends reading the book "Infinite Justice" by David Foster Wallace.
We're provided with a definition of data - measurements of something.
Contains an act of measurement, and that data is continually tethered to the something that was measured.
Example - Keppler Sattelite viewing the Cygnus-Lyra returning digital photographs continually; Watching these images for transit is akin to watching a lightbulb 20 kms away to see if a mosquito flies in front of the bulb. Paladies is the supercomputer crunching this data
~4,000 potential orbiting planets identified. The condiseration of how to represent this data in a meaningful way is the key.
Data has character, every data set has unique character
How we can visualise that data is related to its character
Jer shifts gears, and talks about manipulating the visuals of the data in three and four dimensions. Consider the "Minority Report" style interface. Oblong is the company that built the minority report interface, and it is actively in use today. But it's not yet as accepted as it could be, although it has new collaborative features where people can grab the data relevent to them, pull it out and work with it without affecting the whole.
"Collaborative systems usually don't work because one person is driving, and the other people are jerks."
Use the Ooh/Aah methodolgy - draw them in with the Ooh factor, and keep them interested with the Aah.
Peoples patterns of their lives is highly predictable given the data avilable for tracking their movements via cell phone tracking, or social media postings.
How can we model how people are sharing data on the web?
Examples given that are dramatic and effective:
Peoples data can be used without them being aware - opportunistic sensingOpenPaths is an open project for your phone that sends your location data somewhere you and others can see it, which gives people the experience of data ownership, and first party access to their data.
People who generate data should have access to that data.
It is a reality that cellular phones have become the virtual biographer of our lives.
If we remind ourselves of our personal relationship with our data, we will take conversation more seriously with those who want it.
We should always ask ourselves "What experiences have we had that resulted in this data being produced?"
Bringing data into public spaces removes the "choice" to not view it
Distant reading is the new paradigm for data analysis: New systems bring us to the idea of distant reading where viewing our data at a higher level, from a further distance, new patterns emerge.
Examples cited are:
3 things we need to be considering
1. Data ethics - we need this conversation outside of just privacy
2. Data ownership - the coming central issue over data
3. Data possibilities - rapid change gives us the opportunity to consider where data usage might be in 5 years
Jer's primary tool for data visualisation is "Processing" - MIT open source data visualisation software