Sound and Vision was the theme for the latest Multimedia and Information Technology (MmIT) national conference run by myself and colleagues from the MmIT Committee hosted at The Edge in Endcliffe Village on the 11th and 12th September. The MmIT Group is a special interest group as part of The Chartered Institute of Library and Information Professionals and focuses on the topic areas around multimedia and technology, as you can imagine from the name.
The focus of this year’s conference was simply ‘Sound & Vision’ and hosted a selection of high quality and diverse talks on everything from Augmented Reality to building sound and vision archives.
The conference ran over two days and began with the traditional welcome by MmIT Chair Leo Appleton. Leo then introduced The University of Sheffield’s new Pro-Vice Chancellor for Learning & Teaching, Professor Anne Peat. Anne spoke about the various areas the University was working hard on to implement new platforms of delivering learning and sharing research from iTunes U to MOOCs. The iTunes U theme continued as The University of Sheffield’s Senior Learning Technologist, Dr. Graham McElearney delivered the first plenary of the conference explaining the motives and benefits of creating and hosting academic content on Apple’s education sharing platform. Graham gave evidence as to the far-reaching impact podcasts and videos can have hosted on such a platform that is available in parts of the world, which others are not and gave impressive usage and download statistics. Graham then opened up the presentation to group discussion asking delegates how they could apply something like iTunesU in their own organisation.
In the afternoon, four workshops were run, firstly by Helen Fitton who delivered a very useful session on Box of Broadcasts which allows users to record any TV programme from the last 30 days and from over 60 channels. It allows users to create clips and compilations and embed them into their teaching materials. Whilst in the other room, Penny Andrews showcased the brilliant LibraryBox, an inventive private wireless hub for hosting all kinds of media. By connecting to the wi-fi signal generated by LibraryBox, users can browse the files hosted on the USB stick that connects to LibraryBox, stream films, read and save documents amongst other uses. LibraryBox has real potential for such as on-the-fly teaching, conferences, working in poor or rural areas and much, much more, we’ll certainly look to invest in one for ScHARR.
Later there were two parallel sessions looking at augmented reality. One from Peter Beaumont from Edge Hill University and one from Farzana Latif and Pete Mella from the Learning Technologies Team at the University of Sheffield. Both sessions gave users a real chance to play with augmented reality and look at everything from a 3D role playing game to an interactive periodic table of elements.
After a hearty breakfast, day two of the conference started where one had finished with a second plenary from Liz McGettigan, the Director of Digital Library Experiences at SOLAS on augmented reality. Liz showcased the work that had been done in her time as head of Edinburgh Libraries using augmented reality, showing delegates the children’s reading initiative Mythical Maze. Liz talked about the possibilities for AR with a strong message that the age of passive learning was now over.
Four more workshops took place in the morning, with a useful session looking at the various hardware devices that can be used to capture sound and vision by Chris Clow and Tommy Wilson from The University of Sheffield. They showcased the work they had done building a creative media team and suite that allowed staff and students 24 hour access to create, edit and publish videos and sound recordings. Whilst Stephen McConnachie delivered a session on embedded metadata mapping and automated extraction in the other workshop.Myself and Claire Beecroft showed delegates what they could do with very little money to produce good quality, edited and hosted video and audio. Valerie Stevenson from Liverpool John Moores University ran a session archiving British Culture and showed the diverse collection that her institution holds. In addition their work on translating content from analogue to digital and the creation of a small sound studio and digitisation suite to help the transition.
After lunch the Head of Sound and Vision from The British Library, Richard Ranft gave a tour of the library and their audio and visual archives which numbers into the millions of items. Richard talked about the complexities of trying to build systems to help users navigate the databases and libraries to find what they were after. He explained the importance of this as many of the materials were important historical artefacts and that the solution lied in a combination of human and machine-driven enrichment and visualisation tools.
In the afternoon John Hardisty delivered a workshop on how technology had helped improve library services for people with sight loss. John spoke about various interventions that had been created to help visually-impaired people and how new technologies such as smart devices were being used to help print disabled people. John’s session covered the new inventions and ideas being applied right now and what was on the horizon to help people with sight loss have access to the materials that many of us take for granted. The other parallel session was run by Iain Logie Baird who is the Associate Curator at the National Media Museum and looked at vision and sound collections in science museums. You may recognise his surname as he is the grandson of John Logie Baird, who invented the first mechanical television. He talked about the three museums in the group, The London Science Museum, National Media Museum and the Manchester Museum of Science and Industry and their extensive sound and vision collections.
No comments:
Post a Comment