<?xml version="1.0" encoding="utf-8"?>
<rss version="2.0" xml:base="http://thomas.kiehnefamily.us"  xmlns:dc="http://purl.org/dc/elements/1.1/">
<channel>
 <title>infoSpace - SAA</title>
 <link>http://thomas.kiehnefamily.us/taxonomy/term/18/0</link>
 <description></description>
 <language>en</language>
<item>
 <title>Reflections on the SAA 2006 Annual Conference - Part II</title>
 <link>http://thomas.kiehnefamily.us/reflections_on_the_saa_2006_annual_conference_part_ii</link>
 <description>&lt;p&gt;This entry is a continuation of my observations on this year&#039;s SAA annual conference.  For more, see &lt;a href=&quot;http://thomas.kiehnefamily.us/reflections_on_the_saa_2006_annual_conference_part_i&quot;&gt;Part I&lt;/a&gt;.&lt;/p&gt;
&lt;!--break--&gt;&lt;!--break--&gt;&lt;p&gt;&lt;a href=&quot;http://www.archivists.org/conference/dc2006/dc2006prog-Session.asp?event=1738&quot;&gt;&lt;b&gt;Plenary Session II: &quot;Technology&quot;&lt;/b&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;Each of the three plenary sessions was hosted once each by the three joint conference organization, the second one headlined by SAA.  This year, SAA president Richard Pearce-Moses opened the session with a talk summarizing his work over the last year in exploring the &quot;new skills&quot; needed by archivists for the digital era.  Between his writings in Archival Outlook (&lt;a href=&quot;http://www.archivists.org/periodicals/ao_backissues/AO-Sept05.pdf &quot;&gt;here&lt;/a&gt; and &lt;a href=&quot;http://www.archivists.org/periodicals/ao_backissues/AO-Jan06.pdf&quot;&gt;here&lt;/a&gt;) and the &lt;a href=&quot;http://rpm.lib.az.us/newskills&quot;&gt;New Skills Colloquium&lt;/a&gt; in June I have already heard much of what he had to say, but it was nice to see it presented so succinctly to a room full of archivists â€“ a name drop didn&#039;t hurt, either!&lt;/p&gt;
&lt;p&gt;There were a couple of key points that he made which validate opinions I have had and expressed in the past.  One is that traditional archivists have a tendency to avoid the challenges presented by digital records â€“ paraphrasing Pearce-Moses: to hope that someone else will deal with it instead.  Second, he essentially stated that if archivists do not rise to the challenge, other professions will.  I have &lt;a href=&quot;/musings_on_a_systems_view_of_digital_archives&quot;&gt;previously expressed&lt;/a&gt; my concern over how terminology and practices that technology vendors use come into direct conflict with those that archivists use so it was encouraging to hear it put to the audience.&lt;/p&gt;
&lt;p&gt;Following Pearce-Moses was a talk by Brewster Kahle of Internet Archive fame, which was a pleasant surprise, mainly because I was curious to see how he would present the &quot;save everything&quot; argument in this venue.  Kahle&#039;s presentation was decidedly oriented towards a lay audience, being rather shallow in scope and simple in terms of technical detail, but I can understand his trepidation over being inaccessible to a decidedly non-technical audience â€“ In fact, I have seen this happen on numerous occasions when tech industry professionals or computer science academics are asked to speak to librarians or archivists.  &lt;/p&gt;
&lt;p&gt;Aside from this lapse, however, Kahle definitely had a couple of key points to make and drove them home.  One main point can be paraphrased as: we can save everything digitally because in the grand scheme of things it&#039;s not really that expensive.  He threw out some general figures based on estimated amounts of data found in print, film, etc. and showed how these figures are inexpensive in an institutional or government context.  Kahle didn&#039;t address appraisal and selection, which I am certain many in the audience would have loved to bring up, but I believe that addressing such concerns would have made for a significantly longer presentation.  Second, he mentioned very little about preservation and preservation strategies and how they might impact the costs and requirements for storage and management.  The main point he made about preservation was to reiterate the &lt;a href=&quot;http://www.lockss.org&quot;&gt;LOCKSS&lt;/a&gt; principle, saying essentially that the only proven way to keep information safe is to make lots of copies.  But, I can understand why he would not delve too deeply into this topic as it brings into play discussion of formats, technological obsolescence, and of course, increased storage and costs.  In summary, I appreciated his presentation as a general position statement, but I can easily imagine that few skeptics in the audience were turned.&lt;/p&gt;
&lt;p&gt;The plenary session was wrapped up with a star appearance by &quot;Cokie&quot; Roberts, writer and ABC News correspondent.  Her speech was quite entertaining, the content of which was mostly focused on her experiences in researching for her various books and how her experiences in advocating for breast cancer research could apply to helping fund archives and archival research.  The most interesting part of her presentation was most likely an unintended argument for &quot;save it all.&quot;  &lt;/p&gt;
&lt;p&gt;During her speech, Roberts discussed how difficult it was for her to find source documents regarding or by the wives and women related to the &quot;founding fathers&quot; for her book &lt;i&gt;Founding Mothers&lt;/i&gt;.  Some of the difficulty was due to the usual mis-management of documents, including deliberate destruction by the creators, but more of a problem was the fact that the perspectives of the women of the subject period were considered to be inferior to those of the men â€“ in other words, there was a conscious selection judgment made on the part of archivists not to keep such records.  These decisions could be waved through as sexist or as some related conspiratorial power struggle, and no doubt some of it is, but the issue I keyed in on is that no one can know with certainty what will be of interest to future researchers.  This is perhaps the strongest argument for &quot;save it all,&quot; not only because of the value to users, but because it is not a technological reason.  It is this one thought that weaved Robert&#039;s speech seamlessly into the previous two, a feat that is tempting to attribute to her renowned brilliance, but then again may just as likely be due to the latent inertia behind the notion to &quot;save it all.&quot;&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;http://www.archivists.org/conference/dc2006/dc2006posterPresentations.asp&quot;&gt;&lt;b&gt;Exhibit Hall and Student Poster Sessions&lt;/b&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;Having presented a &lt;a href=&quot;http://www.archivists.org/conference/neworleans2005/no2005prog-Detail.asp?event=1556&quot;&gt;poster&lt;/a&gt; at last year&#039;s exhibit, I felt a responsibility to take a look this year&#039;s presentations.  Two posters caught my attention this year.  The first was &quot;Search and Preserve: Collecting the Punk and Hardcore Communities&quot; by Debi Griffith of the University of Wisconsin at Madison.  I found this poster to be of personal interest for many reasons:  First, it embodies a core argument behind my desire to save everything, that being that relying on conventional institutions and selection and appraisal can become biased against &quot;fringe&quot; or unpopular communities, thus ensuring a bias in or an incomplete cultural record.  Another reason is that I have been a participant in some of these communities, from punk and alternative music, to industrial, techno, and experimental music.  I am a semi-avid collector of DIY-style zines and publications put forth by these communities, a habit that started well before I had any idea about archives and such.  My participation in these communities has taught me how the &quot;mainstream&quot; can easily, if not deliberately, misrepresent such movements and how important it is to ensure that the record includes the perspectives and views of the communities in question.&lt;/p&gt;
&lt;p&gt;The second poster that I caught my interest was &quot;Digital Object Identifiers and Resource Identifiers in Archival Description&quot; by Krista Ferrante of Simmons College.  This poster was fairly simple, presenting DOI and handle servers as a means of providing persistent identification of electronic records, but it did remind me that I need to finally get something together on the &lt;a href=&quot;http://www.xdi.org&quot;&gt;XRI/XDI specification&lt;/a&gt; in the archival context.&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;http://www.archivists.org/conference/dc2006/dc2006prog-Session.asp?event=1759&quot;&gt;&lt;b&gt;Session #508: &quot;Future Shock: Saving the Signals of Audio-visual Records&quot;&lt;/b&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;I attended this session for much the same reasons I attended session #208, that is, to validate the decisions that were made in formulating the &lt;a href=&quot;/digital_preservation_plan_for_the_texas_legacy_project&quot;&gt;CHAT preservation plan&lt;/a&gt; and see what new work had been done in digital video preservation and access since early last year.  The difference between this session and #208 is that this session covered projects specifically dealing with audio, video, and audiovisual research rather than TV.&lt;/p&gt;
&lt;p&gt;The first presentation was by Steve Weiss of the University of North Carolina at Chapel Hill, who presented his work with restoring and preserving African American cultural audio works.  His presentation was heavy on demonstrations of the various music and voice recordings, but fairly light on process and lessons learned.  One idea that I took from his presentation had to do with software for testing CD recordable media prior to use.  All during the CHAT research I had not come up with such software, but it struck me as not only plausible but desirable to confirm recordable media before attempting to write data in order to avoid having to troubleshoot bad recordings after the fact.  No specific software was mentioned, but knowing that such software exists should make it easy to find â€“ more research is necessary here.&lt;/p&gt;
&lt;p&gt;The second presentation was given by Joanne Rudof of the &lt;a href=&quot;http://www.library.yale.edu/testimonies&quot;&gt;Fortunoff Video Archive&lt;/a&gt; for WW-II Holocaust Testimonies.  Rudof described in detail the process used to migrate and preserve a large number of Beta-SP cassettes of oral histories and testimonies.  Much of the initial process she described sounded similar in concept to the CHAT plan: surveying and inventorying existing media, developing a &quot;triage&quot; plan to prioritize preservation efforts, etc.  The major portion of the effort centered on the implementation of an experimental robotic system called SAMMA which comprised a semi-automated system for copying the existing cassettes to newer media and creating MPEG-2 digital surrogates.  It was difficult to tell from the information presented how much material (in hours) was actually migrated â€“ one figure I heard was about 250 hours or 10 TB of MPEG-2 â€“ but the final number of cassettes migrated came out to over 2000.  The mini-DV cassettes used by CHAT are  newer and at less risk than those of the Fortunoff archive, but if the number of hours was correct, then we managed to develop a plan that took more time and individual work effort, but only a fraction of the cost of this project â€“ several hundred thousand versus a few thousand.  I&#039;ve been meaning to revisit the CHAT project in terms of results and I think the low budget aspect may be the tack to take.&lt;/p&gt;
&lt;p&gt;Some research findings were presented, one set by Virginia Danielson of Harvard University, who gave an overview of her work with the &lt;a href=&quot;http://www.dlib.indiana.edu/projects/sounddirections&quot;&gt;&quot;Sound Directions&quot; project&lt;/a&gt;, and the other a short update by Jim Reilly, who was brought in by the session chair to discuss some of his work.  Danielson&#039;s presentation focused on some of the best practices determinations made by her project, or as she put it, &quot;not bad practices.&quot;  One thing I noted to research from her presentation is the &lt;a href=&quot;http://www.iasa-web.org/tc04&quot;&gt;IASA TC-04 preservation manual&lt;/a&gt;.  The main takeaway from Reilly&#039;s presentation was, paraphrased, that there is no single or simple cause of physical degradation of magnetic media.  This reinforces my doubt, stated in the CHAT plan, over the long-term efficacy of tape media as an archival solution.  As optical and disc-based magnetic media overtake magnetic tape in storage capacity, the days of tape media certainly seem numbered.&lt;/p&gt;
</description>
 <comments>http://thomas.kiehnefamily.us/reflections_on_the_saa_2006_annual_conference_part_ii#comments</comments>
 <category domain="http://thomas.kiehnefamily.us/blog_topics/conferences">Conferences</category>
 <category domain="http://thomas.kiehnefamily.us/blog_topics/digital_archives">Digital Archives</category>
 <category domain="http://thomas.kiehnefamily.us/blog_topics/saa">SAA</category>
 <category domain="http://thomas.kiehnefamily.us/blog_topics/save_everything">Save Everything</category>
 <pubDate>Thu, 24 Aug 2006 03:25:50 +0000</pubDate>
 <dc:creator>tkiehne</dc:creator>
 <guid isPermaLink="false">33 at http://thomas.kiehnefamily.us</guid>
</item>
<item>
 <title>Reflections on the SAA 2006 Annual Conference - Part I</title>
 <link>http://thomas.kiehnefamily.us/reflections_on_the_saa_2006_annual_conference_part_i</link>
 <description>&lt;p&gt;Last week I breezed through Washington, DC to attend the &lt;a href=&quot;http://www.archivists.org/conference/dc2006/&quot;&gt;SAA/NAGARA/CoSA Joint Conference&lt;/a&gt;.  Last year at this time, I attended the SAA conference as a new, student member and, as it was my first ever professional conference, I spent most of the time trying to acclimate myself to the conference ebb and flow.  This year I&#039;ve committed to taking better notes, talking a bit more, and, of course, sharing my observations here.&lt;/p&gt;
&lt;!--break--&gt;&lt;!--break--&gt;&lt;p&gt;First off, these notes are my attempt to forge meaning from the shards of information that reached me.  They are not meant to be comprehensive in their coverage of the sessions I attended, but merely document my thoughts and observations which, predictably, are skewed towards my own research interests.  These observations are very raw and are meant to suggest areas of further research or verification.  As clearly as possible I will try to indicate what was directly expressed versus what I interpreted or generated.&lt;/p&gt;
&lt;p&gt;Second, I consciously entered each of these sessions with some overarching personal question or intent, not only to help me decide which sessions to attend but to ensure that my mind remained focused on the topics and issues that are of interest to me.  I will state these for each session&#039;s notes which should help the reader understand my mindset and the subsequent observations.&lt;/p&gt;
&lt;p&gt;In this episode, the first day:  Thursday, 3 August, 2006.&lt;/p&gt;
&lt;p&gt;&lt;b&gt;&lt;a href=&quot;http://www.archivists.org/conference/dc2006/dc2006prog-Session.asp?event=1708&quot;&gt;Session #103: &amp;ldquo;&#039;X&#039; Marks the Spot: Archiving GIS Databases&amp;rdquo;&lt;/a&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p&gt;I attended this session because I hoped to gain some insight into preservation efforts focused on what I will call &amp;ldquo;non-linear&amp;rdquo; records &amp;ndash; things like data sets, Web applications, and other &amp;ldquo;New Media&amp;rdquo; information.  It has long puzzled me how to apply the best practices of digital document preservation to digital forms that span application domains, physical locations, networks, and so on.  My concern arose during the processing of the &lt;a href=&quot;/from_floppies_to_repository_a_transition_of_bits&quot;&gt;Joyce papers&lt;/a&gt;, where hypertext was salient to many of the underlying works, but it also haunts me regularly in my capacity as a Web applications developer.  My working theory here is that geospatial data sets and the applications used to access them present generally the same preservation challenges as software, multimedia &amp;amp; games, relational databases, and so on.&lt;/p&gt;
&lt;p&gt;Three presentations were given, each with distinctive backgrounds and approaches.  Helen Wong Smith of the Kamehameha Schools of Hawaii presented a geospatial cultural / historical database project used to document and maintain land holdings in Hawaii.  Next, Richard Marciano of the San Diego Supercomputer Center presented briefs about several ongoing projects with GIS and geospatial aspects.  Among these were the &lt;a href=&quot;http://www.interpares.org/ip2/ip2_case_studies.cfm?study=23&quot;&gt;InterPARES VanMap project&lt;/a&gt;, the &lt;a href=&quot;http://www.sdsc.edu/PAT/&quot;&gt;Persistent Archival Testbed (PAT) project&lt;/a&gt;, &lt;a href=&quot;http://www.sdsc.edu/ICAP&quot;&gt;ICAP&lt;/a&gt;, and a new project called eLegacy.  Finally, James Henderson of the Maine State Archives presented some of his perspectives and challenges in preserving geospatial data as state government records.&lt;/p&gt;
&lt;p&gt;Geospatial data refers to data sets that link some sort of information (text, image, etc.) to a fixed location or area at a specified time period.  In the case of the  Kamehameha Schools, diverse media such as songs, images, and historical accounts are linked to specific locations within the School&#039;s land holdings.  Localities in the state of Maine maintain road and property data in GIS systems to support applications such as E911.  The most salient aspect of these data sets is that they change over time &amp;ndash; notable historical events happen periodically, roads are re-routed or built, and so on &amp;ndash; much as any other database changes when updated, which suggests that preservation efforts for one can be applied to the other and in other similarly structured applications.&lt;/p&gt;
&lt;p&gt;The three presentations did not flow seamlessly, but did manage to expose some overarching themes.  Perhaps the most significant theme that I observed is the relationship between data sets that change over time and versioning in unitary documents.  The key difference between these two concepts is that examining versions of a document reveals the thought process involved in achieving a final or published work, while examining geospatial data shows how things were at various points in time.  Additionally, the time between discrete versions of documents are usually much shorter than those of geospatial data, usually days versus years, and documents often have a terminal form after which changes cease, whereas geospatial data is usually open-ended or otherwise arbitrarily bounded.  Aside from these differences, the approach to preserving and accessing versions and geospatial data seems very similar.  Data sets that change over time lend themselves to access via temporal queries; where date or date range becomes part of the query criteria.  For a suitably large number of versions, an access mechanism based on date queries would work just as well as it would for geospatial data.  Further, for any body of records that span a period of time, temporal queries can be an immensely useful tool for narrowing query results to relevant time periods.&lt;/p&gt;
&lt;p&gt;When I thought about these ideas in terms of Web applications (such as CRM, sales support, inventory management, etc. -- putting aside the question of why save them) some of the analogies with GIS data break down.  For one, GIS data works in &#039;layers,&amp;rdquo; where types of data can be segregated like unitary documents.  Unfortunately, relational databases have no such abstraction &amp;ndash; they are built to store data efficiently, not in ways that can be easily separated.  &lt;/p&gt;
&lt;p&gt;Another problem is that even though Web application data can be captured by taking snapshots, in much the same way as GIS data, the rate of change within the data set can often be much faster &amp;ndash; on the order of seconds &amp;ndash; than the slower changes in things such as historical events and roads.  Further, as the snapshot horizon nears the immediate, the storage and processing requirements become untenable &amp;ndash; it is impossible to take a snapshot of a database with a frequency that is at or less than the time required to make the snapshot.   As an aside, I wonder what solutions might be suggested by data warehousing techniques.&lt;/p&gt;
&lt;p&gt;Beyond the capturing of the state of the data, Web applications require that not only the data be maintained, but the application code itself be maintained.  Seldom does an application remain unchanged over its service life &amp;ndash; bugs are repaired, features are added and removed, and so on.  These changes can affect the way that the underlying data is represented to the user.  Additionally, such changes are often accompanied by changes to the database structure itself.  As a result, snapshots should be acquired after such changes are applied.  Although not enough detail was given for each of these projects, I wonder if some of the same issues manifested in work with GIS data sets. &lt;/p&gt;
&lt;p&gt;&lt;b&gt;&lt;a href=&quot;http://www.archivists.org/conference/dc2006/dc2006prog-Session.asp?event=1724&quot;&gt;Session #208: &amp;ldquo;Big Bird&#039;s Digital Future: Appraisal and Selection of Public Television Programming&amp;rdquo;&lt;/a&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p&gt;I attended this session in order to revisit my work on the &lt;a href=&quot;/digital_preservation_plan_for_the_texas_legacy_project&quot;&gt;CHAT digital video preservation plan&lt;/a&gt; in the context of similar video preservation projects.  I hoped to validate the decisions that were made in formulating the plan and see what new work, if any, had been done in digital video preservation and access since early last year.  As the title of the session suggests, the subject area focused on TV broadcasts, but I anticipated that the overarching preservation concerns would be indistinguishable from any other video preservation effort. &lt;/p&gt;
&lt;p&gt;The three presentations fit together well, despite differences in scope.  Thomas Connors of the National Public Broadcasting Archives and the University of Maryland gave the first presentation.  Connors led us through a brief presentation that started with mention of a &lt;a href=&quot;http://www.itconversations.com/shows/detail400.html&quot;&gt;podcast&lt;/a&gt; by Brewster Kahle of &lt;a href=&quot;http://www.archive.org&quot;&gt;Internet Archive&lt;/a&gt; fame, which invokes the contentious &amp;ldquo;save everything&amp;rdquo; debate.  Connors invoked the scarcity argument which allowed him to move into a discussion on the lack of literature treating video appraisal criteria.  The remainder of his presentation described Danielle Dumerer&#039;s ranking system, which I interpreted as a risk assessment matrix, for appraising video collections and prioritizing preservation efforts.  This system operationalizes criteria such as current condition of the assets, cost of retention, intellectual rights, use potential, and perceived production value, which is a more formalized but identical process that I used for the CHAT plan.  He then showed how this system mirrors &lt;a href=&quot;http://www.rlg.org/legacy/preserv/joint/gertz.html&quot;&gt;guidelines&lt;/a&gt; described by the RLG and NPO.&lt;/p&gt;
&lt;p&gt;Next in the session was Lisa Carter of the University of Kentucky.  Carter shared her observations in working with television archives, mostly those based on magnetic analog media.  Among these observations were the importance of proper storage of media, the frailty of tape based media, and the importance of keeping the original media even upon conversion to more stable media or digital versions &amp;ndash; all of which were expressed in the CHAT plan.  Much of her talked focused on the importance of metadata for both access and preservation, most notably, the need to work metadata collection into formal workflows.  I found the concept of &amp;ldquo;shutdown procedures&amp;rdquo; to be most interesting, where the creators of a video execute a series of steps to describe, document, and otherwise properly close out a production as a means of combating the often ad hoc procedures that producers often use for the sake of brevity and leave archivists in the dark.&lt;/p&gt;
&lt;p&gt;Leah Weisse of the WGBH (Boston) Media Archives and Preservation Center presented some of her observations in working with the significant back catalog of WGBH broadcasts, reaching all the way back to the 1950s.  One important issue that she presented is that challenges that new direct to drive and flash memory systems present to preservation.  In these cases, there is no original media to work with in the future since the impetus of the users of these devices is to move the digital file off of the memory device and reuse it for subsequent productions.  This is identical to the behaviors of digital camera users, but I had never thought of this in terms of full video capture.  Perhaps the greatest challenge presented in this situation is the need for more rigorous descriptive procedures to ensure that the digital files can be identified, and thus managed, after they have been moved from the capture device.  One observation I made during her presentation is the issue of versioning that I observed during the GIS session.  In this case, the versioning is not only in terms of initial or draft productions (thin director&#039;s cut versus theatrical release in film), but also reformatted versions (letterbox, etc.) and display formats (HD, streaming, etc.).  Weisse had to deal with many of these for many of the works, which implies that the versioning issue is really a genre and form-crossing concern.  I need to see what has been said about versioning in the archival literature and how it translates to other forms.&lt;/p&gt;
&lt;p&gt;&lt;b&gt;&lt;a href=&quot;http://www.archivists.org/conference/dc2006/dc2006prog-Session.asp?event=1737&quot;&gt;Session #310: &amp;ldquo;The Current State of Electronic Records Preservation&amp;rdquo;&lt;/a&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p&gt;Despite it&#039;s comprehensive title, I knew that this session would likely cover only a high-level review of some of the major projects.  With this understanding, I approached this session as a brief update to material I had received while in classes a year or so prior.&lt;/p&gt;
&lt;p&gt;David Lake of NARA and Lee Stout of Penn State University addressed ongoing work on the &lt;a href=&quot;http://www.archives.gov/era&quot;&gt;Electronic Records Archives (ERA)&lt;/a&gt; for the National Archives.  The ERA seems to be the flagship project in North America, at least owing to the amount of information about it that I have encountered of late.  At this point, the ERA has a developer &amp;ndash; Lockheed-Martin &amp;ndash; and is slated for an initial, though not comprehensive release in Fall of 2007.  Much of the questions about the ERA focused on the potential for using the resulting products in venues outside of the National Archives and whether it would be available as an open-source or similar product.  The response emphasized that this project was not only a set of software, but an instantiation of NARA&#039;s workflow processes.  The message seemed to be that while some products that do specific tasks may be portable to other environments, the core of ERA is specific to NARA and its practices.&lt;/p&gt;
&lt;p&gt;Next, Hans Hofman from the National Archives of the Netherlands presented a general overview of three current European projects: &lt;a href=&quot;http://www.digitalpreservationeurope.eu/&quot;&gt;Digital Preservation Europe (DPE)&lt;/a&gt;, &lt;a href=&quot;http://www.dl-forum.de/englisch/projekte/projekte_eng_2711_ENG_HTML.htm&quot;&gt;PLANETS&lt;/a&gt; &amp;ndash; a research project, and &lt;a href=&quot;http://www.casparpreserves.eu/&quot;&gt;CASPAR&lt;/a&gt;.  Much of what Hofman presented was very high-level conceptually, but he did take care to place these projects into the context of previous research and efforts upon which they build.&lt;/p&gt;
&lt;p&gt;Finally, Kenneth Thibodeau of NARA wrapped up the session, providing a bit of thought that transcended the specifics of the previous presenters.  One thought that I took away from his remarks are, paraphrased, that the ERA has shown that preservation has to be attacked as an organizational problem, not a process in isolation &amp;ndash; something that mirrors what I have said before in terms of archival thought infiltrating the process of creation and the tools used by the creators.  One other take-away was his emphasis on the need for digital format repositories of the type that &lt;a href=&quot;http://hul.harvard.edu/gdfr/&quot;&gt;Harvard&lt;/a&gt; is developing.  I interpreted this as not merely as reference databases, but living applications that can provide a supporting framework for preservation software platforms and applications &amp;ndash; think Web services for digital format preservation information.&lt;/p&gt;
&lt;p&gt;&lt;b&gt;General Observations&lt;/b&gt;&lt;/p&gt;
&lt;p&gt;I had one meta-observation concerning the conference as a whole.  Each session was recorded by the conference staff using each room&#039;s audio setup.  The inputs consisted of usually three microphones, one at the podium and two on the panel table.  In virtually every session I attended, the panel participants had to consciously remind themselves to repeat questions from the audience into the microphone so that they would be recorded in addition to the responses given.  This process strikes me as a visceral metaphor for the function of archivists and the frustrations they feel when working with their various constituents.  I often hear the refrain that archival thought needs to happen early in the creation of records, if not before, and given that the recording of these sessions is an inherently future-focused activity &amp;ndash; an attempt to create a complete record of the proceedings &amp;ndash; the panel&#039;s self-reminding process seems apropos.  I have said it before in this venue in different ways, but if we are to capture a more complete cultural record for the future, archival thought in the form of deliberately future-minded actions must be insinuated into our information management &amp;ndash; not only archivists, but everyone that creates information and, especially for the digital realm, in the tools that we use.  I envision this as a sort of repurposing of the &lt;a href=&quot;http://en.wikipedia.org/wiki/Seventh_Generation&quot;&gt;seventh generation&lt;/a&gt; concept for our cultural memory as it is represented in our information objects.&lt;/p&gt;
</description>
 <comments>http://thomas.kiehnefamily.us/reflections_on_the_saa_2006_annual_conference_part_i#comments</comments>
 <category domain="http://thomas.kiehnefamily.us/blog_topics/conferences">Conferences</category>
 <category domain="http://thomas.kiehnefamily.us/blog_topics/digital_archives">Digital Archives</category>
 <category domain="http://thomas.kiehnefamily.us/blog_topics/digital_preservation">Digital Preservation</category>
 <category domain="http://thomas.kiehnefamily.us/blog_topics/saa">SAA</category>
 <category domain="http://thomas.kiehnefamily.us/blog_topics/video_preservation">Video Preservation</category>
 <pubDate>Tue, 15 Aug 2006 01:10:50 +0000</pubDate>
 <dc:creator>tkiehne</dc:creator>
 <guid isPermaLink="false">32 at http://thomas.kiehnefamily.us</guid>
</item>
</channel>
</rss>
