Friday, December 08, 2006

5 Key Points for Implementing a Web Accessibility Policy for a Large Library

I have created five key points for explaining the need for a web accessibility policy in a large library. Web accessibility for users with disabilities is crucial because there has been a major shift in services “performed in person, over the phone, or by mail” to services offered on the web and yet “the area that this debris has piled up the most is the information superhighway” (Hensley, p 124). The sad truth is that, according to the BBC News, “most of the leading websites around the world are failing to provide the most basic accessibility standards for people with disabilities.” (BBC News, “Most websites failing disabled”). As a large library, our mission is to provide access to the public, in accordance with the Americans with Disabilities Act (ADA) and in the spirit of the Library Bill of Rights.

universal%20web%20design.jpg

Image: Universal Web Design http://images.amazon.com/images/P/1562057383.01._AA240_SCLZZZZZZZ_.jpg



1. Universal Design for web accessibility used for library website with no or extremely limited Java.

According to the recent survey on web accessibility, only “ninety seven percent of websites did not provide even minimum levels of accessibility” and “93% failed to provide adequate text descriptions for graphics” (BBC). As with a physical library building, the library’s website must take the ADA principles of access “which mandated disability access to public facilities,” and put them into practice by ensuring access to the library’s virtual site and the services offered therein (ALA, Universal Design).

Therefore, the library’s website must follow the following recommendations by the ALA on web accessibility:
• Use written explanations, captioning and/or transcripts to explain all visual and audio content such as pictures and animations;
• Employ a simple design that is easy to navigate through headings, lists, a consistent structure, and cascading style sheets for layout and style where possible;
• Make sure that text and graphics can be understood without color;
• Clarify natural language and use text labels for all links so that they make sense when read out of context. For example, avoid phrases such as "click here";
• Summarize graphs and charts;
• Avoid embedding textual information in graphics;
• Make line-by-line reading sensible in tables; and
• Use tools, checklists, and guidelines at www.w3c.org/wai to validate your web site.

(ALA, Web Accessibility)


In addition, “universally designed technology may reduce the need for a separate assistive technology device. For example, a program with scalable digital content that permits the user to change the font size, layout and background color may be an adequate substitute for a special magnifying device.” (ALA, Universal Design).

no%20java.jpg

Image: No Java www.evergamer.com/image/noJava.png


The reason for avoiding the use of java on the library’s website is because for users with disabilities, “items such as Java applets and completely graphical interfaces remain impenetrable” (Hensley, p 124). The library must ensure that their website is accessible to all users. Also, according to the web accessibility study, “73% relied on JavaScript for important functionality. JavaScript does not work with some screen readers used by those with impaired vision” (BBC). Therefore, although Java may be offered for certain items offered by the library, our organization’s website must not.

2. The library will create and strictly follow an accessible library materials purchasing policy.
The ALA’s Considerations in Purchasing Accessible Library Materials must be instituted as policy for any purchasing of library materials to ensure accessibility. These considerations include:
• Are there accessible digital versions of the print materials available?
• Is there an accessible online version of the print materials?
• Is the print material available in large print?
• Will the publisher grant the library permission to transform the print material into an accessible digital format?
• For database and online services, is the service compatible with screen readers and does it offer non-visual access (command prompts)?
• For video materials, are there captioned and/or video described versions of the materials available for purchase?
• For digital materials, can you confirm that none of the text is embedded as graphics?

(ALA, Considerations in Purchasing Accessible Library Materials).


digital%20rights%20management.jpg

Image: Digital Rights Management www.mediaculture-online.de/.../cdroms160.jpg


3. Training for all librarians on Digital Rights Management and Accessibility.
Although “the Chafee Amendment has provided the right to the right to authorized entities to make copies of works in alternate formats to meet the needs of people with print disabilities” as part of the fair use doctrine,
“the promise of broader accessibility has been greatly compromised by digital rights management technologies that are increasingly "wrapped around" digital content to protect from unauthorized use. Those technologies often interfere, either inadvertently or more often intentionally, with a variety of special technologies employed by people with print disabilities to access digital content.”
(ALA, Digital Rights Management and Accessibility).


According to a study done by the American Association of the Blind, “more than 50% of the electronic book titles offered for digital sale were "locked" and therefore not available via a common screen reader interface. What is worse, the lack of accessibility of the e-book was not apparent until the book was purchased and downloaded.” This means that librarians have a responsibility to determine, prior to purchasing such digital resources, that they are “fully accessible, and if they are not, demand that the technological "locks" that shut out people with visual and print disabilities be deactivated.” (ALA, Digital Rights Management and Accessibility). Training for librarians must implemented so that librarians and library professions fully understand the copyright and digital management legal issues and how they conflict with assistive technologies and accessibility issues.

For example, the Digital Millennium Copyright Act (DMCA), had until recently
“prohibited ‘circumvention’ of a technological protection measure in order to obtain access to a protected work”, making it “illegal both for a savvy user of screen readers to figure out how to break the lock in order to gain access and for a software developer to develop a program to engineer around the DRM in order to allow assistive technology devices to work.” However, the United States Copyright office, has recently
“granted a limited exemption from the prohibition to ‘literary works distributed in -book format where all existing e-book editions of the work…contain access controls that prevent the enabling of the e-book's read-aloud function and that prevent the enabling of screen readers.’”
(ALA, Digital Rights Management and Accessibility).


Therefore, librarians must be trained and ready because “the full impact of this exemption is not yet clear since few people with print disabilities will have the know how to circumvent the DRM to enable read-aloud functions or screen reader software.” (ALA, Digital Rights Management and Accessibility).

4. The library must provide at least minimal assistive or adaptive technologies.
On the advice of Dawn Hunziker, Assistive Technology Coordinator at the University of Arizona’s Disability Resource Center, the library must provide at the minimum the following assistive technologies: screen reader, text to speech, and screen magnification programs. According to Hunziker, these three resources will allow anyone with a disability to be able to use the library computers. From the Hunziker handout, text to speech “allows you to listen to [class] materials you have scanned, text you have written, email or internet material.” Screen magnification allows users to “enlarge the screen for visually impaired individuals and has limited speech reading capabilities.” And screen reader will “read all information shown on the computer screen and allows access to the computer for visually impaired individuals.”

Furthermore, all library professionals are required to be trained in their use in order to ensure that any library professional can provide assistance and seamless service to all users either with or without disabilities.

5. Building accessibility costs into the budget.
Accessibility doesn’t come cheap; therefore budgets must reflect the priority of equal access. This includes budgeting for
• Library staff training for using assistive and adaptive technologies as well as training on legal issues involved with the DRM
• Website evaluation and restructuring, if needed, to comply with “the Web Accessibility Initiative (WAI) of the World Wide Web Consortium (W3C)” and “Section 508 of the Rehabilitation Act of 1973” as recommended by Dunlap
• Adaptive or accessible materials purchases such as e-books and other accessible technologies such as screen readers, text to speech and screen magnification programs.
• Upgrades or “investing in a maintenance agreement” (Hensley, p 132) for the assistive/adaptive equipment

Besides only budgeting for training and equipment, Hensley recommends looking for additional resources “contacting state and federal agencies for help in locating any grants or one-time funding vehicles to help get technology established” (p 132). He also recommends libraries “investigate special pricing or promotions” (p 132).

References:

ALA, “Considerations in Purchasing Accessible Library Materials.”
http://www.ala.org/ala/washoff/oitp/emailtutorials/accessibilitya/16.htm

ALA, “Digital Rights Management and Accessibility.”
http://www.ala.org/ala/washoff/oitp/emailtutorials/accessibilitya/10.htm

ALA, “Universal Design.”
http://www.ala.org/ala/washoff/oitp/emailtutorials/accessibilitya/20.htm

ALA, “Web Accessibility.”
http://www.ala.org/ala/washoff/oitp/emailtutorials/accessibilitya/18.htm

Dunlap, I.H. “How Database-Driven Web Sites Enhance Accessibility”, Library Hi Tech
News 23:8 (2006): 34-38. http://www.emeraldinsight.com.ezproxy.library.arizona.edu/Insight/ViewContentServlet?Filename=Published/EmeraldFullTextArticle/Articles/2390230809.html (accessed December 07, 2006).

Hensley, “Adaptive Technologies” in Technology for the Rest of Us: A Primer on
Computer Technologies for the Low-tech Librarian, ed. Nancy Courtney, 15-22 (Westport, Connecticut: Libraries Unlimited, 2005).

Hunziker, D. Assistive Technology lecture and handout, University of Arizona, December 6, 2006.

“Most websites failing disabled”, BBC News, December 5, 2006,
http://news.bbc.co.uk/2/hi/technology/6210068.stm

Digitization Topics in Recent News:

I ran across two articles in my RSS feeds that deal with a few of our digitization topics.

One is dealing with the new Microsoft Zune music player, which is a competitor to Apple’s iPod. The marketing hook with Zune, is that it has the following features: “Wi-Fi connectivity, video playback, Xbox integration and some sort of community feature for letting music lovers interact with each other.” (Buskirk, Microsoft IPod 'Killer' Is Doomed.) But best of all, Zune was to “offer music at a variety of download and subscription prices, rather than the flat $1 per-song standard inaugurated with Apple's iTunes Music Store.” .” (Buskirk, Microsoft IPod 'Killer' Is Doomed.)

Zune.jpg

Image: http://www.wired.com/news/columns/0,72172-0.html?tw=rss.technology


However, in “ Zune, Creative Commons Don't Mix”,
the latest news is that “the Zune only frees up tunes for a limited free sampling period -- a policy that actually interferes with the rights of artists who want people to share their works freely.” (Buskirk. “ Zune, Creative Commons Don't Mix”). This is where the Creative Commons licensing comes in- artists who are releasing their work under it, are being blocked by “the Zune's blanket hardwired sharing limitations -- a compromise hammered out to appease the record labels.” (Buskirk. “ Zune, Creative Commons Don't Mix”). If you read anything in the digitizationblog, I would recommend the Ariadne, Issue 49 article by Naomi Korn and Charles Oppenheim, “Creative Commons Licences in Higher and Further Education: Do We Care?“ for a good understanding of the Creative Commons issue, and especially how it pertains to education.

The second article I read is David Pogue’s November 21st post, “The Truth About Digital Cameras”. It’s an interesting experiment he did to test the effect of megapixels of digital cameras.

Pogue%27s%20digital%20camera%20experiment.jpg

Image: http://pogue.blogs.nytimes.com/2006/11/21/21pogues-posts-2/


The point being is that the common assumption is that with a higher the megapixel camera, the higher the quality of an enlarged digital image should be. The results really surprised me; take a look if you’re interested into either digitization or (digital) photography.

Digiprojects and the Cornell Digital Imaging Tutorial

Three of the projects that I learned about from reading the blog entries in digitizationblog include the Fedora and the Preservation of University Records Project, the File Format and Media Migration Pilot Service (FFMM), and the CIHR Policy in Development. I will discuss what the projects are, identify the issues, and discuss them within the context of the Cornell tutorial.

1. Fedora and the Preservation of University Records Project

This project is a research grant project of the Digital Collections and Archives of Tufts University and Manuscripts and Archives of Yale University that sought “to combine electronic records preservation research and theory with digital library research and practice.” Specifically, this project looked at using the Fedora digital repository system to determine whether it has “the ability to serve as an electronic records preservation system.”

Findings were that
“Even though some preservation policies may be articulated and managed through Fedora, an institution still must formulate these policies—they are not pre-set in Fedora. Rather than an out-of-the-box, limited repository solution, Fedora is a repository architecture upon which an institution can shape a repository in many different ways. Thus, the suitability of Fedora as the basis of a preservation system depends significantly on its implementation.”
(Glick, Wilczek and Dockins).


Therefore, the Tufts-Yale project created “three main products” to provide baseline guides: “the requirements for recordkeeping systems and preservation activities, the Ingest Guide, and the Maintain Guide”. However, the project results acknowledge that these baseline guides “all suggest areas of further work.”

Issues that I identified in this project, that were also conceptually discussed in the Cornell digital imaging tutorial, include: preservation of digital records and content, technical infrastructure management and management policies.

From the Cornell digital imaging tutorial, I learned about digital preservation whose goal “is to maintain the ability to display, retrieve, and use digital collections in the face of rapidly changing technological and organizational infrastructures and elements.” Preservation decisions are considered an “integral part” of digitization projects, because they coincide with any “long-term retention plans.” There are many technical and organizational and administrative challenges associated with preservation.

The first challenge is technical infrastructure management, or file management, which is done by “through careful evaluation, and the avoidance of unique, proprietary solutions.”

According to the Cornell tutorial, file management “consists of a set of interrelated steps designed to ensure that files can be readily identified, organized, accessed, and maintained.”

Finally, to address the organizational/administrative challenges, management policies are crucial because they
“boil down to correlating resources and processes with project goals. Project goals, such as enhancing access or promoting efficiencies must be translated into project deliverables, such as digital image files, accompanying metadata, and Web-accessible databases.”


According to Cornell, part of this management policy is to find a “holistic approach that recognizes the interdependencies between technical and organizational components”, one example of which is OAIS (Open Archival Information System).

I learned about OAIS which, according to Cornell, is a “reference model [that] provides a framework for long-term digital preservation and access, including terminology and concepts for describing and comparing archival architectures.”

The OAIS framework was built upon such holistic management principles, in order to find “practical approaches to digital preservation”. In the Fedora Tufts-Yale project, OAIS provided the framework. Furthermore, “the OAIS Reference Model, the requirements, the Ingest and Maintain guides, the resources and services that support the guides, and the implementation of the guides should be viewed as a tightly related set of steps that build on each other” (Glick, Wilczek and Dockins).

2. File Format and Media Migration Pilot Service (FFMM)

This project offered to “Cornell’s scholarly community, the Cornell University Library Research and Assessment Services unit” is a free service whose goal is to “get a better read on the scope and seriousness of digital obsolescence in the unmanaged digital holdings.”

Issues that I identified in this project, similar to the Tufts-Yale project, include: digital preservation, technical infrastructure management, and project planning and management.

From the Cornell tutorial, I learned about digital preservation in both Technical Infrastructure and Strategies of Preservation.

However, the technical infrastructure, or file management, in this project was unique in that the goal was to find old or obsolete technical components (hardware and software) to preserve digital data from obsolete media (like old floppy disks).

floppies.jpg

Source: http://www.rlg.org/en/page.php?Page_ID=20987#article1


This illustrates one of the technical challenges in digital preservation technical infrastructure, because the most up-to-date tools which institutions usually look to invest in, also have “limited backward compatibility.” This project demanded the reverse, which posed its own challenges: the project team resorting to going through their scrap heap and purchasing equipment from ebay. The focus of the digital preservation technical strategies was performing migrations.

3. CIHR (Canadian Institutes of Health Research) Policy in Development - Access to Products of Research

This project’s focus is creating “a process to develop a research policy that will promote access to the knowledge and resources generated from CIHR-funded research” by implementing a formal institutional policy that requires, rather than only encourages, researchers to make their results “publicly available.” The goal of this policy “is to position Canada as a world leader in the creation and use of knowledge for health benefits” by making sure this information is “disseminated as widely as possible so that all parties benefit from these research outcomes.” This digitization project, unlike the other projects, involves outsourcing. However, it still has some of the same issues as in house projects and digitization in general has allowed information to be widely disseminated, which is the ultimate goal of this policy.

Issues that I identified in this project include legal restrictions and project management.

Similar to any digitization selection process, legal restrictions must be identified at the project onset. CIHR has identified open access eligible materials as:
1. physical products of research (i.e., cell lines, DNA libraries, PCR primers);
2. structural and functional data typically deposited in public databases (i.e., genomic data, DNA sequences, protein sequences);
3. peer-reviewed published results.

The project management consists of a governing Advisory Committee, whose first step is an online survey concerning “the general scope and content of a proposed policy” the results of which are to be considered by the committee in their policy development.

Furthermore, the project management approach is to outsource the information to any “Open Access Initiative compliant digital archive” which is CHIR’s most effective “publishing cost-recovery model” (Harnad).

From the Cornell tutorial, I learned about the advantages and disadvantages of the outsourcing management approach. Specifically,
“outsourcing is viable if an institution has a good understanding of the near- and long-term goals of an imaging initiative and can fully specify imaging, metadata, and derivative requirements; locate reliable vendors; evaluate products and services; adopt policies and procedures for various functions; and define institutional and vendor responsibilities.”
(Cornell University Library/ Research Department)


Although the OAI model may be the management approach chosen by institutions like CHIR, there is a major debate over the fiscal soundness of OAI publishers. Statistics have shown that “41 percent of OA journals are losing money, 24 percent are breaking even and only 35 percent are in profit” (Data Conversion Laboratory). This could have a chilling effect on the open access management approach, because the OAI publishing model may not be "reliable" publishing vendors. It will be an interesting case study to see how well the new CHIR development policy functions.

References:
Entlich, R. and Buckley, E. “Digging Up Bits of the Past: Hands-on With Obsolescence.” RLG DigiNews 10(5), October 15, 2006. Accessed 25 November 2006, online: http://www.rlg.org/en/page.php?Page_ID=20987#article1

Glick, K., Wilczek, E., and Dockins, R. “Fedora and the Preservation of University Records Project.” RLG DigiNews 10(5), October 15, 2006. Accessed 25 November 2006, online: http://www.rlg.org/en/page.php?Page_ID=20987

Harnad, S. “CIHR Proposes 99.99% Optimal OA Self-Archiving Mandate." Open Access Archivangelism, October 12, 2006. Accessed 28 November 2006, online: http://openaccess.eprints.org/index.php?/archives/144-CIHR-Proposes-99.99-Optimal-OA-Self-Archiving-Mandate.html

“Moving Theory into Practice: Digital Imaging Tutorial.” Cornell University Library/ Research Department. Accessed 23 November 2006, online: http://www.library.cornell.edu/preservation/tutorial/contents.html

“Open Access debate still rages.” Data Conversion Laboratory. Accessed 29 November 2006, online: http://www.dclab.com/open_access_debate_still_rages.asp

Podcast A5: Societal Issues

Heather Hawley's Podcast Script:
This podcast is on the societal issues module. I talk about the numerous ways I have found that librarians are implementing social software. I delve into the ways that librarians use these types of social technology software, including blogs and wikis created for sharing ideas and I also include some websites in which libraries are using these same technologies. I also discuss the reasons and benefits for using social software tools.
Download file

Thursday, November 02, 2006

Podcast A4: Security & Privacy

Heather Hawley's Podcast Script:
This podcast is on the security and privacy module. I talk about the current state of email spams, computer viruses and worms, spyware, adware, and phishing. I delve into the definitions of these malicious programs and provide examples of what they are from Technology for the Rest of Us, the BBC News, and my own personal experience. I also list the University of Arizona's, Security Incident Response Team recommendations for users to protect their computers.
Download file

Hold the Phone: VoIP Style

Remember our Course Introduction module, setting up and using Skype? Well, I thought this article was interesting and ties into our previous learning exercise using VoIP. From my yesterday’s RSS feed in the New York Times Technology Section, “Phones for that Other System”:
Netgear%20for%20skype.jpg


This article talks about how “until recently, the best you could hope for was an ‘operators standing by’ headset and microphone, which had to be tethered to a computer, which was itself tethered to a router.” It shows how the computer headphone is now morphing into new, familiar-looking VoIP phones to be more user friendly. The idea is that offering a familiar looking phone will open up the market to let Skype compete with landlines and cell phones.

Krug’s Trunk Test: My Assessments and Test Outcomes

Krug’s trunk test is an exercise to assess the usability, or specifically the “good navigation”, of a web site by having the user start from in “the bowels” of a site and assess from there whether the following questions may be answered “without hesitation”:

What site this is?
What page you are on?
What are the major sections of this site?
What are your options at this level?
Where you are in the scheme of things?
How you can search?
How you can get home?
(p85).

I performed this “trunk test” on the following websites and will discuss my final assessment of the answers to the above questions for each of these sites: Drugs & Alcohol page, Food Network's Party Ideas page, and the University of Arizona, School of Information Resources & Library Science’s About SIRLS page.

1. The TeensHealth.org's Drugs & Alcohol page has an overall nicely designed page with a clean and uncluttered layout. However, there are some issues that if addressed would make this a better and more effective informational website.

The first problem I noticed is that, although the Site ID is locatable and properly placed in the upper left corner, (Krug p64), the website’s sponsor, the Nemours Foundation, has an equally prominent listing on the right side of the page, making the site ID a bit confusing. This site’s design of the Site ID and Sponsor information forces the user to pause and think about which option may be the homepage, which is a clear violation of Krug’s mantra “Don’t Make Me Think!” (p11).

It’s much easier to tell what page the user is on, because the page name is on a prominent banner located in the top middle of the page. When I navigated to other areas of the site, I could see that the use of these page name banners was consistent which is “one of the most powerful usability principles” (Nielsen, Top Ten Mistakes in Web Design).

The major sections of the site are clearly laid out by the tabs navigation bar on the left hand side of the page. One issue here is that the sections tabs on left side of the page are not highlighted in some way to show the reader which tab has been selected. According to Krug, “if there’s no tab selected when I enter a site…I lose the impact of the tabs in the crucial first few seconds when it counts the most” (p84). Therefore, highlighted would make the tab navigation scheme effective.

The local navigation of this webpage is fairly well done. However one issue I encountered is that the links change to a very similar color after the user has visited it (from blue-gray to blue). At first I thought they didn’t change color, but after trying different browsers, I noticed there was a slight color change. This may be an issue for other users as well, and the end result is likely to cause confusion for users who may need cues to know which links have been previously visited. According to Jacob Nielsen, in Top Ten Mistakes in Web Design, “knowing which pages they've already visited frees users from unintentionally revisiting the same pages over and over again.”

The “you are here indicators” are the page name banners and in the use of breadcrumbs to show “the path from the homepage to where you are” (Krug, p76). Again, the use of highlighted tabs to cue the user on exactly which section they are on would make the use of these tabs more effective.

Users can search from any page and the search engine is well laid out with the search box at the top with a clearly defined “search” button. However, the search capabilities are fairly weak. Bad search is the number one mistake according to Jakob Nielsen’s Top Ten Mistakes in Web Design, because “overly literal search engines reduce usability in that they're unable to handle typos, plurals, hyphens, and other variants of the query terms.” I experimented with the search by purposely misspelling words like “alcahol” and “maryjuana” and even searching for ambiguous terms like “drinking” to see what the site’s search capabilities were. I came away disappointed since misspelled words resulted in “no results were found for your search”. Another problem is that the search criteria are not displayed on the search results page, which is a “Web Blooper” according to Jeff Johnson’s in his book GUI Bloopers (p342). Johnson points out that “when search services do not show the users’ search criteria on the results page, they are hindering their own effectiveness” (p342). Also, the results displayed for ambiguous search terms should be tightened up for more effective searching. For example, the search for the term “drinking” resulted in matches about water, and even sports nutrition. A search engine this weak could result in the loss of user interest because it’s so difficult to find results when searches are not good matches to what is in the webpage. I think this element of web design is especially important for a site that has a target audience of teens, who may not be sophisticated searchers. This site’s search engine needs to be configured to allow matches for descriptive terms or misspellings to allow effective searching.

Although it should be easy for users to navigate “Home”, because there is a link for it located at the top left side of the page. However there is another violation of the “Don’t Make Me Think!” rule, because right next to link for “Home” is another link for “KidsHealth Home”. This is overly confusing, especially for users that are in the bowels of a site, to know exactly which one is the “right” home navigation link to use. Also, when users click on Home, the homepage opens in new browser, which is Mistake number nine of Top Ten Mistakes in Web Design. Neilsen explains why this constitutes bad design:
“The strategy is self-defeating since it disables the Back button which is the normal way users return to previous sites. Users often don't notice that a new window has opened, especially if they are using a small monitor where the windows are maximized to fill up the screen. So a user who tries to return to the origin will be confused by a grayed out Back button.”


2. The Food Network's Party Ideas page looks overwhelming with its rich media graphics and flash advertisement and the moving ticker in the heading; there are also four drop down menus and three search boxes, as well as numerous section headings and text boxes. Since my initial reaction is “Whoa! Where do I start?” obviously there is not a good use of “clear visual hierarchy” on this page (Krug, p31), nor an effort to “keep the noise down to a dull roar” (Krug p41). My first impression of the “busy-ness” [sic] and “visual noise” (Krug, p38) going on in this page is summed up by Krug: “When everything on the page is clamoring for my attention the effect can be overwhelming.” The use of moving banners and flash is mistake number seven of Nielsen’s Top Ten Mistakes in Web Design because “Web users have learned to stop paying attention to any ads that get in the way of their goal-driven navigation.”

The site ID is a locatable icon properly placed in the upper left corner.

The page you are on relies on the tabs navigation where the tab of the page you are on is reverse highlighted, but these tabs are also the major sections of the website. There is also very small breadcrumb at the top of the page. However, Krug warns against making the breadcrumb “do double duty” by using them in lieu of a page name because “it seems like it should work, but it doesn’t, probably because it fights our expectation that headings are flush left or centered not dangling in the middle of the page at the end of a list” (p79). Krug also suggests that the last item on this breadcrumb be boldface, which is not done here making the page identification more difficult. Another issue I noticed right away is that the URL is: http://www.foodnetwork.com/food/entertaining. This URL name is confusing to me because it contains the “food/entertaining” tag which is different from the page navigation tab for “Party Ideas” (and there is no tab for food and entertaining either).

The major sections of the site are clearly laid out by the tabs navigation bar across the top of the page and are nicely reverse highlighted to show which section the user is on (note that for this website, the sections correspond to pages).

The local navigation is simply a mess because it is spread out all over the page, although it also has three bulleted links in small type on the left side of the page. The links in the middle of the page and on left side of the page take the user to the same place to make the design of this website not only messy but redundant. According to Jakob Nielson in Reduce Redundancy: Decrease Duplicated Design Decisions,
“too many cross-references will create an overly complex interface and prevent users from understanding where they are and what options they have at that location. It's thus essential to limit cross-references to those alternatives that are both most important to users at their current location and most likely to help them overcome navigational dislocation.”


Also, the featured items look similar to advertisements, which users tend to ignore (Nielsen, Top 10 Mistakes in Web Design). Furthermore, these featured items make the mistake of asking users to “click here” which Nielsen says is “the oldest web design rule is to avoid using [“Click Here”] as the anchor text for a hypertext link” (p55). Instead Nielsen recommends using hyperlinks in “text that provides a short summary of what kind of additional information is available” (p55).

The “you are here indicators” again rely on the reverse highlighted tab and small breadcrumb (as do both the page and section identifiers).

You can search by putting your search term into one of the three search boxes on the page. One search box is labeled “Find an Episode”, but users may not know if they should search for an episode or topic. The other two search boxes seem to be the same, because they both gave me identical results, except one is located on the top middle of the page while the other is in the top left margin of the page. Figuring out which of the three search boxes to use is confusing and makes the user think much too much. Users must also select whether to search by “Recipes” or by “Topics” for the non- “Find an Episode” boxes which again makes the user think too much. However the search results properly show the search term the user has entered for the search process (Johnson, p342).

You can get home by clicking the Site ID graphic on the upper left hand corner of the page, or by searching hard for the tiny first breadcrumb labeled “Home”, or the equally tiny “Home” link at the bottom of the page. This is yet another example redundant links.

3. I chose to critique the University of Arizona, School of Information Resources & Library Science (SIRLS) web page because I must rely this website for crucial departmental information and it drives me crazy because the site is so user unfriendly it makes finding information difficult. For this exercise I looked at the About SIRLS page which is frankly a relief to be past the homepage, which manages to make most of Neilsen's Top 10 Mistakes in Web Design, including the use of crowded popup menus and link redundancy in triplicate . However, the About SIRLS page is okay, but with some corrections it could be made better (while the Homepage needs a complete overhaul).

The Site ID is locatable and properly placed in the upper left corner inside a banner with the school logo.

The page you are on is also easy to locate because the page name is bolded and located in the top left of the body of the page, and the navigation tab is reverse highlighted.

The major sections are fairly well done, but I would suggest that the subsections which are currently located on the left side of the page in table form be included in the tab menu at the top instead. Moving the section navigation links to tab subsections would be a more effective use of the tab navigation scheme already in place. But at the very least, the existing subsections should be highlighted when the user clicks on them.

The local navigation is nicely laid out on the page, with a bulleted list and underlined links. However, it could be made more user friendly by reducing the amount of redundant links such as the calendar which is also located in a table on the right side of the page and the Search & Index link. While the Search & Index link should be converted to a “simple search box” (Nielsen, Top 10 Mistakes in Web Design). Also, the duplicate calendar links are bad

“because [if] users don't know for sure when a feature is duplicated, they'll have to spend additional time figuring out whether the duplicate is a new feature or an old feature.”
(Nielsen, Reduce Redundancy: Decrease Duplicated Design Decisions).


Another problem is that the links do not change color which may cause users to unknowingly revisit the same sites repeatedly (Nielsen, Top 10 Mistakes in Web Design).

The “you are here indicators” are the page name heading in bold at the top left of the page, the use of breadcrumbs to show “the path from the homepage to where you are” (Krug, p76) and the use of reverse highlighted tabs to cue the user.

You can search by clicking on a link in the subsection menu located on the left side of the page or clicking on the link in the middle of the page. These are violations of both redundancy and requiring the user to click on a link for searching rather than providing a search box (Nielsen, Top 10 Mistakes in Web Design). However, since this site incorporates a Google search box, the search capabilities are robust with the proper listing of the user’s search terms (Johnson, p342).

You can get home by clicking on the “Home” tab in the section menu, or by clicking on the first breadcrumb labeled “Home”, the “Home” link at the bottom of the page or the SIRLS icon. Having this many links is again redundant, a better option would be to make one “Home” option “more prominent rather than duplicate it” (Nielsen, Reduce Redundancy: Decrease Duplicated Design Decisions).

References:
Johnson, Jeff. GUI Bloopers: Don’ts and Do’s for Software Developers and Web Designers. CA: Academic Press, 2000.

Krug, Steve. Don’t’ Make Me Think: A Common Sense Approach to Web Usability, 2nd Ed. IN: New Riders, 2006.

Nielsen, Jakob. Designing Web Usability. IN: New Riders, 2000.

Nielsen, Jakob. (2002). Reduce Redundancy: Decrease Duplicated Design Decisions. Jakob Nielson’s Alert Box. Accessed online on 10/15/06 at http://www.useit.com/alertbox/20020609.html.

Neilsen, Jakob. (Updated 2006). Top 10 Mistakes in Web Design. Jakob Nielson’s Alert Box. Accessed online on 10/15/06 at http://www.useit.com/alertbox/9605.html.

Risks involved in the YouTube buyout by Google

With all the media fanfare and speculation recently over the purchase of YouTube by Google, I wanted to share an excellent article I read about the ramifications of this buyout. The article is "The Battle Over YouTube", found on the October 9th issue of Newsweek.

The first major issue discussed is the well known controversy over YouTube’s airing of copyrighted material. For example, the music owned by the music company giant Warner, and the tentative profit sharing agreement that’s been reached between those two companies. But not all the music giants want to settle, like the biggest giant of all, Vivendi, which is threatening lawsuits. According to this article, since Google has bought out the company, and everyone knows that this company is rich (unlike the startup that was YouTube), they are now “a target for anyone who wants to sue."

The second major issue discussed, which is has not been widely publicized that I’m aware of, is the enormous operating expenses involved. The author quotes a source who estimates it to be “more than $2 million a month.”

Finally, the threat of startup competition is discussed. A popular YouTube video creator tells the author that he has “started experimenting elsewhere, advertising on his own Web site and trying out Revver.com”, even after being courted by YouTube.

I recommend reading this article because I think it provides more insight to the challenges ahead for this highly touted merger.

Podcast A3: Rich Media

Heather Hawley's Podcast Script:
This podcast is on the rich media module and I talk about some websites I have found that are good examples of rich media use. Sites include The New York Times, Zillow, and the Tucson Association of Realtors Multiple Listing Service. I delve into what these websites are and how they are using rich media, how they measure up to the Interactive Advertising Bureau standards and guidelines if applicable, and how rich media is used as an interactive informational tool in their products or services.
Download file

The Role of XML

My understanding of XML
My understanding of the role of Extensible Markup Language (XML) is that it is the next generation markup language, following (but not replacing) the original Hypertext Markup Language (HTML) on which the internet as we know it is based. However, unlike HTML, it allows data to be described. This XML language is an enhancement of the existing HTML internet building blocks: providing protocols for reliable global communication and delivering files, a language specifying how data should be displayed, and the graphical interface for displaying HTML data on the web. XML aims to improve on these building block elements by being “well formed” to provide consistency of data element assignment, providing enhanced tags that separate content and presentation, and is verifiable (Rhyno, p72-73).

Problems with HTML
The internet is based on TCP/IP, an innovation that began the phenomenon of the internet, and which allowed global communication by providing easy file and information sharing. However, there are some drawbacks to HTML. Namely, that HTML is mainly concerned with “content presentation and arrangement” (Rhyno, p72) of data in a webpage but not with data description. An example of this that author Art Rhyno gives is wanting to

“extract subject information, or even if you just want consistency in subject assignment, it becomes difficult without a commonly used tag like <subject> to mark, or delimit, where this information is contained…HTML is limited to a fixed set of tags. In this case, <subject> is not considered a valid HTML tag….” (p 73).


Besides the description issue, some additional issues with HTML are that it’s “graphics rich content” is problematic for displaying on screens of most small wireless devices and that even if graphics are removed, the HTML text is overwhelming and not very readable for the user (Coyle, p135).

XML: What it is
Coyle describes XML as a “meta-language” because it describes “how others may define their own data languages” (p 137); or more simply, a language to describe language. Therefore, XML is not a language like HTML, because it only sets out a framework to allow “users and industry groups to define their own domain-specific data definition languages” (Coyle, p139). According to Wikipedia, “languages based on XML are defined in a formal way, allowing programs to modify and validate documents in these languages without prior knowledge of their particular form.”

Relationship%20between%20XMLand%20HTML.jpg Image:http://www.idealliance.org/papers/xml2001/papers/html/images/03-03-05/xml-html-venn.jpg


A Breif History of XML and the W3C Influence
XML is derived from Standard Generalized Markup Language, which was itself based on Generalized Markup Language, that began the idea of a “formally defined document type” that can be “described with a set of rules” (Rhyno, p72). Some twenty years later, in 1996 the World Wide Web Consortium (W3C) became involved in developing XML (Rhyno, p72) because it was a less costly, more user friendly version of SGML and it could provide more meaningful data than HTML, yet wouldn’t replace it entirely (Desmarais, pp1-2).

timbl.jpg

Image: http://www.xml.com/2000/12/xml2000/timbl.jpg


XML and Libraries
The use of XML is very useful for libraries due to the following factors:

XML is well formed (Rhyno, p73): Meaning that the structure is sound because it contains the proper nesting with both an opening and closing tag (Desmarais, p7). This is crucial for libraries because they must have quality control over potentially “tens of thousands” of documents (Rhyno, p73).

XML can be validated and can ensure consistency (Rhyno, p73): This requires an XML parser or “validation mechanism…of a document type definition (DTD)” (Rhyno, p73) to “check incoming data against the rules defined in the DTD to verify that the data were structured correctly” (Desmarais, p3). Rhyno calls this validation step “one of the most important steps in managing a library’s digital collection” because it maintains consistency that is needed for “sharing the content with others” and for any future “migrations” of this information to a new system (Rhyno, p73).

XML separates content from presentation (Rhyno, p73): This goes back to the initial problem with HTML: namely that it doesn’t separate content from presentation, a task “which is fundamental in managing large collections of documents...” (Rhyno, p73).


seperating%20content%20from%20format.gif

Image: http://ils.unc.edu/~viles/xml/slides/img020.gif


Obviously the issue of separating content from presentation is important to libraries because their job is to provide and organize content from numerous sources. XML also allows meaning to be embedded into data that can be presented “in a format that is independent of device, programming language, operating system, or network platform” (Coyle, p138).

As noted, XML is useful for libraries and was considered for replacing the "aging" MARC format as their cataloging system (Desmarais, p3). The Library of Congress initially considered this shift by doing a feasibility study in 1995, then in 1998 they released a MARC document type definition (DTD) and software to convert MARC to XML (Desmarais, pp3-4). “The objective is to make machine-readable bibliographic data more open and interchangeable in the Internet environment” (Desmarais, p4).

XML Applications
Besides solving data content issues for the library, XML is useful for integrating technologies and being able to deliver critical information in other industries. For example, I found an article on XML in Wired News titled; XML Zooms onto Gov’t Tech Agenda. This article is about how “declining sales among U.S. automakers have clinched government support for XML standards.” This is in reference to the Enterprise Integration Act of 2002, which was made into law due to the declining profits of U.S. carmakers (Ford, GM, and DaimlerChrysler), after a report by the National Institute of Standards and Technology whose findings showed that data-quality errors caused by interoperability resulted in financial losses of $1 billion per year (Steakly, 2002). The idea here is that XML can help U.S. industries save billions of dollars by streamlining and integrating their manufacturing or business processes so that they can compete in the global marketplace.

Another article I located in the New York Times Technology section entitled, Software Out There that also addresses the interoperability issue. The idea here is that “blocks of interchangeable software components are proliferating on the Web and developers are joining them together to create a potentially infinite array of useful new programs” (Markoff, 2006). According to this article, the main reason for this shift from proprietary systems to one of interoperability is due to open source software and XML which allows “simple and efficient to exchange digital data over the Internet” (Markoff, 2006). Markoff quotes a Microsoft Chief Technical Officer, Ray Ozzie, as saying: "I'm pretty pumped up with the potential for R.S.S. to be the DNA for wiring the Web." Since RSS is based on an XML system, this is obviously a great example of XML extending the enterprise of the web.

todays%20vs%20tomorrows%20web.jpg

Image: http://www.tiresias.org/cost219ter/florence/images/dardailler_fig01.jpg


My Thoughts on the Impact of XML on the Digital Divide Issue
The Markoff article got me thinking about XML and the impact it has on the web and how this in turn, and more importantly, impacts the digital divide. Last night I read in The Economist, the article Splitting the Digital Difference about new ideas for narrowing this divide like giving children laptops, or hard wiring multiple users to a central computer they all share, or cell phones that can carry data and link shared PC's to the internet. This just makes me realize how important the XML standards are for truly “sharing” information. These XML standards will enable data to be shared and understood regardless of the device used or needs of the individual or entity using it to access, provide, or manage information.

References:
Coyle, Frank P. Wireless Web: A Manager’s Guide. NJ: Addison-Wesley, 2001.

Desmarais, Norman. The ABC’s of XML: The Librarian’s Guide to the eXtensible Markup Language. TX: New Technology Press, 2000.

Markoff, John. “Software Out There.” New York Times 5 April 2006. Accessed 26 Sept 2006, online: http://www.nytimes.com/2006/04/05/technology/techspecial4/05lego.html?ex=1159416000&en=ef2d38c1eebed9bf&ei=5070

Rhyno, Art. “Introduction to XML.” In Technology for the Rest of Us, ed Nancy Courney. CT: Libraries Unlimited, 2005.

Steakly, Lia. “XML Zooms onto Gov’t Tech Agenda” Wired News 11 Nov 2002. Accessed 26 Sept 2006, online: http://www.wired.com/news/politics/0,56287-0.html

Web 2.0 Winners & Losers Poll Results

Since Professor Glogoff talked what the web 2.0 means in his most recent podcast on Core Web Technologies, "Web 2.0", when I came across this reader poll in my RSS feed today in the Wired News Technology section, I thought I’d share. If you’re interested, check out who got voted the best and worst websites in The Web 2.0: Winners and Losers.

So I’m bringing gossip back to the blog, since it’s almost like the best & worst dressed at the Emmys, except about the highly touted internet 2.0. Enjoy!

Podcast A2: Podcast on Podcasts

Heather Hawley's Podcast Script:
I talk about the podcasts that I have subscribed to, how long they seem to have been in existence, whether there are corresponding websites, the length of each of the podcasts that I listened to and a summary of what was discussed. I also note what I found interesting about them, why I decided to subscribe to that particular podcast, how I listened to it, and who I would recommend these podcasts to.
Download file

My Experience using Live Bookmarks

My Live Bookmarks:
I chose two Live Bookmarks for Activity #4 in the Course introduction: Wired Technology Section and the New York Times Technology Section. I used to check only these sporadically, but since setting up the live bookmarks on every computer I use including my work computer, my home desktop and my laptop, I now check these feeds daily.

Effect of Live Bookmarks on my Online News Browsing:
Setting up Live Bookmarks has definitely heightened my use of the web for daily news because I am following headlines from multiple sources now instead of only browsing and reading selected articles from the New York Times. And I used to never read technology news, but now this is a major part of what I follow, so I’ve learned a lot. It has been beneficial for me to read this since it helps me relate to what we are discussing or learning about in our Introduction to Technology class. It’s also a great way to supplement the information we get from Professor Glogoff.

My Other Live Bookmarks:
Since completing this Live Bookmark activity, I have since added the following live bookmarks: the New York Times Science and Health sections, the Technician which is the student newspaper from North Carolina State University, Democracy Now, and the Christian Science Monitor.

I chose to add the New York Times Science and Health sections because I am taking a SIRLS Medical Online Searching class, and a major part of this is following current news about medical issues. My project in this class will be on type 2 diabetes, so it’s beneficial for me to monitor any breaking or current news about this disease.

I chose to add the The Technician because it is the student paper for North Carolina State University, where I plan to apply for a Library Fellow position after graduation. Since I’m not overly familiar with this school, I have subscribed to the paper’s feed so that I can learn more about the school and stay informed on their issues. I think this is a good way to get information on and to think about how a librarian could be involved with supporting these students, especially how library technologies could be of help.

I chose to add both Democracy Now! and the Christian Science Monitor because their news coverage is outside mainstream media and is a good way to get a more balanced media coverage of current issues. So reading these news sources along with reading the New York Times keeps me informed about breaking news as well as news analysis.

Live Bookmarks and a RSS Feed Aggregator:
Because I have so many Live Bookmark feeds now, I am interested experimenting with a RSS aggregator. I first learned about this service after reading about it in Fagan Finder. I think this service could be an excellent resource for monitoring news that is relevant to my classes so I have signed up for a free aggregator from FeedReader. Because I now have all these live bookmarks to view, I look forward to experimenting with Feedreader to find out more about adding specific resources, like my current Live Bookmarks. This would allow me to not only view the ones I currently subscribe to, but to also get an overview of all these other news sources for information.

A Report on Radio Frequency Identification (RFID) Technology

RFID Technology: What it is
RFID is a radio frequency identification system that is automated and allows data collection and transfer as well as the computer identification of specific objects.

RFID technology uses a combination of radio frequency and microchip technologies (Schnell p43). The tags, which are both readable and writable, contain memory microchips and antennas (Schnell p44). Yet they can be so tiny as to be “paper thin” (Schnell p44) or even “smaller than a grain of rice” (Garfinkel & Rosenberg, pxxvii). These microchip tags are attached to objects so that the object’s data can be read remotely using the radio frequency waves (Schnell p43).

How exactly does this technology work? The tag’s microchip contains a “unique identifier and other information, the reader extracts the information on the tag”, while the antenna sends radio waves between the tag and the reader to “excite the microchip allowing the stored information to be read” (Schnell p44).

rfid.jpg

Image from: www.tagnology.com/rfid.htm


There are also two different types of radio frequency (RF) tags: passive and active. The difference between the two is that an active tag contains a battery powered system while the passive tag is powered solely from the radio signals it receives. Active tags have the advantage of being readable from greater distances because the system doesn’t rely on a RF signal for power, while passive tags have the advantage of being much smaller and cheaper and have a longer “shelf life” because they don’t need battery replacements (Garfinkel & Holtzman, p17).

Origins and Current Use
RFID technology was originally used by the British government in World War II “to identify their own aircraft returning from sorties over occupied Europe” (Mullen & Moore, p5). It was later used by the U.S. government for monitoring nuclear or hazardous materials in the late 1960s (Mullen & Moore, p5). Then in 1977, this technology was released for public use (Mullen & Moore, p5).

Today this technology is applied to various products that most people recognize such as: the E-ZPass electronic toll collection system, vehicle remote keyless entry systems, and in retail for debit payment or for identifying products (Schnell p44).

RFID Systems in Libraries
Libraries also employ this technology which can offer more benefits than the older barcode technology. In the chapter, “Radio Frequency Identification (RFID)”, author Eric Schnell lists the ways in which RFID technology is superior to barcodes (incidentally, barcodes are the technology employed by the UA Main library). According to Schnell, the advantages are that: visible tags are not required, tags can be placed anywhere, multiple tags can be read at once, they “are more tamper resistant than barcodes”, and they can provide both security and identification at once (Schnell, p44).

Schnell also lists the possible components of RFID systems used in libraries and explains their functions.

Circulation Stations: updates circulation and collection records

Staff Processing Stations: processes new materials, writes data to the tag, and updates collection records

Shelf-Management Readers: hand held devices that handles inventory and data searching (for “retrieval or weeding” purposes)

Theft Deterrence Gates: an alarm system that checks “security status of materials”

Self-Checkout Stations: checks patron identification, checks out items, deactivates security and prints receipt

Return Drops: contains scanners for automatic check-in of materials and updates circulation and collection records

Sorting Stations: handles automatic materials sorting (combined with book drop) (Schnell, p44).


rfid.jpg

Image from: http://www.d-techdirect.com/images/RFID/main.jpg


Costs
Associated costs of RFID technology for libraries can be large, with the cost of the RF tags representing “about half the overall cost (Schnell p48). Schnell gives the following cost estimates, as of mid 2004: tags $0.80, tags for DVD’s $1.50, tags for VHS tapes about $2.00, anti-theft security device components $4,000, server with interfaces for RFID hardware and an integrated library system $15,000 and a self-check-out machine $20,000 (p48). So, for a rough cost breakdown for libraries implementing an RFID system: for a library with 40,000 items the estimated cost is $70,000, or for a library with 100,000 items the estimated cost is $166,000 (Bowen Ayre, p235).

Impact and Implications: Positives
However, RFID technology may have a positive impact on libraries because it can “offer potential cost savings in the operation and management of resources” by: increasing speed of processing materials, improving management of collections, improving sorting, allowing real time updating of records and reducing theft (Schnell, p45).

The impact and implications for businesses is similar. RFID technology is currently being used in business for manufacturing, distribution and inventory, retail, document tracking, security, food supplies, healthcare (Mullen & Moore, pp8-10).

I recently did a news search for RFID and came across several articles about businesses using this technology. After doing a google search I found “IBM to Provide Network to Monitor Cattle” This article is about IBM creating a “remote system to transmit the body temperature of cattle to ranchers, dairy farmers, feedlot owners and government regulators.” In the op-ed piece “It Tracks Your Every Move...at the Water Park” the author talks about his vacation experience at a water park in Pennsylvania where they used bracelets with RFID technology which allowed for hotel access and for purchasing food and tokens.

Impact and Implications: Negatives
However RFID technology also has it's critics. In today’s news, “Wal-Mart Expands Use of RFID Tracking”:

“Despite the best efforts of privacy advocates, Wal-Mart pressed forward with its plans to use RFID, saying it planned to roll out the technology to another 500 stores during this fiscal year. The expansion would mean over a quarter of the company's 3,900-plus stores, including its Sam's Club subsidiary, would use RFID to manage its inventory.”


As noted in the Wal-Mart article, RFID is a controversial technology because of privacy concerns. The major issues are that RFID could be used for profiling and surveillance (Weinberg, p89). In the Wal-Mart article, privacy advocates are concerned about surveillance, claiming that RFID tags could “possibly [be] allowing Wal-Mart to track its customers without their knowledge.”

Privacy issues also remain a top library ethics concern. RFID can have a negative impact on libraries integrity if patron’s “movements and reading habits” are tracked, or have their “information gathered and used to market unwanted products and services” (Schnell, pp49-50).

However, Schnell posits that RFID use in libraries is no more of a threat to patrons’ privacy than traditional circulation databases, or any other debit card or toll pass technology (Schnell, p50). Firstly because of the short “read range” of most libraries RFID systems would make it technically impossible without installing “an array of RFID readers” which is not economically feasible; and secondly because most libraries purge their records kept on patrons once items are returned in order to protect their privacy and ward off USA Patriot Act intrusions (Schnell, p50.)

To help libraries navigate the privacy issue, while utilizing this technology, author Lori Bowen Ayre surveys and lists some best practice guidelines for libraries preparing to use or currently using RFID technology. Among these are:

Libraries need to be open about using RFID and this includes providing literature to inform patrons about their rational for using RFID, its objective, and the policy and procedures of its use.

Signs should be posted to inform patrons that this technology is being used, describe what type of information is being collected, and a privacy statement should be provided along with a description of how this type of technology differs from the older technologies used.

No personal information should be stored on the tags; nor should any information describing the tagged item, in order to prevent this information being read by unauthorized users.

“All communications between tag and reader should be encrypted via a unique encryption key” (not unlike the 802.11 Standard for LAN’s). (Bowen Ayre, p240).


References:
Bowen Ayre, Lori. “Wireless Tracking in the Library: Benefits, Threats, and Responsibilities.” In RFID: Applications, Security, and Privacy, eds. Simson Garfinkel and Beth Rosenberg. NJ: Addison-Wesley, 2005.

Foy, Paul. “IBM to Provide Network to Monitor Cattle” Forbes. 24 Aug. 2006. 9 Sept. 2006 http://www.forbes.com/business/energy/feeds/ap/2006/08/24/ap2972207.html

Garfinkel, Simson and Holtzman, Henry. “Understanding RFID Technology.” In RFID: Applications, Security, and Privacy, eds. Simson Garfinkel and Beth Rosenberg. NJ: Addison-Wesley, 2005.

Garfinkel, Simson and Rosenberg, Beth, Eds.“Preface.” In RFID: Applications, Security, and Privacy, eds. Simson Garfinkel and Beth Rosenberg. NJ: Addison-Wesley, 2005.

Mullen, Dan and Moore, Bert. “Automatic Identification and Data Collection: What the Future Holds.” In RFID: Applications, Security, and Privacy, eds. Simson Garfinkel and Beth Rosenberg. NJ: Addison-Wesley, 2005.

Oswald, Ed. “Wal-Mart Expands Use of RFID Tracking.” Beta News 12 Sept. 2006. 12 Sept. 2006 http://www.betanews.com/article/WalMart_Expands_Use_of_RFID_Tracking/1158092179

Pogue, David. “It Tracks Your Every Move...at the Water Park.” New York Times 30 May 2006. 9 Sept. 2006 http://www.nytimes.com/2006/05/30/technology/poguesposts/30pogues-posts.html?ex=1158206400&en=34c50456bffa9703&ei=5070

Schnell, Eric H. “Radio Frequency Identification (RFID).” In Technology for the Rest of Us, ed Nancy Courney. CT: Libraries Unlimited, 2005.

Weinberg, Jonathan. “RFID, Privacy, and Regulation.” In RFID: Applications, Security, and Privacy, eds. Simson Garfinkel and Beth Rosenberg. NJ: Addison-Wesley, 2005.

What I learned from Molyneux and Drew in the book "Technology for the Rest of Us"

I will discuss the two new topics I learned after reading Molyneux’s “Computer Networks” and Drew’s “Wireless Local Area Networks” chapters from the book, Technology for the Rest of Us. From Molyneux, I learned about partial and full mesh network architectures and the OSI Reference Model concept. From Drew, I learned about the 802.11 Standard for LAN’s, and about WLAN security protocols.
From Molyneux, I learned that a full-mesh network is when each device is interconnected, while in a partial mesh network, devices do not need to be connected to every other device (p 4). To show the benefits of a partial mesh network, Molyneux gives the illustration of a phone network in a full mesh network scenario: every time a new phone is added to the network, it would require a connection “to all other phones in the country” (p 4). Because of this, Molyneux points out that full mesh network designs are usually constrained to small applications (p 4). Therefore, besides feasibly, a major benefit of a partial mesh network is cost reduction because not all devices require connections (p 5).

Molyneux also introduced me to the concept of the OSI Reference Model of network design. This model provides the basis “for comparison and teaching purposes” of all other networking protocols (p 9). This model is multilayered and includes the physical, data link, network, transport, session, presentation, and application levels. Because of this, most information professionals specialize in one layer because “each involves its own technologies, is complex, and requires different skills” (p 9). Molyneux has also proposed a “Layer 8” for libraries whose decision making is

“about what the networks will be used for [which] is a function decided outside the OSI Model- it is related to what the mission of the institution is and how networks can advance that mission” (p 10).


From Drew I learned about the 802.11 Standard for LAN’s, which Drew characterizes as “the most important event in the growth of WLAN’s” (p 16). This standard was adopted in 1997 by the Institute of Electrical and Electronics Engineers (IEEE) and it “describes the protocols and sets a common framework for all developers” (p 16). This standard guarantees purchasers “greater bandwidth” and manufacturers’ compliance with the standard so that devices can “talk with each other no matter what the brand” (p 16).

Drew also taught me about WLAN security including VPN, WEP, WPA and TKIP protocols. VPA means virtual private network, and this protocol requires network authentication before accepting users onto the network and data encryption during transmition (p18). WEP means Wired Equivalency Privacy and this protocol uses an algorithm or “key” for data encryption that is “built into” most access points; it is also the most widely used security protocol (p18). WPA means Wi-Fi protected access and this protocol has been “designed to overcome the basic weaknesses of WEP by providing improved encryption of data and by improving authorization routines” (p19). This protocol requires a password or “preshared key” that the “access point and client use…to generate new keys on other devices on the network” (p 19). And finally, TKIP means Temporal Key integrity Protocol, and it is an algorithm that “provides improved security recognition of the origin of the data being transmitted or received, and greatly improved authentication” (p 19). Drew taught me that security protocols for WLAN’s are important for a secure network so that data can’t be captured and new protocols are being continually developed for even greater security.

References:
Robert E. Molyneux, "Computer Networks," in Technology for the Rest of Us: A Primer on Computer Technologies for the Low-tech Librarian, ed. Nancy Courtney, 1-14 (Westport, Connecticut: Libraries Unlimited, 2005).

Wilfred Drew, Jr., "Wireless Local Area Networks," in Technology for the Rest of Us: A Primer on Computer Technologies for the Low-tech Librarian, ed. Nancy Courtney, 15-22 (Westport, Connecticut: Libraries Unlimited, 2005).

Saturday, September 09, 2006

My Personal Computers Specifications and Recommendations

I find it interesting that the University of Arizona’s College of Law (COL) does not require their students to own a personal computer, while the School of Information Resources & Library Sciences (SIRLS) does. Although the SIRLS program specifies that students only need computers to have web access, and don’t have any further recommendations for the computer hardware, it does however require “basic software: a word processor, a spreadsheet, a database, and a web browser.” Meanwhile, the College of Law provides recommendations for exactly what hardware students purchasing a new computer should buy. I think it’s a good idea to provide students who may want to buy a computer with some specifications on what to look for before they make such a big purchase.

My personal computers consist of a desktop with that meets the COL’s specifications for 1 GB MHz Pentium central processing unit (CPU). It basically meets the other COL requirements; however it’s a few years old now so it doesn’t have all of the most advanced technologies. For instance, it doesn’t have a DVD player, it’s only CD ROM compatible, and it can’t burn CD’s. It does contains a lot of RAM; however since this processor was “built” by my dad and not purchased through a retail store, I’m not sure the specifics. When I contacted my dad last night he couldn’t remember the specs either, excpet that it is a pentium type processor. He works at a hospital as the systems administrator in the lab, so he gets to buy old computers (and other miscellaneous computer equipment) that the company wants to dispose of because they have replaced them with newer equipment. He then will soup up the CPU’s so that the shell’s contain updated hardware. This hobby of his is way too technical for me, but I can say that I was elated to receive this computer as a “going back to college” gift from him! It has Windows XP software installed.

I did buy my own laptop from an online retail store, so I can provide a little more on the specifications of this unit. After borrowing my husband’s laptop for class a few times, I definitely knew I wanted the smallest and lightest unit that I could afford. My husband’s laptop is an extra wide screen, so it’s great for graphics, like watching movies on or playing games on, but it’s not so good to have to lug around. I won’t go into the specs, since I rarely use it. I really wanted to buy an Apple notebook, but it was too expensive, so I shopped around by checking out reviews of cheaper laptops. I came across a very affordable laptop, the Avertec 3200 Series which I decided to buy due to overall favorable reviews in Mobile Tech Review and the Notebook Review forum.

This computer does not conform to all of the COL requirements because the CPU is not a Pentium M or Centrino and it only has a Memory of 256MB (COL recommends 512MB). However, the hard drive is 40 GB so it measures up to the COL’s recommendation of 30 GB Hard Drive. It also has some nice features: it’s very portable (4.5 lbs), has three USB ports, an Optical Drive with DVD player and CD burner, a wireless card and it came with Windows XP. I wasn’t too worried about the lower memory capacity, since I already own the desktop that has a 1 GB processor.

Both my laptop and desktop easily meet the SIRLS requirements, which isn’t too rigorous. However, I would have to recommend that anyone wanting to take SIRLS distance courses should get a laptop computer over a desktop if you can afford it. The main reason is that the convenience of taking your laptop with you wherever you need to go. I got a lot of use out of mine when I did an internship because I worked from home so all the data I compiled was on my laptop. I had weekly meetings with my advisor, so I just brought my laptop and a jump drive with me to share the information. Also SIRLS has a residency requirement so you might want a computer to use while attending class.

Regardless of what type of computer you purchase (laptop vs. desktop), I would strongly recommend a minimum two year warranty on any hardware, since you don’t want any problems with your machine while you are still in school. Also, it should have a wireless/ethernet card, at least two USB ports (to allow multiple applications like an ipod and jump drive), a CD burner/CD ROM, a reliable printer (black and white is fine), a router with a 802.11G, so you and your roommate, spouse or family can all be on the computer at one time, and an external hard drive. The external hard drive is ESSENTIAL for backing up your hard drive. You do not want to lose any important work that you have done in your classes (both current and former). It’s cheap to buy, and easy to do, but you can’t replace your course work files once it has been erased.

A home hi speed internet connection is essential, and they are fairly reasonable through your local cable company. DSL is price comparable too.

Regarding the software, I would recommend any word processing programs including Microsoft Office (especially if it's free with your computer) or Open Office freeware. Anything else that you know you want, like an HTML editor etc (I bought FrontPage since I use it at work and it’s cheaper than Dreamweaver). I would recommend installing Adobe Reader since it’s free, and you’ll need/want to view a PDF at some point.

Be aware that recently a laptop loans service has made available through the main library. This is a great way to offer students (especially anyone doing a residency requirement) the opportunity to borrow a laptop for their classes, or for other use.

Advertising's Evolution to Adapt to People's New Technology Driven Behaviors

“EBay Gambles on Google Partnership for Success of Skype, the Internet Phone Service”

I ran across this headline today in my NY Times Technology RSS feed and thought I should post about it in my blog because, as future librarians, we already know (and love?) Google, hasn't eBay been around forever?, and my fellow Intro to Tech students are getting to know (and love?) Skype.

EBay is hoping its new partnership with Google will help it find new ways to make money from Skype, its Internet calling service. But experts wonder if enough people are willing to make the switch from traditional phones to talking through their computers.

The main issue at play seems to be that people prefer to use landline telephone systems and “that only 10 to 15 percent of people choose to talk using their computers, and that this proportion is not increasing.”

I just don’t understand the insistence of a landline, since I haven’t had one for years. I’ve long ago switched to a cell phone; an added bonus was to stop the telemarketing calls. Now that I’ve learned how Skype works, I plan to share this technology with my parents and friends who all live out of state, so we can make calls for free (or after the cost of a set of headphones, anyway.) I really think it's only a matter of time for people to become more comfortable with PC based calling, and the landline will go the way of the Dodo bird.

Anyway, the gimmick is for this partnership is to increase company profits through increasing advertising dollars. To do this, Google will provide a feature to

allow users to talk to advertisers by way of Skype, instead of just clicking through to the advertisers’ Web sites. Users of this feature, called click-to-call, would also have the option of using Google’s own Google Talk system or standard telephones.

Apparently eBay is also experimenting with Skype by including it in their online auctions, the “Skpe Me” feature so that buys can talk directly to sellers. Ironically, eBay also plans on increasing their advertising revenue by branching out to feature Yahoo ads featuring products that auction sellers are competing to sell.

This leads me to the other interesting and related article I also found in the NY Times Technology RSS feed: “In a TiVo World, Television Turns Marketing Efforts to New Media”.

Basically, advertisers are waking up to the fact that traditional outlets for commercial spots are being bypassed by current technologies and that they need to develop new marketing strategies adopted to people’s new technologically driven behaviors.

It took us more than six months to get our TiVo hooked up because we didn’t have a landline. I wonder if TiVo would consider a “Skype Me” option for setting up their service, instead of making customers go through the hassle and expensive of setting up a landline service. It’s probably only a matter of time.

My Questions for Tim Berners-Lee

My blog on Tim Berners-Lee will consist of questions written as if I’m asking them directly to the man. I would like to ask him the following questions:

Q1. I’m struck by the allusion to books in the 1999 Times article (the naming of your browser Enquire after the encyclopedia "Enquire Within Upon Everything", and the journalist describing your internet invention as “almost Gutenbergian”). So what books do you recommend reading and why? Also, what are you currently reading?

Q2. Why did you decide against selling out and instead decide to work to keep the internet "open, nonproprietary and free"? Also, please tell me more about your ideals in relation to the internet as we now know it (any future plans or “wish lists”)?

Q3. What is your opinion on collective intelligence and folksonomies? Do you think this is a growing field or do you think that something else will replace it? How about an evolution (or in your case, return to your original idea) that a web browser should also be an editor; do you think this will happen in the next 25 years? If so, will that make blogs and wikis obsolete or do you see that as a separate and distinct categories?

Q4: How do you reconcile findability of quality resources to your description of the internet as “a garbage dump”? Also, if browsers facilitated writeability, how would certain internet sites be able to retain their credibility? For example, medical information websites. In your opinion, are there levels of who should be an internet “authority”?

Q5: Do you think it’s fair for journalists to ask you about your culpability in the undesirable aspects of the internet (for example child porn or stolen identities)? Do you think that other product inventors receive this same treatment or are you held to a different standard?

Internet Activities

The numeric IP address I retrieved from the whatismyip website in Activity 2 was: 68.228.53.242. After I retrieved this number, I copied and pasted it into the ARIN's WHOIS webpage. This only gave me the information that my IP address is attached to the physical address of Cox Communications Inc., at 1400 Lake Hearn Dr., in Atlanta Georgia. Because Cox is my cable and high speed internet provider, this IP address is attached to the company’s location. However, my search results for the IP address 66.253.148.213, turned up some very different results.

Searching for this IP address on the WHOIS webpage provided me with not only the service provider’s (Distributed Management Information Systems, Inc.) physical address, but also the physical location of the Royal Entrada Real Oeste Apartments on University. I was curious to find out why my IP Address only gave the Cox address and why DMIS gave locations of the IP address, so I did what anyone would do when they want some information: I googled it! This led me a Property Solutions industry news article where I learned what exactly this company is:


Noment Networks and Distributed Management Information Systems, Inc., providers of broadband Internet services to multifamily properties and planned communities, have merged to form Fusion Broadband. Fusion’s leaders said that the combination will allow them to offer more support hours and greater local representation in customer markets.


Obviously, this provider links IP addresses to each multifamily property or community location, making it apparent to anyone searching for an IP address a location to where the user’s PC is housed. Not very private is it?

When I did the trace route activity, I observed how the internet is setup; it demonstrated for me the path taken to a desired location. When I first did the tracert activity, I tried to trace the route my computer takes to get to The Nest because it’s a website that I frequent. Of the maximum 30 hops that this trace route command allows, it took me 16 hops to find The Nest. From the command prompt, I could see that my request first went to my internet provider, where the request timed out. It automatically retried the request again. After about 8 more hops I located the IP address for The Nest. However, when I tried to do another trace route on the University of Arizona, it timed on the 14th hop until the trace completed at the 30th hop. I tried another trace with the SIRLS url, but it timed out again on the 14th hop until it was completed.

Next, I looked up some of the “cryptic numeric IP addresses”. I the first hop listed on my successful trace route to The Next was IP address 192.168.2.1 and according WHOIS this is an organization called Internet Assigned Numbers Authority located in Marina Del Ray, California. A google search on this organization tells me that this organization’s mission is “Dedicated to preserving the central coordinating functions of the global Internet for the public good.” Who knew? Obviously, the last IP address was registered to The Knot, Inc., which was the website I was searching for.

I had not previously heard or knowingly used of any of the network commands found on the Computer Hope list. However, after doing the WHOIS exercises to find IP addresses and trace route to find IP address locations, I recognized the WHOIS and TRACERT commands.

Anyone who has been using email, forum discussions (like internet message boards) and listservs is bound to come across some netiquette violations ranging from the minor to flagrant. I have witnessed numerous communications that violated the Number One Rule in Netiqute: THEY WERE SHOUTING AT ME FOR NO REASON!

And of course anyone with an email account or that belongs to a chat forum is vulnerable to those pesky spam messages, a violation of Netiquette Rule Number 5. Isn’t there always a stock opportunity that you just can’t pass up? And for some reason every message board has a few people posting totally unrelated comments. Also mentioned in Netiquette #5, I got flamed in a message board for accidentally posting my message twice. It happened when I refreshed my browser which somehow caused the message to be posted again. Whoops!

All the text messagers out there, like myself, already knows all the common abbreviations in Netiquette Number 7, so I won’t go into that.

And finally, as personal aside, I think a new rule should be added to the netiquette list. I can’t help but be annoyed at the overly frequent use of emotions used in emails. Do I really need to see all those smiley faces? I get your point after the first one, so PLEASE people, do your part to stop smiley face abuse! :-<