COLLECTED BY
Organization:
Archive Team

Formed in 2009, the Archive Team (not to be confused with the archive.org Archive-It Team) is a rogue archivist collective dedicated to saving copies of rapidly dying or deleted websites for the sake of history and digital heritage. The group is 100% composed of volunteers and interested parties, and has expanded into a large amount of related projects for saving online and digital history.
History is littered with hundreds of conflicts over the future of a community, group, location or business that were "resolved" when one of the parties stepped ahead and destroyed what was there. With the original point of contention destroyed, the debates would fall to the wayside. Archive Team believes that by duplicated condemned data, the conversation and debate can continue, as well as the richness and insight gained by keeping the materials. Our projects have ranged in size from a single volunteer downloading the data to a small-but-critical site, to over 100 volunteers stepping forward to acquire terabytes of user-created data to save for future generations.
The main site for Archive Team is at archiveteam.org and contains up to the date information on various projects, manifestos, plans and walkthroughs.
This collection contains the output of many Archive Team projects, both ongoing and completed. Thanks to the generous providing of disk space by the Internet Archive, multi-terabyte datasets can be made available, as well as in use by the Wayback Machine, providing a path back to lost websites and work.
Our collection has grown to the point of having sub-collections for the type of data we acquire. If you are seeking to browse the contents of these collections, the Wayback Machine is the best first stop. Otherwise, you are free to dig into the stacks to see what you may find.
The Archive Team Panic Downloads are full pulldowns of currently extant websites, meant to serve as emergency backups for needed sites that are in danger of closing, or which will be missed dearly if suddenly lost due to hard drive crashes or server failures.
ArchiveBot is an IRC bot designed to automate the archival of smaller websites (e.g. up to a few hundred thousand URLs). You give it a URL to start at, and it grabs all content under that URL, records it in a WARC, and then uploads that WARC to ArchiveTeam servers for eventual injection into the Internet Archive (or other archive sites).
To use ArchiveBot, drop by #archivebot on EFNet. To interact with ArchiveBot, you issue commands by typing it into the channel. Note you will need channel operator permissions in order to issue archiving jobs. The dashboard shows the sites being downloaded currently.
There is a dashboard running for the archivebot process at http://www.archivebot.com.
ArchiveBot's source code can be found at https://github.com/ArchiveTeam/ArchiveBot.
The Wayback Machine - https://web.archive.org/web/20230316214443/http://poynton.ca/notes/video/Constant_luminance.html
Constant
Luminance
Since 1953, we have been using a block diagram for color video that is
different from the one that a color scientist would prefer to use.
The principles of color science dictate that we mix linear RGB (tristimulus signals)
to make true
luminance, denoted Y. If a video system were to operate in this way, it
would adhere to the Principle of Constant
Luminance. But in video we depart from that principle, and implement
an engineering approximation: We mix nonlinear ("gamma corrected")
RGB to make what I call luma, denoted Y'. (Many video engineers
carelessly call this luminance.) To form luma, we use the
coefficients that a color scientist would use to form luminance, but we
use them in a different block diagram than the color scientist expects:
We apply gamma correction before the mixing, instead of after. This alteration
in the block diagram introduces a few image artifacts that are usually fairly minor.
The departure from the theoretically correct order of operations is apparent in the dark band seen between the
green and magenta color bars of the standard video test pattern.
Details are available in Chapter 8 of
Digital Video and HDTV Algorithms and
Interfaces.
The issue of constant luminance (or lack of it) is intimately intertwined
with gamma correction. Gamma has unjustifiably acquired a bad reputation.
I presented a paper on the topic,
The rehabilitation of
gamma, at a SPIE/IS&T
conference in 1998.That paper outlines the Principle of Constant Luminance.
As you can deduce from its title, that
paper concentrates on the reproduction of lightness (which is related to
luminance, which is related to luma). It merely outlines the color issues.
I presented the related issue of choosing luma coefficients for
conventional video, DTV, ATV, and HDTV, in a SMPTE paper in 1998:
Luminance,
luma, and the migration to DTV.
The so-called paper
is virtual at this moment, having not been actually finished in that
medium! However, the abstract of the presentation is available:
For the truly courageous, an audiotape of the session is available
through SMPTE. The opening paragraph of this note is the first paragraph of
that paper's abstract.
Some fragments of the paper-in-progress are available. Start with the
brief technical note
Errors due to nonconstant luminance.
If you STILL want to keep going, access the links at the bottom of that
page.
All of this will be tied together within a month or two, and then
(eventually) released as the written version of the SMPTE paper.
Related documents, typeset, available in Acrobat PDF format:
Charles Poynton - Video engineering
2004-02-25