Possibly final notes on magazine scanning

I spent some time this past weekend experimenting with scanning settings and eventually got one full year of Northwest Runner magazine scanned. I chose 2003 to scan because I think this is a year for which digital copies exist – this meant if something went horribly wrong and I physically ruined an issue or two, I probably wouldn’t get my kneecaps bashed in at an upcoming Winter Grand Prix race series. I had already done extensive experimenting with my own copies of the magazine anyway, so that wasn’t too likely, but I wanted to play it safe.

Here are the key notes:

  1. Scanning in greyscsale images+text at 300DPI is dramatically faster than scanning in color.  However for recent issues this isn’t a great option. Some pages are B&W but many are full color.
  2. Output image size for greyscale vs. color at 300DPI is pretty similar.  The only reason this might matter is because of the behavior I previously mentioned where the only way I can do this with our scanners at work is to have it email the output from the scanner to me – I can’t scan directly to a network share – and my mailserver rejects messages above a certain size.
  3. My mailserver seems to reject messages when they cross a threshold somewhere between ~12-15MB in size.  In practice, this means I can scan about 5 ledger-sized, double-sided sheets, or about 10 pages of the magazine at a time.
  4. It is important to separate the pages and invert the fold along the spine before sending through the auto-feeder.  I didn’t do this with one of the first magazines and I wound up with some paper jams, some slightly mangled pages (not really destroyed or anything, but like what you get with a printer auto-feed after something’s gotten jammed).  I mentioned I scanned the entire year of 2003 – the jams only happened in the first issue or two.  After I started this separating and fold inverting process, the pages did not get “stuck” along the spine and they all fed cleanly.
  5. Sometimes 5 sheets barely hits the “too big” threshold for scanning. If this happens, I need to do something like “scan 3, then scan 2.”  This is rare, but it happens.
  6. Some of the magazines are missing pages or have single pages torn out.  This screws up pagination and might make later post-processing / assembly into PDFs a pain (or I might just ignore it).
  7. The printers at work require me to log in and after some time they will log me out.  If I’m logged out, I need to re-enter the scan settings (2 sided, color images + text, scan as JPG not PDF, 300DPI).  This is tedious.  If I stay attentive during the scan process, I can: feed 5 sheets, wait for it to scan, put the next 5 sheets in the feed reader, wait for confirmation that it sent the email, then press “Scan” again, I won’t get logged out.  This also ensures that the scanner (which is critical path in this assembly line) is always “busy.”
  8. As this process is happening, I’m getting email after email with 10 attached images (scan01.jpg, scan02.jpg, etc. for both sides) that I need to pull out of my inbox and archive in folders.  Because the image names conflict (scan01.jpg will be the cover and also page 11 and page 21, etc.) I need to batch these up, too.  My post-processing jpg rotater, cutter, etc. script will handle these.

That’s about it.  To scan the 2003 year of magazines took almost exactly 2 hours. During this time I am constantly busy with: de-stapling issues, preparing 5 page batches for the scanner (de-“sticking” the spine), running the scanner, adding/removing sheets from the feed tray, processing my inbox (which will fill up if I don’t pull the files out), reassembling scanned magazines, and trying to re-staple.  I think I can make this a little more efficient and bet I’ll trim a decent amount of time off that 2 hour baseline, but this process seems pretty close to optimized to do this job well and keep the original issues intact.

Now I just need to sync up with Martin (or really probably Bill Roe, who I think actually owns these issues) and confirm that they’re OK with me plowing ahead with all the back issues.

Comments

Next notes on scanning

After the initial research with scanning last week, I’ve concluded that trying to scan the back issues of Northwest Runner with my home scanners is probably a job I would never finish. At 1 minute per page and some non-trivial amount of post-processing (orienting all pages properly, assembling the PDFs), the initial time to scan is just more of my life than I’m willing to dedicate to this project. I found that my home printer/scanners do offer a document feed feature, though.  This works pretty well.  I can put a stack of documents in the feed tray, start the scanner function, indicating that the input documents are duplex with the moire suppression option on, click “go” and come back a half hour later, flip the scanned stack to get the other side and it’s pretty much done (and the postprocessing is slightly lower, too).

The problem with that is (for my printer) it requires an ~8 1/2 x 11″ input. I tried this with one of my own back issues of the magazine after taking the staples out and cutting it down the spine and the results were great!  Except for the original which I had cut in half.  This is no big deal to me, but apparently the guys who actually own these magazines I offered to scan and who have been involved in this sport for about as long as I’ve been alive are not exactly thrilled at the idea that I’ll destroy all their original issues.  Time for plan C…

This involves my scanners at work.  The printers at my work are Ricoh Aficio MP 5000’s and with *these* I can do scans of ledger (11×17) sized inputs, with the document feed feature, and will automatically do duplex scanning (no flipping required) and they are very fast.  These take about 6 minutes to scan an entire issue, front and back.  This leaves me with my last problem – how to get the scanned files from the printer.

It seems the Ricoh offers two functions – both of which present some problems.

  • Scan and send to email – this is kind of OK.  It will be inconvenient to need to pull the attachments out of hundreds of emails, but I could deal with it.  The larger problem is that the generated bulk scan from an entire issue is apparently larger than my mailserver will allow.  So I scan the entire issue over 6 minutes only to then have the printer tell me “sorry – couldn’t deliver your document” at which point those scans seem to be lost and I just wasted that time.
  • Scan and store on network share – this would be great except that the interface to get these things to talk with a Windows network share are maddeningly hard to use, might just not work, and might need some administrative rights with the printer that I don’t have.  After much trial and error with this, I think that this option is closed to me.

So my likely path forward will be to scan half an issue at a time (or so – if that’s possible) and go do post processing on those.  To do this, I will need to remove the staples from the back issues and feed in half an issue at a time, but I think it will work and go pretty quickly.  One thing I didn’t mention is that even with this approach, it *seems* that there are characteristics of the scan job that need to be re-entered every single time I start a scan job (select input as color, set DPI, other settings, original orientation settings).  Each of these is slow and tedious to input on the Ricoh touch screen and I’m hoping I can simplify it, but it might be tolerable and this will still be dramatically faster than working with my home scanner.

So – here are my next steps:

  1. Go back with a couple of my own copies of the magazine, do some trial and error to try to understand the maximum number of pages that can be scanned and emailed in one batch without my mailserver rejecting it and get more confident that the document feed will work smoothly / flawlessly before I send any of the originals through the feeder. This will include doing that for color inputs as well as greyscale (the oldest issues are greyscale, then a single color is added on some covers, then there are full color, glossy covers over greyscale pages, and current issues are glossy and full color from back to back).
  2. Start scanning the actual issues, probably starting from most current to oldest.  This way, again, if there are any problems with the process or I hurt some issues until I’m certain this is going perfectly, I have some time to correct the process.
  3. Start post-processing.  This may take a while.
    1. Probably use imagemagick, since I know some of its functionality
    2. Cut the scanned images in half – I’ll have ledger sized scans.
    3. Do some math to figure out page numbering.  If the cover is page 1 and an issue is 60 numbered pages long (back cover is page 60), I should have 15 input pages and 30 scanned images (front + back).  I think my picture batches will be: 1+60, 2+59, 3+58, etc.  Also, if I have to do this in two batches, I will have scanned images with numbering which will need to take some of this into account, too, in a cutting and renaming script e.g. postprocess [yyyy-mm] [first_page] [last_page].
    4. Probably also do some image rotation magic
    5. The final output of this will be perfectly named, oriented, and numbered scans (e.g. 1998-12-p01, 1998-12-p02, 1998-12-p03, etc.)
    6. I could deliver those back to Northwest Runner (this is what I had volunteered to do) or I might do some additional post-processing to attempt to assemble them into searchable PDFs.

And that should just about ruin my summer!

Comments (4)

Initial notes on scanning

So I’m starting to experiment with my scanning capabilities. I have two all-in-one printers. An old Canon MP530 and a newer Kodak ESP9250. I thought I would just use them both and cut my scanning time in half by swapping back and forth between them, but instead I’ve spent much of a lovely Memorial Day understanding their capabilities, what works, what doesn’t work and figuring out how I’ll actually scan 30 years of Northwest Runner magazine. Here’s what I’ve found.

First, the colors between the scanners is very different. Using the default scanning characteristics – here are some samples from the cover of the December 1998 issue.

CanonVsKodak

Obviously the picture is terrible – I’ll get to that in a minute – but the one on the left is the Canon and the one on the right the Kodak.  I ran a few more tests and the Canon gave me reliably more faithful looking scans of the original image than the Kodak, so I think I’m simply going to not use the Kodak.

Next – yeah, that image is terrible. How do I fix that?  That’s a moire pattern and it commonly happens with scanned images. The secret to fixing it is to set an option in the scanning software from the manufacturer to “descreen” the image and this basically eliminates the interference:
NoVsDescreen

Great!  Now I have pretty acceptable looking scans.  At least I have the basics of what I expect.  I’d taken some other stats before on scan time and file size if images are saved as JPG.  Here they are:

Test Kodak Canon
600dpi scan speed 35s / page 1:07 / page
300dpi scan speed 15.5s / page 18s / page
200dpi scan speed 6s / page 18s / page (again)
150dpi scan speed 5s / page 10.5s / page
600dpi file size not measured 5MB
300dpi file size not measured 1.2MB
200dpi file size not measured 600KB
150dpi file size not measured 300KB

Well that’s discouraging, but maybe not surprising. The Kodak is *dramatically* faster.  Making matters worse – the above measures are for the Canon scanner when the moire interference pattern is *not* suppressed.  With the interference pattern suppressed (which is really the only acceptable way to do this), the scan speed is >1 minute per page every time.

Finally, I wanted to decide on a scan DPI.  With the moire suppression enabled it doesn’t seem like I’m going to sacrifice any time on the project if I choose to go with a lower DPI, so all I need to do is figure out what would be acceptable. For archival purposes it seems like the only reasonable thing would be to go as high as possible but something tells me 600DPI (or higher, I think that I could do 1200) is just not really going to benefit anyone ever and it is would almost definitely make this take up even more of my time (in terms of initial processing and any post-processing) so I am planning on 300DPI or lower.  To make the call on this, I noticed that the Canon software is capable of taking some input files and generating a searchable PDF. I can’t stand PDF as a format but there’s no denying that this would be cool and handy, so I don’t want to choose a scan option with low fidelity if it seems that I might one day sacrifice that ability.  A couple tests on this and it turns out that the generated PDFs I make of 200DPI input files sometimes cannot find input search strings that I enter for people’s names in race results that are very clearly words on the printed page but at 300DPI in a handful of tests I didn’t find any misses.  Therefore: 300DPI it is.

To summarize:

  1. Canon wins vs. my Kodak. Other scanners will probably yield different results.
  2. It is absolutely necessary to turn on the descreen operation to reduce moire interference (and this is only available in the printer’s driver / software, not as a generic TWAIN device, it seems)
  3. Super-high DPI isn’t worth my time. In fact, did I mention how stupid it is that I’m doing this?
  4. But 300DPI seems to be the minimum to be able to make text-searchable PDF files and have that work.

I have a few more things to research before I get going, but I’m well on my way with these findings!

Comments

Inventory of Northwest Runner

IMAG0190

UPDATE: 6/4 notes inline with back issue notes thanks to Glenn Tachiyama. If I can pool Glenn’s issues with the issues I already have, this would be a complete collection from 1983-present.

To start out the project, I took an inventory of the issues I gathered from Martin. I’m still collecting details on what formats the various back issues are in, but at a high level:

  • the oldest issues are only available in the print copies (which I have)
  • newer issues (from something like 2000ish on) are available in some digital format

So the right thing to do is to scan the oldest issues and try to work with the digital format of the new issues and make directly consumable digital copies of those.

I don’t have access to the earliest volumes.  These date to the early 1970’s and if anyone has access to these, I would be very happy to digitize them, but I’m going to assume they are lost forever.  Here’s a stock of what I HAVE or what is MISSING:

  • Volumes 1-3: all missing
  • Volume 4: HAVE issues 5, 6, 7, and 10
  • Volume 5: HAVE issues 2, 5
  • Volume 6: MISSING 3, 5, 7, 11
  • Volume 7: MISSING 2
  • …note – all future volumes / years indicate issues that are MISSING…
  • Volume 8: 7, 10+
  • Volume 9 / 1981 (volume numbering changed this year): 1, 2, 4, 8, 9, December
  • 1982: May –> Glenn is also missing May, issue never printed?
  • 1983-1984: complete 🙂
  • 1985: July –> Glenn has this 🙂
  • 1986: complete 🙂
  • 1987: January –> Glenn has this 🙂
  • 1988: February –> Glenn has this 🙂
  • 1989-1993: complete 🙂
  • 1994: June, November –> Glenn has this 🙂
  • 1995-1996: complete 🙂
  • 1997: February, June –> Glenn has this 🙂
  • 1998-1999: complete 🙂
  • 2000: December –> Glenn has this 🙂
  • 2001: complete 🙂
  • 2002: September, October –> Glenn has this 🙂
  • 2003-2005: complete 🙂
  • 2006: February
  • 2007 on: assume digital copies exist

Comments

Archiving Northwest Runner

A couple weeks ago I got this idea that seemed great at the time. “Northwest Runner is a really valuable resource for runners in Seattle and I am positive that there is a ton of great history in there that should be preserved and made more publicly available. I should scan all the back issues.

It’s that last part where this may have taken a turn for the worse.  Anyway, I got in touch with long-time editor and publisher, Martin Rudow, and today I picked up a trunk full of back issues. I’m going to take some notes on this process and archive them in my blog for posterity and as I come up with questions that might be interesting for runners or hobby archivists.  This will probably start with a background of what data is available, go into technical questions / notes / challenges / discoveries, and hopefully just be kind of interesting.

I’ll try to remember to tag all the posts as “nwrunner” so that interested readers don’t have to wade through my extensive and deeply crazed rants on the current state of technology.  Wait…that’s the unabomber…not me…

Comments (2)

Running clubs in Seattle

I’ve run all my life but only got “serious” about it in 2007 when I decided to run the Portland Marathon. That year I ran a lot of races, including CNW‘s Firecracker 5k (which I mention below), SRC‘s Cougar Mountain 13 miler (which is vaguely alluded to) and Portland. I also paid dues to join both SRC and CNW that year (though I didn’t understand these in the context of USATF club affiliation).  In 2008 my memberships to both lapsed and I asked myself which club I should stay affiliated with.  The summer of 2008, I went to a Club Northwest board meeting and presented this letter.

Members of the board of Club Northwest,

To clarify: this is really about running and doesn’t touch on field sports.

My name is Patrick Niemeyer and early in 2007 I started getting much more involved in the local running community. I saw the 2007 Firecracker 5000 fliers which appeared to advertise a great deal for race registration for Club Northwest members:

  1. Club Northwest membership and the discounts and perks (Northwest Runner subscription, fashionable bright orange t-shirt) that go along with it
  2. race registration for one low, low price,

so I signed up.  I remember running the race and wondering “who is this ‘Shelly’ I’m trailing that everyone keeps cheering for?” and who I tried, and failed, to catch (who I later learned was board member Shelly Neal).  It wasn’t my first race but it was the first where I self-identified as a runner and over the past year I’ve incorporated running into my life to the point where it’s hard to imagine my life before or without it.

However, as we’ve just passed July 4, 2008 and with it the most recent Firecracker 5000, I’ve let my CNW membership lapse and started to wonder whether I should renew. Though I take part in many races, including those sponsored by CNW, I haven’t attended the All-Comers track meets, I don’t go to the front lines of races wearing a bright orange CNW singlet, and I’ve realized Northwest Runner subscriptions don’t require CNW membership. In talking with running friends about CNW, I’ve realized that many of them don’t even understand that membership into CNW is open. Their impression is that CNW begins and ends with the elite runners they see at races and I’ve found myself correcting the perception.

Meanwhile, I think there is greater clarity about the role of some of the other running communities in Seattle to which I belong.  I’m thinking specifically of ChuckIt and Seattle Running Company/Club.  As a monthly dues club, ChuckIt feels more like it fills a specific niche and doesn’t make as much sense to discuss in this context, but SRC feels much closer to CNW – and I have to guess this is a topic which has come up before.  If I were to characterize SRC and CNW to someone unfamiliar with local running groups I’d say that SRC conducts and evangelizes trail running or ultra running and CNW conducts and evangelizes more traditional track running.  I’d say both have a significant roster of elite runners, but SRC events and participation feel, for whatever reason, like they appeal to a wider base while CNW feels like it more directly serves a specific elite base.

None of what I’ve said may be quite demonstrably true and I can’t speak for “runners of the northwest” but I feel the impression I’ve described is probably pretty accurate among runners who are familiar with, but not members of, either club.  So, finally, this leads me to my two questions.

First: I wonder whether CNW realizes or agrees with the impression that (again, speaking only about running) the club is focused on elite local runners?

Second: if this isn’t the case, I wonder whether the board has plans to assess the image of CNW to help someone like me who is maybe an occasional AG placer but not an elite runner understand why they should become or renew their membership?

To reiterate: my points are really not to ask why CNW isn’t more like SRC (or even ChuckIt), but to ask the board to help clarify what membership in CNW means and help understand who CNW wants as its members? I suspect these aren’t questions which will be answered in a short conversation but sincerely look forward to any conversation this generates.

Thanks very much for your time and consideration,
Patrick Niemeyer

Comments (2)

Squak half recap

Today I ran the 2012 Squak 1/2 marathon trail run put on by RD Roger Michel’s Evergreen Trail Runs. Roger is probably still at the mountain while people from the full and 50k are crawling toward the 10 hour course cutoff.  Here’s my recap of the trail and race, which – despite the opening tone of this post – was excellent.

The course

The information on the site is, in my opinion, pretty lacking.  There’s a course map that shows the route and the course description tells you that there is 3,650′ of gain and there is a prize for the first person to make it to the 1800′ peak on Squak, but I was still left with questions – the answers would look something like this…

The course starts near the trailhead / parking lot at the south side of Squak. You immediately start to climb on a main, wide access road at a pretty constant elevation gain. After a mile or so, you cut to the right for an out & back on what Roger calls a “lollipop” (where the beginning and end of the out & back are a shared / 2-way trail). There was a lead pack of four guys and I was in fifth through here, about a minute back. This comes early enough on the course that by the time you get back to the 2-way stick of the lollipop, you’re unlikely to see anyone else in your event so the trail is clear / runnable.  The lollipop is not flat, but it has some moderate ups and downs.  A lot of this section of the trail was really overgrown, the only part of the course like this. At times I was running through heavy fern coverage and really couldn’t see the trail, making things maybe a little bit dangerous. Eventually, you get back to the road, though and start climbing, again at about the same rate of gain from the start.  I was about 1 minute behind the lead pack on the road and trimmed that to ~45 seconds at the aid station, however I didn’t see them again after this point.

This entire leg is (according to Roger) “4ish” miles until you get close to the summit where there is the only aid station on the course (I was out at 43:16). From here, you veer off the road onto what is probably the steepest, most technical part of the course for an ~1 mile downhill. This is very rough going and I knew going down it that I would probably not run back up when the course comes back and climbs along this leg (full/50k’ers were walking back up the ascent here, too).  During the descent I let two guys pass me who I didn’t see again.  After that descent, you wind around on some rolling trails for what feels like a long time before coming up on another 2-way stretch of course. This shared stretch felt long – close to a mile and it has a fair amount of climbing. Eventually you’re on 1-way trail again and this starts a descent until you arrive back at the same trail from which you left the aid station and start the difficult climb back to the aid station. Around this time, two more 1/2 runners passed me, putting me in 9th where I stayed for the rest of the race.  A stronger runner would probably run this uphill, but it is hard, steep, and modestly technical, so I walked and everyone around me (others in the half and by this time I was passing a few from the full and 50k) was, too.  This is another “4ish” mile leg until you get back to the aid station (my time out, split time: 45:59, total: 1:29:16).

As you leave the aid station, you complete the short run up the service road until reaching the radio towers on top of Squak Mountain. Then you start the penultimate descent along the 2-way trail where you were going upstream on the previous leg. This gets a little hairy because of the number of runners, but is manageable as long as you keep your head up and keep the other eye on the terrain.  You’re heading downhill here so most people are happy to yield the course to you.  After the 2-way descent, you veer off for the final climb, leading to the final descent on the course.  However, based on the start times for the events, at this time you’re coming up on a lot of the slower 12k runners.  This could be helpful or inspiring for some people, but it gave me an excuse to walk a little in a section that was perfectly runnable and in hindsight, I wished I’d run all or at least most of it. This uphill comes at a tough time in the course but it isn’t that hard. Eventually you crest, though, and start a fast, twisting downhill that continues basically all the way to the finish.  Along this stretch, you can see trail signs pointing back to the trailhead with distances, which is nice.  I don’t know how long this whole section was but I’d go with “4ish” again. I could still see 8th place ahead of me for much of this leg, but didn’t reel him in – I just passed a bunch of others who were out for longer (or maybe shorter and slower) days as I rolled down to the finish. You cross the service road with about 400m to go to the finish and end back near the parking lot.  My leg split 35:07, finish time 2:04:22.

So in summary

  1. Short, modest service road climb
  2. Rolling “lollipop”
  3. Continue service road climb to AS1
  4. Fast, technical descent
  5. Long rolling leg leading to climb along 2-way trail
  6. Fast, easy descent leading back to hard (but shorter) climb back up 4 to AS2 (formerly AS1)
  7. Short service road climb to radio towers, followed by longish easy descent but on 2-way trail
  8. 1-way trail climb to final peak,
  9. Long descent back to parking lot

One slightly frustrating thing about my race

I don’t have anybody to blame about this, but trail runs are just hard to calibrate and set goals for.  In any ultra I’ve done, my strategy is pretty simple: start slow, try not to lose much time on uphills, try not to blow out my legs on the downhills, try to stay strong in the last miles (this is very hard).  At the end of one of these, I really don’t know if I’ve run well or accomplished much more than being out for a longer-than-normal long run. Each course is so different and the fields that show up for these is so different that it’s hard to tell anything based on time or placement.

Today I ran a 2:04:22. Going into the race, I didn’t know what to expect (elevation gain doesn’t give you a complete picture, either – at Chuckanut there was a lot of snow on the course and there were some really dangerous, slippery rocks). So I looked at times from the 2010 and 2011 races. My same time in either of those years would have put me in 3rd and had me finish 4 minutes ahead of the next runner but today I was 9th.  I’m not disappointed with the time, but starting the race I thought “OK, I guess I could aim for 2:30, maybe 2:20, maybe faster” (2:30 would have been 9th last year – 9th 2 years ago was 2:50ish), but obviously that’s not what I should have aimed for and obviously maybe I should have aimed to be even faster today.  I’m not disappointed that I didn’t, but this fall my plan is to run the Chicago marathon and to break 3:00.  I have a pretty good idea of exactly what will be required to do that and I will know almost every step of the way whether I’m on track to meet that goal.  I also know what that goal will mean in terms of my fitness and capabilities as a runner.

So I definitely had a good experience today, I’m certainly happy with my time, and I don’t think I would have finished ahead of the guys in 8th, 7th, or faster had I just aimed for 2:00, but I feel like I could say I pretty much crushed my goal but I’m not super thrilled about that because now I wonder whether I set a really easy target.  Anyway (or despite this) I would enthusiastically recommend this race to any others who enjoy trail running. Oh – and I have to give some credit to Roger and the organizers for what I felt were terrific course markings.  Looking at the course map, I thought “this looks incredibly easy to get lost on,” but the entire race was very, very easy to follow.

Comments

Yakima skyline run 50k research

File under “should have been internal monolog.”  Here are some notes in preparation for the 2012 Yakima Skyline Rim 50k that I’m doing in two weeks.

If topos can be trusted…

  • Climb 1 is about like 60% of mt. Si and in a little over 2 miles. Sometimes steeper, sometimes shallower. The top of the ridge looks pretty rolly.
  • Climb 2 starts shortly after mile 8 (keep in mind, round trip on Si is 8 miles with a lot more gain) and looks a lot like climb 1 in gain, but less total ascent – maybe 500′ less? There are some false summits along this ridge.
  • Climb 3 looks like a bitch at the start but gets easier after ~500′ to get to a false summit.  Obviously this starts ~15 miles in since the course is an out & back. Then the actual summit before a Si-like (fast) downhill.
  • Climb 4 is just after mile 22 and looks long and gradual – that same “60% of Si” but not many places quite as steep.

From the 2011 results, Adam Hewey demolished the field finishing in 5:28, 40 minutes ahead of second place.  There were only 14 finishes <7 hours, but there were only 63 finishers, too.  Shawna Tompkins ran 6:49.  Terry said he ran conservatively and finished in 7:06.

I just ran Chuckanut slowly (5:46ish) and ran the Cherry Blossom 10 mile about like I wanted (aimed for and hit 6:52 pace exactly for the first 5, aimed for 6:30 on the second half and was 9 total seconds slow from that) So I’m aiming for something in the 7:00 range and expect that should be doable.  After running at Si on Saturday and going up in a personal bets 58 and calling it quits half-way through the second repeat, though, I need to remember it’s going to be a long day and start conservatively (but definitely needn’t walk the first climbs at all).

Comments

Chuckanut 2012 recap

I had a pretty fantastic time at this year’s Chuckanut 50k. Most of my recent race reports have taken one of the following formats:

  • nearly exhaustive (and exhausting to read) details of the race
  • never documented / published

I’m going to try to bridge the gap and get this one out and just focus on some highlights. In approximate order through the race experience.

  • Registration in January was a success! I know what you’re thinking “Hey dumbass, didn’t you just say you were going to focus on highlights for a race you did YESTERDAY? ‘registration’ is not a ‘highlight’.” But this is kinda important. The 2010 race filled up in ~3 hours and I got shut out. The 2011 race filled up even faster and I got shut out again (then waitlisted, then in, then injured and had to miss).  So it *was* a highlight to make the first cut this year!
  • Sign in and bib pickup went smooth and was a great opportunity to hobnob with some of the ridiculously smoking field from this year’s race.  The patchouli washed over me as I opened the doors to the spa where checkin was held and I instantly knew at least one runner from Ashland was present who would finish well ahead of me. After being told my bib was the number of my all-time favorite TV show, I realize the women’s CR holder is right behind me at checkin. I instantly forget my number, ask for it again (stalling, hoping she’ll decide checkin isn’t worth it, bail on the race, and I can hope to finish one place higher – it doesn’t work) and move on. This expo has free chocolates, Clif bars, chomps, and way more useful stuff than any marathon expo I’ve been to.
  • Dinner at the Olive Garden is exemplary, as usual.
  • Race morning I’m a wreck and terrible company for Katie on the way to the race.  She won’t concede that I’m being a jerk, which only makes me more frustrated. 30 MINUTES TO SHOWTIME!!!
  • 8:00 I’m through the portapoties, notice a breathright strip and crooked hat combo and the burrito guy in the green wave start corral.  I do some quick mental math and figure them + the people I saw from last night + one Joseph Roosevelt Creighton = I’m probably finishing 6th, tops.  Should be a shoo-in for top 10.  I call my bookie and we’re off!
  • 8:05 Top 20, for sure.
  • 8:55:20 I roll into Aid Station 1 and am feeling good (despite the rain) after running comfortably on the first 10k – not going out too fast at all.  My cheering leprechaun is there, I’m committed to having a good time today, not destroying my body, and pushing harder toward the end of the race (if I’ve got it).  The trail leaving the aid station is a nice, easy, runnable trail with a gentle climb – some single track, some bridges around Fragrance Lake. Some people are already starting to walk, so I cruise by.  There are beautiful sections of this run in the quiet snow. Also: it’s getting cold.
  • 9:36:48 I get to aid station 2.  This is surreal – first, I’m not positive I know who Eric Barnes is, but I think I just saw his leprechaun-doppleganger on the course.  The entire crew at this aid station has outdone all reasonable expectations for celebrating St. Patrick’s Day (I’m personally flattered!) and my personal support crew made it through the sloppy rain, too. I am lucky.  I hobnob again, get passed by dozens of runners who don’t bother waiting around as long as I do, and I’m off again.
  • 9:55 many of us are nearly run off the road by some lunatic in his Subaru.
  • 9:57 another runner and I are helping rock the Subaru out of a ditch but not until signing forms ensuring us free digital prints for life.
  • 10:11:12 I reach aid station 3, no real idea what my pace is or what I’m on track for in the race (not to self: may have slipped to top 25 by this time?), but still having a good time, though I am definitely losing sensation in my fingers. I meet Terry’s wife who is part of the crew and very nicely cautions us that it’s a ways to the next station so to stock up now. I stuff some extra cookies in the pocket in my bottle and I’m off.  This next section of the course could be described as “technical,” “hard,” or “totally awesome.”  I was having a blast moving as fast and crazy as possible while barely on the right side of “safe.”  I probably wasn’t going very fast vs. the leaders who were here, oh, and hour ago, but I did pass a lot of people and I don’t think anyone passed me.  This was the case for a lot of the course – I got passed a ton at aid stations but not a lot while I was moving and I passed a lot of people while I was moving. At least this is the polite lie I’ve built up in my head to console myself over my finish time (which I promise is coming-remember, these are the highlights!).
  • 11:20ish My gloves are soaked, my hands are freezing, and I can barely get out an electrolyte capsel around here.  This is kind of a drag and I’m feeling it and it’s showing in my performance.  Also, this is just boring, snowy “slog” – not fun/dangerous/technical.  Eventually, we reach a descent that goes into aid station 4 which is back below the snow line (translation: raining) and I’m feeling a lot better.
  • 11:43:51 I’m at the base of Chinscraper. My good luck charm is there to great me and saw Joe go through, too, but doesn’t tell me how far behind I am (“a lot”).  I snack, notice a bottle of Bushmills that has been getting way too little attention, do my part on the bottle, and I’m off – feeling GREAT!
  • 11:59ish I see Glenn and Win and have been TEARING it up Chinscraper and figure this is the Sun Top equivalent of Chuckanut and am stoked to have made it and be feeling so great!
  • 12:20ish Damn you, Bushmills, how much more of this climb is there???  After finally finishing the climb and starting the descent, I slow to say hi to Terry (who’s filming), and Kevin (who’s tearing down the aid station) and continue the descent on toward Fragrance Lake Road as fast as my quads will allow (translation: appallingly slowly).
  • 12:37PM I’ve made the descent down the road and make it to the 5th and final aid station where I chillax for over 5 minutes, snack on something that I still can’t believe was a vegan candy bar, decide I want a red gummy bear – then change my mind (and my support crew helps me not let it go to waste) and eventually start back the interurban trail.
  • 1:44:00PM after some struggling, a little stopping to visit, seeing one runner collapsed on the side of the road (and already getting the assistance he needed), passing and being passed by other runners and ultimately finding the strength to run basically all of the final 4 miles moderately respectably – I cross the finish at Fairhaven Park in barely under 5:44. And – get this – they are reviewing runner bibs, calling us out as we come to the finish!  And pronounce my name right!  This is all unprecedented, to me.
So that’s my slowest 50k to date.  I might have been in the top 200 – just barely fast enough to not be a disgrace to the green wave.  The course was fantastic.  The support on the course from the organizers and O’Katie were as good or better than any race I’ve known.  Freezing in many parts, ridiculously muddy for ~5 total miles, 10k of fast trails, very technical for ~3-5 total miles, and super, super fun throughout.  Krissy and all the volunteers put on an excellent, excellent race – I’d love to go back and run it harder, or just go back and do it again the same way.

Comments (3)

Daniels Running Formula and easy run pace

Daniels’ Running Formula is an excellent book. It contains tons of detailed information on the physiology of running that I think can benefit anyone who is serious about understanding the sport and understanding and training to the best of their abilities.

I own a print copy of the first edition and recently bought the Kindle edition of the second edition and was doing some comparisons between the two though and was surprised to find one significant difference.  Ultimately a the book’s value comes from its tables that help you do three things:

  1. Identify your fitness level using a metric he refers to as VDOT. This is something that can be measured using some complicated sports medicine assessments that most non-elite runners would never do, but which can also be approximated by taking measures of fitness from events of different distances and projecting from there.
  2. Recommending performances to aim for in workouts that you should conduct given that fitness level.  If you run a 5k in XX:YY minutes and seconds, how fast should your typical easy or long runs be?  How fast should you do 400/800/mile workouts?
  3. Providing workout plans to train for the marathon and some other races.  How far in advance should you train?  What should you do 10 weeks before the race?  8 weeks before?  4 weeks before?  How should you taper? And given your goals and fitness level, how fast should all those workouts be (this comes from the recommended workouts in 2).
Daniels is not the only authority on any of these things – there are other guides and if I’ve learned anything from running, it’s that there is no single textbook solution that every single person can apply and expect the same results.  But within some margin of error, I also believe it’s fair to say that if most people want to run a 3 hour marathon, they will probably be performing at similar levels in some races of other distances, probably be putting in pretty similar total weekly miles, and probably be working out at pretty similar levels of intensity in the workouts preparing for that stab at 3:00.
Daniels has made one interesting revision between the editions that I had not noticed until today, though, and it’s in his recommended running plans for people of specific VDOT values.  Between the editions he has not changed his assessment criteria for measuring VDOT.  This is an abridged table of how he assesses some VDOT values given performance in 2mile and 5k distances (there is a lot more detail in the book – go buy it). My current 2mile and 5k times put me in the ballpark of this range, which is why I chose these examples:
VDOT 2mile 5k Marathon
52 12:02 19:17  3:04:36
55 11:28 18:22  2:56:01
58 10:56 17:33  2:48:14

That hasn’t changed between the editions.  However, I think a lot of people (including myself) would argue that these equivalent performances might overestimate the results in a marathon based on those performances at the shorter distance (and vice versa would project a runner completing a marathon in those times might run a faster 5k and 2 mile race).  What has changed are his recommended workouts for athletes at those VDOT values:

VDOT easy/long pace (1st edition) easy/long pace (2nd edition) MP / T / I / R pace
52 7:59 8:16 Unchanged between editions
55 7:38 7:44 Unchanged between editions
58 7:19 7:34 Unchanged between editions

I haven’t taken the time to think about this change or try it in my own training.  The preparation I’ve done for marathons over the past 4 years has led me to consistently falling short of my marathon goals and my easy and long run pace has more closely matched the recommendations in the second edition (I tend to run my easy and long runs at 7:30-8:00 and my long runs have almost exclusively been 8:00 or slower when I run with people from my running club).  It’s possible that for me, sticking closer to the old edition’s recommended paces would have gotten me to not bomb in the marathons I’ve done, but I don’t know.  I emailed my coach for his input on this and might try to change this up a little in my training and see what happens.  One thing that’s usually pretty clear is that harder / faster training will lead to faster performances in races, though there is obviously also the potential for earlier burnout.

If anyone reading this has consulted with this book or has any thoughts on any of these projections, I’d be really interested to hear your experiences, too.

Comments

« Previous Page« Previous entries « Previous Page · Next Page » Next entries »Next Page »