Crewing for a team is an important, fun, and inexpensive way to get into Rally

You want to drive a rally car, welcome to the party, we all do. Driving a rally car involves allot of money and time and is completely worth it. Just because you don’t have the money to build a car or go to an event doesn’t mean you can’t have an awesome time going to events as an important part of a rally team. This year at the 100 Acre Woods rally I went as part of a two man crew for two cars from Colorado. Competing in a rally is a complicated task that is made much better with a crew. As crew we took the cars to tech, ran the service stops, and did many small things to make our teams successful.

Rally involves several logistical problems before the race even starts. First, for 100 Acre Woods, we tow two rally cars ~850 miles. After arriving you need to figure out Lodging, Registration, Recce, and Tech inspection. With a large group of people it is often more convenient and less expensive to rent a house then stay in hotel rooms. This year we stayed in a comfortable house, The Lodge At Fair Winds with wifi (important because there is no cell coverage). Ideally you’d want to find a place that has a garage you can use in case you need to do overnight repairs to the cars. This year we were fortunate to not have to do any repairs at the lodge as both of our drivers did a good job of keeping their cars on the road. Lodging should be coordinated before you arrive but there is still the task of getting all the vehicles to the location, checking in, unloading gear and finding the closest place to buy beer and food.


Recce (short for reconnaissance) is where the driver and codriver drive the stage roads with the stage notes at posted speeds in a non-race vehicle. This can only be done by the driver and codriver and is the first task on their mind before the race. For our team we used rental cars because they present the same perspective as the race car (vs a truck) and the roads are often times rough back roads that would really beat up the truck. Having a couple of crew members makes getting a rental and doing registration simpler because you have more drivers. Registration is the simple task of checking in with the organizers and showing them that you have all the required items (prof of insurance and registration on race and crew vehicles).

Often on the same day as recce a tech inspection needs to be done. Doing Recce alone is a really taxing activity as you are driving hundreds of miles slowly and checking the notes you received at registration. If you are experienced you are also writing pace notes (extra reminders that help you know where you can go faster). Working as a crew member you can make recce more successful by handling tech inspection. What that means is you drive the race car from the house or hotel to the place where the organizers are looking over the cars and inspecting the safety gear to make sure you are following all the safety rules. This is fun because you get to drive the race car and hang out by all the other race cars. In our case we got to see the Escrot build being teched. If you weren’t at the race you should go to the next race where they enter because that build is awesome. Pictures and video do not do that car justice. At tech inspection you will also register for service areas and at this point it is advantageous to try to get a good spot surrounded by others you know so you can help each other out.

One of the big downsides to working as crew is there is often limited opportunity for you to spectate on stage because you have to be in service areas as need to be somewhere where you have good cell coverage to give your team the best chance of being able to contact you if they need to. However you do get to go to the shakedown stage (a short practice before the race starts on the first day). This can be a fun opportunity to see the cars launch. The organizers did a good job of marking out places where you could spectate as crew however just know that as crew you are just as much on a schedule as your team and they are depending on you being in service when they get there. When you go out to spectate you are taking the risk that you get lost, are caught behind traffic, or your vehicle has a problem and your team is without a crew when they need them. We attempted to spectate on stage 2 and managed to take a wrong turn that resulted in us only seeing the second half of the field. Thankfully on the way out of there back to service we had no trouble and made it to service before our teams did.

At service you have a limited time to try to fix anything that is wrong with the car and prepare it to continue racing. Because both of the drivers of our teams did an awesome job keeping their cars on the road there was no major repairs for us to do. To keep them going at pace we did small things while they got a chance to stretch their legs outside of the car. The cars went up on jack stands and we looked underneath for any damage, checked and adjusted tire pressures, checked the torque on all lug nuts, refilled water and other drinks for the driver and codriver, and got them food to eat. Lastly we looked up their times on stage and encouraged them to continue pushing for the win.

When you are competing in a rally you have many things to think about and keep track of as a driver/codriver team. One of the things you have very little time for is pictures and keeping up on how the race is going for everyone else. As crew you should be taking pictures to share online for friends and family and looking for updates on the competitors. This is fun and the driver/codriver team can focus more on driving fast with you handling the media side of racing and keeping them informed on their times and how their competitors are doing. You can see more of my pictures and follow me on instagram at tyler_128_. To find more pictures and videos of of 100 Acre Woods check out these tags #100aw, #100awrally, #100acrewoodrally, etc.


Working on a crew is an awesome way to participate in rally at a very low cost, build great friendships, and help your friends be more competitive. It is completely worth it and much more involved than you would initially think.  Special thanks to Steve, Scott, Sam, Joe, and Grant for letting me be a part of their team and participate in such an awesome event.

In the end both teams did well.  Steve and Scott pushed hard and drove the wheels off the car for a 3rd overall finish, 2nd in national class (OL and SP), and 1st in OL for both days regionally. Sam and Joe were successful in completing Sam’s first event as a driver and having a blast doing it.  Pictures don’t do this event justice, the roads are amazing, the local people are inviting, the competition was fierce, the organizers and volunteers ran a smooth event, and the weather was perfect.  If you want to get into rally you should find a team that you can crew with, it will not only be a great time but you will learn so much about the sport that will help you if you plan on building a car and competing yourself.

Python Flashcards Game

This is part of my poker math game as there are topics from he book that work well for the format of flashcards. This is really simple program that demonstrates how to do several interesting things in python.

The first couple functions show a simple way to wait for the user to input some data. The reason I’m assigning the output to __ is to show that I’m not going to use that data. The double underscore variable is a standard way of expressing that in python.

At the beginning of the main (line 16) I call clear to clear out the terminal window and give me a fresh looking space. This shows the simple way to call system commands from python.

On line 20 we use the with ... as syntax to open the file. This is a special kind of statement for resources like files that need to be cleaned up when you finish using them. The advantage of with is that when execution goes out of scope it will clean up the file which will close it. That means that if any of the code inside throws an exception the file will be closed.

Line 21 uses the pythonic line reading syntax to read from the file. The variable cards refers to the file and for line in file will loop through a file a line at a time in python.

Line 24 is the next interesting syntax piece. To understand it we need to break it down. The first thing that split is separating the line into a list of strings, splitting anytime it sees a comma. The second thing is the list compression. List compression is a syntax that efficiently does something on each item in a list and the syntax is [action for item in list]. strip() is a function that removes whitespace and the beginning and end of a line and I’m using it specifically to clean up any extra whitespace in the csv file lines.

So there you have it, a simple flashcards python app you can use for studying. To follow the advancements in my python game go checkout my github repo and let me know what you think.

Poker Math Game – Pot Odds

I just started reading the book Essential Poker Math, Expanded Edition and decided to write some short python games to help train myself. The first game is called Pot Odds and is based off chapter 6 of the book. It deals only in whole numbers and assumes you have only one other opponent. This is a simplification that allows you to focus on becoming very quick doing the math in your head. Since this is based on python I’m requiring pyenv for managing python installs and dependencies.

Pyenv install:

Then place this in your .bashrc:

And lastly you will need to run this:

Game setup

Download the game using git or click on this link:

Then setup the environment and install the dependencies:

Lastly, you can start this first game with this command:

Searching Realtor.com with python

In this tutorial I will show you how I used python to get a list of addresses and prices from the website realtor.com in a given area.

Disclamer

Before we start I need to point out that realtor.com does not want webscraping against their site. They have gone so far as use a system built by Distil Networks to identify and block bots. Their main concern is that you scrape all the data off their site and build a competing site and get add revenue off their data or something. Here is a link to describe what the are doing. While technically I am using scraping (reading webpages with something other than a browser) I am not going to republish any information I gather. Also, I am only doing this to more efficiently use the information that they publish. I believe that while the advice below may violate their view of the letter of the law it does not violate the spirit of the law.

Source

The source for this can be found on my github gist. This was done for the rent-scout project. Over time rent-scout will be updated and some of what I post below will become out of date.

The first url

Before you can begin parsing any website you need to first find a url you can work with. The awesome thing about urls is that they are like folders that take arguments and you can string together a list of different arguments to find the address you want to access. I discovered that if I went to http://www.realtor.com/local/Colorado I got a table of links that showed the top cities in that state along with the number of listings available for rent and for purchase. I am mostly interested in the ones available for purchase because I want them as rental properties. So the first thing you can do to read the url is to use the libraries lxml and requests. You can install them with pip install lxml requests. This will read that page and print the html data in text form:

Troubleshooting Notes

I discovered really quickly that if I didn’t create a session with a user agent header that the site would put me into infinite redirects. This is probably one of their tactics to dissuade beginners form writing scripts to scrape their site. Sessions also have other benefits, one of which is speed as it reuses the TCP connection and does not attempt to set up a new one each time it sends a new request. If you don’t set the User-Agent to something browser like it will look something like this: python-requests/1.2.0. That is a dead giveaway to the server that you are not a browser, but instead python’s request library that is trying to get the server to respond with a html page that you can parse.

Parse that table

The steps for parsing the table of cities is to first create a tree from the html data then we need to find the xpath’s to the various items we want, and then we need to parse that data into a useful structure.

Build the tree

There is a tool in the html package from lxml that can build a tree from a string:

Find the XPaths

XPath is a way to navigate through elements and attributes in n XML document. The easiest way to find an xpath to an item you are looking for is to use Google Chrome. In order to find the xpath first open up the Developer Tools (View->Developer->Developer Tools). Next right-click on an element you are interested in and select inspect. That will highlight the line of html code that that link comes from. Right click on that line of code and then select Copy->Copy XPath. If you do that to the link for the city Aurora on the page http://www.realtor.com/local/Colorado you will get this result:

Parse the data

You will notice that the table is in some tag with the id of top-cities and that table has tr and td tags. tr is the row and td is the column. a is the link. The second column you will notice is a list of links to homes for sale in that area. I found this one to be particularly interesting so I used this code to investigate:

The output of this looks like this:

Now this is something I can work with. The next step was to get the name of the cities out of the title element and to do that I used this code:

Next I wanted to grab the links and the number of properties that were listed for sale, and the number of properties that were listed for rent for each location. This is the code I used to do that:

Syntax Note

You will notice that I’m using a technique to build lists that you may not have used before. It is called List Compression and here is a link that explains how to use it: List Compression

URL argument Note

The second thing you should notice about the code above is the argument that I’m adding the end of the url. With urls you can add arguments by listing them after a question mark in key,value pairs like this: ?key=value,key2=value2. I used this technique to set the variable pgsz to the value of the number of houses available for sale in that area. I discovered that this would cause the website to display all the listings in that area on one page. To figure this out I hovered my mouse over the next page button on the bottom of the result page and saw this argument. I changed it and noted what it did. This allows the next step to be much easier.

Getting a link to the listing

The next step proved to be much harder. I wanted to find the links on the next page that went to listings but one thing I found is that there were duplicate links and that grabbing the link was actually kind of difficult using xpath. Enter another tool. That tool is BeautifulSoap. It is a tool for searching through documents and finding what you are looking for. It is considerably easier than XPath. To test I chose to pick just one city for continuing on, Aurora. There are two steps to this part: getting the list of links to the listings and parsing data on those pages.

Getting a list of links to listings

Below is how I got the list of URL’s that went to addresses:

Making urls unique

You will notice a couple of things. First is the regular expression '[^?]*'. What this does is match until it finds a ? in the string. This is because the website has may copies of the same link with different arguments (probably so they can track on which links get clicked on). I wanted a list of unique arguments so I used a set. When you add an item to a set it checks to see if that item is already in the set, and if it is not it adds it. The second thing you’ll notice is that I’m searching through the soup object with a regular expression matching on the href parameter of find_all. This is complicated so I’ll break it down into parts. find_all takes a special type of parameter that allows you to pass key=value pairs into it that it reads as a dictionary. Inside this function it uses that dictionary to help it search the tags in the soup. In our case we want all tags with a href parameter (link), where that link contains the word detail. I discovered the format for the links by clicking on links in my browser and looking at the urls. All the listing links had the word detail in them.

On regular expressions

Regular expressions are very powerful for searching, yet they can also be very confusing. For a good primer on how to use them go to the Python Regular expression operations wiki page and read up. The trick to learning regular expressions is to use them. Set up a program and try parsing text. The only real way to learn to use something like regular expressions is to play around. And then once you understand the basics you can google for how to do something specific and you will often find many examples on how to do the specific thing you are looking for with regular expressions.

Parsing data out of those links

Once you have the links the next step is rather trivial. There is so much more that could be done here but I chose to just go to the links and get the address and price they are asking for now. Here is the code, there isn’t anything new here:

Next Steps

The next thing I’m going to do with this is gather more information about each of the listings and start to build a database locally that I can then use with other scripts without going out to the website each time to gather the same information. Doing this will make so I’m not being evil with the bandwidth on realtor.com’s server and making it more difficult for them to get the same information to other users. If you want to learn how to do this you should copy the code into a text file, run it, and then start playing around with it. Change it to work differently on your system and see what it does. And of course, I am not responsible for anything you do with this new power.

The Motivation Loop

I’m writing this because it is something I’m learning, not something I’ve mastered in any way. This is as much a message to myself as it is to anyone else. When I start a new project it is new and exciting, I have many ideas, and I become intensely focused and motivated. That last for a while until the next project comes along and the first one is harder to work on because I just don’t feel like it any more. That can be awfully frustrating with with many unfinished project piling up. I have several of those right now:

  • Lexan oven for making lexan race car windows
  • Rally car needing the motor pulled for some exhaust modification to raise the skid plate
  • Maintenance on my daily driver
  • A home security system
  • Countless small software projects

Right now there are a couple projects I’m intensely focused on. This blog, a open source dropbox like tool (more on that later), and the real estate simulation tool. One of my major goals is to finish projects I start. But how does one do that, the obvious answer is to just do it. Yet getting started again seems nearly impossible to be motivated at any time. And this is because I have the same false view as everyone else on how motivation is created: Excitement turns into Inspiration turns into Motivation turns into Action.

What I propose in this post is that it is actually a loop and that Motivation is the reward instead of the reason for action. If you want to be motivated doing something you first have to start doing that thing and you will be come excited and inspired and that will create motivation. Everyone can start a project, but only someone with the determination to start working on a project when they are unmotivated can really finish a project.

The author Mark Manson calls this the “Do something principal.” He explains it better than I can and you should read his blog post. Do something principal.

Now if only I can figure out how to remember this and actually apply it to my own projects. If I don’t this blog will die just like all my other projects.

Rent Scout – Investing with Python

I started a new project on github rent_scout where I will be using python as a tool to explore real estate investing. My current plan goes as follows: Use python to scrape the web for properties to sell, rent in a given area, and as much data about the real costs of purchasing a property as a rental. The plan will revolve around the idea of trying to use readily available information to try to find markets and properties that would turn the largest guaranteed return on investment.

Real estate investing is a good hedge against inflation and can be largely done with the banks money. Ideally every property would be cashflow positive from day one but that might not be possible. The plan is to build a tool that models real estate investments and can offer some metric to assist in investing in the best the market has to offer at the best times. I know there are always risks in investing and that no model can be completely trusted, but I believe that by understanding as much as I can about the process I can gain a real advantage over other small time real estate inverters.

The plan would then be to find properties that could be purchased as long term investments, handed over to a good property management company, and ideally in 15-30 years you would have a property that you could sell. The reason for this is there are many costs involved in each real estate transaction and unless I got into a really bad deal it would make more sense to ride out anything the market does so long as I can generate enough from rent to pay off the property. Then once the property is paid off I would have real passive income.

The first thing to model is house prices and rent prices. The next step is building a tool that can scrape the web for this information. To start I’ve written a simple python script to calculate the monthly payment of a loan and the total cost of that loan. Once I have rent and housing prices I need to do some digging to find out all the real costs in owning a home and renting (including what good property management companies are available in the market I’m researching). I need to find out the tax codes in each state and area I’m interested in and build that into the model. I need to model the tax benefits of paying interest and so forth. Here are my next steps:

  1. Costs of owning a home (ask friends, read books, google, etc.)
  2. Hidden costs of mortgages and how to get the best mortgage
  3. Web scraper for houses listed in an area and their attributes
  4. Web scraper for rental prices and availability (supply) in a market
  5. Research what makes rentals more desirable

Run from your problems

This may sound crazy but running may be good for you in more ways than physical. Here is some evidence:

  • Running makes you high. And specifically long distance running has been shown to result in mood altering drug like affects.
  • Running makes you healthier. Specifically 40% lower risk of heart disease high blood pressure, 20% lower risk of osteoarthritis and hip replacement, and 35% percent reduced risk of developing cataracts and a 40% to 50% reduced risk for macular degeneration. Fear isn’t the greatest of motivators, but I’m not one who wants any of these problems.
  • Running improves your social standing and has odd health benefits. Studies shows running improves your sex drive, improves your memory and ability to learn, makes you less likely to develop cancer, and you are able to hear better. The idea presented here is that we come from hunter/gatherers and those who run more were better hunters and would be better people to have children with, would learn more quickly to catch pray, and could hear better because of the increased running for hunting.
  • Running slows the aging process. Contrary to popular belief, exercise for older people is not harmful to their mobility but actually benefits them in many ways. This study shows that if you are concerned about the affects of getting older, maybe the affects you are feeling are the affects of sitting on your couch.

You probably didn’t want to hear that the way to help yourself was to do something hard. However, it is often the hard things that have so many good benefits. Now if only I could take my own advice and run more.