It’s not unusual that writers occasionally need to do fact checking on the stories they write, and when your “stories” consist of stock recommendations, it’s even more important to keep your ducks in a row.
Such was the situation at the publishing company where I worked early in my career. We had a smart, visionary “idea guy” who came up with the basic philosophy for the stocks we picked, and a group of editors who took his big ideas and turned them into weekly articles and advice that we sent to our subscribers. Some were paying customers, some only signed up for the free newsletter, but all were interested in our perspective on investing. We had to make sure that we kept our credibility intact.
The only problem: there were so many stocks, so many opinions issued, and such a volatile market that it often turned into a major chore just to determine our historical opinion of a given stock…had we, in fact, previously recommended a stock that we were now panning? If we claimed gains of X% for a given stock, what share price was that based upon?
Each editor had his or her own system, consisting of various Excel spreadsheets, Word docs, text files and Post-It notes. As a young go-getter with a background in journalism, I was already close with the editorial department—I even helped them to proofread their articles when they were short-staffed—and so I quickly volunteered to help them come up with a better solution.
Being young and needing some good portfolio pieces, I decided to hand-code a database-driven mini-website. I believe I called it EMERALD, which stood for “Editorial-something-something-Database;” editors could log in, enter a stock ticker, a date, and a number of note fields regarding their current opinion on that stock.
I wasn’t savvy enough to get real metrics on the time savings from my new web application, but I’d estimate that it saved somewhere between 5 and 10 hours a week for the already overworked editors—probably at least a $10K yearly cost savings, depending on how you did the math. Not bad for a few days of programming work.
In this case, the project was a success, despite its technical shortcomings—the main screen could get very long since it lacked basic pagination options, for example, and the table structure resembled nothing so much as a large Excel spreadsheet. Which brings me to the first lesson, a real classic: the perfect is the enemy of the good. All my friends and coworkers really needed was a system that was better than what they were using, which was no system at all.
But your clients are the ones you need to convince that your solution is “good enough.” That brings me to my second lesson: if you’re going to do consulting work, you really need to develop a good rapport, and that means walking the walk and talking the talk. It wasn’t hard for me to gain the trust of the editorial team because of my own experience in journalism, and once they understood that I understood them and was on their side, they depended on me to help solve their issue.
In retrospect, however, I could have built a simpler, better system in even less time and with potentially better results—a centralized Access database, or even a shared Excel file might have met their needs. And that’s the third lesson: there’s a complex relationship, for developers, between challenge and efficacy. I wanted to do the project with the languages, database and framework I used because it would benefit me, as a developer, to a) gain more experience and b) have evidence of that experience—just as, if I were trying to get a TV job, my blog would rapidly become a vlog. It’s important to understand the needs of your problem solvers—as well as the needs of the problem owner—when coming up with a solution.
That’s true not just for software developers, but for anybody who is a worker bee in your organization: useful experience is its own reward.