top of page

Chapter Seven: Concluding Thoughts

Topic-Example-Discussion. Context-Action-Result. Clarify-Structure-Analyze-Conclude. All are frameworks I’ve learned in school to help me write paragraphs, structure a resume, and tackle a case interview respectively. In general, most of my analytically-driven classes—economics, finance, strategy, etc.—seem to want all relevant analysis tied up in a neat little bow. Take for example, the consulting case interview. Success in a consulting case interview relies heavily on your ability to evaluate a problem through a mutually exclusive, collectively exhaustive framework.  In other words, your structure allows you to explore all relevant issues without any overlapping content. They usually look like this:

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Now don’t get me wrong, I’m extremely grateful for this type of analytical thinking I’ve been exposed to throughout my education. But the repetition of “here’s problem X, use this tool to analytically solve it” in my business classes often leads to a few problems:

 

  1. It conditions people to assume that all problems can be solved analytically through the right combination of regression, benchmarking, and industry research

  2. It never really includes the step that occurs before the analysis, which involves deciding whether the problem you’re addressing even needs addressing

 

I find the second point more troublesome. Essentially, in most of my business classes, exercises usually involve a very defined scope: calculate the cost of X using Y, calculate the total present values of cash flows using Z, etc. But defining the scope seems just as important as solving a problem once scope has been defined. And defining the scope of an investigation is hard, as evidenced by my attempt in this project that fell flat on its face.

 

So what have I learned from this project? It would be ironic to conclude a project involving oversimplified assumptions with a grandiose, sweeping conclusion about what I've discovered of myself. So I'll leave you with two key takeaways:

 

Of News: I've learned most of all that the news is a really slippery topic. Content can't really be quantified effectively (despite what Julia Fox thinks), and delivery is even harder to evaluate. Furthermore, you can't really evaluate programs in a vaccuum because they co-exist in a complex ecosystem. In regards to my intial injuiry as to whether I should continue watching more satire than mainstream news, I'm honeslty not sure. This project has opened my eyes to the complexity of evaluating new in a vacuum, so I think the best approach moving forward is to embrace diversification and reflect on my experiences to determine what's best for me. 

 

Of Myself: I have a tendency to bite off way more than I can chew. This isn't disasterous for a college class, but if something similar happened while working for a future consulting client, they probably wouldn't be too happy. When it comes to analzying very broad problems, I need to be realistic with my scope. I've also learned that critisizing my own through process can lead to more insight than the initial thought process itself. 

 

All in all, I'm very happy with how this project turned out. I probably won't fully understand the key takeaways from it until I mature and reflect more on the experience. However, as a whole, I found this opportunity to fall flat on my face very constructive. Thank you for reading!

 

 

 

bottom of page