It is very rare that one can get sound training in any of the sport sciences without having to take at least one research methods class. Perhaps it is human nature or some sort of cultural creation, but too often such a class is seen as a requirement to pass rather than a true asset to someone that will work in the “real world” of sports. Unwittingly, this limits the sport-scientist/coach from truly reaching excellence. It leads to a poor consumer of science and one that is poor in execution of focused player development plans. With this in mind, there are 5 basic research ideas that I have found important to quality sport science practice:
1. Research is not about proving yourself right.
Too often young researchers decided that study findings that contradict a study’s hypothesis are a bad thing. This is absolutely wrong. If the study was well designed and well thought out, these findings are valuable. Open mindedness and healthy skepticism is required when looking at all data. There are risks to viewing one’s self as “right” too quickly (i.e. imagine implementing a training plan that seems to make sense, but in reality only leads to injured athletes). Similarly, there are risks to dismissing findings that are not in support of a hypothesis without sufficient thought (i.e. the nuances to the solutions to many of life’s complex questions can lie in the contradictions).
Understanding these concepts extends beyond naïve graduate students, but to the public as a whole. It was surprising to stumble upon the following quote from Bill James, the Red Sox’s famed sabermatrician, “Random data proves nothing and that it cannot be used as proof of nothingness. Why? Because whenever you do a study, if your study completely fails you will get random data. Therefore, when you get random data, all you may conclude is that your study failed.” He is getting at something with this quote, but the willingness to suggest a failed study because of unclear data is shortsighted.
Research is about keeping an open mind and gaining information through well regulated examination.
2. Manipulate only one variable at a time if you want precise understanding of the impacts of coaching interventions.
Coaching approaches and philosophies can change quickly. There is nothing wrong with this, however if performances dramatically decrease or increase it will be tough to determine what the cause of these costs or benefits were. Single-subject research design tells that during research only add a single new variable at a time and then watch things for a while. If the outcome being examined changes in any significant manner, you can say it was most likely due to the variable that you recently added. Differently if you add two or more variables at a time you are left confused as to what actually led to change in behavior.
This concept can be clearly seen in the coaching of Michael Boyle. While he might have a bit of a “shock and awe” style to his writing and presentations, his coaching is quite disciplined. It is always impressive to hear him in his talks about refining the strength and conditioning programs of athletes and how he religiously adheres to the “manipulate one variable at a time” principle. A sports medicine colleague commented to me the other day, “Heck, if Mike added bananas into an athlete’s diet, he wouldn’t mess with anything else for a few weeks until he determined if the banana eating had any significant impact.” The question for those working in athletics is, “Can you stay this disciplined when refining your player development programs?”
3. Establish your “baseline” before you change the game plan.
This concept is closely related to #2. Human beings (at least Westerners) tend to be an impatient population. When something does not appear to be going right we push for changing something… anything. Back to considering single subject research, prior to initiating any interventions have a substantial baseline period. Highs and lows of behavior may just be artifact early on rather than the “truth.” Giving actions and performances a fair test of time truly allows someone to see “what is what.”
A good example of this is the baseball player calculating his batting average after the first two games of the season. It is likely that the average is quite high or quite low at this time – and likely a false measure of the player’s goodness. After 20 games or so, it would appear that things begin to come into focus. The batter that begins to make swing changes and panics after the second game certainly lacks 20/20 vision for his current status as a batter. It takes a bit of time to establish a baseline, but it’s worth it because it creates a true foundation off of which one can be coached and learn.
4. Appreciate the normal curve.
“Outliers” has become a hot term with the publication of Malcolm Gladwell’s most recent book. This being said, focusing on them can sometimes get us into a bit of trouble. Perhaps one of the biggest lessons I learned early on in my career was from Bob Dallis, currently the Dartmouth College women’s tennis coach. After I concluded a workshop that went o.k. but seemed to miss a few of the players, Bob pulled me aside and said, “Think about the normal curve when considering how and if you reached a team.” What he meant by this was that there are likely to be outliers. A small section of every group will love what you say regardless of what you say. Conversely there is likely to be a small section of the group that will not appreciate your efforts regardless of how good they are. The job of a good educator is to make sure to get the middle to attend, learn, and embrace the ideas being shared. Trying too hard to sway the negative outliers leads to a failure to attend sufficiently to others. Also, basking in the glow of the positive outliers only build the teacher’s ego and does little for the students. In a lot of ways, you can measure the quality of your work by the growth of the “normal” athletes in front of you.
5. Embrace evidence-based practice.
“Evidence-based” is a hot term these days… yet it is an old idea. If one considers it closely, it simply means being an effective and ethical practitioner of your craft. Part of such quality practice is having the stomach and patience to read primary research and quality reviews of up to date study in the sport sciences. Appreciating recent publications in referred journals can help one refine his craft. The common criticism of this concept is that such sources are too slow to publish about the current trends in sport science, knowledge moves too fast for them to keep up. This is a cop out and at times can lead to reckless practice (not to mention a waste of an athlete’s valuable training time).
It is true that sometimes coaches and practitioners “in the trenches” are ahead of the scientists. This does not mean one should abandon evidence-based practice. In actuality, the wise practitioner realizes that this is an opportunity to create evidence by being thoughtful, focused, and organized in coaching practices. Evidence-based is about both learning from quality practice that has preceded and objectively creating evidence off of which to make educated coaching decisions when relevant studies do not seem to exist. This being said, I have found that too few people give a fair crack at the first step of quality practice: taking a good look at the literature and understanding the nuances of everything read leads to great practices on and around the playing field. If you want to be able to build great athletes lay a solid foundation by using scientific evidence.
Did you pay attention in your research methods class? In many regards it was about making good professional decisions and making athletes great…
Dr. Adam Naylor, AASP-CC. is the Director of the Boston University Athletic Enhancement Center (www.bu.edu/aec). He has serves as a mental conditioning and player development resource for players at all stages of their sports career. More reflections on player development and sport psychology can be found at http://prosportpsychsym.wordpress.com and Dr. Naylor can be reached at email@example.com and followed on Twitter @ahnaylor.