Home » Uncategorized » Week 10 Insights

Week 10 Insights

Our team has made further changes to our prototype based on feedback from last week. This week we have received feedback on what should be included when considering holistic performance. The visuals are well received, but our client wants to know more about how the data is being put together, where our golden ratio numbers are coming from etc. Our CPC interface is looking pretty good, but we need to work on what the research psychologist and the CG should be able to see regarding performance data.

 

 

  • BENEFICIARIES

 

Primary Data Users

  1. Cognitive Performance Coaches
  2. Data Analysts
  3. Research Psychologist

Other Beneficiaries

  1. Instructors (Cadre)
  2. SWCS Commander

 

 

  • INTERVIEWS & KEY TAKEAWAYS

 

 

Trevor O’brien | Information Technology

trevor.obrien@socom.mil

  • Comparison timeline. Weekly is good, but monthly or even quarterly could be useful for visualization.
  • General feedback, good compared it to spear interface.
  • Maybe too many tabs at the moment. Navigation may get too complicated.
  • Interested in looking at the tactical training.
Shawn Zeplin | Director of Behavioral Health at Duke University Athletics

  • Mental health and athletic performance are not separated – he works with both aspects of student athletes to improve sports performance
  • Athletes, mostly male, do not trust him, because they have been socialized that showing emotion is weak
    • He spends time at their practices to build trust
Phil Williams | CEO of Phil Williams LLC

  • The focus on holistic performance is b/c the pressure and psychological stress that was put on the trainee
    • Soldiers who were going to therapy didn’t want people to know they were going to therapy so that is why there is holistic focus
  • Compare soldier performance w/ marine or navy corp people – what information can you correlate among different branches (pushups, metabolic rate)?
  • The platform addresses two things they need 1) real-time data 2) cadre v. cadre & soldier v. soldier & military v. navy v. air force
Oscar Gonzalez | Research Psychologist

oscar.gonzalez@usuhs.edu

  • Holistic Performance – Rader Chart need to have both
    • Physical domain
    • Psychological domain
  • Exercise & Performance – what other factor contribute to performance difference?
    • Diet
  • Percentage representation is vague, need to specify what’s the unit time
  • Need to clearly label each graph and each measurement need to have a unit
  • Need to define what ideal means
  • The baseline graph shouldn’t be a continue graph – > use bar chart instead
  • The position of the sliding bar is confusing
Constance Garcia | Data Manager

constance.garcia.ctr@socom.mil

  • Heart rate variability monitors & eye tracking are the only ones that CPC’s use right now
  • Surveys are handed out to the entire cohort
  • Surveys change year from year – data analysts can upload surveys into the platform so they can compare whether surveys are comparable
Alexandra Hanson | Research Analyst

  • We don’t want to target specific instructors graduation rates bc it sets up competitive field between instructors
  • For CGs, we want to  create four quadrants – CPC, Cadre, physical, academic/survey data
    • CG’s don’t need to know information on specific individuals
Aspen Ankney | CPC USSOCOM USASOC SWEG

aspen.ankney.ctr@socom.mil

  • Currently 6 cohorts in a year, will change to 4 cohort next year
  • CPC don’t care that much about Cohort performance and organize information by cohort won’t help them, what they want is organize by course(However research psychologist really want the information to be organized by cohort, this is what Oscar care about) — All depends on who is the target user
  • Cohort sorting would be the least use feature since few soldiers make through the entire pipeline still stay within his/her original cohort
    • Organize by course make much more sense
  • Want data visualization for each course
  • Want comparison among different pipeline: Green Beret, Civil Affairs, Psychological Operation
  • There are total of 12 CPC on the team, each coach between 80 – 300 students
  • Based on Aspen, CPC don’t care about individual profile and overall they are not interested in individual performance that much, they care about course
    • Not that frequent meet with individual students (sometimes not at all, sometimes 10-20 per week, usually only meet with the students who are struggling and those who outperformed)
    • Instead of displaying individual performance they want course performance
  • 7 phases, one phase will broke into many sub-courses. Ex: there are five courses within officer specific training
  • Based on Aspen, data team and research psychologist don’t have the training and don’t know how to run the analysis that they want —> CPC input data and run analysis themselves —> the product should allow them to input data, organize data, output result and run analysis
  • Whatever information accessible to research psychologist should also be made accessible to CPC
  • Overall, Aspen loves the prototype and she gave very positive feedback: incredible, easy-to-use, highly developed, awesome job
Daniel Gajewski Performance Integrator

Daniel.gajewski@socom.mil

  • Visuals are good. Good incorporation of feedback from last conversation
  • Showing that we are able to take multiple files from soldiers and integrate them into aggregate data is promising.
  • Ideally, based on the visualization we have shown, if we have say 8 soldiers in a particular cohort and 7 exercises, this would mean using 56 files to look at holistic and aggregate performance (for simplicity, there would likely be more than one file per training exercise for one soldier).
Mark Manturo | Research Psychologist

mark.monturo@gmail.com

  • Holistic Performance
    • Leadership ability
    • Physical fitness test score
    • Ability to learn
    • Tactical proficiency
    • Deploymacy
    • Problem solving
  • What is the standard of “idea”? In navy they have NSS-navy grading score
    • Put something in in terms: historical score (minimal score, average score, reference line)
    • Standard deviation
  • What want to see on the commander’s page:
    • Peer evaluation
    • Student’s hobby and what activities do they do in their spare time
      • Looking for endurance athletes and team player; attract intelligent and all-rounded people
    • Answers to questions such as: Am I producing the next generation? Do I have enough people? Do I need a policy change
Seth Spradley | Data Analyst

seth.c.spradley.ctr@socom.mil

  • Says visually it looks good, interface is intuitive.
  • Offered to help debug the software system when we develop it further.
  • Progression tab, overlay box should be moved to upper left hand corner rather than in the middle of the graph.
Maj Arth | Joint Special Operations Command

mjarth@gmail.com

  • Ask if there is an operational psychologist
  • Have several “mini-traits”
  • Layout the overall situation. Tell your sponsor to pick a specific problem to go after. Can’t solve all the military problems.
Dillon Buckner | Military Intelligence Observer, United States Army

dbuck22@gmail.com

  • Pitch to companies that have a similar solution and try to get buy-in from them.
  • Get rid of red text from the cohort home page. Red means bad in the military
  • It would be more ideal to show a passing rate for each class.

 

 

  • KEY INSIGHTS

 

  1. Mental health component is a large part of performance and soldiers tend to be skeptical of mental health professionals. (cultural/socialization issue)
  2. While targeting specific instructor graduation rates would be useful data, it would also create a moral hazard due to competition to graduate the most.
  3. Cross service comparisons for special forces performance metrics would potentially be useful, if we can get the data.

 

 

  • KEY PROBLEMS

 

  1. We need to make sure that different users have what they need for their profiles on our software platform.
  2. When looking at ideal performance (golden ratio), we need to be more specific about what that means for different training exercises/metrics
  3. For self reported data, surveys change from year to year. If this is the case, then how do we accurately compare survey data over time?

 

 

  • KEY DECISIONS

 

  1. We need to think about what the SWCS CG wants to see on his interface. Ideally it should be simple and robust enough for him to see trends and make policy decisions.
  2. We need to add more granularity to the analysis. Even though there are 7 main exercises, there are multiple sub exercises. These also will be likely multi day measurements as well, meaning that we will need to integrate more files.
  3. Do we want students to have login access? We have received conflicting feedback on this and need to make a decision soon.

 


Leave a comment

Your email address will not be published. Required fields are marked *