Week 11 Insights

Our team has edited the CPC interface, based on feedback from last week, and we created the CG interface. This week, we received our first round of feedback on the CG feedback. Both the CPC and CG interfaces are aesthetically appealing, but we need to hone in on what visuals are needed and not needed. For the CG interfaces, we need to identify whether comparing their cohorts’ performance with other CG’s cohorts’ performance and the must-have information they need to see quickly. For the CPC interfaces, we now allow soldier performance to be viewed by class and cohort which has been well-received.

 

 

  • BENEFICIARIES

 

Primary Data Users

  1. Cognitive Performance Coaches
  2. Data Analysts
  3. Research Psychologist

Other Beneficiaries

  1. Instructors (Cadre)
  2. SWCS Commander

 

 

  • INTERVIEWS & KEY TAKEAWAYS

 

 

Phil Williams | CEO of Phil Williams LLC

  • Agrees that organizing by classes provides a better picture than cohorts b/c people who drop out of certain cohorts and then are categorized to a different cohort
  • For the normal distribution chart, Phil is interested in the specific characteristics that differentiate the top 5% v. the top 50% and lowest 5%?
  • “Cohort Comparison Chart” could include – how do my cohorts compare w/ CG John M, CG X, CG Y? This provides a comparison to CG’s how they are doing to past CG’s
Bruce MacDowell Maggs | Professor of Duke Computer Science

bmm@cs.duke.edu

  • DARPA is interested in blockchain for storing soldier demographic information
  • DARPA wants additional research or case studies of how blockchain could work to store information and has convened blockchain researchers to learn more
Michael Jelen | Berkeley Research Group

michaeljelen@gmail.com

  • Try combining the macro and micro-level data so that only the most important information is present.
Adam Beauregard | United States Navy

adambeauregard@gmail.com

  • Everything looks fine.
  • Doubtful that the team CPC will go through 12 pages of graphs. Pick out the most important ones.
  • Consider making a dashboard.
Mitch Heath | CEO at Teamworks

mheath17@gmail.com

  • Try figuring out which graphs are the most important.
  • Everything seems aesthetically good.
  • Try also to find filtered categories.
  • Willing to invite people to the location to meet the team.
  • Wants us to send info about H4D demo day on April 18th.
Joe Blanton | Colonel United States Army

joeblanton12@gmail.com

  • The ability for the platform to tailor to the preference of different commander is very important
  • Present the so-what and the purpose before explaining the why; assuming a busy and inpatient commander only has less than a minute of time
  • An algorithm that take in key indicator to show the changes of weekly key result
  • Have evidence and data to back-up what is the drive of attrition
  • Regression and projection would help a lot
Anubhav Mehrotra | VP of Product Management at Live Nation

Factors used to test level of engagement and emotion change:

  • Galvanic skin response
  • Position shift in alpha brainwave
  • Movement synchrony
Lieutenant Colonel Thomas Academic Instruction Director | SWEG

phillip.thomas2@socom.mil

  • Agrees that trainees don’t need access. Reports can be generated for them
  • Make sure our efforts are coordinated with Oscar
  • Still concerns over data storage

 

 

  • KEY INSIGHTS

 

  1. Classes, rather than cohorts, are a better way to categorize soldier performance due to the soldier drop out from cohorts.
  2. CG’s may be interested in having more options – filtering out by certain soldier/cohorts or comparing their performance to other CG’s.
  3. We still need to drill in on what specific graphs are useful for CPC and start taking out or having the option to filter out the less useful ones.

 

 

  • KEY PROBLEMS

 

  1. We need to talk to cadre’s, but there has been pushback on why speaking to cadre’s is relevant to our problem.
  2. Instead of guessing what the CG wants, we need to interview a CG to truly tailor the platform to his/her needs.
  3. We would like to understand better whether the CG wants more data on the selection or training aspects of the high attrition rate.

 

 

  • KEY DECISIONS

 

  1. We have scheduled a meeting to speak with Commander Rice which will help us understand the 1) type of data 2) amount of data 3) types of insights that would be most useful for higher ups.
  2. We do not want students to have access to our platform as, from interviews, the CPC’s can work individually with students and show them the platform during meetings.
  3. If we cannot interview cadre’s within the next week, then we need to remove them as a beneficiary. We will continue reaching out to the contact our problem sponsor has provided to get interviews with cadre.

 

Week 10 Insights

Our team has made further changes to our prototype based on feedback from last week. This week we have received feedback on what should be included when considering holistic performance. The visuals are well received, but our client wants to know more about how the data is being put together, where our golden ratio numbers are coming from etc. Our CPC interface is looking pretty good, but we need to work on what the research psychologist and the CG should be able to see regarding performance data.

 

 

  • BENEFICIARIES

 

Primary Data Users

  1. Cognitive Performance Coaches
  2. Data Analysts
  3. Research Psychologist

Other Beneficiaries

  1. Instructors (Cadre)
  2. SWCS Commander

 

 

  • INTERVIEWS & KEY TAKEAWAYS

 

 

Trevor O’brien | Information Technology

trevor.obrien@socom.mil

  • Comparison timeline. Weekly is good, but monthly or even quarterly could be useful for visualization.
  • General feedback, good compared it to spear interface.
  • Maybe too many tabs at the moment. Navigation may get too complicated.
  • Interested in looking at the tactical training.
Shawn Zeplin | Director of Behavioral Health at Duke University Athletics

  • Mental health and athletic performance are not separated – he works with both aspects of student athletes to improve sports performance
  • Athletes, mostly male, do not trust him, because they have been socialized that showing emotion is weak
    • He spends time at their practices to build trust
Phil Williams | CEO of Phil Williams LLC

  • The focus on holistic performance is b/c the pressure and psychological stress that was put on the trainee
    • Soldiers who were going to therapy didn’t want people to know they were going to therapy so that is why there is holistic focus
  • Compare soldier performance w/ marine or navy corp people – what information can you correlate among different branches (pushups, metabolic rate)?
  • The platform addresses two things they need 1) real-time data 2) cadre v. cadre & soldier v. soldier & military v. navy v. air force
Oscar Gonzalez | Research Psychologist

oscar.gonzalez@usuhs.edu

  • Holistic Performance – Rader Chart need to have both
    • Physical domain
    • Psychological domain
  • Exercise & Performance – what other factor contribute to performance difference?
    • Diet
  • Percentage representation is vague, need to specify what’s the unit time
  • Need to clearly label each graph and each measurement need to have a unit
  • Need to define what ideal means
  • The baseline graph shouldn’t be a continue graph – > use bar chart instead
  • The position of the sliding bar is confusing
Constance Garcia | Data Manager

constance.garcia.ctr@socom.mil

  • Heart rate variability monitors & eye tracking are the only ones that CPC’s use right now
  • Surveys are handed out to the entire cohort
  • Surveys change year from year – data analysts can upload surveys into the platform so they can compare whether surveys are comparable
Alexandra Hanson | Research Analyst

  • We don’t want to target specific instructors graduation rates bc it sets up competitive field between instructors
  • For CGs, we want to  create four quadrants – CPC, Cadre, physical, academic/survey data
    • CG’s don’t need to know information on specific individuals
Aspen Ankney | CPC USSOCOM USASOC SWEG

aspen.ankney.ctr@socom.mil

  • Currently 6 cohorts in a year, will change to 4 cohort next year
  • CPC don’t care that much about Cohort performance and organize information by cohort won’t help them, what they want is organize by course(However research psychologist really want the information to be organized by cohort, this is what Oscar care about) — All depends on who is the target user
  • Cohort sorting would be the least use feature since few soldiers make through the entire pipeline still stay within his/her original cohort
    • Organize by course make much more sense
  • Want data visualization for each course
  • Want comparison among different pipeline: Green Beret, Civil Affairs, Psychological Operation
  • There are total of 12 CPC on the team, each coach between 80 – 300 students
  • Based on Aspen, CPC don’t care about individual profile and overall they are not interested in individual performance that much, they care about course
    • Not that frequent meet with individual students (sometimes not at all, sometimes 10-20 per week, usually only meet with the students who are struggling and those who outperformed)
    • Instead of displaying individual performance they want course performance
  • 7 phases, one phase will broke into many sub-courses. Ex: there are five courses within officer specific training
  • Based on Aspen, data team and research psychologist don’t have the training and don’t know how to run the analysis that they want —> CPC input data and run analysis themselves —> the product should allow them to input data, organize data, output result and run analysis
  • Whatever information accessible to research psychologist should also be made accessible to CPC
  • Overall, Aspen loves the prototype and she gave very positive feedback: incredible, easy-to-use, highly developed, awesome job
Daniel Gajewski Performance Integrator

Daniel.gajewski@socom.mil

  • Visuals are good. Good incorporation of feedback from last conversation
  • Showing that we are able to take multiple files from soldiers and integrate them into aggregate data is promising.
  • Ideally, based on the visualization we have shown, if we have say 8 soldiers in a particular cohort and 7 exercises, this would mean using 56 files to look at holistic and aggregate performance (for simplicity, there would likely be more than one file per training exercise for one soldier).
Mark Manturo | Research Psychologist

mark.monturo@gmail.com

  • Holistic Performance
    • Leadership ability
    • Physical fitness test score
    • Ability to learn
    • Tactical proficiency
    • Deploymacy
    • Problem solving
  • What is the standard of “idea”? In navy they have NSS-navy grading score
    • Put something in in terms: historical score (minimal score, average score, reference line)
    • Standard deviation
  • What want to see on the commander’s page:
    • Peer evaluation
    • Student’s hobby and what activities do they do in their spare time
      • Looking for endurance athletes and team player; attract intelligent and all-rounded people
    • Answers to questions such as: Am I producing the next generation? Do I have enough people? Do I need a policy change
Seth Spradley | Data Analyst

seth.c.spradley.ctr@socom.mil

  • Says visually it looks good, interface is intuitive.
  • Offered to help debug the software system when we develop it further.
  • Progression tab, overlay box should be moved to upper left hand corner rather than in the middle of the graph.
Maj Arth | Joint Special Operations Command

mjarth@gmail.com

  • Ask if there is an operational psychologist
  • Have several “mini-traits”
  • Layout the overall situation. Tell your sponsor to pick a specific problem to go after. Can’t solve all the military problems.
Dillon Buckner | Military Intelligence Observer, United States Army

dbuck22@gmail.com

  • Pitch to companies that have a similar solution and try to get buy-in from them.
  • Get rid of red text from the cohort home page. Red means bad in the military
  • It would be more ideal to show a passing rate for each class.

 

 

  • KEY INSIGHTS

 

  1. Mental health component is a large part of performance and soldiers tend to be skeptical of mental health professionals. (cultural/socialization issue)
  2. While targeting specific instructor graduation rates would be useful data, it would also create a moral hazard due to competition to graduate the most.
  3. Cross service comparisons for special forces performance metrics would potentially be useful, if we can get the data.

 

 

  • KEY PROBLEMS

 

  1. We need to make sure that different users have what they need for their profiles on our software platform.
  2. When looking at ideal performance (golden ratio), we need to be more specific about what that means for different training exercises/metrics
  3. For self reported data, surveys change from year to year. If this is the case, then how do we accurately compare survey data over time?

 

 

  • KEY DECISIONS

 

  1. We need to think about what the SWCS CG wants to see on his interface. Ideally it should be simple and robust enough for him to see trends and make policy decisions.
  2. We need to add more granularity to the analysis. Even though there are 7 main exercises, there are multiple sub exercises. These also will be likely multi day measurements as well, meaning that we will need to integrate more files.
  3. Do we want students to have login access? We have received conflicting feedback on this and need to make a decision soon.

 

Week 9 Insights

This week we used our interviews to get further feedback on our prototype. Based on last week’s feedback we added more interfaces and visualization tools. This task was spearheaded by Bettie. The response this week has been positive. We discovered that when dealing with analytics, it is important to have benchmarks to compare. These “golden references” are soldiers who are the highest performers. If we can break down these soldiers into groups based on characteristics, we may be able to help our client figure out what is seperating the highest performers from the rest of the pack.

 

 

  • BENEFICIARIES

 

Primary Data Users

  1. Cognitive Performance Coaches
  2. Data Analysts
  3. Research Psychologist

Other Beneficiaries

  1. Instructors (Cadre)
  2. Trainees
  3. SWCS Commander

 

 

  • INTERVIEWS & KEY TAKEAWAYS

 

 

Lieutenant Colonel Thomas Academic Instruction Director | SWEG

phillip.thomas2@socom.mil

  • Briefly viewed prototype but had computer issues
  • Acquisition process for our product may not be too difficult if the product is relatively inexpensive, could likely be purchased through a simple contract/credit card.
  • Issue is, who has the money? LTC Thomas would likely have to lobby either SWCS, USASOC or SOCOM. Process could take months to years depending on politics and who has the money.
Mitch Health | CEO at TeamWorks

  • These are the questions we should be asking:
    1. “What is the one thing you want if you could see w/in 20 seconds?”
    2. “What do you think is the most important takeaway from this page?”
    3. “If you were presenting a page for your boss, then what would you include in there?”
  • Rank list of priorities for the person to minimize having too many features
Phil Williams | CEO of Phil Williams LLC

  • Compare trainees to the “golden references”
  • Need golden references for gender, age group – we need to create golden references by other groups
  • Figure out average performance of high performers
Morgan Hall | SOCEP CPC

morgan.hall.ctr@socom.mil

  • “It’s all about progression – what is the baseline is and how do I progress on these various indices from that baseline?”
  • Operational GB who have been here many years and have had many deployments would be the golden reference
  • Would use it most likely when he is having a one on one with the soldier to see which areas they think they’re struggling in
Constance Garcia | Data Manager

constance.garcia.ctr@socom.mil

  • Put the following into soldier info: age, rank, years of service, deployment & how many years deployed, highest education, prior & current MOS (military occupational specialty)
  • LOVES: holistic performance overview (spider web graph) – she loves it b/c it shows how each part plays into the other
  • Don’t use personal ID information b/c they don’t want this to go on their record – take out “John A.”
Travis Nicks | Former Navy Submarine nuclear operator for 10 years

travis.nicks@duke.edu

  • Based on his 10 years experience working on technology innovation at Navy, it’s very hard to get devices approved to connect to the internet due to risks of cybersecurity threat
    • Even get approved, there will be many constraints on the use case
  • Metrics can be used to measure anxiety level besides hrv includes:
    • Breathing rate
    • Sweat
    • Saliva – hydration level
    • Stand still -motion range
Barbara Plotkin | CPT USSOCOM former instructor

barbara.j.plotkin@socom.mil

  • We need to be more hands off during captain’s career course training
  • Mission analysis is always taken in the form of a problem statement
  • Funding begins in October, 4 cycles of CCC, then ends on the end of September
  • Comptroller looks at historical data and decides if the needs of CCC are being met with the current level of funding.
Seth Spradley | Data Analyst

seth.c.spradley.ctr@socom.mil

  • Prototype holds up well visually. Graph presentation is very effective.
  • Interested in seeing more of the achievement history in the soldier performance tab
  • Was confused if we were solely focussing on Green Berets or are also including civil affairs and psyops. (regardless, this system should be able to scale to meet their needs)
Alexandra Hanson | Research Analyst

  • “There has been a lot of people pushing for this information, but it has been hard to get units to share information” (about developing golden references)
  • The quality of candidates that we were getting almost twenty years ago is remarkably different than the quality of candidates that we are getting today – seventeen years ago, 9/11 happened so we had wall street bankers, Master’s degrees candidates so the quality of education that people are coming in with are very different.
  • Visually represent in an intuitive large amounts of data – CPC’s need a platform that brief them on data in an intuitive and self explanatory way
Gabriella Shull | Duke Biomedical Engineering PhD  @ BX/NC

gabriella.shull@duke.edu

  • Visual attention EEG
  • Can refer us to PhD in cognitive performance and neuron science if needed
  • Look into cognitive brain science
    • What affect focus and attention  

 

 

  • KEY INSIGHTS

 

  1. Having the prototype stand out visually is key to selling it as a solution.
  2. While there are many competing wants and needs, we should prioritize the most important features and not include everything.
  3. “Golden Standard” for recruits needs to be established in order to provide benchmarks for analysis and can be compared to things such as the group average.

 

 

  • KEY PROBLEMS

 

  1. Full deployment of the MVP may still involve a hardware component that is not anywhere near development ready.
  2. Rapid deployment is also linked to funding availability. At this point in time this would mean looking for unobligated funds in either SWCS, USASOC or SOCOM.
  3. Network security/device approval process is still a major hurdle.

 

 

  • KEY DECISIONS

 

  1. We need to start narrowing down metrics in order to build a robust data system that can run the statistical analysis that our client wants.
  2. Once we have our metrics figured out, how do we go about linking soldier strengths and weaknesses to possible training interventions?
  3. Who do we really need to get to in order to sell our product?