Home » 2019 » February

Monthly Archives: February 2019

Week 8 Insights

We utilized our 10 interviews this week to accomplish two main goals: (1) collect feedback on our software prototype, a data management and visualization platform (2) validate our hypothesis that having a data management platform that enables our beneficiaries to visualize the output would help them to make data-driven decisions. Betty spent her weekend on prototyping and delivered an interactive prototype. The team walked through the prototype with project sponsor Lieutenant Colonel Thomas and interviewed key beneficiaries such as HDP director Jim, research psychologist Oscar, and Constance and Alexandra from the data team. We have received both positive and constructive feedback from interviews. These valuable feedback provide us with a much clearer idea on what kind of function and product feature they want and in what direction shall we further develop the prototype.

 

 

  • BENEFICIARIES

 

Primary Data Users

  1. Cognitive Performance Coaches
  2. Data Analysts
  3. Research Psychologist

Other Beneficiaries

  1. Instructors (Cadre)
  2. Trainees
  3. SWIC Commander

 

 

  • INTERVIEWS & KEY TAKEAWAYS

 

 

Lieutenant Colonel Thomas Academic Instruction Director | SWEG

phillip.thomas2@socom.mil

  • Want the product to answer the following questions:
    1. How many people are going through pipeline in a day?
    2. How satisfied are trainees interested?
    3. What are the machines that are used the most?
  • ROI, bottom line – (they are investing money and want to know how money is creating results)
Oscar Gonzalez Research Psychologist | SWEG

oscar.gonzalez@usuhs.edu

  • time series analysis, want to see changes over time
  • comparisons over different groups (age bracket, ranks, years of experience, MOS/job description)
  • Want the product to answer the following questions:
    1. What is the exercise that provides the biggest performance increase?
    2. What is the performance machine/training/process that creates the biggest change?
    3. Who are the people that have the biggest change and smallest change?
    4. When do people start showing performance training over time?
Constance Garcia Data Manager

constance.garcia.ctr@socom.mil

  • The ability to run statistical analysis
  • Really like how the prototype shows who has “touched the data”
  • Use excel to run basic analysis with a set of metrics for most of the time
  • SPSS isn’t on the network so they must use the software on a stand alone computer
Major Mike Williams Special Operations Command Officer  

mike.s.williams1@gmail.com

  • 75 Ranger Regiment: as new soldiers came in, they would put them where they were relative to the baseline

-”the average soldier can do 65 pushups per minute” -> here’s a regime on how you can get to the average pushup

  • we need to create a normal distribution curve for people who complete the training
  • Reach back out to Major Mike Williams when we need help w/ statistical analysis
Dr. Greg Dale Sport Psychologist & Leadership Director | Duke

gdale@duke.edu

  • Three focus areas for Duke sports: leadership, culture, performance
  • Works with students on: expectations, evaluation, awareness of consequences
  • Teach students how to be aware of their nervous feelings and physical changes to perform better
Daniel Gajewski Performance Integrator

Daniel.gajewski@socom.mil

  • Issues still to tackle: data centralization from multiple sources, time stamping, aggregation…
  • Finding a way to get data from the field back to the lab is ideal. Hardware solution is still necessary.
  • Synchronizing data to an event during training is more important than having the program run advanced statistical analysis. (for Dan at least)
Trevor O’brien Information Technology

trevor.obrien@socom.mil

  • Would like to see graphical user interface, how it was developed. Step by step. (less useful right now since it’s more of a visual representation)
  • Showing what different profiles for different users would look like would be useful.
  • Making sure the user interface is user friendly is key.
  • Trevor is more of a hands on learner, could help us with development if we let him in.
  • Provided some data on spear that may be helpful.
  • Unrelated: Spear is likely going to get nixed, but could be helpful for us to use.
 Rick Dietrich SOCEP Director

Frederick.d.dietrich@socom.mil

  • Really likes how simply and intuitive the software is to use.
  • Nexus 10 has a biotrace+ software that connects all sorts of devices:
    • EEG
    • ECG
    • Galvanic Skin Conductor
    • EMG
    • Blood Oxygen Level
    • Blood Volume Pulse
    • Extremity Temperature
    • SCP (Brian Score)
  • Kubios is the software that gathers information from the Nexus
  • Look at MindMedia
 Alexander Hanson Data Analyst

Frederick.d.dietrich@socom.mil

  • Concerned about the data storage capabilities of the platform – they had to move away from excel b/c it couldn’t store the data
  • Merging, regression, multiple analysis of variance is important
  • Emphasized that auto population of data is pivotal to her job
Jim Arp HDP Director

james.arp@socom.mil

  • Concerned about data storage and scalability

 

 

  • KEY INSIGHTS

 

  1. The main users of the software product is the data team and the research psychologist
  2. Excel is the current data analysis tool. SPSS is only on one standalone Mac that is not connected to the internet. Few complex analysis has been done
  3. Interviewees all like the design of the current prototype but would like to see more personalized page with differentiated product feature for each beneficiary E.g. the home page for CPC and data team would be different since CPC mainly want to see the result in a better data visualization form to facilitate their training while data team want to run actual analysis
  4. Time series analysis is the most important analysis they want to have on the prototype so that they could see changes over time
  5. While being able to run statistical analysis on our product would be a big plus, beneficiary want us to position it as a data storage/management tool
  6. The bottom line is the ROI: they are investing money and want to know how money is creating results)
  7. Being able to answer the following four questions from our prototype would be the key:
  • What is the exercise that provides the biggest performance increase?
  • What is the performance machine/training/process that creates the biggest change?
  • Who are the people that have the biggest change and smallest change?
  • When do people start showing performance training over time?

 

 

  • KEY PROBLEMS

 

  1. The biggest problem is how will we get the data from the hardware that we used as inputs for the software prototype
  • All current devices have bluetooth and they could transmit the data to the computers if connected to internet
  • But none of the current devices are connected to internet and there is little we could do about it
  1. Given the time constraints and skills available to the team, we think the best way to tackle this complex problem and to deliver a prototype that could solve both the input and the output problem is to leverage existing commercial solution for the hardware and focus on software prototyping to create a data management and visualization platform. Nevertheless the premise is these device are connected to the internet.
  2. The process of getting both software and hardware approved is very complicated and could take 6 month or even more.
  3. This is still true: that the structure and procedure for data collection, analysis, and storage is changing week to week within HDP. Week to week, we learn of new developments in many of our beneficiaries’ roles as it relates to data analysis, connection, and storage.
  4. The prototype could proceed to different directions and the feedback we received from interviewees compose a wide range of product feature requests and also has some conflicting information.
  5. This is still true: We do not have the devices with us so it is difficult to test our hardware MVP.

 

 

  • KEY DECISIONS

 

  1. We need to validate the possibility of having current existing devices connected to internet since they would change the way how we shall approach the problem
  2. The solution has to be a hardware + software product to solves both the input and the output problem
  3. While we are figuring out the hardware part, we should continue refining the software prototype
  4. We will commit to do more research on other commercial offering and how compatible that would be with HDP devices
  5. We will continue to commit to identifying companies that are willing to send new devices and data specs so that we more easily gauge the feasibility of creating a device from scratch

 

Week 7 Insights

We optimized our interviews to validate our assumption that (1) an Arduino or particle.io-based micro controller system would be pass the security laws at Fort Bragg (2) the micro controller system shown below is scalable and would fulfill the design constraints set by LTC Phillip Thomas and James Arp and (3) to discuss the feasibility of the proposed system both in terms of necessary skills and the time that we had available to work on an original prototype. In order to be fully prepared, we also explored commercial solutions in case we wouldn’t be able to deliver a proper prototype.

BENEFICIARIES

Primary Data Users

  1. Cognitive Performance Coaches
  2. Data Analysts
  3. Research Psychologist

Other Beneficiaries

  1. Instructors (Cadre)
  2. Trainees
  3. SWIC Commander

INTERVIEWS & KEY TAKEAWAYS

Vatrina Madre Information Technology Director | SWEG

vatrina.mardre@socom.mil

  • We need to figure out if our device will be on the lang or NIPR networks.
  • Requirements for approval: cannot create vulnerabilities, must be compatible with Windows 10, cannot create risk
  • Bluetooth can be approved, although it is hard.
Maj Arth Commander’s Action Group, Director | Joint Special Operations Command

majarth@gmail.com

  • Military regulations are unclassified and open to the public -> ask LTC if he can send us rule regulation governing connectivity standards
  • Interview someone who pushed for new connectivity rules
  • Military bureaucracy creates more stringent rules as there are more and more rules as you go down the chain of command
Rachel Feher Congressional Research Service

rachfef@hotmail.com

  • Advisory Board – works specifically in healthcare consulting, look at what information or research on what healthcare is doing
  • Talk to people at Duke Hospital – rehabilitation department
  • Soldiers at Walter Reed National Military  Medical Center are meeting with so many doctors, and they have a central database for which to track each patient.  
Dr. Lawrence Appelbaum Director of Human Performances Lab | Duke

greg@duke.edu

  • Different radio frequencies mean that we need to time sync
  • Proprietary information from different sensor makers will make synching difficult.
  • Arduino syncs by sending out an orientation pulse from each device and then it gets a timestamp
  • Tobii (lots of data), hrv (very little).
Yao Yuan ECE Student | Duke

yiyao.yuan@duke.edu

  • Recommended Firefly hardware DIY platform
  • Recommended tutorial for learning circuit design
  • Willing to help with hardware prototyping if we needed it
Mitch Heath CEO | Teamworks

mheath17@gmail.com

  • Conduct more MVPs and really try to understand your problem sponsors prototyping problem
Kyle Janson ECE BME Student | Duke

kyle.janson@duke.edu

  • Willing to put us in contact with other ECE people working on data problems
  • Might not need to use a microcontroller. Consider looking at 3rd party companies that offer data integration but not necessarily collection
Mark Palmeri MD PhD and ECE Professor | Duke

mark.palmeri@duke.edu

  • Learning the skills to create this component/ device might take more time than is allotted to you. It depends on how much time the team has willing to give.
  • Willing to give time to go over possible device companies that are doing data integration
Trevor O’brien SWEG IT | SWCS

  • Hardware solution (chip set) could work, but may only be a short term patch with high future sustainment costs.
  • A network/web app solution, ideally with a 3rd party vendor would be ideal.
  • A thin client solution, where the software is hosted on a server and the devices only need storage and RAM could save up to $15 million.
Rich Diviney Retired Navy SEAL/Seal Team 6 Instructor

  • SEAL school has similar issue with attrition, A/S at 87%.
  • We should know whether or not attrition is coming from people giving up or from failing out. Special operator recruitment has changed in part due to pop culture influence, this leads to recruits who want to be hot shots and get disillusioned early on (in that case they are getting the wrong people).
  • Focus could change who is being recruited so that they get the right mental profile. Even though this seems like an obvious problem but school is generally blind to it (outsider perspective useful).

KEY INSIGHTS

  1. The data analysts and the data managers would still be the main point of contacts to receive the raw data
  2. Bluetooth is now possible. From our interview with Vatrina Madre, we learned that there are two networks that we can operate on: Lang and NIPR. NIPR is more “black and white” and Lang is more lenient as far as devices to connect.
  3. The new facility will be up and running in 2-5 years.
  4. Creating a prototype from scratch seems infeasible given the time frame and the resources available. One commercial solution in particular, StelLife (introduced to us by Steve McClelland) is a strong candidate due to high data integration capabilities and scalability.

KEY PROBLEMS

  1. The biggest problem is that given the time constraints and skills available to the team, creating a prototype from scratch seems less feasible. The team is stretched in regards to managing workflow given the high amount of deliverables and the capabilities of the team.
  2. This is still true: that the structure and procedure for data collection, analysis, and storage is changing week to week within HDP. Week to week, we learn of new developments in many of our beneficiaries’ roles as it relates to data analysis, connection, and storage.
  3. This is still true: We do not have the devices with us so it is difficult to test the effectiveness of our MVP. We have tried to ask for dummy data, but we faced barriers regarding confidentiality.
    1. We also have not been able to get in contact with vendors to supply a possible solution.

KEY DECISIONS

  1. We have mapped out a new workflow in which half of the team works on interviews while the other half of the team works on prototyping. We are currently going to reconsider this workflow at the end of this class in case team members feel overworked.
  2. We will commit to identifying companies that are willing to send new devices and data specs so that we more easily gauge the feasibility of creating a device from scratch
  3. We will look more into StelLife and other possible commercial products as a means of providing a solution to our problem sponsor.

 

Week 6 Insights

We utilized our ten interviews this week to accomplish two goals: 1) gauge reactions to our MVP 2) understand necessary security and training environments that limit our MVP. Additionally, we talked to experts in the field like Phil Williams and Dr. Janson to understand relevant research and tools to build our MVP. Through conversations with experts, we drew a MVP as a team and tested the desirability of our MVP during interviews. In this process, we naturally learned of more limitations that we must account for as we continue to iterate on our MVP. At the end of this week, we have a more clear idea of how to modify our current MVP to address additional pain point and limitations that we learned during interviews.

 

 

  • BENEFICIARIES

 

Primary Data Users

  1. Cognitive Performance Coaches
  2. Data Analysts
  3. Research Psychologist

Other Beneficiaries

  1. Instructors (Cadre)
  2. Trainees
  3. SWIC Commander

 

 

  • INTERVIEWS & KEY TAKEAWAYS

 

 

 

  • KEY INSIGHTS

 

  1. Due to recent procedural changes, the data analyst and data manager would be the main individuals receiving the raw data.
  2. A SD card cannot be directly injected into the military computers, because a SD card is not an approved device. Rather, the data analyst would inject the SD card into a personal computer and send the dataset to his/her military computer.
  3. The biometric devices will be used within the new training facility under normal temperature and terrain conditions. Therefore, the solution we provide does not have to withstand extremely tough conditions.
  4. The new training facility will accomodate for the needs of our MVP, if it proves to be useful and relevant.

 

 

  • KEY PROBLEMS

 

  1. Our MVP must work within the security constraints. Many interviewees emphasized that our solution may need to work without wifi and/or bluetooth which has limited solutions to transfer data from our MVP to the data analysts’ computers.
  2. The structure and procedure for data collection, analysis, and storage is changing week to week within HDP. Week to week, we learn of new developments in many of our beneficiaries’ roles as it relates to data analysis, connection, and storage.
  3. We do not have the devices with us so it is difficult to test the effectiveness of our MVP. We have tried to ask for dummy data, but we faced barriers regarding confidentiality.

 

 

  • KEY DECISIONS

 

  1. We need to learn about the specifications of our devices by speaking to the companies that created these devices. Only by understanding device specifications can we build a MVP that can receive info from all devices.
  2. We must understand what data each device stores and seek to create dummy data to test our MVP.
  3. We will research about Zigbee to determine storage capabilities, non-wireless communication capabilities, and costs.
  4. We need to identify other similar commercial solutions for our problem.

 

Name Title Email Date Interview/Interviewee Takeaways
Phil Williams CEO of Phil Williams LLC; Advanced Research, RDT&E, Leap Ahead Technology, On the Move Communications phil.williams@LInkToPhil.com 2/9/19 AJ/BX
  1. Compatibility of raspberry Pi with the current three hardware
  2. Use case scenario of the product (environmental factor)
  3. Philp would be extremely helpful for us if we actually want to deliver a working product (Even by the end of day, all we deliver is just a MVP, it would be great we make detail recommendation and plans to LTC Thomas and his team about how to carry this project further)
LTC Jesse Marsalis Program Manager Special Programs jesse.r.marsalis.mil@mail.mil 2/11/19 AJ; TL
  1. Specific job – develop and acquire capabilities within Special Operations command
  2. Might be good person to speak to once we have a MVP
Alexandra Hanson Data Analyst alexandra.hanson.ctr@socom.mil 2/12 BX
  1. Need a system that could incorporate both the biometrics data and the cognitive data
  2. Other unit is also using tablet for self-assessment, HDP is working on funding, the fastest turnaround would be 6 month
  3. Would like us to research and provide a comparison on different tablet offering, ex: iPad vs Samsung
  4. Military like action item, end the report/presentation with a list of action items
  5. Present the solution in an incremental plan
  • Step 1:
  • Step 2:
  • Step 3: if have $$$/enough resources, you could..
SFC Jeffery West Cadre Language School. Non-Commissioned officer jeffrey.l.west@socom.mil 2/12/19 NC
  1. 92% of trainees get to +1 area after 3 week training program
  2. Human feedback is the best. It can’t be beat by a computer. Trainees need to interact with people from that culture.
  3. Government employees here not contractors.
  4. Military personnel are constantly on rotational assignment (3 years)
Constance Garcia Data Manager constance.garcia.ctr@socom.mil AJ 2/12/19 -they would have to use personal devices – to get the data from the Zigbee, we have to plug in SD card to personal devices and then send dataset to work laptops

– on a scale of 1 – 10 of how helpful would this be? 10

-Alexandra and Constance will be using the SD cards from the Zigbee & do analysis from time to time rather than constant

LTC Phillip Thomas Director of Academic Instruction, SWEG phillip.thomas2@socom.mil All 2/12/19
  1. Identify other commercial solution
  2. Asking for spec
  3. Bottom line: our proposed solution need to be flexible enough to accommodate all kinds of data, including cognitive data and allowing data team to manually input self-assessment survey
  4. LTC Thomas wonders where the repository of the data that going to end up

 

Brian Hackett Founder of the Learning Forum bhackett@thelearningforum.org TL 2/12/19 – Experience working with the Navy attempting data collection, but process broke down.

– cpc’s and other contractors working with the navy SEALs were underpaid/undertrained, led to friction.

– lack of communication between SEAL school and SWIC, even if they were working on the same thing.

-nail in the coffin for the SEALs  was privacy concerns, killed any data sharing.

– He has connections to other people who did work the SEALs. (Potential leads).

Haig Nalbantian Senior Partner, Mercer

Workforce Sciences Institute

haig.nalbantian@mercer.com TL 2/13/19 – pioneer of Internal Labor Market (ILM) analysis.

– worked with the Navy and used ILM to help identify key skills to improve upon.

– didn’t work with special forces, but has contacts that may be useful.

Kyle Janson MEng Biomed kyle.janson@duke.edu NC 2/12/19
James Arp HDP Director james.arp@socom.mil BX/NC 2/13/19
  1. The facility will accommodate our proposed solution
  2. Getting rid of the middle step of placing micro-computer to soldiers’ jacket
  3. Talking to industry expert and learning how tech company solve similar problems
Barbara Plotkin CPT USSOCOM former instructor barbara.j.plotkin@socom.mil NC 2/12/19
  1. A device that allows quick insights would be helpful for the soldiers.

Week 5 Insights

This week our hope was to better understand some of the reasons for attrition through the special forces pipeline and try to see where we can provide value added, if possible. We found where a lot of the attrition is coming from. Of last year’s class of 1200 who applied, only around 500 made it all the way through. SWIC’s CG wanted a graduation rate of around 800. He began an initiative called Performance Integrative Training (PIT), as a way for soldier who don’t pass a module the first time to get help to achieve learning goals. PIT is based on HDP models for cognitive performance. This demonstrates that there is optimism in HDP’s training methods and there will be buy in for a data driven solution.

 

  1. BENEFICIARIES

 

Primary Data Users

  1. Cognitive Performance Coaches
  2. Data Analysts
  3. Research Psychologist

Other Beneficiaries

  1. Instructors (Cadre)
  2. Trainees
  3. SWIC Commander
  4. Performance Integrated Training (PIT)

 

  1. INTERVIEWS & KEY TAKEAWAYS

 

Week 5 Title Contact Key Takeaways Interviewer
Major Chuck Schumacher Operations Officer, SWEG charles.schumacher@socom.mil Performance Integrative Training (PIT), SWIC Commander’s initiative to get trainees through failed modules. Based on HDP work.

Class size of 50-60, 50% success rate.

AJ
JC Crenshaw SOFCCC Course Manager john.crenshaw@socom.mil Right now there are 11 instructors.

Only 33% of enlistees and 40% of Officers make it through assessment and selection.

Highest attrition from physical/mental assessment and culminating exercise.

AJ
Col. Joe Blanton Program Executive Officer, SOF Support Activity joseph.blanton@duke.edu   acquisition team works with communication team to determine what devices can be put on the network. Necessary step for any new equipment we’d be bringing them. AJ
Captain Oscar Gonzalez HDP Research Psychologist oscar.gonzalez@usuhs.edu Concerns arising over scope. Are we still trying to solve the problem that was posed?

Is the data we are collecting useful for providing an MVP?

AJ,BX,TL
Ian Ankney Lead CPC
aspen.ankney.ctr@socom.mil
CPCs are not involved in assessment selection. The new facility may be up to six years away.

Cadre operate fairly idiosyncratically. Since they don’t all measure the same things this leads to confusion. Soldiers are often told they are being measured one way when it is actually something else is being measured.

TL,BX
Michael Jelen H4D Course Advisor michaeljelen@gmail.com Rather than a central server, utilizing edge computing may be a better and lest costly solution. This would involve getting the biometric devices to sync to something the soldiers could wear ideally. TL
Major Amar Mohamadou SWEG Executive Officer mohamadou.amar@socom.mil Each SWIC dropout costs the army $30,000-$35,000.

Recruit Class was 1200 this year.

PIT saved $1 million this year.

AJ
Lieutenant Adam M. Beauregard Lieutenant, Navy adambeauregard@gmail.com We should think about what kind of data to collect,

determine what format the solution should be based on.

Focus on really understanding quantitative metrics for green beret training

What insights do we want the instructors to be able to easily obtain from the data?

Focus on wearable tech.

BX,NC
Major Ben Spain Major, Air Force benjamin.spain@gmai There is a price threshold: below the price point, commander can make purchasing decisions for software. Above that point then need to go through the bidding process.

There is an entire unit working on the bidding process, (highly complex).

BX
Lieutenant Colonel Ormond Brendan SWEG Deputy Commander, Language Group brendan.ormond@socom.mil How do we deal with measuring intangibles like leadership?

Problems with shrinking recruiting class, both witin the army for SWIC and in the general US population.

Any solution is cost prohibitive, not only in procurement but also in time spent on implementing and maintaining any data system.

TL

 

III. KEY INSIGHTS

  1. Attrition is concentrated around Physical/psychological assessment, small unit tactics and the culminating exercise. Cost to the Army for relocation of trainees is around $30k.
  2. Without a way to measure some of the intangible elements in selection, like leadership, it will be difficult to get buy in for a data driven solution.
  3. The idea of a central server for data is becoming less feasible. A better solution might involve wearable tech that syncs to the biometric devices via Bluetooth.

 

  1. KEY DECISIONS
  1. Now that we have a picture of where the data collection process is breaking down, as well as a general picture of where soldiers are experiencing difficulty in the pipeline, we need to begin developing a prototype. To get more insight about this, we need to find models of effective data management that we think, given SWEG’s constraints can be applicable to our problem.

Week 3/4 Insights

In the past two weeks, we have interviewed 20 beneficiaries which include a mix of primary and secondary data users, as well as higher ups(generals and commanders). Besides, we had the very precious opportunity to meet with general Dempsey, the 18th Chairman of the Joint Chiefs of Staff and the 37th Chief of Staff of the Army, and general McChrystal, the joint Special Operations Command in the mid-2000s. We have gained a broader view of how military adopt technology from these two prestigious military leaders. Moreover, we have visited Fort Bragg on 1/29, which we gained hands-on experience on current in-placed tracking devices and data analytics tools.

 

  • BENEFICIARIES

 

We organized our beneficiaries in three main groups listed below:

Primary Data Users

  1. Cognitive Performance Coaches
  2. Data Analysts
  3. Research Psychologists

Secondary Data Users

  1. Physical Therapists
  2. Spiritual Advisors
  3. Cognitive Performance Coaches
  4. Interpersonal Coaches
  5. Strength Trainers
  6. Dietitians
  7. Other Sports Medicine Specialists

Higher Ups (General and Support Staff at SWCS)

  1. General Santiago
  2. Support Staff/ Board that drives downstream policy changes.

 

  • INTERVIEWS & KEY TAKEAWAYS

 

Name Title Email Takeaway Interviewer
Hector Agayuo SWEG Command Sergeant Major aguayoh@socom.mil Lack of bandwidth of personnel to collect data
He looks outliers or errors (compare that with the actual situation)
Which means he need to know each soldiers quite well, how would he be able to do that?
He goes through 6 months of CCC data in 2-3 hours (110 surveys) 
NC
Greg Santiago Operations Specialist, SWEG gregorio.santiago@socom.mil Logistic & operation planning does not involve a lot of data related to our project

Same biometrics data may means different things as each individual is unique. How to assess data can be much more difficult than how to collect them  

BX
MAJ Chuck Schemacher; Operations Officer, SWEG charles.schumacher@socom.mil Is SWIC measuring the right stuff? Why do many students who pass Assessment and Selection drop out later on?

Brass want more graduates, cadre want better quality graduates.

New generation of soldiers are different to train, causes friction.

Bob Jones Communications Language School Director HDP operates here to evaluate the trainees effectiveness at negotiation NC+BX
James Arp HDP Director james.arp@socom.mil Funding comes from:

USSOCOM, USASOC, Army

We have to justify the value of the training programs to the 2,3, and 4 star general commands

NC
General McChyrstal  (x2) he joint Special Operations Command in the mid-2000s Must change the culture as well as the technology within the military All
Mike Taylor HHC 1st Sergeant michael.s.taylo More robust data collection/analyis would help determine what training interventions work and what does not.

cooperation/information sharing is generally facilitated by relationships, if they aren’t there, then generally different units won’t communicate.

AJ
General Dempsey the 18th Chairman of the Joint Chiefs of Staff and the 37th Chief of Staff of the Army Need to focus on what problem we can solve

Technology is changing the landscape of the federal gov and DoD

Focus on the cultural aspect of getting military

All
COL Bill Rice SWEG Commander william.rice@socom.mil Higher-ups at SWEG don’t have much contact with the committees.

SWEG doesn’t have a committee that makes data-driven decisions.

Peer feedback, instructor feedback, and some data informs human performance results.

30 day ideal innovation turnover for new devices. He would like to be able to test new hardware/software without jumping through the regulation/ permission hoops.

Use case:

Immediate feedback mechanism as the top use case priority from Commander Rice point of view. Have expert as a reference point for novices

using big data/database to make key decision such as selection process/recruiting. Track each trainee’s performance/data throughout the training process

BX
Major Chuck Is SWIC measuring the right stuff? Why do many students who pass Assessment and Selection drop out later on?

Brass want more graduates, cadre want better quality graduates.

New generation of soldiers are different to train, causes friction.

Stephen M. Mannino, Edd Human performance program coordinator stephen.m.mannind@socom.mil Showed the database that strength and conditioning coaches use to advise

Strength and conditioning coaches have to download to share the database with other data users (physical therapists, CPC’s, data analysts, research psychologists)

All
Justin Jones Strengthen conditioning coach justin,jones@socom.mil Names are declassified – only coaches have access to the names

Database’s real time feedback has been helpful in readjusting physical training

All
Taylor McKinney Physical therapist Accessing the medical database always leads to problems

No protocol for how physical therapists make data-informed decisions

All
Sara Butler Physical therapist Would really like a database that they can get to without taking up so much time

Did not interact much with CPC’s, data analysts, and research psychologist

All
Alexandra Hanson Data Analyst alexandra.hanson.ctr@socom.mil Getting buy-in from higher ups is necessary to help increase funding and implementation

Acquisition for different accounts at SWEG is siloed which means that different solutions will not arrive at the same time.

Constance Garcia Data Manager constance.garcia.ctr@socom.mil Government shutdowns affects contractors.
Daniel Gajewski Performance Integrator Daniel.gajewski@socom.mil The Toby eye tracking device just arrived and they are still working on calibrating it

The current output of that eye tracking

device is just a video

Has another software platform for further analysis but they haven’t start yet

All
Dawne Edmonds Process Improvement and Project Management

US Army Special Operations Command

Dawne.edmonds@socom.mil There’s no authoritative data source. It means that there is not a single data source that the analysts rely and trust upon.

“CONSTANTLY. I CAN’T EXPLAIN TO YOU HOW OFTEN IT HAPPENS”

They tend to type all the data all over again. This is a constant problem. Army are multi-service.

NC
Oscar Gonzalez Research Psychologist oscar.gonzalez@usuhs.edu Is in the process of holding CPC’s more accountable for inputting and collecting data

Working on concerns from data analysts about the large amount of time spent toward data input

All

 

 

  • KEY INSIGHTS

 

  • Data utilization is actually quite low. The decision making at the committee level are mostly based on intuition and past experience
    • Many data (both biometrics and cognitive) are not collected due to lack of tracking devices
    • Even for those collected data, most of them are not being utilized
  • Most data are self-reported in the form of survey or self-assessment, which involves basis
  • Most database are not shared internally, many secondary data users work in silo and do not have access to each other’s database

 

  • KEY PROBLEMS

 

  1. Secondary users have three or four different data systems for one function
  2. Secondary users have to get data manually
  3. There is no central database for the data from biometric devices
  4. Secondary data users work in silo
  • Hard for secondary data users to collaborate with each other
  • Hard for secondary data users to collaborate with primary data users
  1. Lack of a systematic workflow among primary data users
  • Some CPC collect and scrub data, some don’t
  • CPC does not have a standard process or expectation on data-related work
  1. Errors in data collection (ex: bear incident)
  2. No plan for biometric devices and what outcomes they should have
  3. Training feedback
  • Soldiers do not receive real-time feedback
  • Soldiers do not receive specific feedback on training performance
  1. Who gets access to which data
  2. The regular maintenance of the system & platform and periodic update

 

  • KEY DECISIONS

 

  1. We need to define the scope of the problem and narrow down the problems
  2. Define the key pain point and focus on the corresponding problem
  3. Re-identify key beneficiaries based on the problem
  4. Identify use cases for the re-identified key beneficiaries
  5. Prioritize use case
  6. Start brainstorming potential solution for each use case