Home » Uncategorized
Category Archives: Uncategorized
Week 11 Insights
Our team has edited the CPC interface, based on feedback from last week, and we created the CG interface. This week, we received our first round of feedback on the CG feedback. Both the CPC and CG interfaces are aesthetically appealing, but we need to hone in on what visuals are needed and not needed. For the CG interfaces, we need to identify whether comparing their cohorts’ performance with other CG’s cohorts’ performance and the must-have information they need to see quickly. For the CPC interfaces, we now allow soldier performance to be viewed by class and cohort which has been well-received.
- BENEFICIARIES
Primary Data Users
- Cognitive Performance Coaches
- Data Analysts
- Research Psychologist
Other Beneficiaries
- Instructors (Cadre)
- SWCS Commander
- INTERVIEWS & KEY TAKEAWAYS
Phil Williams | CEO of Phil Williams LLC
|
Bruce MacDowell Maggs | Professor of Duke Computer Science
bmm@cs.duke.edu
|
Michael Jelen | Berkeley Research Group
michaeljelen@gmail.com
|
Adam Beauregard | United States Navy
|
Mitch Heath | CEO at Teamworks
mheath17@gmail.com
|
Joe Blanton | Colonel United States Army
joeblanton12@gmail.com
|
Anubhav Mehrotra | VP of Product Management at Live Nation
Factors used to test level of engagement and emotion change:
|
Lieutenant Colonel Thomas Academic Instruction Director | SWEG
phillip.thomas2@socom.mil
|
- KEY INSIGHTS
- Classes, rather than cohorts, are a better way to categorize soldier performance due to the soldier drop out from cohorts.
- CG’s may be interested in having more options – filtering out by certain soldier/cohorts or comparing their performance to other CG’s.
- We still need to drill in on what specific graphs are useful for CPC and start taking out or having the option to filter out the less useful ones.
- KEY PROBLEMS
- We need to talk to cadre’s, but there has been pushback on why speaking to cadre’s is relevant to our problem.
- Instead of guessing what the CG wants, we need to interview a CG to truly tailor the platform to his/her needs.
- We would like to understand better whether the CG wants more data on the selection or training aspects of the high attrition rate.
- KEY DECISIONS
- We have scheduled a meeting to speak with Commander Rice which will help us understand the 1) type of data 2) amount of data 3) types of insights that would be most useful for higher ups.
- We do not want students to have access to our platform as, from interviews, the CPC’s can work individually with students and show them the platform during meetings.
- If we cannot interview cadre’s within the next week, then we need to remove them as a beneficiary. We will continue reaching out to the contact our problem sponsor has provided to get interviews with cadre.
Week 10 Insights
Our team has made further changes to our prototype based on feedback from last week. This week we have received feedback on what should be included when considering holistic performance. The visuals are well received, but our client wants to know more about how the data is being put together, where our golden ratio numbers are coming from etc. Our CPC interface is looking pretty good, but we need to work on what the research psychologist and the CG should be able to see regarding performance data.
- BENEFICIARIES
Primary Data Users
- Cognitive Performance Coaches
- Data Analysts
- Research Psychologist
Other Beneficiaries
- Instructors (Cadre)
- SWCS Commander
- INTERVIEWS & KEY TAKEAWAYS
Trevor O’brien | Information Technology
trevor.obrien@socom.mil
|
Shawn Zeplin | Director of Behavioral Health at Duke University Athletics
|
Phil Williams | CEO of Phil Williams LLC
|
Oscar Gonzalez | Research Psychologist
oscar.gonzalez@usuhs.edu
|
Constance Garcia | Data Manager
constance.garcia.ctr@socom.mil
|
Alexandra Hanson | Research Analyst
|
Aspen Ankney | CPC USSOCOM USASOC SWEG
aspen.ankney.ctr@socom.mil
|
Daniel Gajewski Performance Integrator
Daniel.gajewski@socom.mil
|
Mark Manturo | Research Psychologist
mark.monturo@gmail.com
|
Seth Spradley | Data Analyst
seth.c.spradley.ctr@socom.mil
|
Maj Arth | Joint Special Operations Command
mjarth@gmail.com
|
Dillon Buckner | Military Intelligence Observer, United States Army
dbuck22@gmail.com
|
- KEY INSIGHTS
- Mental health component is a large part of performance and soldiers tend to be skeptical of mental health professionals. (cultural/socialization issue)
- While targeting specific instructor graduation rates would be useful data, it would also create a moral hazard due to competition to graduate the most.
- Cross service comparisons for special forces performance metrics would potentially be useful, if we can get the data.
- KEY PROBLEMS
- We need to make sure that different users have what they need for their profiles on our software platform.
- When looking at ideal performance (golden ratio), we need to be more specific about what that means for different training exercises/metrics
- For self reported data, surveys change from year to year. If this is the case, then how do we accurately compare survey data over time?
- KEY DECISIONS
- We need to think about what the SWCS CG wants to see on his interface. Ideally it should be simple and robust enough for him to see trends and make policy decisions.
- We need to add more granularity to the analysis. Even though there are 7 main exercises, there are multiple sub exercises. These also will be likely multi day measurements as well, meaning that we will need to integrate more files.
- Do we want students to have login access? We have received conflicting feedback on this and need to make a decision soon.
Week 9 Insights
This week we used our interviews to get further feedback on our prototype. Based on last week’s feedback we added more interfaces and visualization tools. This task was spearheaded by Bettie. The response this week has been positive. We discovered that when dealing with analytics, it is important to have benchmarks to compare. These “golden references” are soldiers who are the highest performers. If we can break down these soldiers into groups based on characteristics, we may be able to help our client figure out what is seperating the highest performers from the rest of the pack.
- BENEFICIARIES
Primary Data Users
- Cognitive Performance Coaches
- Data Analysts
- Research Psychologist
Other Beneficiaries
- Instructors (Cadre)
- Trainees
- SWCS Commander
- INTERVIEWS & KEY TAKEAWAYS
Lieutenant Colonel Thomas Academic Instruction Director | SWEG
phillip.thomas2@socom.mil
|
Mitch Health | CEO at TeamWorks
|
Phil Williams | CEO of Phil Williams LLC
|
Morgan Hall | SOCEP CPC
morgan.hall.ctr@socom.mil
|
Constance Garcia | Data Manager
constance.garcia.ctr@socom.mil
|
Travis Nicks | Former Navy Submarine nuclear operator for 10 years
travis.nicks@duke.edu
|
Barbara Plotkin | CPT USSOCOM former instructor
barbara.j.plotkin@socom.mil
|
Seth Spradley | Data Analyst
seth.c.spradley.ctr@socom.mil
|
Alexandra Hanson | Research Analyst
|
Gabriella Shull | Duke Biomedical Engineering PhD @ BX/NC
gabriella.shull@duke.edu
|
- KEY INSIGHTS
- Having the prototype stand out visually is key to selling it as a solution.
- While there are many competing wants and needs, we should prioritize the most important features and not include everything.
- “Golden Standard” for recruits needs to be established in order to provide benchmarks for analysis and can be compared to things such as the group average.
- KEY PROBLEMS
- Full deployment of the MVP may still involve a hardware component that is not anywhere near development ready.
- Rapid deployment is also linked to funding availability. At this point in time this would mean looking for unobligated funds in either SWCS, USASOC or SOCOM.
- Network security/device approval process is still a major hurdle.
- KEY DECISIONS
- We need to start narrowing down metrics in order to build a robust data system that can run the statistical analysis that our client wants.
- Once we have our metrics figured out, how do we go about linking soldier strengths and weaknesses to possible training interventions?
- Who do we really need to get to in order to sell our product?
Week 8 Insights
We utilized our 10 interviews this week to accomplish two main goals: (1) collect feedback on our software prototype, a data management and visualization platform (2) validate our hypothesis that having a data management platform that enables our beneficiaries to visualize the output would help them to make data-driven decisions. Betty spent her weekend on prototyping and delivered an interactive prototype. The team walked through the prototype with project sponsor Lieutenant Colonel Thomas and interviewed key beneficiaries such as HDP director Jim, research psychologist Oscar, and Constance and Alexandra from the data team. We have received both positive and constructive feedback from interviews. These valuable feedback provide us with a much clearer idea on what kind of function and product feature they want and in what direction shall we further develop the prototype.
- BENEFICIARIES
Primary Data Users
- Cognitive Performance Coaches
- Data Analysts
- Research Psychologist
Other Beneficiaries
- Instructors (Cadre)
- Trainees
- SWIC Commander
- INTERVIEWS & KEY TAKEAWAYS
Lieutenant Colonel Thomas Academic Instruction Director | SWEG
phillip.thomas2@socom.mil
|
Oscar Gonzalez Research Psychologist | SWEG
oscar.gonzalez@usuhs.edu
|
Constance Garcia Data Manager
constance.garcia.ctr@socom.mil
|
Major Mike Williams Special Operations Command Officer
mike.s.williams1@gmail.com
-”the average soldier can do 65 pushups per minute” -> here’s a regime on how you can get to the average pushup
|
Dr. Greg Dale Sport Psychologist & Leadership Director | Duke
gdale@duke.edu
|
Daniel Gajewski Performance Integrator
Daniel.gajewski@socom.mil
|
Trevor O’brien Information Technology
trevor.obrien@socom.mil
|
Rick Dietrich SOCEP Director
Frederick.d.dietrich@socom.mil
|
Alexander Hanson Data Analyst
Frederick.d.dietrich@socom.mil
|
Jim Arp HDP Director
james.arp@socom.mil
|
- KEY INSIGHTS
- The main users of the software product is the data team and the research psychologist
- Excel is the current data analysis tool. SPSS is only on one standalone Mac that is not connected to the internet. Few complex analysis has been done
- Interviewees all like the design of the current prototype but would like to see more personalized page with differentiated product feature for each beneficiary E.g. the home page for CPC and data team would be different since CPC mainly want to see the result in a better data visualization form to facilitate their training while data team want to run actual analysis
- Time series analysis is the most important analysis they want to have on the prototype so that they could see changes over time
- While being able to run statistical analysis on our product would be a big plus, beneficiary want us to position it as a data storage/management tool
- The bottom line is the ROI: they are investing money and want to know how money is creating results)
- Being able to answer the following four questions from our prototype would be the key:
- What is the exercise that provides the biggest performance increase?
- What is the performance machine/training/process that creates the biggest change?
- Who are the people that have the biggest change and smallest change?
- When do people start showing performance training over time?
- KEY PROBLEMS
- The biggest problem is how will we get the data from the hardware that we used as inputs for the software prototype
- All current devices have bluetooth and they could transmit the data to the computers if connected to internet
- But none of the current devices are connected to internet and there is little we could do about it
- Given the time constraints and skills available to the team, we think the best way to tackle this complex problem and to deliver a prototype that could solve both the input and the output problem is to leverage existing commercial solution for the hardware and focus on software prototyping to create a data management and visualization platform. Nevertheless the premise is these device are connected to the internet.
- The process of getting both software and hardware approved is very complicated and could take 6 month or even more.
- This is still true: that the structure and procedure for data collection, analysis, and storage is changing week to week within HDP. Week to week, we learn of new developments in many of our beneficiaries’ roles as it relates to data analysis, connection, and storage.
- The prototype could proceed to different directions and the feedback we received from interviewees compose a wide range of product feature requests and also has some conflicting information.
- This is still true: We do not have the devices with us so it is difficult to test our hardware MVP.
- KEY DECISIONS
- We need to validate the possibility of having current existing devices connected to internet since they would change the way how we shall approach the problem
- The solution has to be a hardware + software product to solves both the input and the output problem
- While we are figuring out the hardware part, we should continue refining the software prototype
- We will commit to do more research on other commercial offering and how compatible that would be with HDP devices
- We will continue to commit to identifying companies that are willing to send new devices and data specs so that we more easily gauge the feasibility of creating a device from scratch
Week 7 Insights
We optimized our interviews to validate our assumption that (1) an Arduino or particle.io-based micro controller system would be pass the security laws at Fort Bragg (2) the micro controller system shown below is scalable and would fulfill the design constraints set by LTC Phillip Thomas and James Arp and (3) to discuss the feasibility of the proposed system both in terms of necessary skills and the time that we had available to work on an original prototype. In order to be fully prepared, we also explored commercial solutions in case we wouldn’t be able to deliver a proper prototype.
BENEFICIARIES
Primary Data Users
- Cognitive Performance Coaches
- Data Analysts
- Research Psychologist
Other Beneficiaries
- Instructors (Cadre)
- Trainees
- SWIC Commander
INTERVIEWS & KEY TAKEAWAYS
Vatrina Madre Information Technology Director | SWEG
vatrina.mardre@socom.mil
|
Maj Arth Commander’s Action Group, Director | Joint Special Operations Command
majarth@gmail.com
|
Rachel Feher Congressional Research Service
rachfef@hotmail.com
|
Dr. Lawrence Appelbaum Director of Human Performances Lab | Duke
greg@duke.edu
|
Yao Yuan ECE Student | Duke
yiyao.yuan@duke.edu
|
Mitch Heath CEO | Teamworks
mheath17@gmail.com
|
Kyle Janson ECE BME Student | Duke
kyle.janson@duke.edu
|
Mark Palmeri MD PhD and ECE Professor | Duke
mark.palmeri@duke.edu
|
Trevor O’brien SWEG IT | SWCS
|
Rich Diviney Retired Navy SEAL/Seal Team 6 Instructor
|
KEY INSIGHTS
- The data analysts and the data managers would still be the main point of contacts to receive the raw data
- Bluetooth is now possible. From our interview with Vatrina Madre, we learned that there are two networks that we can operate on: Lang and NIPR. NIPR is more “black and white” and Lang is more lenient as far as devices to connect.
- The new facility will be up and running in 2-5 years.
- Creating a prototype from scratch seems infeasible given the time frame and the resources available. One commercial solution in particular, StelLife (introduced to us by Steve McClelland) is a strong candidate due to high data integration capabilities and scalability.
KEY PROBLEMS
- The biggest problem is that given the time constraints and skills available to the team, creating a prototype from scratch seems less feasible. The team is stretched in regards to managing workflow given the high amount of deliverables and the capabilities of the team.
- This is still true: that the structure and procedure for data collection, analysis, and storage is changing week to week within HDP. Week to week, we learn of new developments in many of our beneficiaries’ roles as it relates to data analysis, connection, and storage.
- This is still true: We do not have the devices with us so it is difficult to test the effectiveness of our MVP. We have tried to ask for dummy data, but we faced barriers regarding confidentiality.
- We also have not been able to get in contact with vendors to supply a possible solution.
KEY DECISIONS
- We have mapped out a new workflow in which half of the team works on interviews while the other half of the team works on prototyping. We are currently going to reconsider this workflow at the end of this class in case team members feel overworked.
- We will commit to identifying companies that are willing to send new devices and data specs so that we more easily gauge the feasibility of creating a device from scratch
- We will look more into StelLife and other possible commercial products as a means of providing a solution to our problem sponsor.
Week 6 Insights
We utilized our ten interviews this week to accomplish two goals: 1) gauge reactions to our MVP 2) understand necessary security and training environments that limit our MVP. Additionally, we talked to experts in the field like Phil Williams and Dr. Janson to understand relevant research and tools to build our MVP. Through conversations with experts, we drew a MVP as a team and tested the desirability of our MVP during interviews. In this process, we naturally learned of more limitations that we must account for as we continue to iterate on our MVP. At the end of this week, we have a more clear idea of how to modify our current MVP to address additional pain point and limitations that we learned during interviews.
- BENEFICIARIES
Primary Data Users
- Cognitive Performance Coaches
- Data Analysts
- Research Psychologist
Other Beneficiaries
- Instructors (Cadre)
- Trainees
- SWIC Commander
- INTERVIEWS & KEY TAKEAWAYS
- KEY INSIGHTS
- Due to recent procedural changes, the data analyst and data manager would be the main individuals receiving the raw data.
- A SD card cannot be directly injected into the military computers, because a SD card is not an approved device. Rather, the data analyst would inject the SD card into a personal computer and send the dataset to his/her military computer.
- The biometric devices will be used within the new training facility under normal temperature and terrain conditions. Therefore, the solution we provide does not have to withstand extremely tough conditions.
- The new training facility will accomodate for the needs of our MVP, if it proves to be useful and relevant.
- KEY PROBLEMS
- Our MVP must work within the security constraints. Many interviewees emphasized that our solution may need to work without wifi and/or bluetooth which has limited solutions to transfer data from our MVP to the data analysts’ computers.
- The structure and procedure for data collection, analysis, and storage is changing week to week within HDP. Week to week, we learn of new developments in many of our beneficiaries’ roles as it relates to data analysis, connection, and storage.
- We do not have the devices with us so it is difficult to test the effectiveness of our MVP. We have tried to ask for dummy data, but we faced barriers regarding confidentiality.
- KEY DECISIONS
- We need to learn about the specifications of our devices by speaking to the companies that created these devices. Only by understanding device specifications can we build a MVP that can receive info from all devices.
- We must understand what data each device stores and seek to create dummy data to test our MVP.
- We will research about Zigbee to determine storage capabilities, non-wireless communication capabilities, and costs.
- We need to identify other similar commercial solutions for our problem.
Name | Title | Date | Interview/Interviewee | Takeaways | |
Phil Williams | CEO of Phil Williams LLC; Advanced Research, RDT&E, Leap Ahead Technology, On the Move Communications | phil.williams@LInkToPhil.com | 2/9/19 | AJ/BX |
|
LTC Jesse Marsalis | Program Manager Special Programs | jesse.r.marsalis.mil@mail.mil | 2/11/19 | AJ; TL |
|
Alexandra Hanson | Data Analyst | alexandra.hanson.ctr@socom.mil | 2/12 | BX |
|
SFC Jeffery West | Cadre Language School. Non-Commissioned officer | jeffrey.l.west@socom.mil | 2/12/19 | NC |
|
Constance Garcia | Data Manager | constance.garcia.ctr@socom.mil | AJ | 2/12/19 | -they would have to use personal devices – to get the data from the Zigbee, we have to plug in SD card to personal devices and then send dataset to work laptops
– on a scale of 1 – 10 of how helpful would this be? 10 -Alexandra and Constance will be using the SD cards from the Zigbee & do analysis from time to time rather than constant |
LTC Phillip Thomas | Director of Academic Instruction, SWEG | phillip.thomas2@socom.mil | All | 2/12/19 |
|
Brian Hackett | Founder of the Learning Forum | bhackett@thelearningforum.org | TL | 2/12/19 | – Experience working with the Navy attempting data collection, but process broke down.
– cpc’s and other contractors working with the navy SEALs were underpaid/undertrained, led to friction. – lack of communication between SEAL school and SWIC, even if they were working on the same thing. -nail in the coffin for the SEALs was privacy concerns, killed any data sharing. – He has connections to other people who did work the SEALs. (Potential leads). |
Haig Nalbantian | Senior Partner, Mercer
Workforce Sciences Institute |
haig.nalbantian@mercer.com | TL | 2/13/19 | – pioneer of Internal Labor Market (ILM) analysis.
– worked with the Navy and used ILM to help identify key skills to improve upon. – didn’t work with special forces, but has contacts that may be useful. |
Kyle Janson | MEng Biomed | kyle.janson@duke.edu | NC | 2/12/19 | |
James Arp | HDP Director | james.arp@socom.mil | BX/NC | 2/13/19 |
|
Barbara Plotkin | CPT USSOCOM former instructor | barbara.j.plotkin@socom.mil | NC | 2/12/19 |
|
Week 5 Insights
This week our hope was to better understand some of the reasons for attrition through the special forces pipeline and try to see where we can provide value added, if possible. We found where a lot of the attrition is coming from. Of last year’s class of 1200 who applied, only around 500 made it all the way through. SWIC’s CG wanted a graduation rate of around 800. He began an initiative called Performance Integrative Training (PIT), as a way for soldier who don’t pass a module the first time to get help to achieve learning goals. PIT is based on HDP models for cognitive performance. This demonstrates that there is optimism in HDP’s training methods and there will be buy in for a data driven solution.
- BENEFICIARIES
Primary Data Users
- Cognitive Performance Coaches
- Data Analysts
- Research Psychologist
Other Beneficiaries
- Instructors (Cadre)
- Trainees
- SWIC Commander
- Performance Integrated Training (PIT)
- INTERVIEWS & KEY TAKEAWAYS
Week 5 | Title | Contact | Key Takeaways | Interviewer |
Major Chuck Schumacher | Operations Officer, SWEG | charles.schumacher@socom.mil | Performance Integrative Training (PIT), SWIC Commander’s initiative to get trainees through failed modules. Based on HDP work.
Class size of 50-60, 50% success rate. |
AJ |
JC Crenshaw | SOFCCC Course Manager | john.crenshaw@socom.mil | Right now there are 11 instructors.
Only 33% of enlistees and 40% of Officers make it through assessment and selection. Highest attrition from physical/mental assessment and culminating exercise. |
AJ |
Col. Joe Blanton | Program Executive Officer, SOF Support Activity | joseph.blanton@duke.edu | acquisition team works with communication team to determine what devices can be put on the network. Necessary step for any new equipment we’d be bringing them. | AJ |
Captain Oscar Gonzalez | HDP Research Psychologist | oscar.gonzalez@usuhs.edu | Concerns arising over scope. Are we still trying to solve the problem that was posed?
Is the data we are collecting useful for providing an MVP? |
AJ,BX,TL |
Ian Ankney | Lead CPC | aspen.ankney.ctr@socom.mil |
CPCs are not involved in assessment selection. The new facility may be up to six years away.
Cadre operate fairly idiosyncratically. Since they don’t all measure the same things this leads to confusion. Soldiers are often told they are being measured one way when it is actually something else is being measured. |
TL,BX |
Michael Jelen | H4D Course Advisor | michaeljelen@gmail.com | Rather than a central server, utilizing edge computing may be a better and lest costly solution. This would involve getting the biometric devices to sync to something the soldiers could wear ideally. | TL |
Major Amar Mohamadou | SWEG Executive Officer | mohamadou.amar@socom.mil | Each SWIC dropout costs the army $30,000-$35,000.
Recruit Class was 1200 this year. PIT saved $1 million this year. |
AJ |
Lieutenant Adam M. Beauregard | Lieutenant, Navy | adambeauregard@gmail.com | We should think about what kind of data to collect,
determine what format the solution should be based on. Focus on really understanding quantitative metrics for green beret training What insights do we want the instructors to be able to easily obtain from the data? Focus on wearable tech. |
BX,NC |
Major Ben Spain | Major, Air Force | benjamin.spain@gmai | There is a price threshold: below the price point, commander can make purchasing decisions for software. Above that point then need to go through the bidding process.
There is an entire unit working on the bidding process, (highly complex). |
BX |
Lieutenant Colonel Ormond Brendan | SWEG Deputy Commander, Language Group | brendan.ormond@socom.mil | How do we deal with measuring intangibles like leadership?
Problems with shrinking recruiting class, both witin the army for SWIC and in the general US population. Any solution is cost prohibitive, not only in procurement but also in time spent on implementing and maintaining any data system. |
TL |
III. KEY INSIGHTS
- Attrition is concentrated around Physical/psychological assessment, small unit tactics and the culminating exercise. Cost to the Army for relocation of trainees is around $30k.
- Without a way to measure some of the intangible elements in selection, like leadership, it will be difficult to get buy in for a data driven solution.
- The idea of a central server for data is becoming less feasible. A better solution might involve wearable tech that syncs to the biometric devices via Bluetooth.
- KEY DECISIONS
- Now that we have a picture of where the data collection process is breaking down, as well as a general picture of where soldiers are experiencing difficulty in the pipeline, we need to begin developing a prototype. To get more insight about this, we need to find models of effective data management that we think, given SWEG’s constraints can be applicable to our problem.
Week 3/4 Insights
In the past two weeks, we have interviewed 20 beneficiaries which include a mix of primary and secondary data users, as well as higher ups(generals and commanders). Besides, we had the very precious opportunity to meet with general Dempsey, the 18th Chairman of the Joint Chiefs of Staff and the 37th Chief of Staff of the Army, and general McChrystal, the joint Special Operations Command in the mid-2000s. We have gained a broader view of how military adopt technology from these two prestigious military leaders. Moreover, we have visited Fort Bragg on 1/29, which we gained hands-on experience on current in-placed tracking devices and data analytics tools.
- BENEFICIARIES
We organized our beneficiaries in three main groups listed below:
Primary Data Users
- Cognitive Performance Coaches
- Data Analysts
- Research Psychologists
Secondary Data Users
- Physical Therapists
- Spiritual Advisors
- Cognitive Performance Coaches
- Interpersonal Coaches
- Strength Trainers
- Dietitians
- Other Sports Medicine Specialists
Higher Ups (General and Support Staff at SWCS)
- General Santiago
- Support Staff/ Board that drives downstream policy changes.
- INTERVIEWS & KEY TAKEAWAYS
Name | Title | Takeaway | Interviewer | |
Hector Agayuo | SWEG Command Sergeant Major | aguayoh@socom.mil | Lack of bandwidth of personnel to collect data He looks outliers or errors (compare that with the actual situation) Which means he need to know each soldiers quite well, how would he be able to do that? He goes through 6 months of CCC data in 2-3 hours (110 surveys) |
NC |
Greg Santiago | Operations Specialist, SWEG | gregorio.santiago@socom.mil | Logistic & operation planning does not involve a lot of data related to our project
Same biometrics data may means different things as each individual is unique. How to assess data can be much more difficult than how to collect them |
BX |
MAJ Chuck Schemacher; | Operations Officer, SWEG | charles.schumacher@socom.mil | Is SWIC measuring the right stuff? Why do many students who pass Assessment and Selection drop out later on?
Brass want more graduates, cadre want better quality graduates. New generation of soldiers are different to train, causes friction. |
|
Bob Jones | Communications Language School Director | HDP operates here to evaluate the trainees effectiveness at negotiation | NC+BX | |
James Arp | HDP Director | james.arp@socom.mil | Funding comes from:
USSOCOM, USASOC, Army We have to justify the value of the training programs to the 2,3, and 4 star general commands |
NC |
General McChyrstal (x2) | he joint Special Operations Command in the mid-2000s | Must change the culture as well as the technology within the military | All | |
Mike Taylor | HHC 1st Sergeant | michael.s.taylo | More robust data collection/analyis would help determine what training interventions work and what does not.
cooperation/information sharing is generally facilitated by relationships, if they aren’t there, then generally different units won’t communicate. |
AJ |
General Dempsey | the 18th Chairman of the Joint Chiefs of Staff and the 37th Chief of Staff of the Army | Need to focus on what problem we can solve
Technology is changing the landscape of the federal gov and DoD Focus on the cultural aspect of getting military |
All | |
COL Bill Rice | SWEG Commander | william.rice@socom.mil | Higher-ups at SWEG don’t have much contact with the committees.
SWEG doesn’t have a committee that makes data-driven decisions. Peer feedback, instructor feedback, and some data informs human performance results. 30 day ideal innovation turnover for new devices. He would like to be able to test new hardware/software without jumping through the regulation/ permission hoops. Use case: Immediate feedback mechanism as the top use case priority from Commander Rice point of view. Have expert as a reference point for novices using big data/database to make key decision such as selection process/recruiting. Track each trainee’s performance/data throughout the training process |
BX |
Major Chuck | Is SWIC measuring the right stuff? Why do many students who pass Assessment and Selection drop out later on?
Brass want more graduates, cadre want better quality graduates. New generation of soldiers are different to train, causes friction. |
|||
Stephen M. Mannino, Edd | Human performance program coordinator | stephen.m.mannind@socom.mil | Showed the database that strength and conditioning coaches use to advise
Strength and conditioning coaches have to download to share the database with other data users (physical therapists, CPC’s, data analysts, research psychologists) |
All |
Justin Jones | Strengthen conditioning coach | justin,jones@socom.mil | Names are declassified – only coaches have access to the names
Database’s real time feedback has been helpful in readjusting physical training |
All |
Taylor McKinney | Physical therapist | Accessing the medical database always leads to problems
No protocol for how physical therapists make data-informed decisions |
All | |
Sara Butler | Physical therapist | Would really like a database that they can get to without taking up so much time
Did not interact much with CPC’s, data analysts, and research psychologist |
All | |
Alexandra Hanson | Data Analyst | alexandra.hanson.ctr@socom.mil | Getting buy-in from higher ups is necessary to help increase funding and implementation
Acquisition for different accounts at SWEG is siloed which means that different solutions will not arrive at the same time. |
|
Constance Garcia | Data Manager | constance.garcia.ctr@socom.mil | Government shutdowns affects contractors. | |
Daniel Gajewski | Performance Integrator | Daniel.gajewski@socom.mil | The Toby eye tracking device just arrived and they are still working on calibrating it
The current output of that eye tracking device is just a video Has another software platform for further analysis but they haven’t start yet |
All |
Dawne Edmonds | Process Improvement and Project Management
US Army Special Operations Command |
Dawne.edmonds@socom.mil | There’s no authoritative data source. It means that there is not a single data source that the analysts rely and trust upon.
“CONSTANTLY. I CAN’T EXPLAIN TO YOU HOW OFTEN IT HAPPENS” They tend to type all the data all over again. This is a constant problem. Army are multi-service. |
NC |
Oscar Gonzalez | Research Psychologist | oscar.gonzalez@usuhs.edu | Is in the process of holding CPC’s more accountable for inputting and collecting data
Working on concerns from data analysts about the large amount of time spent toward data input |
All |
- KEY INSIGHTS
- Data utilization is actually quite low. The decision making at the committee level are mostly based on intuition and past experience
- Many data (both biometrics and cognitive) are not collected due to lack of tracking devices
- Even for those collected data, most of them are not being utilized
- Most data are self-reported in the form of survey or self-assessment, which involves basis
- Most database are not shared internally, many secondary data users work in silo and do not have access to each other’s database
- KEY PROBLEMS
- Secondary users have three or four different data systems for one function
- Secondary users have to get data manually
- There is no central database for the data from biometric devices
- Secondary data users work in silo
- Hard for secondary data users to collaborate with each other
- Hard for secondary data users to collaborate with primary data users
- Lack of a systematic workflow among primary data users
- Some CPC collect and scrub data, some don’t
- CPC does not have a standard process or expectation on data-related work
- Errors in data collection (ex: bear incident)
- No plan for biometric devices and what outcomes they should have
- Training feedback
- Soldiers do not receive real-time feedback
- Soldiers do not receive specific feedback on training performance
- Who gets access to which data
- The regular maintenance of the system & platform and periodic update
- KEY DECISIONS
- We need to define the scope of the problem and narrow down the problems
- Define the key pain point and focus on the corresponding problem
- Re-identify key beneficiaries based on the problem
- Identify use cases for the re-identified key beneficiaries
- Prioritize use case
- Start brainstorming potential solution for each use case
Week 2 Insights
BENEFICIARIES
To simplify our business model canvas and arrange for a more direct interview strategy, we organized our beneficiaries in three main groups listed below:
Primary Data Users
- Cognitive Performance Coaches
- Data Analysts
- Research Psychologists
Secondary Data Users
- Physical Therapists
- Spiritual Advisors
- Cognitive Performance Coaches
- Interpersonal Coaches
- Strength Trainers
- Dietitians
- Other Sports Medicine Specialists
In identifying this group, we also learned that a representative from each of the 7 groups in the secondary data users is weighing in on the facility design and architecture so that their work needs are met.
Higher Ups (General and Support Staff at SWCS)
- General Santiago
- Support Staff/ Board that drives downstream policy changes.
# | Name | Position | Dep. | Note Take, Interviewer | Date | Notes | |
11 | Vatrina Madre | Information Technology Director | SWEG | vatrina.mardre@socom.mil | NC/BX | 1/16 |
|
12 | Aspen Ankney | SOCEP CPC | SWEG | aspen.ankney.ctr@socom.mil | AJ/BX | 1/14/19 | -data manager needs to collab w/ CPC to make sense of the data or else data goes to waste (the current collaboration isn’t quite efficient/smooth
-CPC -> data analyst ->research psychologist is ideal rather than what is happening now -CPC: 70% is working with data (cleaning/adjusting spreadsheet/running analysis) 20% is actual data collection of it, do hard copy, 10-15% of the time spend with other CPC, learning what are they working on 5% Research, finding the norm, what other methods -only 40-50% CPC’s collect performance enhancement data |
13 | Ian Ankney | SOCEP CPC | SWEG | ian.t.ankney.ctr@socom.mil | BX | MOVED TO NEXT WEEK – SICK | |
14 | Constance Garcia | Data Manager | SWEG | constance.garcia.ctr@socom.mil | BX,AJ | -lack of communication between CPC and data analysts
-CPC’s drop off data in person, and data analysts have to manually put in data (data input takes up 70% of her time inputting data) -sometimes CPC want data to be inputted and not analyzed – Besides lacking an integrated data platform(from the tech side), it seems human factors(lack of communication and collaboration) and inefficient processing (manual input, no standard protocol of the format of the data) also contribute to the problem. |
|
15 | Curtis Price | Deputy to the Commander | SWEG | curtis.price@socom.mil | AJ; NC | 1/15/19 |
|
16 | Dr. Tom Duncan | Performance Integrator | SWEG | tommy.duncan.ctr@socom.mil
tommy.duncan@ptp-llc.com |
BX; AJ | 1/16/19 | -Oscar has a budget
-Oscar has been sitting in on meetings about religiousness for the performance integrator (moving toward the ideal model now) -discrepancies with how many CPC’s collect data – this person estimates 50% collects data |
17 | Alexandra Hanson | Research Analyst | SWEG | alexandra.hanson.ctr@socom.mil | TL | -We need to be able to analyze the problem from a military context. Data gets lost due to personnel turnover, lack of SOPs to stop this from happening
-Security is the primary concern, more so than sharing. Data being abused is already a problem. Countermeasures from cyber attacks and EMPs are also a consideration. -Any solution we provide should be evaluated on whether or not more Spec Ops soldiers are coming out of the program. Ineffective data collection is actually hurting some recruits, washing them out based on technicalities |
|
18 | Dr. Morgan Hall | SOCEP CPC | SWEG | Morgan.hall.ctr@socom.mil | TL, AJ | 1/16/19 | -some devices are “shiny” but aren’t effective or necessarily research based
-CPC’s are well integrated with each other but not with the data analysts and research psychologists -CPC send all the raw data to the data analyst -> no standard procedure for CPC’s of what to do with data |
19 | LTC (Dr.) Mike Devries | Command Psychologist | SWCS | michael.r.devries@socom.mil | AJ/BX | 1/13/19 |
|
20 | Dr. Megan Brunnelle | Head Physical Therapist | SWEG | NC | 1/17/19 | -Three Systems: army medical systems, SPEAR, medical imaging system + Profile system; Cognitive and conditioning notes are manual
-Accessing the server is hard – “not a day goes by that I don’t have trouble with the system” -Talk to five people: sports medicine, physical therapist, strength and conditioning, dieticians, CPC, interpersonal coaches, spiritual advisors |
|
21 | Kelvin Bronson S6 | Information Technology | US Army JFK Special Warfare Center and School
(SWCS) |
kelvin.bronson@socom.mil | NC | 1/16/19 |
|
KEY INSIGHTS
- The relationship between the CPC’s, research analysts, psychologists is more nuanced than previously expected. Half of CPC’s don’t complete data collection and most of the time, CPC’s are scrubbing data (~70% of time).
- Another beneficiary group has been identified: the secondary data user group. These are the physical therapists, spiritual advisors, dietitians, etc.. These are people that aren’t directly present for data collection during training but still used the collected data.
- PAIN POINT: “Not a day goes by where we don’t run into problems with the citrix server. It’s usually hard to log on.” It’s clear that the citrix server for all the other data servers is inefficient and inconsistent. Targeting other secondary beneficiary users would allow deeper understanding of the issues at human dynamics and performance (HDP).
KEY DECISIONS
- We need to consider a whole other world of potential beneficiaries: the secondary data users. These are people that interact with the data after training and use insights from training to inform their own functions. For instance, the physical therapists use training data to provide more perspective on a patient’s physical well-being, recovery time, and supplementary exercises.
- There’s nothing to tell an incoming CPC what their specific function is within a data collection study. Our team should pursue interviews in areas that provide insight about transitioning incoming contractors.
Week 1 Insights
After interviewing ten customers, we know that the training facility will use various devices that tracks eye movement, heart rate variability, motion, and other human performance metrics. However, each hardware collects and stores data separately, and there currently exists no way for the data to be stored in a single location. The military lacks an efficient way to store this data which leads to the lag in the analysis of the data. More specifically, we discovered that data is in the hands of the Cognitive Performance Coordinator (CPC) who gets the raw data must spend a lot of time scrubbing the data before giving the data to the data analysts. The data analysts support the research psychologists in evaluating the training program. Our customers emphasized that the data should be continuously updated as new data occurs.
Next week, we plan on interviewing potential trainees. All of our interviews this week were with individuals who would be facilitating or leading the training program. Speaking with trainees will allow a more holistic picture of the training program. Additionally, we plan on speaking to more data analysts. The data analyst can address our questions of their daily technical functions as well as their inability to work with raw data. Apart from more diverse interviewees, we plan to clarify on the timeline in training site construction, type of biometric devices that are currently used and future biometric devices that will be used, and why the data analysts cannot scrub the raw data.
Name | Job Description/ Role | Designation | Key Takeaways | |
Oscar Gonzalez | Research Psych | -training needs military person and CPC -facility not yet built-psychologists would like to analyze rather than clean data |
oscar.gonzalez@usuhs.edu | |
SSG Trevor Obrien S6 | Information Technology | US Army Special Operations Command(USASOC) | -technology are in different versions-device to device communication is important
-SPEAR: both a hardware and software solution |
trevor.obrien@socom.mil |
Rick Dietrich | SOCEP DIR | -Cognitive Performance Coordinator (both Dan Sproles & Dan Gyetsky) gets raw data -> scrubs data -> gives data to data analysts-raw data is on CPC’s laptop so CPC’s have to scrub data to transform it into something that data analysts can use – it takes a lot of time for CPC to scrub data | Frederick.d.dietrich@socom.mil | |
Phillip Thomas | Director of Academic | -equipment must be approved for continuous software updates (otherwise manual updates on an unconnected computer)-SOCOM initiative – all data is inputted manually right now | phillip.thomas2@socom.mil | |
Jim Arp | HDP Director | US Army Special Operations Command(USASOC)
Special Warfare Education Group (SWEG) |
-smart system automatically collects and uploads data-problem: measure impact of the training programs | james.arp@socom.mil |
Kelvin Bronson S6 | Information Technology | US Army JFK Special Warfare Center and School(SWCS) | – high demand for coaches (want to serve more students) and currently coaches/CPC(especially before Oscar came) need to take time away from training students to process data.-Thus, one of the main goal of our solution should be to save their time and energy so that they focus on their main task. | kelvin.bronson@socom.mil |
Steve Mannino | THOR3 | US Army JFK Special Warfare Center and School(SWCS) | -training facility can house 5000-number of staff available is a weakness for training
-SPEAR: not user friendly enough |
stephen.m.mannino@socom.mil |
Seth | Data Analyst | -Dealing with Silos: Either SOCOM or USASOC commander could initiate changes for information sharing. | stephen.m.mannino@socom.mil | |
Dan Gajewski | SOCEP CPC | US Army Special Operations Command(USASOC)
Special Warfare Education Group (SWEG) |
-Event-based data tracking during an activity.-Find a way to relate data each other in real time.
-Come back to him after a few interviews. |
daniel.gajewski.ctr@socom.mil |
Dan Sproles | SOCEP CPC | US Army JFK Special Warfare Center and School(SWCS) | -Limitations: soldier to coach ratio. (1:40 or even 1:200). Even new tech solutions need to work with scale.-Scaling of coaches will likely take more time, extra strain on coaches in the beginning. | DANIEL.J.SPROLES.ctr@socom.mil |
Recent Comments