Category: Evaluate

Evaluate

The teams have been testing their solutions. Receiving feedback is the key to iterate and co-create a solution that responds to the users need. Regardless of the importance of testing, being challenged on the idea they have been working for weeks is not easy to handle. The students reflect on this process and their strategies to incorporate feedback, staying true to the essence of their solution and the problem they are solving.

“The evaluate phase has taught me the importance of asking insightful and intentional questions.”

By: Sanya Uppal

The evaluation phase has been one of uncertainty and unpredictability. Finding a method of testing and creating a prototype has involved a greater examination of our core purpose and introduced the component of feasibility. It has also involved a feeling of responsibility; how do we create a prototype, test and refine in a way that is meaningful for us and our stakeholders? This has been very impactful on my design thinking mindset and approach to our prototype iteration process. 

The evaluate phase has taught me the importance of asking insightful and intentional questions. When interviewing individuals or asking for feedback, this is important as it helps drive the conversation and gain meaningful observations. Furthermore, I have learnt how crucial it is to maintain a constant dialogue in our team as we receive feedback. Each of us on the team reacts to critique in our own way and extracts different information from it. Therefore, we have to consolidate and ensure that we are on the same page. 

Responses to our idea have been varied, particularly as each individual highlights a different aspect of our testing prototype that they consider valuable. Co-creation is our intention, but it has been difficult to consider contrasting opinions that all lead us into different directions. This phase has helped me understand which judgements should be in our feedback circle and how my team must collectively make this decision.

 

“The most difficult part about our interview processes has been selling our idea.”

By: Justing Koga

As we are nearing the end of our Evaluate phase, I’m most proud of the work we’ve done as a team. I think chemistry is often overlooked when considering project success — one person has the ability to derail the momentum of a workflow. I’m so excited that going into our final week in OpenDesign+, my team is moving at full speed. 

During the Evaluate phase of the open design process, the most difficult part about our interview processes has been selling our idea. We are developing a “read the room” gauge for students and professors during the online learning setting (as well as a couple other community tools). Our problem has been that we lack a cohesive, easily digestible summary of our ideation process and of our solution. Going forward, we will have to experiment with different storytelling platforms to best be able to present our work. 

For me, the most memorable learnings from the Evaluate phase of open design have been more social than anything: how can I best present my ideas and thoughts without disrupting the general flow of the team’s ideation process? This question is especially difficult in the online setting, but I truly feel that something like our “read the room” gauge might allow group/class members to be more cognizant of the state of the room. 

 

“The evaluate phase has taught me the importance of falling in love with the problem and not the solution”

By: Marcus Ortiz

The evaluation phase has definitely been a roller coaster of emotions. One second we were on a roll redesigning and reiterating, then, within an hour, the magic would wear off. Despite this, the evaluate phase has easily had the most impact on my design mindset.

It can be hard to let go of an idea. However, the evaluate phase has taught me the importance of falling in love with the problem and not the solution. In order to design and create “with” others and not “for” others, you must realize it is essential to include their opinions not only when defining the problem space, but also when creating and iterating. Oftentimes this means setting aside your pride. Although an idea may seem golden, if users say otherwise, it is your responsibility to step up and change the idea you thought was perfect.

Sometimes the hardest part is realizing you even have a biased connection to an idea. Subconsciously, you may end up asking interviewees questions that imply the answer you want to hear. It is only natural to want the original design to be perfect. Yet, this narrow mindedness severely hinders innovation. The goal of feedback is to see the solution from a different perspective, not to force your prospective on them.

However, despite the difficulties that come with fostering and utilizing feedback, our design has improved tremendously from the evaluate phase. Although we are still working out some kinks, I can not wait to present our design now that we improved our idea by getting some much needed feedback from professors and students. Now we just have to make sure we can communicate it right!

 

“I am now in a place where I welcome critique with open arms”

By: Caroline Surret

The evaluate phase of the open design process has made me excited to iterate towards a final solution. As a part of the evaluation process, my team will be running a series of workshops with incoming first-year students around purpose-finding. I am eagerly anticipating those sessions, as taking the time to reflect on my goals and pathway before I got to Duke would likely have been a very fulfilling experience–one that would have made me a bit less anxious coming into my first year. 

Besides those workshops, my team has been collecting feedback on our prototype pages via Qualtrics surveys. While it has taken some getting used to throughout the open design process, I am now in a place where I welcome critique with open arms and am very much enjoying reading through feedback and thinking critically about how to address the underlying assumptions of our project. 

Ultimately, the evaluation phase of the open design process feels very impactful because it is a representation of the value of co-creation. As we continue into presenting our work to stakeholders next week, I hope to communicate this moment of co-creation in a way that clearly demonstrates its impact on our solution and our team.

 

“With my engineering background, I find it difficult to think big without thinking about the achievable “baby steps” to get there.”

By: Zsofia Walter

The evaluation and testing phase has been difficult. The thought of creating low-fidelity prototypes to test different aspects of the platform, and thinking of phases for that testing made me excited because I could finally see a pathway to where we could actually reach our goal. The difficulty came in consolidating these plans with the short time we have left. We had to accept this time limitation and decide upon the best way to exhibit our idea. With my engineering background, I find it difficult to think big without thinking about the achievable “baby steps” to get there. 

My biggest “aha” moment of this entire program came this week. A big aspect of our platform that we love is recreating the “purposeful wandering” we would normally be able to do. This week we were able to speak to Michael Faber, a senior manager from OIT, and not only was he able to give us insight into the technological feasibility of our platform but also forced me to take a step back from our vision of this interactive map. One thing that he said really stuck with me. He asked if it was the act of physically wandering around Duke that we were trying to recreate, or the feeling students get when they stumble onto something they didn’t know about and find connection. This redefining of the problem made it easier for me to envision low-fidelity testing for our platform.

 

“It has been difficult to decide on which testing method to pursue because of the sheer amount of ideas we want to incorporate.”

By: Kaelyn Griffiths

The evaluate phase has been the most rewarding time, yet the phase that makes me feel the most anxious. It has been really nice to continue building our vision and sharing that vision with our stakeholders, especially students, but knowing that others may not see the need for our vision has been a scary, recurring thought. As my team has been discussing the best ways to test and evaluate our idea, it has been difficult to decide on which testing method to pursue because of the sheer amount of ideas we want to incorporate. Hearing critiques so far on our ideas has been extremely helpful and we have worked hard to build a team dynamic that invites constructive criticism because we understand that feedback is how we design something that best fits the needs of the intended user.

Our ideas have evolved immensely from highlighting key features we wanted to include in our idea, to putting those ideas in practice by designing a prototype with our stakeholders, to continuing to revise those designs through workshops and feedback. This process has taught me so much about problem solving in the real world, including that it is imperative that while discussing and designing concept ideas, keeping the intended user at the center of discussion and making sure their needs are met is key. My team has done an amazing job with holding each other accountable for that and it has proven to be successful in all the feedback we have received thus far.

I think a recurring challenge has been balancing time and the feeling of responsibility to do our project justice. My team has very big ideas and is looking for a culture shift that prioritizes meaningful and intentional decision-making at Duke, but we understand that time constraints may prevent us from seeing this project completely through. Even the time constraints of the evaluate phase have felt a little overwhelming, but we have been working on scheduling to ensure that we can get enough feedback to prove our concept useful. 

 

“It’s been hard for me to separate criticism of the product as different from criticism of myself.”

By: Jonathan Browning

This design process has felt like a sprint since the beginning – perhaps that’s the point. But, even though I may have put more hours into the Evaluate phase than any other, something feels less rushed. For me, there’s been this release of tension that I didn’t realize I was carrying. Since our initial design idea – which I guess was only a week ago but feels like months- everything has been curated by myself and my team. That’s not to say we haven’t been trying to get authentic feedback but, in my sessions, I’ve been trying to present our solution in the best light possible – really focusing on the good and emphasizing changes when the bad (or at least, less polished) comes up. Switching to evaluation has allowed us to put our idea into the world and see how it comes across, without us explaining it step-by-step

Part of this is freeing, but part of it also pains me. I can no longer defend or elaborate on the idea when presented with criticism. It’s been hard for me to separate criticism of the product as different from criticism of myself. I want this concept to succeed but I know that solutions only get better when you face the weaknesses. For this reason, I have wholeheartedly embraced iteration because I know that it gets us ever closer to the point where we can say “This could actually make a difference in someone’s life.” 

I’ve learned a lot about using criticism, crafting questions, and evaluation in general. But, most surprisingly, I’ve also learned hard skills involving creative outlets I’ve never used before, including Canva and Photoshop. Open Design has opened me up to learning new skills on the fly and I believe my greatest impact has been in readily embracing these skills.

 

“After you put so much time and effort into designing a solution that you truly believe will work, it becomes hard to see outside of your perspective.”

By: Florence Wang

Although the “create” stage was difficult because we had to figure out a way to consolidate all of our information and ideas into one solution, the “evaluate” stage was difficult for a completely different reason. For the entirety of this program, we have been pushed to think outside of the box, and really let the design thinking process and open source methodology guide our creativity. However, when tasked with coming up with a tangible testing method, we soon realized that this was much easier said than done. 

We started out by really embodying this idea of co-creation, and reaching out to as many individuals as possible to hear their feedback and then use that to fuel our iterations. This was an extremely valuable process, and it was both refreshing and somewhat uncomfortable to hear critiques about our design. After you put so much time and effort into designing a solution that you truly believe will work, it becomes hard to see outside of your perspective. In a sense, you start to become desensitized to the possible pitfalls of your prototype and you start to view positive feedback as a sort of confirmation bias instead of simply another perspective that can help improve your design. 

However, these conversations with our stakeholders also contributed to a new problem that we as a team faced. We realized that we actually had something. An idea that could potentially be useful for the lives of many individuals. But with that something also comes great responsibility, and we had to show that this something is not only necessary and helpful, but also that it is feasible. However, in order to do so, we needed a solid testing method, and nothing that we came up with seemed to encompass what we were trying to express with our solution and also represent all aspects of our vision. For the first time, I actually started to feel small and incapable–how were we going to do this? We don’t possess the power or skills to create a working prototype so therefore, how are we going to get the buy-in, surpass the technological difficulties, and actually make this “tangible impact” that we have been talking about since the beginning of the program?

I think we started to narrow our scope too much and thought about changing our direction to make it easier to convince people that it is a “good” solution. However, although it was super beneficial to think about our design from a different perspective, ultimately, we needed to remind ourselves of the bigger picture and that it’s okay to have a lower-fidelity prototype. But no matter what, we shouldn’t let our fear of not having a full-on working example of our design take away from certain elements of our vision. 

And finally, through our conversations with those around us, I was able to learn more about human behavior and the thought processes of those I was designing for. I think that for myself and my team, it was a reminder that in the design thinking process, everything is a prototype. Nothing is ever fixed. There is always room for improvement because humans are ever changing creatures and ultimately, humans lie at the heart of our “solutions.”

 

“User-centered design requires this open dialogue.”

By: Drew Flanagan

The “Evaluate” phase has been tricky because it is very difficult to effectively assess your own design. 

When sharing your design, many people respond very positively, especially those you are close with. Others, maybe colleagues or peers you are less close with, tend to focus heavily on the positives of the design so as to not come across as “too negative” or unwilling. 

To combat this cycle, interestingly, my group has started soliciting anonymous feedback on our work from all members of the Duke community. This method of evaluation is helpful because community members are invested in improving the community, and also, via the online form we made, can be constructive with their feedback without feeling like they are being overly critical.

While the anonymous survey eliminated potential bias, another challenge of “Evaluation” has been making sure to ask the right questions. How do we develop questions for testers that get at our “how might we statement” (the crux of our design) rather than getting distracted with some of the accessories (such as technology used for implementation). Due to this challenge, we have geared our survey to focus specifically on our design and whether or not it meets an individual and community need rather than if it is aesthetically or technically sufficient.

Hearing critiques of our idea has not been easy, but our team has continued to welcome them. User-centered design requires this open dialogue. Though it can feel excessive or unnecessary initially, we always learn something from user testing, even if it’s just an affirmative of existing features or a critique of an aspect we know needs to be improved.

I think I’ve made the most impact in helping my team adapt to feedback and suggested changes. Often, we can get stuck when we are forced to consider a new aspect of our design or modify a feature completely. However, I am proud of my ability to continue to bring the team together to think thoroughly as we adapt to suggestions from users.

 

“I realized it was the genuine feedback that was allowing us to create truly better iterations of our program. “

By: Arya Patel

The evaluation phase of the design process has really helped me understand how the feedback loop works to improve ideas, though it may be uncomfortable at times. 

I realized that it can be difficult to hear feedback that might interfere with the favorite parts of our idea. I feel like at this stage it is easy to get tunnel vision and not want to really ask the real questions or allow the evaluator to give their honest opinion. I often found myself wanting to cut people off, or correct them, or steer them away from weak points and point them towards our stronger areas. However, as I learned to resist this mindset during our evaluation tests, I realized it was the genuine feedback that was allowing us to create truly better iterations of our program. 

Over time, our idea has contracted and expanded as we took into account different stakeholder perspectives, pain points, and challenges. The evolution of our project design, parallel to the growth of our learning and knowledge, was quite intense and long. However, the outcomes are rewarding. It makes me proud to think we were able to overcome all kinds of challenges and thoughts that were pulling us in every direction to whittle down to a cohesive, interesting and innovative design. 

The most memorable part of this experience were the days when it felt like we were talking in circles around the same idea and then all of a sudden things would click and we would all come together excitedly and be on the same page. I think these moments say a lot about the team’s persistence, hard work, and willingness to work through uncomfortable or frustrating conversations. I am so happy that I got to experience this program with this cohort and my team; all of whom I’ve learned quite a lot from!

 

Powered by WordPress & Theme by Anders Norén