Why the phrase "Qualitative vs. Quantitative" is all wrong - in pictures!

I need to go out on a limb. The phrase “qualitative vs. quantitative” is all wrong. Just plain wrong. Stated as such, we’re misled into believing that qualitative and quantitative information are completely different categories of thing. First: they’re not. Second:

Premises

Set Theory

Zoom out with me for 30 seconds. Remember Set Theory? Don’t worry, nobody does. But it takes 30 seconds to re-learn. It’s Venn diagrams.

Read More

Flux Goes to Collision Conference 2018!

THINK FAST! What do Al Gore, Wyclef Jean, Brad Smith (the Pres. of Microsoft), Sophia Bush, and Flux’s own Katie Hillman and Bridgette Ortiz have in common?!

WE WERE ALL AT COLLISION FEST 2018 IN OUR HOMETOWN OF NEW ORLEANS! (Along with Arcade Fire and many other esteemed guests…)

Full list here: https://collisionconf.com/speakers

collisionlogonola.jpg

If you haven’t heard of Collision yet, it is a massive tech conference that has been in New Orleans the past three years. It was named by Forbes as “North America’s Fastest Growing Tech Conference!”

This year was my first year attending-- and my first year at a tech conference in general. The event was definitely not what I had expected (I guess I had envisioned less variety of topics and more hands on technical demonstrations? Just less...hip? Exciting? Shiny?) but it was a really great experience! Firstly, Collision attracts people from all over the world-- they travel from India, New Zealand, England, etc. to attend. Which is great both for in conference conversation and the New Orleans economy!

As an evaluator, I was a rare breed compared to other attendees. Most people I talked to were unsure of what evaluation was, and extremely unsure of how it operated in the realm of social impact. So, that was an exciting opportunity to be able to share what we do with a new array of people.

I also got to learn about the exciting things happening in the techier realm of the world-- everything from apps people were creating to innovate for medical problems, advances and developing ethical issues in AI, the increasing female presence in tech (and increasing leadership roles for females!!), to the pros and cons of utilizing remote workers in your business!

On top of it all, Tinder was there! (It’s everywhere really...) Tinder’s Elie Seidman gave an interesting final talk discussing the role of dating apps in igniting a cultural revolution. Before sitting in on this, I had never thought about the parallels between the new methods we use for dating (super simple swiping) and the methods for how we approach other aspects in our lives (such as finding jobs, ordering food, studying for tests, and banking-- just to name a few), and how these similarities mirror our priorities. It’s like the busier we get and the more options we have, the technology we use reflects our choices and preferences in these spheres… whether it be streamlined and efficient, or more in depth and time consuming. (examples: Ok Cupid vs. Tinder, Uber Eats vs. Going to a restaurant)

collision inside.jpg

While I could talk about the things I learned and opinions I heard all day, I’ll bring it back to evaluation. As we’ve mentioned before in these blog posts, evaluation is a somewhat smaller professional field, and its presence is even lighter in New Orleans. But, our heavy use of data and reliance on technology made me think I would encounter more evaluators than I did. (Which was none, if you were wondering.) Granted, I did not meet everyone at the conference… but none of the speakers were identified as evaluators. I think as a field we have much to offer and gain from this techier realm-- and it would be an awesome goal to have more overlap!

Maybe next year’s Collision Conference (in Toronto, Canada) will boast some evaluators on their speaker list. But until then, I encourage you all to explore the options tech provides for us!

If you have any questions or comments about Collision Conference (your experience or mine), let me know at katherine@fluxrme.com!

Principles-Focused Evaluation

In December I was lucky enough to be able to go to the Principles Focused Evaluation Workshop facilitated by TerraLuna’s Nora Murphy. (If you haven’t heard of her, check her out. She’s amazing. TerraLuna Bio is here: http://www.terralunacollaborative.com/nora-murphy/)

1239867_226371377517791_864276791_n.png

Coming from New Orleans, traveling to Minneapolis during December was a treat...a very frozen treat.

One of my personal principles is “to be a proponent of daily, continuous learning, both to myself and others…” so, in spite of the cold, I needed to get on that plane and engage in what would turn out to be the most interesting professional development experiences I have had thus far. I scrounged together layers of warm clothes the day before my flight, calling in my friends from the Northeast to add to my stores of layering pieces, hats, and coats. "Can I pleassseee borrow your winter coat?" my text message to Elizabeth read sheepishly - if text messages can be sheepish. (wood chimes) Elizabeth: "Sure. Just don’t leave it Boston like you did with your purse last time ;)"  She provided the final piece of the puzzle, an ankle length, hooded, puffy jacket. But even then, I questioned my survival.

cold1blog.jpg

 

But the workshop was worth the cold.

Okay, all jokes about the cold aside (even if I did biff it on ice walking there the first day), this conference really helped me to reorient my approach as an evaluator, and repositioned my beliefs on what was possible for the scope of evaluations themselves. While it has taken me long enough to do, I am excited to finally write my personal reflection on the workshop, principles focused evaluation, and its potential use in New Orleans.

 (Mural on the outside of IntermediaArts, the gallery where TLC used to have co-working space)

(Mural on the outside of IntermediaArts, the gallery where TLC used to have co-working space)

The Workshop:

The workshop was two days at the beginning of December in Minneapolis. We worked out of a delightful space called Anahata Studio, that was a little difficult to find at first. But after winding through a series of business and yoga studios, I stumbled upon our home for the next two days. And luckily walked right in for breakfast! There were roughly 12 people in attendance, from a mix of backgrounds -- some were studying evaluation, others were professionals who were moving to the field, and other were longtime evaluators. And we were all there to learn the new method of principles-focused evaluation.

The content we went over was informed by the work of Nora and Michael Quinn Patton. Our materials for the workshop consisted of a Principles workbook created by Nora, and Principles-Focused Evaluation: The GUIDE by Michael.

I’m a pretty reserved person. Without a doubt, the emotional and intellectual depth expected in this workshop was challenging. There was no escaping the close scrutiny of our biases, nothing left un-problematized. I had to stretch myself and focus in a way I’d never felt before in a professional setting.

 Trust me.. I'm much more excited than I look. This is my "focused" face!

Trust me.. I'm much more excited than I look. This is my "focused" face!

Now a little more about Principles Focused Evaluation:

First and Foremost: the field of Evaluation is ever-evolving. And methods like Principles are evidence. Actually, principles focused evaluation was created because of the need for differentiation of evaluation methods. Among other things, principles focused evaluation is an intriguing way to address complex dynamic interventions (when the thing you’re measuring won't hold still; for example, when a program will inevitably change through its duration) in evaluation. This makes it an increasingly human way to approach the field of evaluation -- both in theory and practice. Some of these other nontraditional and new directions for evaluation is taking are in: mission fulfillment, strategy, advocacy campaigns, policy change, and systems change. And principles can be applied to all of these areas. An important factor about creating principles focused evaluations is that these evaluations be conducted through the lens of knowing things are interrelated, and that hurdles will emerge that are not anticipated (no matter how well you plan).

24900159_10214973537278831_6809779968120850566_n.jpg

As you can guess, principles are at the root of this method of evaluation.

Michael Quinn Patton and Charmagne E. Campbell-Patton define effectiveness principles as,

 “...a statement that provides guidance about how to think or behave toward some desired result (either explicit or implicit), based on norms, values, beliefs, experience, and knowledge. The statement is a hypothesis until evaluated within some context to determine its relative meaningfulness, truth, feasibility, and utility for those attempting to follow it” (http://www.cehd.umn.edu/OLPD/MESI/spring/2017/Patton-Principles.pdf).

These principles are created from looking in depth at the organization and taking into account what fundamentally matters. This is important because an evaluation is more likely to be used if intended users find the evaluation meaningful, the questions relevant, and care about the findings. Principles can be created using the GUIDE framework and rubric, which is available in Patton’s new book.

"But...aren’t they just like rules?"

NO. 

No, they're not. 

“A principle is prescriptive. It provides advice and guidance on what to do, how to think, what to value, and how to act to be effective. It offers direction. The wording is imperative: Do this. The guidance is sufficiently distinct that it can be distinguished from contrary or alternative guidance” (http://www.cehd.umn.edu/OLPD/MESI/spring/2017/Patton-Principles.pdf).

A great example of the difference between the two (from the Patton source listed above) is this:

Rule: “30 minutes of aerobic exercise each day.”

Principle: “Exercise regularly at a level that supports health and is sustainable given your health, life style, age, and capacity.”

An example of this method in practice was conducted by Nora Murphy creating “9 evidence-based, guiding principles to help youth overcome homelessness."

download.png

The Principles-focused evaluation...

  1. Identified principles in draft form
  2. Collaboratively identified fourteen youth
  3. Interviewed youth, reviewed their case file, interviewed a nominated staff person
  4. Synthesized information and wrote case stories
  5. Reviewed stories with the youth
  6. Analyzed stories, looking for principles and emergent themes

The resulting principles were:

  1. Journey oriented: Interact with youth to help them understand the interconnectedness of past, present, and future as they decide where they want to go and how to get there.
  2. Trauma-informed: Recognize that most homeless youth have experienced trauma; build relationships, responses, and services on that knowledge.
  3. Non-judgemental: Interact with youth without labeling or judging them on the basis of background, experiences, choices, or behaviors.
  4. Harm reduction: Contain the effects of risky behavior in the short-term and seek to reduce its effects in the long-term.
  5. Trusting youth-adult relationships: Build relationships by interacting with youth in an honest, dependable, authentic, caring, and supportive way.
  6. Strengths-based:  Start with and build upon the skills, strengths, and positive characteristics of each youth.
  7. Positive youth development: Provide opportunities for youth to build a sense of competency, usefulness, belonging, and power.
  8. Holistic: Support youth in a manner that recognizes the interconnectedness of their mental, physical, spiritual, and social health.
  9. Collaboration: Establish a principles-based, youth-focused system of support that integrates practices, procedures, and services within and across agencies, systems, and policies.

The reason a method such as principles-focused evaluation was so necessary here was because of the uniqueness of each young person the evaluation would come into contact with-- each homeless youth had their own story, with their own adversity and trauma. Each homeless youth is on a personal journey and “has unique needs, experiences, abilities, and aspirations.” Principles focused evaluation allowed for the creation of principles to “provide guidance and direction to those working with homeless youth...” as well as a framework for how the youth are approached, interacted with, and supported.

The full report can be found here: https://www.terralunacollaborative.com/wp-content/uploads/2014/03/9-Evidence-Based-Principles-to-Help-Youth-Overcome-Homelessness-Webpublish.pdf

 

If you are as inspired as I was after hearing how influential principles can be... here are some reflective words from Nora Murphy to help you begin to formulate your own principles:

  • Who do I need to be?
  • How do I live it?
  • What is the experience?
  • How do I recognize?
  • What is the outcome?

How does this affect Flux and evaluation here in NOLA?

This experience was a perfect example of how evaluation is evolving to the needs of the world.

With the field of evaluation still being relatively new to New Orleans, these new techniques are one way to meet some of our local organizations’ needs. The Principles method is especially effective because it looks at the heart of the organization -- the culture, the passions, the goals-- in order to guide the evaluation. New Orleans is a city overflowing with heart, culture, passions, and goals... so what could be a more perfect pair than that!

We here at Flux look forward to the opportunity to apply Principles-Based Evaluation with our present and future partners. We are grateful to the TerraLuna Collaborative for giving us the tools to incorporate this method into our work. Interested? Contact me (katherine@fluxrme.com) for further materials and a consult on whether this approach might fit your organization!

Good Luck and Happy Reflection!

New to interviewing?

Welcome to part 3 of the series #tipsfornewevaluators!

The rush of recognizing a good quote is addictive! When I worked as a reporter for my university’s newspaper, they were my favorite part of getting a story. Something about talking to a new person and getting a feel for how they communicate with their words, bodies and expressions - it’s like immersing yourself in a whole new world.

Conducting fieldwork is often an early part of an emerging evaluator's journey: on the ground, in the mix or whatever you call it. You'll be interacting with the real people affecting - and affected by - the program you’re evaluating.

Interviews are one of the tried-and-true workhorses of data collection during fieldwork for both journalism and evaluation, but there are some subtle differences. Both disciplines seek to tell a compelling story, but evaluations have the added goal of being useful for guiding the future of decision making. Depending on the evaluation methodology, the interview might also take on a range of specific formats with different degrees of structure, probing and exploration (e.g. closed, semi-structured and open; more on these in a later post).

Interviewing for Evaluation: Getting started

Here are three tips to make sure you make the most of your time with them. Ready when you are!

1. Prep a script. And practice it.

You don't have to say exactly what you wrote, but it's important to plan to give yourself the maximum chances of getting the information you want. Never forget that no matter how friendly the person or how willing to talk, you're taking up their time and it's important to know when you've got what you need. Put another way, you never know exactly where the winding road of a conversation will bring you, but you want to have directions to guide you.

Beyond the questions you want to ask, a script should include an introduction of yourself and your organization/company, an estimation of the time an interview will take, a brief overview of the project you're working on.

You also need to establish 'informed consent,' and to get it, your respondents MUST be well-informed! As interviews are almost never the only source of information in an evaluation, the interviewer needs to know beforehand how the interview data will be stored, transmitted, analyzed and interpreted alongside other sources of evidence. In other words, you have to be clear about what you'll be doing with the information they give you. If you're recording, don't forget to get their permission to record. Be explicit and succinct. 

Finally, don't let the first time you use your script be on your interviewee. Practice your script on a co-worker or friend first. Practice makes progress!

 

2. Mark your time.

Plan and simple. If you're recording, put timestamps at the beginning of each new question or section as you go. This way, you aren't wasting valuable time searching through the entire recording looking for that one good sentence. Uhhhhh, no...I definitely haven't done that before.
 

3. Get personal.

It takes a lot of courage and tact, but many times you'll have to learn to dig a little deeper. There are many, much less costly ways than an interview to tells if someone likes or doesn't like the program. In the interview, you're there to understand WHY. I like to describe it as getting water from a well. No matter how long it takes you to lower the bucket, you'll hit water eventually.

Wrapping Up.

I hope you’ll join me again for the next part in the series, which features some ideas on how to make meetings fun, effective AND interesting. If you have any ideas for me, please don’t hesitate to email me at tawanda@fluxrme.com. See y’all soon!