Tool Evaluation Criteria

The main criteria for a tool to be featured on the Workforce EdTech Tools repository is proven effectiveness in increasing education, employment or economic mobility outcomes of adults with lower level skills. The profile of each tech tool includes information on how its effectiveness was demonstrated. The list below details some of the criteria presented in the form of questions to ask when evaluating the pros and cons of a tool for use in this context. No tool would rate the highest on every criteria so the weight to give each question depends greatly on the kind of tool and the context and goals for its implementation. We readily welcome suggestions for improving and updating the criteria as technologies continue to evolve. We will continue to add effective tools once we have the chance to evaluate them. Please also share with tools you see having a positive impact in the field.

This Online Tool/Resource Evaluation Rubric provides a checklist to guide your determination about whether or not a technology tool or resource aligns with your program needs. The full Tool Evaluation Criteria list is available as a PDF download or an editable Google Sheet. The rubric was created for the EdTech Center’s Technology Testing for Adult Learning and Employment field tests.

Criteria for Evaluating Workforce EdTech Tools:

To Increase Reach and Impact of Adult Learning & Employment Initiatives

Proven Effectiveness Icon 3Proven Effectiveness

Is there evidence that this technology can help meet your goals?

Effectiveness
  • What evidence exists to prove effectiveness of the tool? The Guide For Educators describes the types of evidence used in evaluating educational technologies.
  • To what extent does this evidence suggest that the tool could help meet your specific goals?
  • What other organizations have leveraged the tool effectively, and what factors are different or similar to yours (setting, types of users, goals for using product, etc.)?
  • Given past performance, what outcomes can we expect and over what period of time?

accessibility icon 2Accessibility

Is the Technology accessible and easy to use for all learners?

Ease of Use
  • Is the tool built with best practices in usability to maximize ease of use, learnability of the tool, effectiveness, efficiency, and user satisfaction?
  • How much  training, if any, will the user need to be able to use the tool effectively? Make sure to do user testing in order to answer this rather than making assumptions!
  • Is the tool built with best practices in Universal Design for technology so that it can be accessed, understood and used to the greatest extent possible by all people regardless of their age, ability or disability?
  • Are instructions easy to find and follow, and is the layout formatted in clear way to allow for easy navigation?
  • Can users backtrack or review previous pages content if helpful or needed?
  • If they log out, can they easily return to where they want to be?
  • Is the tool easy to obtain and use? Steps like having to downloading an app, download new (or old) technologies, or purchase a license, or enter an email pincode can be significant barriers to onboarding users.
  • Can the tool be easily accessed on multiple platforms including desktop and mobile, and on various operating systems (Android, Windows, iOS, etc.)?
  • Is internet required at all times or can the tool (or some features of the tool) be used offline?
Digital Literacy Required
  • Can users with lower digital literacy use the program successfully? Consider also how much up front training and ongoing support may be needed for effective use.
  • Does a user need to register before starting to use the tool, and if so is an email, pincode or authentication required?
  • If a user gets stuck in navigation, are there audio, visual, text, or other clear cues provided to instruct a user on what they should do?
  • Does the tool integrate with other tools or interfaces that the users already use?
Language & Culture
  • To what extent does the tool offer translation in the intended users’ first languages?
  • Is the content culturally diverse and free of bias?
  • Is language consistent and free from grammatical, spelling, and syntax errors?
  • Is the text complexity appropriate for for the literacy level(s) of intended users? The  Automatic Readability Checker uses seven readability formulas to indicate how difficult a passage in English is to understand and gives tips for simplifying language.
  • For low literate users, is there audio provided and/or sufficient visual cues?

affordability icon 2Affordability

What is the cost and is it affordable now, during scaling, and in the long-term?

Affordability Over Time
  • Is the tool free, and if so, does it use a ‘freemium’ model where it charges for upgrades or premium versions?
  • If there is a cost, does the tool charge by a one-time cost, or by number of licenses, or by usage rate?
  • How much expenditure in staff time, if any, is required to host and maintain use of the tool?
  • Is the tool affordable now, and will it remain affordable as you scale its use, as well as over the long-term?
  • How much data and device memory does the tool use, and what is the cost of this usage?

user experience icon 2User Experience

Is the tool enjoyable and used effectively by the intended audience to meet their goals?

Onboarding, Hook & Engagement
  • Is the first log in an easy and seamless experience?
  • Is the first introduction/view of the tool engaging and does it “hook” them to try it?
  • What user engagement strategies are used?
  • Does the tool provide or allow for creation of engaging content?
  • Do interactions provide multiple means of action, expression and engagement?
Quality and Effective Content
  • Is the content up-to-date, accurate, comprehensive, and relevant?
  • Will the content be effective towards users’ achieving their goals for using the tool?
  • As there a multi-media combination of content forms such as text, audio, images, animations, video and interactive content?
  • Is the content relatable and, if appropriate, memorable?
  • If the content includes a learning tool or resources, is the curriculum and instructional design  aligned to users’ learning goals and sufficient to be effective? See the TTALE Rubric for Evaluating Instructional Content, which also shares more detailed rubrics for evaluating learning resources.
Logical Flow
  • Is the content logically organized, and if appropriate, is the flow explained?
  • Is the content/experience chunked into digestible amounts both in terms of time blocks and cognitive load?
  • Are there clear and connecting transitions between activities, and if appropriate, do they build on one another?
User Centered
  • What do honest user testimonials and/or feedback say? If you don’t have these, search online for reviews and/or seek out other organizations using the tool.
  • Is the user experience and are the activities designed to focus on the needs of the primary users first (before the needs of secondary program staff) and does the tool succeed in meeting users’ needs?
  • Does the tool respond or adapt differently to the user based on what the user does? In educational technology tools, adaptive learning can greatly increase learning outcomes and retention by personalizing and differentiating instruction and providing additional supports to users as they need it.
  • Does the tool avoid social gender and other bias as it personalizes the user experience?
  • Has tool been created with the variables that users will need? An example variable is name fields that can accommodate hyphens or 20-characters names.

User Support and Communication icon 1User Support & Communication

Does the tool provide adequate and accessible user support and communication?

Supports
  • What is the process and what are the resources the tool provides for training new users to use it?
  • Does the tool provide support for users, and is the tech support easily accessible and sufficient?
  • Are there accessible and effective communications methods for updating users and keeping them engaged?

Data Privacy and Security icon 3Data, Privacy & Security

Is the solution set up to give you the data you need in a sustainable way and are you clear and comfortable with the privacy policies?

Data
  • Does the tool provide the data you need for your intended purposes related to its intended use and measurement of impact?
  • What specific user analytics and other data does the tool gather?,
  • Do the tool’s reporting functions meet your needs and the needs of the end users?
  • Can the data be easily visualized and aligned to user progress against goals?
  • Can you access the data directly, or if the third-party tool developer/vendor must send it, do they charge for the data reporting?
  • Who owns the content that you might add to the tool?
  • Is the data collected and exported in a way that enables data interoperability with other tools and systems. Tapping Data for Frontline Talent Development explains the importance of this.
Privacy & Security
  • Who has access to the data, who owns the data, and how is it protected?
  • What data is shared, how is it shared, and with whom?
  • Are users effectively and appropriately made aware of data policies and practices?
  • Does the tool or connected tools collect, track or otherwise leverage Personally identifiable information(PII)?

Longevity icon 1Longevity

Are you confident that the vendor will continue to exist and provide needed support and updates to the product for the period of time you may need it?

Vendor Longevity, Support & Improvements
  • How long has the tech solution/ developer been in business and at what development stage is the product (BETA or fully developed)? 
  • What is the tool’s business model, and are you certain the tool will stay available, updated and supported for the length of time you need it?
  • What processes does the developer use for continuous quality assurance and improvement?
  • Are there frequent updates or versions that will need to be installed, and how?

Do you have a comment or a question?

Share This