Friday, October 31, 2008

New Products at MAVCC

I was perusing the website at MAVCC (Multistate Academic and Vocational Curriculum Consortium) during lunch and I wondered how many of you had recently viewed their "New Products" webpage?

New products include:

Also, check out the online catalog or contact their customer service department at 1-800-654-3988 or (405) 743-5579, or e-mail them at customerservice@mavcc.com


National Consortium on Health Science and Technology Education


The National Consortium on Health Science and Technology Education (NCHSTE) and the CareerTech Testing Center have teamed up to provide an online testing system that provides both pre- and post-testing opportunities and reporting services. The assessment is based on the National Health Care Foundation Skills Standards. Eligible students who successfully pass this online assessment will obtain a Certificate issued by NCHSTE.

NCHSTE is a national partnership of individuals and organizations with a vested interest in health science and technology education. The consortium was organized in 1991 to stimulate creative and innovative leadership for ensuring a well prepared health care workforce.

Below you will find links to documents that you can download and print:
Portfolio content criteria
Portfolio rubric
National Healthcare Foundation Standards and Accountability Criteria

This is an exciting partnership with the CareerTech Testing Center! If you aren't aware of NCHSTE (who they are, what they do, etc.), click on the link in their name and check them out!!!

Thursday, October 30, 2008

Understanding the Numbers

I have been thinking about "numbers," "testing," and "test interpretation" after attending recent parent/teacher conferences and I thought it might be a good time to begin an ongoing discussion of what some of those numbers mean. When attending conferences, I always flashback to a few years ago to when I heard an instructor interpret a test to a parent that a "70" standard score was actually ok because it was like making a "C" when, in effect, it was two standard deviations below the mean and on an intelligence test, this would clearly fall in the mentally retarded range of ability.

Anyway, I know many of you interpret a wide range of assessments and measurement instruments so this may seem like a review, but feel free to post and tell us about some of the tests you use and what you like/dislike about the administration, scoring, and interpretation procedures of each instument.

Most of the information that you will read today and in the future on statistics are from Jerome Sattler's "Measurement of Children" and I thought it would be good to start with norm-referenced measurement (an indication of average or typical performance of the specified group).

A norm group should be representative of the various demographic populations as a whole, the number of subjects in the the norm group (size) should also be large enough to ensure stability of the test scores, and it is always important to consider how relevant the norms are to the evaluation of the examinee.

The first derived score we will look at is Age-Equivalent and Grade-Equivalent Scores.

"Age-equivalent and grade-equivalent scores are derived by determining the average score obtained on a test by individuals of various ages or grade placements. For example, if the average score of a 17-year old student on a test is 15 items correct out of 25, then any other student obtaining a score of 15 receives an age-equivalent score of 17. An age-equivalent score is found by computing the mean raw score of a measure for a group of children with a specific age. Similarly, a grade-equivalent score is found by computing the mean raw score obtained by children in each grade. If the mean score of seventh graders on an arithmetic test is 30, then a child obtaining a score of 30 is said to have arithmetical knowledge at the seventh grade level. A grade-equivalent score is expressed in tenths of a grade (for example, 10.5 refers to average performance at the middle of the tenth grade). A grade-equivalent score, therefore, refers to the level of test performance of an average student at that grade level. It does not mean that the student is performing at a level consistent with curricular expectations at his or her particular school. (Other terms for age-equivalent scores are mental age (MA) and test age.)

Age- and grade-equivalent scores require careful interpretation, for the following reasons:

1. Within an age-equivalent (or grade-equivalent) distribution of scores, the scores may not represent equal units. The difference between second and third grade-equivalent scores may not be the same as the difference between eleventh and twelfth grade-equivalent scores.
2. Many grade equivalents are obtained by means of interpolation and extrapolation. Consequently, the scores may not actually have been obtained by children.
3. Grade equivalents encourage comparison with inappropriate groups. For example, a ninth grader who obtains a grade equivalent of 11.1 in arithmetic should not be said to be functioning like a eleventh grader at the beginning of the school year; this is the wrong comparison group. The ninth grade student shares with the average eleventh grader the number of items right on the test—not other attributes associated with eleventh grade mathematical skills. The grade equivalent of 11.1 should be thought of in reference to only the student’s ninth grade group, not any other group.
4.
Identical grade-equivalent scores on different tests may mean different things.
5. Grade equivalents assume that growth is constant throughout the school year; this assumption may not be warranted.
6. At upper levels, grade or age equivalents have little meaning for school subjects that are not taught at those levels or for skills that reach their peak at an earlier age.
7. Grade equivalents exaggerate small differences in performance—a score slightly below the median may result in a grade level equivalent one or two years below grade level.
8. Grade equivalents vary from test to test, from subtest to subtest within the same battery, and from percentile to percentile, thereby greatly complicating any type of comparison.
9. Grade-equivalent scores depend on promotion practices and on the particular curriculum in different grades.
10. Age- and grade-equivalent scores tend to be based on ordinal scales that are too weak to support the computation of important statistical measures, such as the standard error of measurement.

The preceding discussion indicates that grade- and age-equivalent scores are psychometrically impure; however, they still may be useful on some occasions. Grade- and age-equivalent scores place performance in a developmental context, provide information that is easily understood by parents and the public, and reduce misinterpretations. (Percentile ranks, for example, are often misinterpreted as indicating the percentage of questions that the child answered correctly.) Instead of abandoning grade- and age-equivalent scores, we should better educate people in their use."

Monday, October 27, 2008

Wordle

Wordle is a toy for generating “word clouds” from text that you provide. The clouds give greater prominence to words that appear more frequently in the source text. You can tweak your clouds with different fonts, layouts, and color schemes. The images you create with Wordle are yours to use however you like.

You can print them out, or save them to the Wordle gallery to share with your friends. You can either paste in a bunch of text, enter the URL of any blog, blog feed, or any other web page that has an Atom or RSS feed, or enter a del.icio.us user name to see their tags.

Create your own here.

I think Wordle can be an interesting tool to use when making a presentation as it shows the importance of key words used within your presentation. It may be fun to use as well if you have your own website or blog. I entered the URL for the CareerTech Testing Center Blog and the graphic above was created.

Tuesday, October 21, 2008

Testing Liaison Updates

As the testing season begins, please be aware of the following updates:

1. We are asking for the student's WAVE ID number on our competency tests. If the student does not have a WAVE ID number yet, they can still use a social security number, school ID or other unique numerical identifier for the student. If the number they use is not 10-digits long, they will need to add zeroes to the FRONT of the number. For example, if the student's id number is 445566, this will need to be input in the WAVE ID field as 0000445566.

2. If you need certificates, you may use the certificate pdf file that you have used in the past. Starting January 2009, we will have a new competency certificate available that will be printed at ODCTE and mailed to the testing liaisons.

3. Check our website regularly for updates: http://www.careertechtesting.com/

4. Testing/Data policies (i.e. when to test, waiting period, etc.) can be accessed in the 2008 Testing Liaison Policies and Procedures.

5. The Passport to Financial Literacy Act legislation requires instruction in financial literacy for all students in Oklahoma, beginning with those who start 7th grade in the 2008-09 academic year. The CareerTech Testing Center has developed skills standards and a test that address these requirements and CIMC (Curriculum and Instructional Materials Center) now offers "Life Skills: Financial Literacy Skills" that addresses all 14 content areas listed in the recently passed legislation.

6. Please forward the blog to anyone, on or off your campus, that may benefit from the information and ask them to join. The membership continues to grow and we hope you will provide your comments as well. It's also our hope that this will evolve into an exchange of ideas concerning testing and career and technology education. Please let us know if you happen to find something interesting that would be relevant to the blog and we will post it as well.

And remember, you can always check the Testing Liaison Updates link located in the left-hand column of the blog. J.T.

Monday, October 20, 2008

New Ways to Cheat


YouTube seems to have created "celebrities" out of high-tech cheaters. "Kiki" has recently appeared on Good Morning America (see link for story and video) and has had other articles written about her as well.

A quick YouTube search finds 3,000+ videos on cheating. I wonder how well these students would perform on exams if they were actually studying instead of making videos???

Kiki's video is currently off of YouTube, but will soon be back after her agreement with Good Morning America expires. However, there are a few others I might recommend in the future. J.T.

Friday, October 17, 2008

Visit us on Wikipedia







Visit the CareerTech Testing Center on

Thursday, October 16, 2008

Recommended Time for Testing

Ideally, students should be tested as soon as they have completed training and passed all skills performance evaluations. It is NOT a recommended practice to wait until the end of the academic year to test if the student is ready to test earlier.

Testing statistics prove that 70% of all certification exams are passed when students take their exams 3 to 7 days after course completion. This amount of time typically provides adequate study time and allows testing to take place while the information is still fresh. On the converse, the same statistics show over an 80% failure rate for students attempting their exam immediately after a class or if they wait more than 2 weeks after course completion.

Please review our 2008 Testing Liaison Policies and Procedures guide for more information.

Tuesday, October 14, 2008

A Brave New World-Wide-Web

I just viewed a video by David Truss shared via the Dangerously Irrelevant Blog that does a great job showing the transformation and tools of a instructor in the 21st Century and why these tools should be used in the classroom. The video is extremely well done and I like it a lot, but I also know that there are going to be LOTS of people whose reaction to David’s video is: "I DON'T WANT TO BE THAT CONNECTED." Anyway, it's food for thought as we consider the ever-changing effects of technology on today's classroom. J.T.

Thursday, October 9, 2008

New Video Game Literally a Mind Game

MAKUHARI, Japan (AFP) - Willpower is set to replace fast fingers in a new video game in which players move characters through a headset that monitors their brain waves.

California-based NeuroSky Inc. showed off the new headset -- named Mindset -- at the Tokyo Game Show, the industry's biggest exhibition which opened near the Japanese capital Thursday. The Mindset monitors whether the player is focused or relaxed and accordingly moves the character on a personal computer. "We brought this to the game show as a new interface, a new platform for game creators," NeuroSky managing director Kikuo Ito told AFP. Children's games using the system will hit the US market next year, Ito said.

"We are exploring the use of brain waves in the game industry because games are fun and so close to people," he said. "Once people get used to the idea of using brain waves for various applications, I hope we will see various products using this technology," he said. In distance learning courses, for example, teachers could monitor whether students were attentive, Ito said.

Train drivers and motorists could use it to judge their stress levels and alertness, Ito added. Japan's Keio University put similar technology to use this year to let a paralysed man take a virtual stroll on the popular Second Life website, with the machine reading what he wanted to do with his immobile legs.

NeuroSky said the Mindset could help people with other types of disabilities. "For people with difficulty speaking, this can be a tool for communication," Ito said. Ito was hopeful that the technology would eventually go on sale outside the United States. Prices have not been announced.

It will be interesting to see how this new technology can affect special education and education in general. J.T.

What Happens When a Student Cheats?

Cheating on Tests

If any candidate is caught cheating during an examination, testing will stop immediately. The candidate will receive a failing result and the incident will be reported to the CareerTech Testing Center (CTTC).

Several factors to consider when cheating occurs:
• Widespread cheating (e.g., answer copying) jeopardizes the validity of results.
• Leaking of test items damages the credibility of the individuals involved, as well as the school and the CTTC.

Tips on how cheating can be prevented:
• Advise test takers that testing is monitored continuously for irregularities and cheating.
• Minimize testing attempts.
• Use the Coaching Report to see if participants are scoring consistently, as expected. Did some participants “ace” the test unexpectedly?

Wednesday, October 8, 2008

High-Stakes Testing Puts Pressure on Educators

By Diette Courrégé (Contact)
The Post and Courier
Sunday, September 14, 2008


So much rides on public school students’ test scores. They can make or break a principal’s career. Awards, money and promotions often accompany high scores. Low scores can mean state takeover or intense public scrutiny. They can lessen neighborhoods’ home values and desirability. The increasing pressure on educators to post strong results on high-stakes tests has created ripe conditions for cheating.

The increasing pressure on educators to post strong results on high-stakes tests has created ripe conditions for cheating. Cases of educator-led cheating are cropping up across the country, from Virginia to Texas to Ohio. An analysis of ......

As always, click on the title if you want to read the entire article. J.T.

Friday, October 3, 2008

ANIMOTO

Just found a website called Animoto.com. You can upload pictures and sync them to music, free for 15 seconds. What a great way to add something interesting to your website, presentation, or report. Longer versions, with CD quality, will cost you, but you can create a class assignment, place it on YouTube or send it to your iPhone. I think there are lots of applications....

Here is a video that demos new products at CIMC. The best thing is that it only took a few minutes to make:

Thursday, October 2, 2008

Helpful Hints for Testing Liaisons

Here a few helpful hints that will make the testing process easier and better for all of your students (view the complete 2008 Testing Liaison Policies and Procedures for more information).


The testing environment should be such that participants can concentrate on their assessments with minimal distractions. Considerations regarding the testing environment include:
o Consistent/adequate lighting levels.
o Temperature at a comfortable level with proper ventilation.
o Space is quiet with minimal distractions.
o Participants should be asked to behave consistently (no eating, getting up and moving about).
o Avoid/delay the test administration when a participant appears hurried, troubled, or ill.

Responsibilities of the Test Proctor include:
o Participant authentication: a picture ID should always be shown and login should be handled quickly and quietly by the proctor.
o Protection of the security of the online testing system. Username AND password should NEVER be revealed.
o Prohibiting the use of all communication devices (photos of test items and text messaging are common problems).
o Computer usage: Monitor whether participants are trying to access the internet or other programs.
o The proctor should be vigilant in their observance of the testing environment: Note passing, hand gestures, etc.
o Reference materials, texts, notes, etc., are not allowed in the testing area unless specifically allowed for in the exam or in a student’s Individualized Education Plan.
o If a candidate is caught cheating during an examination, testing will stop immediately. The candidate will receive a failing result and the incident will be reported to the CareerTech Testing Center (CTTC).
o Students with an IEP may have special accommodations as specified in an IEP, IRP, 504, LEP, and ELL.


Find out more about test administration and the CareerTech Testing Center.
Related Posts Plugin for WordPress, Blogger...
 
MDZE3SGDZH9Y