Elizabeth Boyd

 ElizabethM. Boyd

Elizabeth M. Boyd

  • Courses2
  • Reviews5

Biography

Kennesaw State University - Management

Organizational Psychology PhD
Nonprofit Organization Management
Elizabeth
Boyd
Mount Pleasant, Michigan
Dr. Liz Boyd earned her PhD in Organizational Psychology then spent 8 years as a professor in Psychology and Business departments.
After experiencing a series of very serious health issues and consulting with dozens of doctors, she finally found relief and healing through the use of cannabis.
In 2019 Dr. Boyd founded Free Green, an organization currently seeking non-profit status, whose mission is to provide cannabis education, resources, and products to Veterans living in the State of Michigan.
Too many Veterans suffer with conditions that could be helped greatly though the use of cannabis (chronic pain, anxiety/depression/PTSD, seizure disorders, opiate addiction).
Veterans: Our mission is to “Dank you for your service” by providing everything you need to use medicinal cannabis responsibly, totally free of charge.


Experience

  • Free Cannabis for Veterans

    Self Employed

    Elizabeth worked at Free Cannabis for Veterans as a Self Employed

  • Indiana University Purdue University Indianapolis

    Assistant Professor

    Elizabeth worked at Indiana University Purdue University Indianapolis as a Assistant Professor

  • Free Green Michigan

    Director Of Operations

    Elizabeth worked at Free Green Michigan as a Director Of Operations

  • Kennesaw State University

    Assistant Professor and Research Director, Women's Leadership Center

    Elizabeth worked at Kennesaw State University as a Assistant Professor and Research Director, Women's Leadership Center

Education

  • Michigan State University

    MA

    Industrial/Organizational Psychology

  • Michigan State University

    PhD

    Organizational Psychology

Publications

  • Detecting and deterring insufficient effort responding to surveys.

    Journal of Business and Psychology

    Purpose Responses provided by unmotivated survey participants in a careless, haphazard, or random fashion can threaten the quality of data in psychological and organizational research. The purpose of this study was to summarize existing approaches to detect insufficient effort responding (IER) to low-stakes surveys and to comprehensively evaluate these approaches. Design/Methodology/Approach In an experiment (Study 1) and a nonexperimental survey (Study 2), 725 undergraduates responded to a personality survey online. Findings Study 1 examined the presentation of warnings to respondents as a means of deterrence and showed the relative effectiveness of four indices for detecting IE responses: response time, long string, psychometric antonyms, and individual reliability coefficients. Study 2 demonstrated that the detection indices measured the same underlying construct and showed the improvement of psychometric properties (item interrelatedness, facet dimensionality, and factor structure) after removing IE respondents identified by each index. Three approaches (response time, psychometric antonyms, and individual reliability) with high specificity and moderate sensitivity were recommended as candidates for future application in survey research. Implications The identification of effective IER indices may help researchers ensure the quality of their low-stake survey data. Originality/value This study is a first attempt to comprehensively evaluate IER detection methods using both experimental and nonexperimental designs. Results from both studies corroborated each other in suggesting the three more effective approaches. This study also provided convergent validity evidence regarding various indices for IER.

  • Detecting and deterring insufficient effort responding to surveys.

    Journal of Business and Psychology

    Purpose Responses provided by unmotivated survey participants in a careless, haphazard, or random fashion can threaten the quality of data in psychological and organizational research. The purpose of this study was to summarize existing approaches to detect insufficient effort responding (IER) to low-stakes surveys and to comprehensively evaluate these approaches. Design/Methodology/Approach In an experiment (Study 1) and a nonexperimental survey (Study 2), 725 undergraduates responded to a personality survey online. Findings Study 1 examined the presentation of warnings to respondents as a means of deterrence and showed the relative effectiveness of four indices for detecting IE responses: response time, long string, psychometric antonyms, and individual reliability coefficients. Study 2 demonstrated that the detection indices measured the same underlying construct and showed the improvement of psychometric properties (item interrelatedness, facet dimensionality, and factor structure) after removing IE respondents identified by each index. Three approaches (response time, psychometric antonyms, and individual reliability) with high specificity and moderate sensitivity were recommended as candidates for future application in survey research. Implications The identification of effective IER indices may help researchers ensure the quality of their low-stake survey data. Originality/value This study is a first attempt to comprehensively evaluate IER detection methods using both experimental and nonexperimental designs. Results from both studies corroborated each other in suggesting the three more effective approaches. This study also provided convergent validity evidence regarding various indices for IER.

  • A Qualitative Exploration of Reactions to Work-Life Conflict Events

    Research in Careers

  • Detecting and deterring insufficient effort responding to surveys.

    Journal of Business and Psychology

    Purpose Responses provided by unmotivated survey participants in a careless, haphazard, or random fashion can threaten the quality of data in psychological and organizational research. The purpose of this study was to summarize existing approaches to detect insufficient effort responding (IER) to low-stakes surveys and to comprehensively evaluate these approaches. Design/Methodology/Approach In an experiment (Study 1) and a nonexperimental survey (Study 2), 725 undergraduates responded to a personality survey online. Findings Study 1 examined the presentation of warnings to respondents as a means of deterrence and showed the relative effectiveness of four indices for detecting IE responses: response time, long string, psychometric antonyms, and individual reliability coefficients. Study 2 demonstrated that the detection indices measured the same underlying construct and showed the improvement of psychometric properties (item interrelatedness, facet dimensionality, and factor structure) after removing IE respondents identified by each index. Three approaches (response time, psychometric antonyms, and individual reliability) with high specificity and moderate sensitivity were recommended as candidates for future application in survey research. Implications The identification of effective IER indices may help researchers ensure the quality of their low-stake survey data. Originality/value This study is a first attempt to comprehensively evaluate IER detection methods using both experimental and nonexperimental designs. Results from both studies corroborated each other in suggesting the three more effective approaches. This study also provided convergent validity evidence regarding various indices for IER.

  • A Qualitative Exploration of Reactions to Work-Life Conflict Events

    Research in Careers

  • It's All Relative: Social Comparison and Work-Family Conflict

    EJWOP

  • Detecting and deterring insufficient effort responding to surveys.

    Journal of Business and Psychology

    Purpose Responses provided by unmotivated survey participants in a careless, haphazard, or random fashion can threaten the quality of data in psychological and organizational research. The purpose of this study was to summarize existing approaches to detect insufficient effort responding (IER) to low-stakes surveys and to comprehensively evaluate these approaches. Design/Methodology/Approach In an experiment (Study 1) and a nonexperimental survey (Study 2), 725 undergraduates responded to a personality survey online. Findings Study 1 examined the presentation of warnings to respondents as a means of deterrence and showed the relative effectiveness of four indices for detecting IE responses: response time, long string, psychometric antonyms, and individual reliability coefficients. Study 2 demonstrated that the detection indices measured the same underlying construct and showed the improvement of psychometric properties (item interrelatedness, facet dimensionality, and factor structure) after removing IE respondents identified by each index. Three approaches (response time, psychometric antonyms, and individual reliability) with high specificity and moderate sensitivity were recommended as candidates for future application in survey research. Implications The identification of effective IER indices may help researchers ensure the quality of their low-stake survey data. Originality/value This study is a first attempt to comprehensively evaluate IER detection methods using both experimental and nonexperimental designs. Results from both studies corroborated each other in suggesting the three more effective approaches. This study also provided convergent validity evidence regarding various indices for IER.

  • A Qualitative Exploration of Reactions to Work-Life Conflict Events

    Research in Careers

  • It's All Relative: Social Comparison and Work-Family Conflict

    EJWOP

  • From "work-family" to "work-life": Broadening our conceptualization and measurement

    Journal of Vocational Behavior

  • Detecting and deterring insufficient effort responding to surveys.

    Journal of Business and Psychology

    Purpose Responses provided by unmotivated survey participants in a careless, haphazard, or random fashion can threaten the quality of data in psychological and organizational research. The purpose of this study was to summarize existing approaches to detect insufficient effort responding (IER) to low-stakes surveys and to comprehensively evaluate these approaches. Design/Methodology/Approach In an experiment (Study 1) and a nonexperimental survey (Study 2), 725 undergraduates responded to a personality survey online. Findings Study 1 examined the presentation of warnings to respondents as a means of deterrence and showed the relative effectiveness of four indices for detecting IE responses: response time, long string, psychometric antonyms, and individual reliability coefficients. Study 2 demonstrated that the detection indices measured the same underlying construct and showed the improvement of psychometric properties (item interrelatedness, facet dimensionality, and factor structure) after removing IE respondents identified by each index. Three approaches (response time, psychometric antonyms, and individual reliability) with high specificity and moderate sensitivity were recommended as candidates for future application in survey research. Implications The identification of effective IER indices may help researchers ensure the quality of their low-stake survey data. Originality/value This study is a first attempt to comprehensively evaluate IER detection methods using both experimental and nonexperimental designs. Results from both studies corroborated each other in suggesting the three more effective approaches. This study also provided convergent validity evidence regarding various indices for IER.

  • A Qualitative Exploration of Reactions to Work-Life Conflict Events

    Research in Careers

  • It's All Relative: Social Comparison and Work-Family Conflict

    EJWOP

  • From "work-family" to "work-life": Broadening our conceptualization and measurement

    Journal of Vocational Behavior

  • Moving Beyond Work–Family: Establishing Domains Relevant to Work–Life Conflict

    SIOP 2012

Possible Matching Profiles

The following profiles may or may not be the same professor:

  • Megan Elizabeth Boyd (-40% Match)
    Limited Term Faculty
    Georgia State University - Georgia State University

  • Elizabeth M Boyd (70% Match)
    Assistant Professor
    Kennesaw State University - Kennesaw State University

  • Elizabeth Boyd (00% Match)
    Faculty
    University Of Baltimore - 360228