Skip to main content

Table 1 Assessment of inter-rater reliability and agreement between raters’ blinded judgement of included studies (n = 13) using the Testex screening tool

From: Effect of free-weight vs. machine-based strength training on maximal strength, hypertrophy and jump performance – a systematic review and meta-analysis

 

Measurements of agreement

 

Kappa

SE

P-value

Percentage

1

Eligibility criteria included

0.70d

0.18

0.005

86

2

Randomization method stated

1.00e

0.00

 < 0.001

100

3

Allocation concealment

1.00e

0.00

 < 0.001

100

4

Groups similar at baseline

NA

-

-

92

5

Assessor blinded

Constant

-

-

100

6a

Study withdrawals < 15%

1.00e

0.00

 < 0.001

100

6b

Adverse events reported

1.00e

0.00

 < 0.001

100

6c

Session attendance reported

0.63d

0.33

0.011

92

7

Intention-to-treat analysis

Constant

-

-

100

8a

Between-group primary analysis

Constant

-

-

100

8b

Between-group secondary analysis

Constant

-

-

100

9

Point measures for all outcomes

Constant

-

-

100

10

Activity monitoring controls

Constant

-

-

100

11

Relative exercise intensity adjusted

Constant

-

-

100

12

Exercise energy expenditure information reported

Constant

-

-

100

Total scores of intra-rater agreement

0.78d

0.14

 < 0.001

85

  1. Abbreviations Testex Tool for the assEssment of Study qualiTy and reporting in Exercise, SEStandard Error, NA Not applicable due to constant values preventing Kappa analysis, Constant 100% agreement on rating, preventing Kappa analysis. Subscript letters denote level of agreement with Kappa analysis: a =  < 0.20: poor, b = [0.20, 0.40): fair, c = [0.40, 0.60): moderate, d = [0.60, 0.80): good, and e = [0.80, 1.00]: very good [33]