Originally Posted by

**bruxism** Hi mathematics boffins. This is my first time here, and i'm here on behalf of my very scientific and brainy girlfriend.

so she's doing stuff with mice. grafting skin to them, and seeing how long it takes them to reject the graft, and whether they do reject the graft.

So two groups of mice, one positive control, the other has been treated with something.

here are the results.

(a 1 means a mouse rejected the graft on that day. a 0 means no rejection)

(12 mice in each group)

+ve control

Days post graft rejection

90. 1.

90. 1.

103. 1.

112. 1.

125. 1.

150. 0.

150. 0.

150. 0.

150. 0.

150. 0.

150. 0.

150. 0.

anti CD4 after grafting

days post graft rejection

45. 1.

45. 1.

56. 1.

56. 1.

61. 1.

150. 0.

150. 0.

150. 0.

150. 0.

150. 0.

150. 0.

150. 0.

so as you can see, rejection was much faster in the second group.

so now, is the difference in the times taken to reject significant? I'm no scientist, but i remember 5% confidence intervals etcetra from doing intro to stats at uni.

How would you go about working out a time based problem like this? The best my girlfriend can come up with is calculations for the total amount of rejections, which is obviously not significant at all as the difference is only one mouse.

I hope this all made sense, and any help is greatly appreciated.