But I must tell how idiosyncratic, course specific, and nonstandard I find it.
It seems to me that this goes with a particular textbook, maybe computer science?
I hope you find someone here who is familiar with this approach.
But again, it does seem to me to be out of the mainstream.
I hope someone here can help you.
Well, you definitely need to assume . And then, since you are trying to prove , I would assume as starting a subproof. Then, it seems to me, you should try elimination on the assumption. Remember: that you can derive anything from a contradiction is a powerful tool! Often, you get a contradiction, and you can simply jump to your conclusion immediately in the current subproof.
Reply to Plato: I don't know how widespread Fitch is; it certainly seems all the rage at Stanford. It was the method I was taught at Virginia Tech (of course, the prof I had came from Stanford, so perhaps that explains that). The Barwise and Etchemendy book Language, Proof, and Logic uses Fitch in conjunction with the natural deduction rules.
I like the natural deduction rules, because they are an incredibly well-organized way of remembering your inference rules. It compares very well with Copi's however many rules he has. Each symbol has its introduction and elimination rule, and that's really all you have to remember.
Taken together, Fitch and natural deduction rules are a very powerful, easy-to-use system, in my opinion.