I never set out to be a logician and never presumed myself to be one, but I have ended up doing and teaching a lot of logic anyway. This is partially because my work in philosophy of physics unavoidably intersected logic in more than one way, and partially because I have ended up teaching a great deal of logic over the past nearly thirty years. I am therefore a practitioner, more or less, of what is inexactly called "exact philosophy". I even served as the President of the Society for Exact Philosophy from 2014-16. The duties of the President of that august group are not onerous, the most important being to give the toast at the annual banquet: "To syntax and the Void!" (whatever that may mean). But I also got to meet a lot of real logicians and I've learned a lot from them at all those exacting meetings.
My research and writing interests in logic have gone in two main directions.
The first direction evolved out of my teaching introductory symbolic logic. Our second-year course at the University of Lethbridge covers propositional and introductory first order predicate logic, with a strong emphasis on natural deduction techniques. All standard stuff, but I got intrigued by the challenge of finding an effective way of teaching the procedure of existential elimination. The most common way of presenting this involves an elaborate process of creating a subproof in order to get rid of the existential quantifier, doing the deduction you need to do, and then backing out of the subproof with minuet-like precision. It is a very complicated and indirect way of doing something that intuitively seems as if it ought to be simple, and even some of my best students found it to be incomprehensible. Some very elementary books do in fact use a simpler and more intuitive way of doing existential elimination, but unfortunately their method happens to be (strictly speaking) invalid. Could there not be an easier way to do existential elimination that also is valid? The trick in solving a problem like this is to find a way of seeing the obvious. One day, on a drive through Northern Ontario, the penny dropped. I found a simple notation for expressing so-called denoting phrases, expressions of the form "the so-and-so" or "a so-and-so". I won't get into the details here, but this gives a simple way of doing existential elimination and also greatly simplifies deductions using so-called definite descriptions. I think it may also give a simpler tool for expressing the semantics of quantificational statements, but this is a work in progress. In fact, I eventually learned that my notational tricks were really just a somewhat simpler form of Hilbert's "epsilon calculus". I call it an epsilon calculus without the epsilon.
I am currently working on a text on symbolic logic that incorporates some of the tricks and short-cuts I've developed over the years in my teaching. It will include a system of predicate natural deduction using denoting phrases. Who knows, it might catch on.
The other direction in which my thinking in logic has wandered is the notion that some of the new techniques in quantum computation and information theory could be applied to long-standing puzzles and paradoxes in classical logic, such as the paradox of the liar and Curry's Paradox. My work in this area is very sketchy since I haven't really had much time for it so far. (Worrying about collapsing ice sheets seems to be more immediately important.) Quantum computation offers a natural generalization of classical logic that is analogous to the way in which complex numbers are a natural generalization of real-valued mathematics. As George Spencer Brown suggested in his Laws of Form (1969), extending logic to the complex realm may offer a kind of completeness in a way similar to that in which complex numbers allow a sort of completeness that is impossible with real numbers. (Many real-valued algebraic equations can't be solved, but the Fundamental Theorem of Algebra says that every algebraic equation of degree n has n solutions, so long as we allow that some may be complex.) Thus, the natural generalization from classical to quantum logic that Hilary Putnam envisioned in his famous "Is Logic Empirical?" (1968) may come through quantum computation.
Here is a short but I hope suggestive presentation I gave in January, 2020, at the World Logic Day meeting organized at University of Lethbridge by Gillman Payette: The Square Root of Falsehood.