Intuitively, d is supposed to represent some arbitrarily small (or infinitesimal) quantity--so if we wrote i = d/dt it wouldn't have much of a meaning on its own. (It would obtain some significance if we multiplied i times some function which depends on time.) i = dq/dt expresses the first of the two concepts you mentioned, namely that i is equal to the change in current as time changes, for all real number values of time. Since change in current over some time intervals may be sharper than it is over some others, i is itself a function (not necessarily constant) of t.

Now I'm not really sure how this concept is different from the second thing you wrote, "current equal to the change in charge with respect to the change in time". What is change in charge with respect to change in time except simply change in charge as time changes? Perhaps you have some sense of how this is different, which I'm not seeing.

It is probably helpful to not think about Leibniz notation as anything other than a short-hand for the official definition of a derivative, which is the limit as h goes to zero of the difference quotient. (Note, a non-standard form of calculus has been developed which in some sense literally has infinitesimal quantities, and is provably equivalent to the calculus that you're learning. So there is a kind of legitimate way of thinking about infinitesimals, but since they're not formally introduced to you and you can do everything you have so far without them, you might just satisfy yourself with having an intuitive understand which--if you ever want to come back to a more formal treat--you can ultimately do away with and replace with the formal notion of a limit of the difference quotient.)

Hope this helps.