Based on the video that
@Red Leg posted, I'd say the answer to that question is no. I would have to imagine that the idea of sending a drone into combat that is under its own "thinking" is something that would be unsettling to any commander. To simply send one of these off with the goal of kill/destroy anything deemed to be enemy is a bit broad.
In the video Ukraine it was stated that they will build one million drones this year. That number may be accurate, but I can guarantee that it is constrained by a budget. So with any deployed weapon, one wants to get as much "bang for the buck" as possible. I'm not sure at this moment that AI technology has progressed to this point.
The main use I see in AI for these drones is to find targets that are not stationary. If they're stationary all you need is any accurate guidance system that already exists. It's when that target is moving that you need a brain on the control side that determines if you've found a target worthy of the cost incurred when the drone is ultimately sacrificed in destroying that target.
In order to know that an AI based autonomous drone is capable of such things would require testing. How does one setup such a test that proves the viability of such technology. That I think would be very challenging.