a vector field with only curve flux line can have a non zero divergence.true or false.explain
That's true. why because ,For any field the gauss's theorem says that the integral of its divergence in any region of space equals the integral of the fields over the boundary of the region,the flux of field lines crossing the boundary surface l. If you enclose the in such a surface a point at which field lines start or end, the flux doesn't vanish, so the divergence of the field can't be zero.The divergence is basically the surface integral of a vector function out of an small box, or other small closed shape. We take the limit of this integral divided by the shape's volume, as the volume will become zero. It can be looked at as how much the vectors of the function in a small region are pointing out from a point, that is how much they diverge, meaning go in different directions.
For example if at a point the arrows used to represent the
function are all pointing in the same direction, they are not
diverging, and the divergence is zero. Looking at it from the point
of view of the flux out of a small surface, the flux into the
surface is cancelled out by the flux out of it on the other
side.
However, if say the arrows are changing from going forward to going
back on themselves, they are diverging, since they are at starting
to go in different directions. In terms of flux, either the flux on
one side of the shape is not as much as the opposing flux on the
other , and the fluxes don't cancel, or the arrows are in different
directions on opposite sides of the shape, and all the flux is
either in or out of the shape. Then there is a non-zero
divergence.
If it's useful plz upvote
Get Answers For Free
Most questions answered within 1 hours.