Earlier this week, CNN Business published a video of its transportation editor, Michael Ballaban, attempting to use Tesla’s FSD advanced driver assistance system through the streets of New York City. On a number of occasions, he was forced to assume control as the system made jerky, indecisive, or seemingly dangerous decisions.

Although the nature of the system, which is in Beta form, has been disputed, it is widely alleged that name oversells the product. Although the system can take over many driving responsibilities, the name “Full Self-Driving” has been accused of being misleading to customers.

Indeed, as the test shows, although the system functioned well when the road was clear, in trickier situations where vehicles, pedestrians, and cyclists appeared, human intervention was frequently required.

Read Also: Would You Trust Tesla’s FSD Beta After Watching This Video?

In one scenario, in which a cyclist was between a vehicle ahead and the Tesla, the FSD system attempted to unnecessarily avoid the cyclist and turned into the path of an oncoming delivery truck.

In another situation, the car failed to see fences blocking a lane on the other side of an intersection and turned aggressively into the next lane. A third event saw the Tesla stop at a green light.

“You know, it does seem to be making other drivers upset,” said Ballaban of the system. He admitted that his own unease with the system may have affected the results of this informal test, but the footage suggests that there were situations in which an accident may have occurred without his intervention.

Although driving through New York City is challenging for human drivers, too, Ballaban pointed out that to call this “Full Self-Driving” is simply incorrect.