On the other hand, when the engine inertia is larger than the load inertia, the electric motor will require more power than is otherwise necessary for the particular application. This improves costs because it requires spending more for a engine that’s larger than necessary, and because the increased power consumption requires higher operating costs. The solution is to use a gearhead to match the inertia of the electric motor to the inertia of the strain.

Recall that inertia is a measure of an object’s resistance to improve in its movement and is a function of the object’s mass and shape. The greater an object’s inertia, the more servo gearhead torque is needed to accelerate or decelerate the thing. This implies that when the strain inertia is much bigger than the motor inertia, sometimes it can cause excessive overshoot or enhance settling times. Both circumstances can decrease production range throughput.

Inertia Matching: Today’s servo motors are producing more torque relative to frame size. That’s because of dense copper windings, lightweight materials, and high-energy magnets. This creates higher inertial mismatches between servo motors and the loads they want to move. Using a gearhead to better match the inertia of the motor to the inertia of the load allows for utilizing a smaller electric motor and results in a more responsive system that’s easier to tune. Again, this is accomplished through the gearhead’s ratio, where the reflected inertia of the load to the motor is decreased by 1/ratio^2.

As servo technology has evolved, with manufacturers generating smaller, yet better motors, gearheads have become increasingly essential partners in motion control. Finding the optimum pairing must take into account many engineering considerations.
So how will a gearhead start providing the power required by today’s more demanding applications? Well, that all goes back to the basics of gears and their ability to change the magnitude or path of an applied drive.
The gears and number of teeth on each gear create a ratio. If a motor can generate 20 in-pounds. of torque, and a 10:1 ratio gearhead is mounted on its result, the resulting torque will be close to 200 in-lbs. With the ongoing emphasis on developing smaller sized footprints for motors and the gear that they drive, the capability to pair a smaller engine with a gearhead to achieve the desired torque output is invaluable.
A motor could be rated at 2,000 rpm, but your application may only require 50 rpm. Trying to run the motor at 50 rpm may not be optimal based on the following;
If you are working at an extremely low velocity, such as 50 rpm, as well as your motor feedback resolution isn’t high enough, the update rate of the electronic drive could cause a velocity ripple in the application. For example, with a motor feedback resolution of 1 1,000 counts/rev you have a measurable count at every 0.357 degree of shaft rotation. If the electronic drive you are employing to control the motor has a velocity loop of 0.125 milliseconds, it’ll search for that measurable count at every 0.0375 amount of shaft rotation at 50 rpm (300 deg/sec). When it does not observe that count it’ll speed up the engine rotation to think it is. At the velocity that it finds the next measurable count the rpm will become too fast for the application form and then the drive will sluggish the electric motor rpm back off to 50 rpm and the whole process starts all over again. This constant increase and decrease in rpm is what will cause velocity ripple in an application.
A servo motor running at low rpm operates inefficiently. Eddy currents are loops of electric current that are induced within the motor during operation. The eddy currents actually produce a drag pressure within the motor and will have a greater negative effect on motor performance at lower rpms.
An off-the-shelf motor’s parameters might not be ideally suitable for run at a low rpm. When an application runs the aforementioned engine at 50 rpm, essentially it is not using all of its obtainable rpm. As the voltage constant (V/Krpm) of the engine is set for a higher rpm, the torque constant (Nm/amp), which can be directly linked to it-is lower than it requires to be. Because of this the application requirements more current to drive it than if the application had a motor particularly created for 50 rpm.
A gearheads ratio reduces the motor rpm, which is why gearheads are occasionally called gear reducers. Using a gearhead with a 40:1 ratio, the electric motor rpm at the insight of the gearhead will end up being 2,000 rpm and the rpm at the result of the gearhead will end up being 50 rpm. Operating the engine at the higher rpm will permit you to prevent the concerns mentioned in bullets 1 and 2. For bullet 3, it allows the design to use much less torque and current from the electric motor predicated on the mechanical benefit of the gearhead.