Combining them gives you your 240v. When people use 110 / 220v terminology they mean the same thing, so for practical purposes 110=115=120 or 220=230=240. Appliances are designed to operate within that range and the installation manual should tell …
Oct 12, 2013 · The short easy answer is yes, you can plug your 220v appliance into a 120/240v outlet provided the configuration of plug to receptacle is the same, and the outlet is wired for 240v. If the configuration is different, it will be because the ampacity rating of the appliance differs from the that of the rating of the receptacle.
Bias voltage is derived from the input power transformer, so if it had been adjusted with a 240V input, it will not be correct with 220V. Check your bias, this is probably a procedure as to how this is done in the manual, or bring it to a shop where they can simulate your voltage and adjust the bias for you.
Jan 31, 2019 · Most 220V devices will work fine on 240V. However, on some devices (mostly the ones do not have voltage regulator), it will cause over voltage for internal circuits, which lead to shortened life. <> Yes, they are the same thing and used in US residential homes. If you take two 120v lines,
|Is 220 volts the same 240 volts||Feb 01, 2019|
|What is the difference between 220 and 240 volt||Jan 28, 2019|
|How does 208V differ from 220V||Jan 24, 2019|
|Can you plug a 230V 50Hz appliance into a 240V 60Hz outlet||Jan 20, 2019|
Nov 16, 2007 · I can save a lot of money by buying it online from the US, so would prefer to do that, but don’t want to fry it, myself, or the house. To add to my confusion, there are lots of similar previous questions, but some cite the mains voltage in the UK as 240v, some as 230v, and some as 220v.
Oct 22, 2006 · BlueStar. The only thing you might notice is that things with motors might run very slightly slower, but would be hardly noticeable, after all, if you run an appliance on the end of a long extension cable, or have a shed at the end of the garden powered from the house you will experience volt-drop which will result in your 240v AC being decidedly
The mathematical reference is Ohm’s Law and the Power Triangle. If you plug an 110V appliance in 220V outlet (same as 120v to 230v, 240v) you can only hope that some protection device disconnects the power to the appliance. Otherwise: If it is some kind of heating device, (toaster,
I had planned to hook it up on a double pole switch feeding a three pole to rig it for either 240 or 120, but after hearing it run furiously at 240v, I decided to dial it back to just 120v. The current draw from this motor at 120v is ~2.3A and at 240v it is ~ 4.5A.
Probably nothing serious would happen as has been mentioned already but there is a possibility of a potential serious situation. Consider an appliance only intended for operation on 240V AC but is able to work from (say) 200V to 250V. To do so might mean it uses a switch-mode power supply to regulate the internal DC voltages. Let’s say it required 100 watts internally, maybe some form of audio amplifier. At 250 volts AC it would draw 0.4 amps plus 10 % more for inefficiencies – that’s a current of 440 mA. At 200 volts AC it would draw 550mA. At 100 V ac it would try and draw a current of nearly an amp if it were able. The point is that it will try and draw more current at a lower AC voltage and this could blow an internal fuse or damage the switching transistor – the average current may only be 1 amp but the switching current might be 10 amps. Also, at a lower voltage (with the increase in current) the reservoir capacitor after the bridge rectifier will be struggling to maintain low ripple and between cycles the dc voltage before the switching element may sag to only 50 volts – this means a higher instantaneous current draw on a cyclic basis and possibly more damage to the regulating switching transistor.Best answer · 13There could be damage. With half the RMS AC voltage, that is, half the force pushing charge through the device, we might expect half the current flow. If the device acts like a simple resistor, that’s exactly true. That means 1/4 of the normal amount of power is used by the device. If the device has capacitive or inductive reactance, and has nonlinear effects, then no. Still, with no specific device as subject discussion, we may as well assume one quarter the power usage. If that power is primarily working a motor, then the motor will be spinning slower. (duh.) Some motors depend on spinning at a high speed to keep themselves cool. If it’s not spinning fast enough, maybe it won’t keep itself as cool. But at (probably) 1/4 the power, it’s not getting as hot, either. Will friction or load keep the motor from spinning at all? Whether the cooling effect is diminished in the same proportion as the motor heating, depends on the actual type of appliance, the load the motor is pushing, the presence of voltage regulating circuits, and for all I know, the appliance’s astrological birth chart. That’s just considering basic motor physics. The range of parts and physical phenomena in a generic unspecified household appliance is vast, and so it is not possible to rule out some other way that half-voltage input could cause damage. Short answer: without further info, it’s guesswork, but the range of guesses must include the possibility of damage. There is only one way to find out, assuming you can make the plug fit the socket11In a linear situation (an electric blanket, for example), the power will just be reduced to 25%. Switching power supplies such as PC power supplies (the kind with a slide switch to select the input voltage) will attempt to produce the required output power with the available voltage, and unless some kind of under voltage lockout or thermal protection kicks in could be damaged – the power devices will get much hotter than normal. The most susceptible to damage are appliances like small refrigerators that require enough motor torque to get past the compressor torque humps. With low voltage in (such as a brown-out), the compressor can stall (reducing the motor back-EMF to zero) and thus draw a much higher current than usual, all of it converted to heat. As a bonus, any cooling fans will not operate at full efficiency, if at all.7If you have a 240 VAC motor which is running at 90% efficiency and is putting out one horsepower into a load, then since one horsepower is 746 watts, it’ll be consuming about 75 watts in order to deliver that 746 watts to the load. That’s about 820 watts, total, and since the input to the motor is 240VAC, it’ll be drawing about 3.4 amps once it gets up to speed. When it’s just starting, however, it can easily draw ten times that current and dissipate it in the stator’s winding resistance, so that power would be 240V * 34A = 8160 watts, and the stator’s winding resistance would be 240V / 34A ~ 7 ohms. Now, if you were to connect 120V to the motor and the static load on the shaft was high enough to keep the rotor from turning, then that 120V would see only the stator’s 7 ohm winding resistance, and it would cause the statpr’s winding to dissipate:
P = E² / R = 120V² / 7R = 2057 watts! Then, since the motor was designed, ostensibly, to rise a fixed temperature above ambient with 75 watts, steady state, being dissipated in the stator’s winding, 2057 watts in it would certainly cause some damage after even a short time3I have a radial arm saw that was given to me by a friend. It started slow and did not have much power to cut ( it bogged down easily and the motor smelled hot) I was going to take it to have the motor windings re-wound and upon disassembly I discovered that it was a dual voltage motor(120-240V) that had been plugged into a 120v outlet but was tapped for 240V operation. After I moved the input wiring to the 120V tap and reassembled the motor it started much faster and ran with the expected cutting ability of the 3/4 hp it was rated.I no longer smelled “Hot”. I know thios is not a technical explanation but a practical example for comparison. The 120V supplied to the 240V motor “Worked ” sort-of, but, it did not allow the motor to perform at its peak efficiency or expected torque.3Many electronic devices nowadays are rated for any voltage between 100 and 240V, for precisely this reason. However, anecdotally, there are situations where too low a voltage can cause damage. A few years ago, I owned a particularly cheap phone, which came with a particularly cheap charger, which was only rated for 230V. When running on 115V, I found that the charger would not charge the phone, and indeed appeared to discharge it. Given the low quality of both the phone and the charger, I suspect the charger simply transformed and rectified the voltage, which would mean the voltage dropped from 4.2V to 2.1V – too low to charge a Li-ion battery.1I’m in the process of field testing a 277v single phase motor I got for nothing at 120v. I am using it as a whole house fan. It came with a squirrel cage fan and housing. I had planned to hook it up on a double pole switch feeding a three pole to rig it for either 240 or 120, but after hearing it run furiously at 240v, I decided to dial it back to just 120v. The current draw from this motor at 120v is ~2.3A and at 240v it is ~ 4.5A. I have heard a lot of otherwise smart sounding people on here state that as voltage goes down, the amperage goes up. But I really think that a lot of people are forgetting that this is only the case when the output power remains constant. I’m sure that with the motor running slower, it will have more heating from trying to attain its design speed and always failing, but I think it’ll hold in there. So far, the motor has no problem getting to speed and doesn’t seem to get too hot.0NO IF WE CONNECT ANY APPLIANCES BELOW IT RATED VOLTAGE IT WILL JUST GIVE OUTPUT BELOW THEN DESIRED OR BELOW WHAT IT CAN GIVE AT RATED VOLTAGE I WILL EXPLAIN IT SIMPLY. LET US ASSUME THAT IMPEDANCE OF APPLIANCES IS 50 OHM. IF I AM CONNECTING IT WITH 250 VOLT. IT WILL DRAW 5 AMPS. BUT IF WE CONNECT IT TO 100V THEN IT WILL DRAW ONLY 2 AMPS. BECAUSE WITH OUT CHANGING THE FREQ. OF SUPPLY IT WILL NOT CHANGE IMPEDANCE OF APPLIANCES. SO AT LOWER VOLTAGE IT WILL UNDERPERFORM. AND AT HIGHER VOLTAGE IT MAY CHANCE TO GET DAMAGED,
A 110v to 220v converter is an incredibly useful device. With a voltage converter, you don’t have to call an electrician to get the voltage you need. If your appliance draws more than 20A or 4800W continuous at 220-240v, you cannot run it on a Quick 220 Can I run ANY 220v appliance …
Dec 19, 2014 · So I guess what you’re saying is that a 230v rated motor will run any where in my imagined 200vold range, right? Maybe a bit more power on 240v or a little less on 220v or 208v. I’ll take a look at your link wheels, it’s just take me a little time to catch up. I ain’t the young sprout that I once was, one thing at a time now-a-days is all I can muster.
Mar 12, 2007 · Has anybody here ever ran a 220v amp in the UK? Discussion in ‘Amps and Cabs ‘ started by Random Correction: Its a 5% margin! Therefore its possible to have 220v and 240v within the 230v zone – so everybody is happy and nothing changes. and reduces internal voltages, as they say in the manual – and that by doing so, you can run 6V6s in
Aug 09, 2005 · I’ve emailed Roland but expect a reply only after the next luna eclipse; Any UK (or AU or anyone in the know) forumites have any experience plugging a 220v adapter/appliance into 240v mains? I’ve manage to find out it will ‘run hot’ but am i likely to permanently damage the …
You can do it, but it would require changing the pigtail(the 3 prong wire in the back) from a standard 220v to a 110v, and there would be no guarantee that your appliance would operate properly. It would probably run—-like a car missing half its cylinders.