Tesla: Development and Technology News
#1241
Ex-OEM King
Would explain the engrish.
I know but in this instance he's talking about home charging, not public charging.
It's fun backing him into a corner and watching him squirm though.
Still waiting for that Moab trip with the Ridgeline buddy.
It's fun backing him into a corner and watching him squirm though.
Still waiting for that Moab trip with the Ridgeline buddy.
#1242
In fact that is the fun part of reading his posts. You never know what’s coming next, like listening to a radio station. .
#1243
Sanest Florida Man
Here’s all the new shit in the 2022 3/Y
Last edited by #1 STUNNA; 12-09-2021 at 09:20 PM.
#1244
Sanest Florida Man
hairpin wound drive unit is something Sandy Munro had been wanting for a while
The thicker the wire and the smaller the resistance, the less energy will be lost by heat on the wire. And also because the Hairpin motor has a shorter winding end size than a round wire motor, it reduces the copper loss and further improves the efficiency. And thus, the electric motor uses less energy.
Secondly, this “1%” refers to the average efficiency difference of the two types of engines under WLTP conditions, and the hairpin engine is 1.12% higher. However, below the global average, the efficiency difference between the two engines can be 2%. In the operating points with low speed and high torque, the efficiency difference can even add up to an astonishing 10%.
In addition, the Hairpin motor can be packed in more stator windings. This means the motor can output higher power and torque under the same energy loss. The electric motor of the second-generation EDU electric drive gearbox in the new MG EHS Plug-in Hybrid has a power density of 4.7 kW/kg. That’s an increase of more than 20%. Only a few competitors can match this.
Furthermore, the flat wire used in a Hairpin motor has a more regular shape and reduces the thermal resistance in the stator slot. It offers a higher heat transfer efficiency, which further improves the peak power and sustained performance.
And last but not least. Because the winding provides more stiffness, the motor itself offers better stiffness. In the electromagnetic design, a smaller slot can be adopted, which reduces mechanical and electromagnetic noise.
https://news.mgmotor.eu/why-1-percen...
I think you can see their hairpin windings at the 11 sec make of this video
@Legend2TL this is your kinda shit
Hairpin winding
Compared with round wire windings, this Hairpin technology increases the area of wiring under the same conditions.The thicker the wire and the smaller the resistance, the less energy will be lost by heat on the wire. And also because the Hairpin motor has a shorter winding end size than a round wire motor, it reduces the copper loss and further improves the efficiency. And thus, the electric motor uses less energy.
Never underestimate 1%
This technology increases the average efficiency of an electric motor by more than 1% compared to a motor without Hairpin wiring. Indeed, “only” 1%. But never underestimate 1% as this improvement provides several significant benefits. First, it increases the area of the maximum efficiency zone. This means you can enjoy the highest efficiency at low speeds in urban congestion, but at high speeds on the highway as well.Secondly, this “1%” refers to the average efficiency difference of the two types of engines under WLTP conditions, and the hairpin engine is 1.12% higher. However, below the global average, the efficiency difference between the two engines can be 2%. In the operating points with low speed and high torque, the efficiency difference can even add up to an astonishing 10%.
In addition, the Hairpin motor can be packed in more stator windings. This means the motor can output higher power and torque under the same energy loss. The electric motor of the second-generation EDU electric drive gearbox in the new MG EHS Plug-in Hybrid has a power density of 4.7 kW/kg. That’s an increase of more than 20%. Only a few competitors can match this.
Furthermore, the flat wire used in a Hairpin motor has a more regular shape and reduces the thermal resistance in the stator slot. It offers a higher heat transfer efficiency, which further improves the peak power and sustained performance.
And last but not least. Because the winding provides more stiffness, the motor itself offers better stiffness. In the electromagnetic design, a smaller slot can be adopted, which reduces mechanical and electromagnetic noise.
I think you can see their hairpin windings at the 11 sec make of this video
@Legend2TL this is your kinda shit
Last edited by #1 STUNNA; 12-10-2021 at 08:18 AM.
#1245
Sanest Florida Man
#1246
Sanest Florida Man
In the operating points with low speed and high torque, the efficiency difference can even add up to an astonishing 10%.
#1247
Race Director
Thread Starter
Over-generalization is never a good thing and can easily lead to (and actually has on numerous occasions) a certain group of people indiscriminately hating other people simply for being part of a certain group.
It's a very delicate subject so offering any actual examples can be tricky. Alright, we'll go with something that doesn't have anything to do with nationality, religion, or the color of one's skin: veganism. A lot of people tend to become aggressive toward someone when they hear they are vegan just because they think that person will try to preach this lifestyle and never stop rambling about it. However, just because some do indeed do that, it doesn't mean you shouldn't give the benefit of the doubt first and see where it goes from there.
It's the same with Tesla owners and particularly FSD Beta users. Most outsiders regard them as these brain-washed drones who have no problem endangering everyone on the road for the sake of helping push the self-driving technology forward (and TSLA stock along the way). Needless to say, not every Tesla owner is as blind to the dangers the FSD Beta poses right now and it looks like more and more of them are starting to realize (or at least speak up about it).
Sadly, though, in some cases, the countless videos on the Internet showing the advanced driver assistance system's (ADAS) repeated fails aren't enough to paint a more realistic image of what it can and can't do, so it takes a similar personal experience for the coin to finally drop.
One such example comes from a Tesla Motors Club (TMC) forum user called "tr6990" based in Missouri who posted their recent experience with the system under the admittedly exaggerated (though not entirely incorrect) title: "FSD Beta Attempts to Kill Me; Causes Accident." So, what actually happened.
According to the user's recounting of the incident, the system was engaged while the Model Y was traveling on a "two-lane rural road." It was dark (right after sunset), but the conditions were clear and it was a section of the road the FSD Beta was no stranger to.
The hairy part began during a right-hand curve with another car approaching from the other side. About mid-way through the bend, for no apparent reason, the system decides it's time to go straight, putting the Model Y into the path of the oncoming vehicle.
The driver says they had their hands on the wheel so they could react immediately (about 0.3 of a second, according to their version of the story) by veering to the right and back into the correct lane. However, the correction applied was too severe and they lost control of the EV. The Model Y went through the ditch on the side of the road and into the woods. Luckily, they didn't hit any trees but the damage to the vehicle was still severe.
With the video accompanying the post now removed, we have no way to gauge what happened ourselves and trust what the forum member wrote in their post. However, there is one thing about this story - and others like it - that raises the hairs on my back: the potential fate of the people driving in the other car.
For years - over a century now - we've been conditioned to accept that people tend to value their lives enough to stay in their lane and not come straight at us as soon as there is a bend in the road. It hasn't been a completely smooth ride (some people lose control of their vehicles when turning and others think it's a good idea to use a head-on crash as a method of committing suicide) but we'd argue the system worked reasonably well thus far.
Now, Tesla is basically asking us to trust a computer to do the same, even though the situation has two major holdbacks: one, as Tesla themselves put it, the system is still in its beta form, and therefore not ready for widespread public use, and two, as far as we know, computers aren't yet self-aware so they couldn't care less about crashing.
The author of the post is now looking to take legal action against Tesla but, as everyone else on the forum told him, I too don't see how he could have a case. Tesla makes it pretty clear it washes its hands completely of anything that happens during FSD Beta use with all the responsibility falling on the driver.
The sad thing is, though, that while FSD Beta users can opt-out, we can't. Nobody is asking us if we're OK with sharing the roads with an ADAS that's been proven to be faulty and the only safeguard being a driver whose attentiveness, skill, and reaction times we know nothing about. You can argue the latter part has been the same since forever, but at least this far, they've only had to manage their own mistakes, not those of an AI driver as well.
At the end of the day, there are three parties at fault here: Tesla, for releasing a potentially dangerous system to the public and marketing it aggressively as being more capable than it actually is; some of the drivers for over-relying on FSD despite the warnings; finally, the authorities for taking their sweet time with coming up with a set of regulations that would make it very clear what can and can't be done on public roads.
I don't know about you, but if I were to choose which of three to trust will address the situation somehow first, it would be the one in the middle. And it's posts like the one referenced earlier that give me hope. I just wish more people would do it before going through a situation where they could have potentially injured others as well.
It's a very delicate subject so offering any actual examples can be tricky. Alright, we'll go with something that doesn't have anything to do with nationality, religion, or the color of one's skin: veganism. A lot of people tend to become aggressive toward someone when they hear they are vegan just because they think that person will try to preach this lifestyle and never stop rambling about it. However, just because some do indeed do that, it doesn't mean you shouldn't give the benefit of the doubt first and see where it goes from there.
It's the same with Tesla owners and particularly FSD Beta users. Most outsiders regard them as these brain-washed drones who have no problem endangering everyone on the road for the sake of helping push the self-driving technology forward (and TSLA stock along the way). Needless to say, not every Tesla owner is as blind to the dangers the FSD Beta poses right now and it looks like more and more of them are starting to realize (or at least speak up about it).
Sadly, though, in some cases, the countless videos on the Internet showing the advanced driver assistance system's (ADAS) repeated fails aren't enough to paint a more realistic image of what it can and can't do, so it takes a similar personal experience for the coin to finally drop.
One such example comes from a Tesla Motors Club (TMC) forum user called "tr6990" based in Missouri who posted their recent experience with the system under the admittedly exaggerated (though not entirely incorrect) title: "FSD Beta Attempts to Kill Me; Causes Accident." So, what actually happened.
According to the user's recounting of the incident, the system was engaged while the Model Y was traveling on a "two-lane rural road." It was dark (right after sunset), but the conditions were clear and it was a section of the road the FSD Beta was no stranger to.
The hairy part began during a right-hand curve with another car approaching from the other side. About mid-way through the bend, for no apparent reason, the system decides it's time to go straight, putting the Model Y into the path of the oncoming vehicle.
The driver says they had their hands on the wheel so they could react immediately (about 0.3 of a second, according to their version of the story) by veering to the right and back into the correct lane. However, the correction applied was too severe and they lost control of the EV. The Model Y went through the ditch on the side of the road and into the woods. Luckily, they didn't hit any trees but the damage to the vehicle was still severe.
With the video accompanying the post now removed, we have no way to gauge what happened ourselves and trust what the forum member wrote in their post. However, there is one thing about this story - and others like it - that raises the hairs on my back: the potential fate of the people driving in the other car.
For years - over a century now - we've been conditioned to accept that people tend to value their lives enough to stay in their lane and not come straight at us as soon as there is a bend in the road. It hasn't been a completely smooth ride (some people lose control of their vehicles when turning and others think it's a good idea to use a head-on crash as a method of committing suicide) but we'd argue the system worked reasonably well thus far.
Now, Tesla is basically asking us to trust a computer to do the same, even though the situation has two major holdbacks: one, as Tesla themselves put it, the system is still in its beta form, and therefore not ready for widespread public use, and two, as far as we know, computers aren't yet self-aware so they couldn't care less about crashing.
The author of the post is now looking to take legal action against Tesla but, as everyone else on the forum told him, I too don't see how he could have a case. Tesla makes it pretty clear it washes its hands completely of anything that happens during FSD Beta use with all the responsibility falling on the driver.
The sad thing is, though, that while FSD Beta users can opt-out, we can't. Nobody is asking us if we're OK with sharing the roads with an ADAS that's been proven to be faulty and the only safeguard being a driver whose attentiveness, skill, and reaction times we know nothing about. You can argue the latter part has been the same since forever, but at least this far, they've only had to manage their own mistakes, not those of an AI driver as well.
At the end of the day, there are three parties at fault here: Tesla, for releasing a potentially dangerous system to the public and marketing it aggressively as being more capable than it actually is; some of the drivers for over-relying on FSD despite the warnings; finally, the authorities for taking their sweet time with coming up with a set of regulations that would make it very clear what can and can't be done on public roads.
I don't know about you, but if I were to choose which of three to trust will address the situation somehow first, it would be the one in the middle. And it's posts like the one referenced earlier that give me hope. I just wish more people would do it before going through a situation where they could have potentially injured others as well.
#1248
Sanest Florida Man
#1250
My first Avatar....
So... why are EVs so frigging expensive?
#1251
Sanest Florida Man
#1252
Semi factory is in Nevada
This should end the speculation about where the initial units will be built.
#1253
#1254
AZ Community Team
#1255
#1256
duh… Tesla seems to be doing fine with the profit margins so far. Maybe cause they don’t have no baggage.
#1257
My first Avatar....
Tesla is over priced just like, all EVs.
#1258
Whats up with RDX owners?
iTrader: (9)
This is from Twitter so the source cannot be refuted.
Oh and this is the description from the (now deleted) YouTube video:
Oh and this is the description from the (now deleted) YouTube video:
#1259
My first Avatar....
Don't worry...comfy will explain it.
#1261
Race Director
Thread Starter
A Tesla Model 3 taxi cab got involved in a severe accident in Paris. The first official information about it is that the cause was an SUA (sudden unintended acceleration) episode and braking issues. About 20 people got hurt, five of them with life-threatening injuries.
25 photos
The crash happened on December 11 at around 9 PM in the 13e arrondissement in Paris. Le Figaro reports that the police informed the driver was traveling on rue d’Ivry (it’s actually avenue) when his car suddenly started accelerating and could not brake. Jérôme Coumet, the mayor of the 13e arrondissement, said on Twitter the crash was caused by a “technical failure.” He also said that “the accelerator (pedal) would have stuck.”
With no control, the EV first hit a bike rider on avenue d’Ivry. The taxi cab would then have hit two pedestrians at the crossing with rue de Tolbiac, smashed a glass recycling container – sending glass bits into other passersby – another pedestrian and a traffic light at avenue de Choisy. It just stopped when it hit a van close to the restaurant Le Mandarin de Choisy, according to CNEWS.
French newspapers and websites have conflicting information about the number of injured people. CNEWS talks about eight people in ICUs, five with life-threatening injuries, and 12 more lightly hurt people. Le Figaro mentioned nine people hurt. Le Parisien said 7 people had life-threatening injuries, and 13 went to the emergency room. Unfortunately, Le Parisien seems to have talked to the people involved in the accident, but we have no access to the article.
This is not the first report of SUA incidents with Tesla vehicles. Still, it seems to be one of the worst ones in which those claims happened. Jason Hughes, known as the Tesla Hacker, has already dismissed SUA incidents in Tesla vehicles as impossible due to how the brakes override the accelerator pedal whenever they are activated.
The latest hypothesis for these episodes is that cruise control is accidentally turned on. When the previously set speed is much higher than the one the vehicle is traveling at, it accelerates too fast to the established speed and looks like an SUA episode. In China, the government obliged Tesla to take measures and it added a sound to warn TACC (Traffic-Aware Cruise Control) was switched on. Tesla has never clarified if this is the cause for SUA events, and investigations have never concluded that this was really the case. Perhaps the French police will help determine what causes such problems in Tesla vehicles.
25 photos
The crash happened on December 11 at around 9 PM in the 13e arrondissement in Paris. Le Figaro reports that the police informed the driver was traveling on rue d’Ivry (it’s actually avenue) when his car suddenly started accelerating and could not brake. Jérôme Coumet, the mayor of the 13e arrondissement, said on Twitter the crash was caused by a “technical failure.” He also said that “the accelerator (pedal) would have stuck.”
With no control, the EV first hit a bike rider on avenue d’Ivry. The taxi cab would then have hit two pedestrians at the crossing with rue de Tolbiac, smashed a glass recycling container – sending glass bits into other passersby – another pedestrian and a traffic light at avenue de Choisy. It just stopped when it hit a van close to the restaurant Le Mandarin de Choisy, according to CNEWS.
French newspapers and websites have conflicting information about the number of injured people. CNEWS talks about eight people in ICUs, five with life-threatening injuries, and 12 more lightly hurt people. Le Figaro mentioned nine people hurt. Le Parisien said 7 people had life-threatening injuries, and 13 went to the emergency room. Unfortunately, Le Parisien seems to have talked to the people involved in the accident, but we have no access to the article.
This is not the first report of SUA incidents with Tesla vehicles. Still, it seems to be one of the worst ones in which those claims happened. Jason Hughes, known as the Tesla Hacker, has already dismissed SUA incidents in Tesla vehicles as impossible due to how the brakes override the accelerator pedal whenever they are activated.
The latest hypothesis for these episodes is that cruise control is accidentally turned on. When the previously set speed is much higher than the one the vehicle is traveling at, it accelerates too fast to the established speed and looks like an SUA episode. In China, the government obliged Tesla to take measures and it added a sound to warn TACC (Traffic-Aware Cruise Control) was switched on. Tesla has never clarified if this is the cause for SUA events, and investigations have never concluded that this was really the case. Perhaps the French police will help determine what causes such problems in Tesla vehicles.
#1262
Sanest Florida Man
NHTSA Finds No Evidence Of Tesla Sudden Unintended Acceleration
After reviewing roughly 662,000 Tesla cars (Model S 2013-2019, Model X 2016-2019 and Model 3 2017-2019), the National Highway Traffic Safety Administration (NHTSA) found no defect or evidence of sudden unintended acceleration.
It basically clears the manufacturer from suspicions that the car can accelerate on its own and crash.
The NHTSA has received tons of data from various incidents (see Defect Petition DP20-001, opened from January 13, 2020–January 8, 2021), but the evaluation of material did not support the claims.
According to the NHTSA, in every instance in which event data was available for review, crashes were caused by pedal misapplication.
NHTSA denied the petition, requesting to "recall all [Tesla] Model S, Model X, and Model 3 vehicles produced from 2013 to the present" due to sudden unintended acceleration.
It basically clears the manufacturer from suspicions that the car can accelerate on its own and crash.
The NHTSA has received tons of data from various incidents (see Defect Petition DP20-001, opened from January 13, 2020–January 8, 2021), but the evaluation of material did not support the claims.
According to the NHTSA, in every instance in which event data was available for review, crashes were caused by pedal misapplication.
Originally Posted by NHTSA
"After reviewing the available data, ODI has not identified evidence that would support opening a defect investigation into SUA in the subject vehicles. In every instance in which event data was available for review by ODI, the evidence shows that SUA crashes in the complaints cited by the petitioner have been caused by pedal misapplication. There is no evidence of any fault in the accelerator pedal assemblies, motor control systems, or brake systems that has contributed to any of the cited incidents. There is no evidence of a design factor contributing to increased likelihood of pedal misapplication. The theory provided of a potential electronic cause of SUA in the subject vehicles is based upon inaccurate assumptions about system design and log data."
#1263
Guess people are so much looking for some FUD.
#1264
Ex-OEM King
#1265
Sanest Florida Man
My brother did that once, he was moving my dad's car in the driveway and hit the gas instead of the brake and knocked over a brick fence we had. Should've blamed it on cruise control and he would've gotten away with it
#1266
Sanest Florida Man
#1267
Best antidote for FUD is truth.
#1268
Sanest Florida Man
The following users liked this post:
Comfy (12-20-2021)
#1269
Why did they stop at 10.8…? Should have gone straight to 11.0…
#1270
Sanest Florida Man
“Improved photon to control vehicle response latency by 20%”
means the FSDbeta reacts 20% faster to things around it
means the FSDbeta reacts 20% faster to things around it
The following users liked this post:
Comfy (12-20-2021)
#1271
Ex-OEM King
#1272
#1273
Whenever the Cybertruck comes, cause that’s what both of us actually have preordered.
#1274
Sanest Florida Man
I use a car so rarely now IDK if it makes sense to drop 50k on one, I just went 16 days without driving my car. We’ll see how that trend lasts but spending $600/mo on a vehicle I’ll use once a week or so seems stupid
#1275
Sanest Florida Man
here’s the change log of the big yearly update
#1276
The following 3 users liked this post by AZuser:
#1277
My first Avatar....
^
I lol'd.
I lol'd.
#1278
Ex-OEM King
same, lol.
#1279
Sanest Florida Man
The following users liked this post:
Comfy (12-24-2021)
#1280
Sanest Florida Man