|
News Source Slashdot:Hardware
Data Centers in Nvidia's Hometown Stand Empty Awaiting Power
Two of the world's biggest data center developers have projects in Nvidia's hometown that may sit empty for years because the local utility isn't ready to supply electricity. From a report: In Santa Clara, California, where the world's biggest supplier of artificial-intelligence chips is based, Digital Realty Trust applied in 2019 to build a data center. Roughly six years later, the development remains an empty shell awaiting full energization. Stack Infrastructure, which was acquired earlier this year by Blue Owl Capital, has a nearby 48-megawatt project that's also vacant, while the city-owned utility, Silicon Valley Power, struggles to upgrade its capacity. The fate of the two facilities highlights a major challenge for the US tech sector and indeed the wider economy. While demand for data centers has never been greater, driven by the boom in cloud computing and AI, access to electricity is emerging as the biggest constraint. That's largely because of aging power infrastructure, a slow build-out of new transmission lines and a variety of regulatory and permitting hurdles. And the pressure on power systems is only going to increase. Electricity requirements from AI computing will likely more than double in the US alone by 2035, based on BloombergNEF projections. Nvidia's Jensen Huang and OpenAI's Sam Altman are among corporate leaders that have predicted trillions of dollars will pour into building new AI infrastructure.
Read more...
What's the Best Ways for Humans to Explore Space?
Should we leave space exploration to robots — or prioritize human spaceflight, making us a multiplanetary species? Harvard professor Robin Wordsworth, who's researched the evolution and habitability of terrestrial-type planets, shares his thoughts:In space, as on Earth, industrial structures degrade with time, and a truly sustainable life support system must have the capability to rebuild and recycle them. We've only partially solved this problem on Earth, which is why industrial civilization is currently causing serious environmental damage. There are no inherent physical limitations to life in the solar system beyond Earth — both elemental building blocks and energy from the sun are abundant — but technological society, which developed as an outgrowth of the biosphere, cannot yet exist independently of it. The challenge of building and maintaining robust life-support systems for humans beyond Earth is a key reason why a machine-dominated approach to space exploration is so appealing... However, it's notable that machines in space have not yet accomplished a basic task that biology performs continuously on Earth: acquiring raw materials and utilizing them for self-repair and growth. To many, this critical distinction is what separates living from non-living systems... The most advanced designs for self-assembling robots today begin with small subcomponents that must be manufactured separately beforehand. Overall, industrial technology remains Earth-centric in many important ways. Supply chains for electronic components are long and complex, and many raw materials are hard to source off-world... If we view the future expansion of life into space in a similar way as the emergence of complex life on land in the Paleozoic era, we can predict that new forms will emerge, shaped by their changed environment, while many historical characteristics will be preserved. For machine technology in the near term, evolution in a more life-like direction seems likely, with greater focus on regenerative parts and recycling, as well as increasingly sophisticated self-assembly capabilities. The inherent cost of transporting material out of Earth's gravity well will provide a particularly strong incentive for this to happen. If building space habitats is hard and machine technology is gradually developing more life-like capabilities, does this mean we humans might as well remain Earth-bound forever? This feels hard to accept because exploration is an intrinsic part of the human spirit... To me, the eventual extension of the entire biosphere beyond Earth, rather than either just robots or humans surrounded by mechanical life-support systems, seems like the most interesting and inspiring future possibility. Initially, this could take the form of enclosed habitats capable of supporting closed-loop ecosystems, on the moon, Mars or water-rich asteroids, in the mold of Biosphere 2. Habitats would be manufactured industrially or grown organically from locally available materials. Over time, technological advances and adaptation, whether natural or guided, would allow the spread of life to an increasingly wide range of locations in the solar system. The article ponders the benefits (and the history) of both approaches — with some fasincating insights along the way. "If genuine alien life is out there somewhere, we'll have a much better chance of comprehending it once we have direct experience of sustaining life beyond our home planet."
Read more...
NVIDIA Connects AI GPUs to Early Quantum Processors
"Quantum computing is still years away, but Nvidia just built the bridge that will bring it closer..." argues investment site The Motley Fool, "by linking today's fastest AI GPUs with early quantum processors..." NVIDIA's new hybrid system strengthens communication at microsecond speeds — orders of magnitude faster than before — "allowing AI to stabilize and train quantum machines in real time, potentially pulling major breakthroughs years forward."CUDA-Q, Nvidia's open-source software layer, lets researchers choreograph that link — running AI models, quantum algorithms, and error-correction routines together as one system. That jump allows artificial intelligence to monitor [in real time]... For researchers, that means hundreds of new iterations where there used to be one — a genuine acceleration of discovery. It's the quiet kind of progress engineers love — invisible, but indispensable... Its GPUs (graphics processing units) are already tuned for the dense, parallel calculations these explorations demand, making them the natural partner for any emerging quantum processor... Other companies chase better quantum hardware — superconducting, photonic, trapped-ion — but all of them need reliable coordination with the computing power we already have. By offering that link, Nvidia turns its GPU ecosystem into the operating environment of hybrid computing, the connective tissue between what exists now and what's coming next. And because the system is open, every new lab or start-up that connects strengthens Nvidia's position as the default hub for quantum experimentation... There's also a defensive wisdom in this move. If quantum computing ever matures, it could threaten the same data center model that built Nvidia's empire. CEO Jensen Huang seems intent on making sure that, if the future shifts, Nvidia already sits at its center. By owning the bridge between today's technology and tomorrow's, the company ensures it earns relevance — and revenue — no matter which computing model dominates. So Nvidia's move "isn't about building a quantum computer," the article argues, "it's about owning the bridge every quantum effort will need."
Read more...
How the US Cut Climate-Changing Emissions While Its Economy More Than Doubled
alternative_right shares a report from The Conversation: Countries around the world have been discussing the need to rein in climate change for three decades, yet global greenhouse gas emissions -- and global temperatures with them -- keep rising. When it seems like we're getting nowhere, it's useful to step back and examine the progress that has been made. Let's take a look at the United States, historically the world's largest greenhouse gas emitter. Over those three decades, the U.S. population soared by 28% and the economy, as measured by gross domestic product adjusted for inflation, more than doubled. Yet U.S. emissions from many of the activities that produce greenhouse gases -- transportation, industry, agriculture, heating and cooling of buildings -- have remained about the same over the past 30 years. Transportation is a bit up; industry a bit down. And electricity, once the nation's largest source of greenhouse gas emissions, has seen its emissions drop significantly. Overall, the U.S. is still among the countries with the highest per capita emissions, so there's room for improvement, and its emissions (PDF) haven't fallen enough to put the country on track to meet its pledges under the 10-year-old Paris climate agreement. But U.S. emissions are down about 15% over the past 10 years. The report mentions how the U.S. managed to replace coal with cheaper, more efficient natural-gas plants while rapidly scaling wind, solar, and battery storage as their costs fell. At the same time, major gains in appliance, lighting, and building efficiency flattened per-capita power use. This also coincided with improved vehicle fuel economy that helped keep transportation emissions in check.
Read more...
Ford Considers Scrapping F-150 EV Truck
According to the Wall Street Journal, Ford executives are considering scrapping the electric version of the F-150 pickup truck as losses, supply setbacks, slow sales, and the arrival of a cheaper midsize EV truck undermine the business case for its full-size electric pickup. Reuters reports: Last month, a union official told Reuters that Ford was pausing production at the Dearborn, Michigan, plant that makes its F-150 Lightning electric pickup due to a fire at a supplier's aluminum factory. "We have good inventories of the F-150 Lightning and will bring Rouge Electric Vehicle Center back up at the right time, but don't have an exact date at this time," Ford said in a statement on Thursday. The WSJ report added that General Motors executives have discussed discontinuing some electric trucks, citing people familiar with the matter. The Detroit three, which includes Ford, GM and Chrysler-parent Stellantis, have rolled back their ambitious plans for EVs in the United States, pivoting to their gasoline-powered models.
Read more...
Manufacturer Bricks Smart Vacuum After Engineer Blocks It From Collecting Data
A curious engineer discovered that his iLife A11 smart vacuum was remotely "killed" after he blocked it from sending data to the manufacturer's servers. By reverse-engineering it with custom hardware and Python scripts, he managed to revive the device to run fully offline. Tom's Hardware reports: An engineer got curious about how his iLife A11 smart vacuum worked and monitored the network traffic coming from the device. That's when he noticed it was constantly sending logs and telemetry data to the manufacturer -- something he hadn't consented to. The user, Harishankar, decided to block the telemetry servers' IP addresses on his network, while keeping the firmware and OTA servers open. While his smart gadget worked for a while, it just refused to turn on soon after. After a lengthy investigation, he discovered that a remote kill command had been issued to his device. He sent it to the service center multiple times, wherein the technicians would turn it on and see nothing wrong with the vacuum. When they returned it to him, it would work for a few days and then fail to boot again. After several rounds of back-and-forth, the service center probably got tired and just stopped accepting it, saying it was out of warranty. Because of this, he decided to disassemble the thing to determine what killed it and to see if he could get it working again. [...] So, why did the A11 work at the service center but refuse to run in his home? The technicians would reset the firmware on the smart vacuum, thus removing the kill code, and then connect it to an open network, making it run normally. But once it connected again to the network that had its telemetry servers blocked, it was bricked remotely because it couldn't communicate with the manufacturer's servers. Since he blocked the appliance's data collection capabilities, its maker decided to just kill it altogether. "Someone -- or something -- had remotely issued a kill command," says Harishankar. "Whether it was intentional punishment or automated enforcement of 'compliance,' the result was the same: a consumer device had turned on its owner." In the end, the owner was able to run his vacuum fully locally without manufacturer control after all the tweaks he made. This helped him retake control of his data and make use of his $300 software-bricked smart device on his own terms. As for the rest of us who don't have the technical knowledge and time to follow his accomplishments, his advice is to "Never use your primary WiFi network for IoT devices" and to "Treat them as strangers in your home."
Read more...
China Achieves Thorium-Uranium Conversion Within Molten Salt Reactor
Longtime Slashdot reader hackingbear writes: South China Morning Post, citing Chinese state media, reported that an experimental reactor developed in the Gobi Desert by the Chinese Academy of Sciences' Shanghai Institute of Applied Physics has achieved thorium-to-uranium fuel conversion, paving the way for an almost endless supply of nuclear energy. It is the first time in the world that scientists have been able to acquire experimental data on thorium operations from inside a molten salt reactor according to a report by Science and Technology Daily. Thorium is much more abundant and accessible than uranium and has enormous energy potential. One mine tailings site in Inner Mongolia is estimated to hold enough of the element to power China entirely for more than 1,000 years. At the heart of the breakthrough is a process known as in-core thorium-to-uranium conversion that transforms naturally occurring thorium-232 into uranium-233 -- a fissile isotope capable of sustaining nuclear chain reactions within the reactor itself. Thorium (Th-232) is not itself fissile and so is not directly usable in a thermal neutron reactor. Thorium fuels therefore need a fissile material as a 'driver' so that a chain reaction (and thus supply of surplus neutrons) can be maintained. The only fissile driver options are U-233, U-235 or Pu-239. (None of these are easy to supply.) In the 1960s, the Oak Ridge National Laboratory (USA) designed and built a demonstration MSR using U-233, derived externally from thorium as the main fissile driver.
Read more...
The World's Tallest Chip Defies the Limits of Computing: Goodbye To Moore's Law?
Longtime Slashdot reader dbialac shares a report from EL PAIS: For decades, the progress of electronics has followed a simple rule: smaller is better. Since the 1960s, each new generation of chips has packed more transistors into less space, fulfilling the famous Moore's Law. Formulated by Intel co-founder Gordon Moore in 1965, this law predicted that the number of transistors in an integrated circuit approximately doubles each year. But this race to the minuscule is reaching its physical limits. Now, an international team of scientists is proposing a solution as obvious as it is revolutionary: if we can't keep reducing the size of chips, let's build them up. Xiaohang Li, a researcher at King Abdullah University of Science and Technology (KAUST) in Saudi Arabia, and his team have designed a chip with 41 vertical layers of semiconductors and insulating materials, approximately ten times higher than any previously manufactured chip. The work, recently published in the journal Nature Electronics, not only represents a technical milestone but also opens the door to a new generation of flexible, efficient, and sustainable electronic devices. "Having six or more layers of transistors stacked vertically allows us to increase circuit density without making the devices smaller laterally," Li explains. "With six layers, we can integrate 600% more logic functions in the same area than with a single layer, achieving higher performance and lower power consumption."
Read more...
Australians To Get At Least Three Hours a Day of Free Solar Power - Even If They Don't Have Solar Panels
Australia's new "solar sharer" program will give households in NSW, south-east Queensland, and South Australia at least three hours of free solar power each day starting in 2026 -- even for those without rooftop panels. Other areas will potentially follow in 2027. The Guardian reports: The government said Australians could schedule appliances such as washing machines, dishwashers and air conditioners and charge electric vehicles and household batteries during this time. The solar sharer scheme would be implemented through a change to the default market offer that sets the maximum price retailers can charge customers for electricity in parts of the country. The climate change and energy minister, Chris Bowen, said the program would ensure "every last ray of sunshine was powering our homes" instead of some solar energy being wasted. Australians have installed more than 4m solar systems and there is regularly cheap excess generation in the middle of the day. Part of the rationale for the program is that it could shift demand for electricity from peak times -- particularly early in the evening -- to when it is sunniest. This could help minimize peak electricity prices and reduce the need for network upgrades and intervention to ensure the power grid was stable.
Read more...
LADWP Says It Will Shift Its Largest Gas Power Plant To Hydrogen
Bruce66423 shares a report from the Los Angeles Times: The board of the Los Angeles Department of Water and Power on Tuesday approved a controversial plan to convert part of the city's largest natural gas-fired power plant into one that also can burn hydrogen. In a 3-0 vote, the DWP board signed off on the final environmental impact report for an $800-million modernization of Units 1 and 2 of the Scattergood Generating Station in Playa del Rey. The power plant dates to the late 1950s and both units are legally required to be shut down by the end of 2029. In their place, the DWP will install new combined-cycle turbines that are expected to operate on a mixture of natural gas and at least 30% hydrogen with the ultimate goal of running entirely on hydrogen as more supply becomes available. The hydrogen burned at Scattergood is supposed to be green, meaning it is produced by splitting water molecules through a process called electrolysis. Hydrogen does not emit planet-warming carbon dioxide when it is burned, unlike natural gas. [...] Although burning hydrogen does not produce CO2, the high-temperature combustion process can emit nitrogen oxides, or NOx, a key component of smog. [...] [T]he approved plan contains no specifics about where the hydrogen will come from or how it will get to the site. "The green hydrogen that would supply the proposed project has not yet been identified," the environmental report says. Industry experts and officials said the project will help drive the necessary hydrogen production. "Burning hydrogen produced by 'excess' solar or wind power is a means of energy storage," adds Slashdot reader Bruce66423. "The hard question is whether it's the best solution to the storage problem given that other solutions appear to be emerging that would require less infrastructure investment (think pipes to move the hydrogen to the plant and tanks to store it for later use)."
Read more...
Ukraine First To Demo Open Source Security Platform To Help Secure Power Grid
concertina226 shares a report from The Register: [A massive power outage in April left tens of millions across Spain, Portugal, and parts of France without electricity for hours due to cascading grid failures, exposing how fragile and interconnected Europe's energy infrastructure is. The incident, though not a cyberattack, reignited concerns about the vulnerability of aging, fragmented, and insecure operational technology systems that could be easily exploited in future cyber or ransomware attacks.] This headache is one the European Commission is focused on. It is funding several projects looking at making electric grids more resilient, such as the eFort framework being developed by cybersecurity researchers at the independent non-profit Netherlands Organisation for Applied Scientific Research (TNO) and the Delft University of Technology (TU Delft). TNO's SOARCA tool is the first ever open source security orchestration, automation and response (SOAR) platform designed to protect power plants by automating the orchestration of the response to physical attacks, as well as cyberattacks, on substations and the network, and the first country to demo it will be the Ukraine this year. At the moment, SOAR systems only exist for dedicated IT environments. The researchers' design includes a SOAR system in each layer of the power station: the substation, the control room, the enterprise layer, the cloud, or the security operations centre (SOC), so that the SOC and the control room work together to detect anomalies in the network, whether it's an attacker exploiting a vulnerability, a malicious device being plugged into a substation, or a physical attack like a missile hitting a substation. The idea is to be able to isolate potential problems and prevent lateral movement from one device to another or privilege escalation, so an attacker cannot go through the network to the central IT management system of the electricity grid. [...] The SOARCA tool is underpinned by CACAO Playbooks, an open source specification developed by the OASIS Open standards body and its members (which include lots of tech giants and US government agencies) to create standardized predefined, automated workflows that can detect intrusions and changes made by malicious actors, and then carry out a series of steps to protect the network and mitigate the attack. Experts largely agree the problem facing critical infrastructure is only worsening as years pass, and the more random Windows implementations that are added into the network, the wider the attack surface is. [...] TNO's Wolthuis said the energy industry is likely to be pushed soon to take action by regulators, particularly once the Network Code on Cybersecurity (NCCS), which lays out rules requiring cybersecurity risk assessments in the electricity sector, is formalized.
Read more...
AMD Will Continue Game Optimization Support For Older Radeon GPU's After All
An anonymous reader quotes a report from Tom's Hardware: After a turbulent weekend of updates and clarifications, AMD has published an entire web page to assuage user backlash and reaffirm its commitment to continued support for its RDNA 1 and RDNA 2-based drives, following a spate of confusion surrounding its recent decision to put Radeon RX 5000 and 6000 series cards in "maintenance mode." This comes after AMD had to deny that the RX 7900 cards were losing USB-C power supply moving forward, even though the drive changelog said something quite different. Just last week, AMD released a new driver update for its graphics cards, and it went anything but smoothly. First, the wrong drivers were uploaded, and even after that was corrected, several glaring errors in the release notes required clarification. AMD was forced to correct claims about its RX 7900 cards, but at the time clarified that, indeed, RX 5000 and 6000 graphics cards were entering "Maintenance Mode," despite some RX 6000 cards being only around four years old. Now, though, AMD has either rolled back that decision or someone higher up the food chain has made a new call, as game optimizations are back on the menu for RDNA 1 and RDNA 2 GPUs. "We've heard your feedback and want to clear up the confusion around the AMD Software: Adrenalin Edition 25.10.2 driver release," AMD said in a statement. "Your Radeon RX 5000 and RX 6000 series GPUs will continue to receive: Game support for new releases, Stability and game optimizations, and Security and bug fixes," AMD said.
Read more...
Manufacturer Remotely Bricks Smart Vacuum After Its Owner Blocked It From Collecting Data
"An engineer got curious about how his iLife A11 smart vacuum worked and monitored the network traffic coming from the device," writes Tom's Hardware. "That's when he noticed it was constantly sending logs and telemetry data to the manufacturer — something he hadn't consented to."The user, Harishankar, decided to block the telemetry servers' IP addresses on his network, while keeping the firmware and OTA servers open. While his smart gadget worked for a while, it just refused to turn on soon after... He sent it to the service center multiple times, wherein the technicians would turn it on and see nothing wrong with the vacuum. When they returned it to him, it would work for a few days and then fail to boot again... [H]e decided to disassemble the thing to determine what killed it and to see if he could get it working again... [He discovered] a GD32F103 microcontroller to manage its plethora of sensors, including Lidar, gyroscopes, and encoders. He created PCB connectors and wrote Python scripts to control them with a computer, presumably to test each piece individually and identify what went wrong. From there, he built a Raspberry Pi joystick to manually drive the vacuum, proving that there was nothing wrong with the hardware. From this, he looked at its software and operating system, and that's where he discovered the dark truth: his smart vacuum was a security nightmare and a black hole for his personal data. First of all, it's Android Debug Bridge, which gives him full root access to the vacuum, wasn't protected by any kind of password or encryption. The manufacturer added a makeshift security protocol by omitting a crucial file, which caused it to disconnect soon after booting, but Harishankar easily bypassed it. He then discovered that it used Google Cartographer to build a live 3D map of his home.This isn't unusual, by far. After all, it's a smart vacuum, and it needs that data to navigate around his home. However, the concerning thing is that it was sending off all this data to the manufacturer's server. It makes sense for the device to send this data to the manufacturer, as its onboard SoC is nowhere near powerful enough to process all that data. However, it seems that iLife did not clear this with its customers. Furthermore, the engineer made one disturbing discovery — deep in the logs of his non-functioning smart vacuum, he found a command with a timestamp that matched exactly the time the gadget stopped working. This was clearly a kill command, and after he reversed it and rebooted the appliance, it roared back to life. Thanks to long-time Slashdot reader registrations_suck for sharing the article.
Read more...
Amazon's Deployment of Rivian's Electric Delivery Vans Expand to Canada
"Amazon has deployed Rivian's electric delivery vans in Canada for the first time," reports CleanTechnica, with 50 now deployed in the Vancouver area. Amazon's director of Global Fleet and Products says there's now over 35,000 electric vans deployed globally — and that they've delivered more than 1.5 billion packages. More from the blog Teslarati:In December 2024, the companies announced they had successfully deployed 20,000 EDVs across the U.S. In the first half of this year, 10,000 additional vans were delivered, and Amazon's fleet had grown to 30,000 EDVs by mid-2025. Amazon's fleet of EDVs continues to grow rapidly and has expanded to over 100 cities in the United States... The EDV is a model that is exclusive to Amazon, but Rivian sells the RCV, or Rivian Commercial Van, openly. It detailed some of the pricing and trim options back in January when it confirmed it had secured orders from various companies, including AT&T.
Read more...
Researchers Consider The Advantages of 'Swarm Robotics'
The Wall Street Journal looks at swarm robotics, where no single robot is in charge, robots interact only with nearby robots — and the swarm accomplishes complex tasks through simple interactions. "Researchers say this approach could excel where traditional robots fail, like situations where central control is impractical or impossible due to distance, scale or communication barriers."For instance, a swarm of drones might one day monitor vast areas to detect early-stage wildfires that current monitoring systems sometimes miss... A human operator might set parameters like where to search, but the drones would independently share information like which areas have been searched, adjust search patterns based on wind and other weather data from other drones in the swarm, and converge for more complete coverage of a particular area when one detects smoke.In another potential application, a swarm of robots could make deliveries across wide areas more efficient by alerting each other to changing traffic conditions or redistributing packages among themselves if one breaks down. Robot swarms could also manage agricultural operations in places without reliable internet service. And disaster-response teams see potential for swarms in hurricane and tsunami zones where communication infrastructure has been destroyed. At the microscopic scale, researchers are developing tiny robots that could work together to navigate the human body to deliver medication or clear blockages without surgery... In recent demonstrations, teams of tiny magnetic robots — each about the size of a grain of sand — cleared blockages in artificial blood vessels by forming chains to push through the obstructions. The robots navigate individually through blood vessels to reach a clog, guided by doctors or technicians using magnetic fields to steer them, says researcher J.J. Wie, a professor of organic and nano engineering at Hanyang University in South Korea. When they reach an obstruction, the robots coordinate with each other to team up and break through. Wie's group is developing versions of these robots that biodegrade after use, eliminating the need for surgical removal, and coatings that make the robots compatible with human tissue. And while robots the size of sand grains work for some applications, Wie says that they will need to be shrunk to nano scale to cross biological barriers, such as cell membranes, or bind to specific molecular targets, like surface proteins or receptors on cancer cells. Some researchers are even exploring emergent intelligence — "when simple machines, following only a few local cues, begin to organize and act as if they share a mind...beyond human-designed coordination." Thanks to long-time Slashdot reader fjo3 for sharing the article.
Read more...
|