In an iconic scene in Terminator 2, the T-1000 is peppered with holes from shotgun blasts only to quickly heal himself much to the dismay of our heroes. Now, what once seemed like science fiction decades ago could soon be a reality.
Researchers from Carnegie Mellon University have created a material that can spontaneously heal itself after extreme mechanical damage while maintaining an electrical current.
The material is made up of liquid metal droplets suspended in a soft rubber. Once damaged, the droplets rupture and form new connections with nearby droplets. The result is rerouted electrical signals without interruption.
Researchers put their discovery to the test by severing, puncturing and otherwise damaging the material all the while the electrical current continued to flow to the clock on the other end, without interruption.
Researchers say the new material could be used in a variety of applications from bio-inspired robots to human-machine interactions to wearable computing. It could also be used in power and data transmission.
CPU Trends: Apple is moving on from Intel because Intel isn’t moving anywhere
From TheVerge April 3, 2018
Excellent commentary & graphic on Intel chip evolution
A report from Bloomberg this week has made public something that should already have been apparent to tech industry observers: Apple is planning to replace Intel processors in Mac computers with its own chips starting sometime around 2020. The two California companies have enjoyed a long and fruitful partnership ever since Apple made the switch to Intel CPUs with the 2006 MacBook Pro and iMac, but recent trends have made the breakup between them inevitable. Intel’s chip improvements have stagnated at the same time as Apple’s have accelerated, and now iPhone systems-on-chip are outperforming laptop-class silicon from Intel’s Core line. Even if Intel never cedes its performance crown, the future that Apple is building will invariably be better served by its own chip designs.
Apple’s decision to ditch the world’s most popular CPU line for laptop and desktop computers may seem radical, but there are a number of key factors that actually make it obvious and unavoidable.
Attend any major tech exhibition and you’ll find Intel announcing or reannouncing mildly improved processors. Whether you’re at IFA in Berlin, CES in Las Vegas, or Computex in Taipei, the spiel is always the same: the future is wireless, battery life matters to everyone, and there are a lot of people with five-year-old PCs who might notice a difference if they buy a new Intel-powered computer. It’s all painfully incremental and out of sync with Apple’s product cadence. Apple will give you, at most, two years with an iPhone before enticing you into upgrading, whereas Intel is trying to convince people with PCs that are half a decade old to do the same.
In the past, Intel could rely on microarchitecture changes one year and production process shrinkage another year to maintain its momentum of improvement. But the infamous Moore’s Law sputtered to an end back in 2015. Intel is approaching the limits of what’s possible to achieve with silicon, and it hasn’t yet figured out its next step. The chart below, compiled by AnandTech, illustrates Intel’s predicament well. Notice how long the 14nm process node has endured, the question marks next to the release window for 10nm chips, and the almost total absence of a future road map. In previous years, Intel’s ambitious plans would be known well in advance. (The company hasn’t grown more secretive; it just doesn’t seem to have any secrets left.) And without the power efficiency gains that come from building smaller chips, Intel just can’t compete with ARM processors designed for efficiency first.
If there’s one unifying theme to define everything that Apple does, it’s integration. From integrating components on a logic board to integrating an entire ecosystem of Apple devices like the iPhone, Macs, AirPods, and HomePod to integrating supply and distribution lines under its centralized control. Apple started designing its own iPhone chips because it didn’t want to be dependent on Qualcomm. A year ago, it started making its own graphics processors to shed its reliance on Imagination Technologies. Apple also created its own Face ID system, acquired the maker of its Touch ID system, and it was recently reported to be secretly developing its own MicroLED screens for the Apple Watch.
Apple will tell you that it’s obsessed with delighting the consumer, crafting elegantly designed objects, or some other lofty aspiration, but the company’s overriding ambition is to control every last minute aspect of its products. The Intel chips that have been at the heart of MacBooks and Macs for over a decade aren’t minute; they’re central to how each computer can be designed and engineered. Apple has stuck with them for so long because of Intel’s once-insurmountable lead, but the way we use computers is changing, the workloads on a CPU are changing, and Apple’s A-series of chips are better designed to handle that new world of computing. Plus, the iPhone has shown the advantages of designing hardware and software in harmony, requiring smaller batteries and less RAM than comparable Android rivals.
The iOS laptop
Apple’s macOS, the operating system that runs on Intel’s x86 architecture, is now legacy software. This may sound like a blunt allegation to make, given that Apple still sells plenty of MacBooks and iMacs, but the development of that OS within Apple seems to have halted entirely. Today, macOS feels like it’s in maintenance mode, awaiting a widely anticipated change that will produce a unified iOS and macOS operating system, with iOS taking precedence.
Mobile computing has firmly established itself as the predominant mode of use these days, and that trend will only grow more pronounced in the future. Apple’s primary software focus is rightly fixed on iOS, which happens to run on ARM instructions, not Intel’s x86. So, if Apple is indeed intent on bringing iOS up into its less-portable computing line, and if it has chips that offer comparable performance to Intel’s consumer CPUs (which it does), why not build all of that on top of its own processor? Whether it’s presented as a new-age iPad Pro or MacBook Air, a device that combines the strengths of iOS and the convenience of a clamshell design with a generous touchpad is something a lot of people would love to have. By pursuing this course of action, Apple gets to scratch its vertical integration itch while sating existing demand.
The mobile office
The thing that makes it possible for Apple to even contemplate running its lithe mobile operating system on its more powerful computers is the way our computing habits are changing. Not only are we using mobile devices more often than desktop ones for entertainment, but we’re now doing most of our work on phones as well. You can be a professional photographer with just a Pixel 2, for instance. The phrase “phoning it in” certainly has a whole different ring to it in 2018 than it did at the beginning of this decade.
As investment and development dollars continue flowing into the dominant mobile platforms — Android and iOS — it’s logical to expect that every useful desktop application that hasn’t yet been adapted to them already is on its way there. Sure, Intel is likely to retain its dominance at the very high end of computing, but for the vast majority of people and situations, iOS will soon be able to provide all that users want. And once the software reaches that point, Apple will want to match it with hardware that’s powerful and ergonomic enough to take advantage.
It’s not just Apple that’s moving away from Intel processors. Google has been hiring and dabbling with its own custom chip designs, and Microsoft and Qualcomm this year started pushing Windows on ARM as an alternative to the typical Intel-powered laptops. The whole technology world is moving to developing and designing for mobile applications first, and Intel’s desktop roots keep holding it back from being competitive in that expanding market.
Patexia.com reports continuing reduction in the filing of suits for patent litigation and continuing increase in the filing of Inter Partes Review (IPR).
Detailed information and graphs on these trends are provided in the Patexia.com article including year-over-year from 2015 on to 2017Q1. The 2017Q1 data shows:
“In the first quarter of 2017 we saw a continued decline in patent litigation. Thedistrict court litigation was down 26 percent to 1,012, compared to 1,346 in Q4 of 2016. And it was down 5 percent year over year (1,067 in Q1 of 2016). For the same period,Inter-Partes Review (IPR) was up 22 percent to 550, compared to 448 in Q4 of 2016. This increase was even sharper year over year. IPR saw a whopping increase of 64 percent in Q1 2017 versus Q1 2016, which saw 335.”
One key statistic related to the IPR process: “IPR activity per quarter was at an all-time high in Q1 2017. Since its inception in September 2012, IPR has been gaining popularity as a tool to challenge the validity of patents in lawsuits or licensing deals. …”
Related to patent litigation cases: “Patent litigation in district courts was at its lowest level since 2011. Although the litigation has dropped to pre-AIA levels, it is worth mentioning that post-AIA numbers are generally magnified because of joinder rules. …”
“The Remarkable Potential of Stem Cells” by Phil Kesten
The author is Prof. Phil Kesten, Associate Professor of Physics, Santa Clara University (SCU) | Associate Vice Provost, SCU Undergraduate Studies.
This is a very nice article entitled: “The Remarkable Potential of Stem Cells” by Phil Kesten. It is laid out in an interesting and easy to read manner but shows where Stem Cell related therapies are headed and some potential applications.
Stem cell therapies, devices to deliver them, and other related technologies will be a new frontier for many years. The potential for innovative therapies is huge, but seemingly “simple” problems remain. One significant problem that I have studied involves retention of the stem cells at the target site after they are delivered to that site.
See the full article at either link in the Santa Clara University “Illuminate” publication of September 9, 2016:
The anatomy of a human cell is shown in this figure:
and Prof. Kesten goes on to say in this article:
Over the past few decades, talk of stem cells has often been in the news. What exactly are stem cells, and why all the excitement? Let’s wonder a bit about the science of cells—and the remarkable potential of stem cells.
All living things are made up of cells. There are more than a trillion cells—perhaps more than 30 trillion—in the human body, including many kinds of specialized cells. Bone cells, nerve cells, skin cells, blood cells … and, yes, stem cells.
All cells are self-contained, with their insides separated from their environment by a cell membrane. This enclosure keeps cytoplasm—a thick, gel-like substance that comprises the bulk of a cell—from leaking out. The cell membrane also allows nutrients to flow in, while keeping out material that might damage the cell.
Within each cell is a nucleus that holds the cell’s genetic material. Most cells also contain mitochondria—tiny organic batteries that serve as the cell’s power supply. And within each cell is a structure called the endoplasmic reticulum, a network of membranes within the cytoplasm that carries material, such as nutrients, throughout the cell.
There are critical differences among various kinds of cells, each having specific jobs and roles to play, for instance, in enabling you to breath, to walk, to fend off diseases. Yet with all this diversity among cell types, at the moment of conception, every living organism starts as a single cell. That cell divides into two, then four, then eight, and so on. And at this stage, when you were just a blob of cells, those were all embryonic stem cells.
The special, critical feature of stem cells is that, as they divide, they begin to differentiate. Some end up as nerve cells, some as blood cells, and some as muscle cells. While those specialized cells can only create more of their own kind of cell when they divide, stem cells give rise to any of the hundreds of kinds of specialized cells in your body.
Adults do have stem cells in their bodies. These adult stem cells are the body’s repair mechanisms. They can fix damaged tissues and organs by regenerating worn out or damaged or diseased cells, no matter what kind of specialized cells they are. Adult stem cells in your bone marrow, for example, can become red blood cells, white blood cells, or platelets, which are the cells that make up your blood.
The real power of stem cells, however, is not simply in their versatility. It is, rather, that stem cells can be grown in a laboratory.
The real power of stem cells, however, is not simply in their versatility. It is, rather, that stem cells can be grown in a laboratory. And even more powerful, in the past few years, scientists have learned how to reprogram specialized cells to become like stem cells. Indeed, the 2012 Nobel Prize in Physiology or Medicine was awarded to Shinya Yamanaka of Kyoto University for his work on converting mature skin cells into cells that closely resemble stem cells.
Scientists have already been exploring the use of stem cells to treat diseases such as multiple sclerosis and cerebral palsy, as well as to repair spinal cord and bone injuries. It will certainly be many years before stem therapies are widely available, but we can look forward to a future in which scientists can grow, say, a new liver for a patient whose own liver is failing. A new liver that is a perfect match for that patient, because it is grown from his or her own cells. Stem cell research promises an exciting future for regenerative medicine.
Having attended Stanford University myself for both a Master’s and PhD in Mechanical Engineering, I always feel a strong sense of pride when I see an article like this one related to “Most Innovative Universities”. Stanford is an amazing place, with so many “best in class” academic capabilities across many diverse fields. However, it is the medicine, science and engineering achievements that always catch my eye. When you look at how Stanford people have conceptualized and developed programs like the Medical Device Innovators series, the idea is always to break down the walls and collaborate across disciplines to identify needs, understand how they might be accomplished, and then develop devices and procedures to meet the goals.
The other thing that I look at is the number and diversity of fabulously successful companies and ideas that have come out of Stanford. The Silicon Valley ecosystem of top Universities, interest and drive to commercialize, and Venture Capital makes the entire area unique.
Here is the article by Thomson Reuters:
Stanford Again Tops “Most Innovative Universities” Rankings
Palo Alto, Calif. — Stanford University again tops this year’s newly released Reuters Top 100 ranking of the world’s most innovative universities, which aims to identify institutions doing the most to advance science, invent new technologies and help drive the global economy. MIT and Harvard round out the top three. The second annual rankings use proprietary data and analysis tools from Thomson Reuters to examine a series of patent and research-related metrics. “Stanford held fast to its first place ranking by consistently producing new patents and papers that influence researchers elsewhere in academia and in private industry,” the news serve wrote. The complete rankings are at the link below.
Two weeks after releasing the Galaxy Note 7 SmartPhones, Samsung is literally and figuratively fighting fires! They have now recalled the roughly 2.5 Million Galaxy Note 7 that have been distributed (about 1 Million phones sold). This is clearly a serious safety and reliability issue that should have been identified before any shipments started. Not only is there the cost associated with the recall, replacement, possible personal injury and property damage, Samsung stock has taken a hit that knocked $2 Billion off of its market value! The market can be massively punishing and unforgiving for mistakes like this one.
To date, 35 reports of fire/explosion issues have been received by Samsung. Samsung believes that the problems are confined to fewer than 0.1% of the phones. Based on a population of 2 Million phones, this would indicate the problems apply to less than 2000 phones. This is a huge number of failures and a 99.9% reliability (even if the reliability level is even this high) is an unacceptable level in the consumer products world.
We expect these products not only to function reliably but also to be safe. Battery fire issues with hoverboards in late 2015 basically tanked the sales of that product.
Additional details including the press release can be found here.
Pro Football players in the NFL are bigger, faster, and stronger than ever before. All of these characteristics increase the acceleration, force, and energy associated with contact between players. When this contact occurs to the head it can translate into a concussion or just contribute to an ongoing series of cumulative smaller injuries.
Evidence is mounting that concussions or cumulative injuries have serious long-term effects. This long-term effect applies not only to football, but also things like battlefield blast loading and similar events.
Discussion on sensor technology and helmet improvements.
Reference articles with further information:
Energy harvesting (energy scavenging) has always been attractive since sources are almost always available and the energy available is just wasted if not used. In addition to the three sources discussed in the reference below (light, vibration, and heat), another attractive source is available from automotive vibrations (particularly for sensors) and the more significant and now more widely used source of regenerative braking.
Quoting from the excellent Design News article of April 22, 2015 by Warren Miller:
“Energy harvesting in particular seems to be moving at an accelerating pace. We now seem to be at a point where it is possible to run low-power systems primarily from energy harvesting sources. This is a big shift from even just a couple of years ago.
Three key trends seem to have accelerated this dramatic shift. The first is the wild growth in the low-power market. New applications like wearable devices, smart sensors, and disposable devices are driving the insatiable need for more processing power on a low-power budget.
This rapidly growing market drives the second trend: the availability of low-power MCUs and FPGAs. These devices now offer considerable, power-efficient processing that can be applied to the wide range of applications in the growing low-power market. The third trend is the growing availability of energy harvesting sources that produce enough power to run low-power MCUs and FPGAs for enough time to do useful work.
Shown in the Figure below, is a summary of the power harvesting capabilities of three common harvesting technologies. We are all familiar with solar power as an energy harvesting technology, and it has probably been the main energy harvesting technology to power electronic devices up to this point.
But new technologies that provide alternative — and often more convenient power sources – have been developed. Piezoelectric effects, for example, can be used to harvest energy from vibration, motion, and pressure. This can be convenient for powering a variety of devices in areas such as wearable electronics for athletics and sensors on trucks or trains and for material flow control.
A piezoelectric energy source, as with many harvested energy sources, can be derived in bursts, which often need to be stored and accumulated for later use. In very simple systems, a simple capacitor storage system may be sufficient to give a very low-power MCU the juice needed to power up and perform simple calculations several times a second.
Smart use of the MCUs’ low-power states is usually critical in low-power applications, and newer MCUs can sleep indefinitely while using only microamperes of current, which makes it possible to use them in these types of very low-power applications.”
“Perhaps surprising is the large amount of harvested power available from thermal energy. On par with solar harvested power, thermal energy can perhaps be best used in industrial applications where sensors monitor extremes of pressure and temperature.
The large temperature gradients available in industrial process control applications can easily power low-power FPGAs to implement very complex sensing algorithms using digital signal processing filtering or transform functions. Small rechargeable batteries can be used to store power when the temperature gradient isn’t available, but because sensing is normally only required while temperature extremes exist, batteries can be small without impacting sensor availability.
Perhaps even more interesting is the possibility of harvesting small amounts of thermally produced energy when temperature differences are not as extreme. A wearable device, for example, might have available a 10- or 20-degree temperate difference. This might be sufficient to generate enough power over just a few hours to power an activity monitor, heart rate sensor, or position tracker.
A small wristband could provide enough area to generate the power required to run a monitor or sensor. Combining energy harvesting techniques, thermal and vibration for example, could be an even more efficient method for powering an activity monitor.”
The drought in California is in its 4th year. Water conservation and development of new water resources are becoming more important than ever. Clever ideas and the development and application of new technologies to improve our water supplies are critical. I also feel that the massive commercial and residential development that is going on in Silicon Valley and other highly-populated areas of California is rapidly increasing demand at the same time that supplies are falling. Water may become the new currency, especially when you include the voracious water appetite in the agriculture industry!!
Clever ways to apply technology to make a real impact on water saved through conservation should all be addressed immediately. Conservation measures are the easiest to implement quickly. However, most are at the household level and require participation by many to have an impact.
Large-scale sources of new water supplies also most be pursued aggressively. Desalination plants offer a great opportunity to develop new supplies near the Pacific Coast. Cost and energy requirements are major issues that must be addressed to make desalination more feasible. I believe that the cost factor can be improved by standardized designs that are applied to the development of a large number of plants.
The energy required for desalination can be reduced and at least “green” energy sources can be substituted for traditional power sources.
Transporting of water supplies from geographic areas with huge supplies far in excess of their demand to areas where water supplies are limited also need to be explored. What are the most efficient options? What are the sizes required for either pipeline or canal transportation of large quantities of water? Canals and pipelines are already used in California for water transportation, but the sources are also in California and are also affected by the drought (the reduced Sierra snowpack, etc.).
An excellent article by Michael Goldman gives great detail and context on the current historic drought. The article has the provocative title of: The California Drought — “Whiskey’s fer drinkin’ – Water’s fer fightin'” and is posted here :
A high-percentage of the total water used in California goes to agriculture. Drip-irrigation can dramatically reduce the water used and is appropriate for many types of crops. Incentives to promote the rapid conversion to drip-irrigation for agriculture should be put in place immediately.
For the remaining part of the water that goes toward commercial and residential use, a large fraction is used for landscaping. Grey-water and other sources of reclaimed water can certainly be used for landscape applications.