RSS Technology News

How Technology Is Changing How We Do Leisure

Recently while looking through old photos, one of my children asked me what an object was, pointing to a Walkman, the now-defunct portable cassette player. I started to explain this and then found myself referring to its precursor, and then that objects precursor, and then that led to a discussion of Thomas Edison’s and W.K.L. Dickson’s experimental sound film of 1895. And this led to my showing my daughter how much of what we call leisure today is radically different than from even twenty years ago.  In fact, new technology is not only changing how we operate in the quotidian with the ability to buy bus tickets on our phone—no more running from store to store to change a dollar into quarters—but it having its most profound effects on our culture, especially leisure time activities.

I spoke with Giacomo Bruno, CEO of the Italian publisher Bruno Editore, who notes that leisure reading has changed a lot in recent years thanks to the technological advances of digital books stating, “People used to go to the bookstore to browse through books in order to choose which ones to buy. Nowadays people go online and in a few seconds they download an entire ebook.” Bruno reports how people often read to learn new skills so they can have a competitive advantage in the market and thus have greater economic stability today. “There are more than one billion people who read ebooks with self-help and personal growth among the most widely-read,” he reports. What Bruno indicates is that our culture of reading has become more professionalized as it shifts from the traditionally popular genres of romance and crime novels.

Another activity that has changed radically is that of leisure driving. A longstanding American practice of “going for a drive” has shifted in recent years. This is partly due to the cost of gasoline but mostly, it is because today the sense of driving without an end-plan has been quelled by various driving apps that have turned a past-time which used to be about doing nothing, into an activity very much about doing something.  There are apps to help people get to their destination with the least amount of traffic (Waze), that assist with directions while keeping to the trajectory of cheap gas stations (GasBuddy), and of course, there are myriad podcasts which drivers listen to as the modern-day upgrade from books on tape. The entire ethos of driving to forget has moved from that space of deep meditation and purposelessness to the end-goal of efficiency, errands, and literacy.

Another area where technology has morphed our cultural practices is how we move our bodies.  We have seen how Ekso Bionics’ exoskeletons reduce the bodily strain on workers, but now Harvard is taking this technology and transferring it to soft exosuits, structured textiles, whose use is envisaged in sports. Similar to this idea is mech racing which takes place within the format of exo-bionics, massive metal skeletons which resemble something out of Mad Max. These races have still not yet materialized.

In sports, we are seeing how miniaturized GPS, accelerometers, and other data collection tools inserted into players’ jerseys and cleats are giving biofeedback. Now sportspersons can know their heart rate, speed, jump height, fatigue, hydration levels, muscle activation, respiratory patterns, and neurological activity which can inform future training regimes. And new technology can reduce injury as well as prevent injury even for non-professionals for whom sports is leisure time. Even for those who consider their leisure time of watching sports as a passive use of sports, today immersive tech has turned the passive viewer into, at the very least, an active viewer as now spectators can potentially talk directly with sportspersons and even interact with them through various online platforms.

Then there is how we mix various types of past-times like listening to music and sports. The Walkman revolutionized how people listened to music in the 1980s as jogging soon became something people did with a Walkman clipped to their bodies. Two decades later, this technology was replaced by the iPod Shuffle and today by the Apple Watch and Bluetooth headphones. Now, not only has the bothersome cord disappeared from music listening, but we are heading towards unforeseen uses such as xFyro’s wireless and waterproof earbuds. What we do and how we do it are constantly changing as the leisure of swimming is now something we can undertake while listening to our favorite songs. The technology of the body is as much about finding better and healthier ways of living as it is about creating new cultural niches, evolving how we engage in work and leisure.

Our notion of leisure time has been blown to pieces with the advances in new technology as we no longer do what we did before in quite the same way.  Part of me revels in how these changes allow us to keep up with new information and enjoy music in the least likely of places. But another part of me worries that we are injecting natural spaces of silence and nothingness with tasks to accomplish and information to learn. Maybe, just maybe, we need a new cultural trend of slow leisure?

Source: https://www.forbes.com/sites/julianvigo/2019/05/30/how-technology-is-changing-how-we-do-leisure/

Japan to limit foreign investment in its technology companies as fears over Chinese ownership escalate

Japan has said it will restrict foreign ownership of its technology and telecoms businesses in a move widely seen as an attempt to block China from gaining access to its trade secrets.

The country will introduce new rules from August 1 which require foreign investors to report themselves to the Japanese government if they plan to purchase more than 10pc of the shares of Japanese technology and telecom firms.

The investors would then undergo inspection by the government, and could be forced to change or drop their investment plans if they are deemed to be a national security concern.

The Japanese government has identified 15 new restricted sectors, including mobile phone and computer manufacturing. It has also strengthened restrictions on five existing protected sectors, which includes telecoms businesses.

A spokesman for the government said that “based on increasing importance of ensuring cyber security in recent years, we decided to take necessary steps, including the addition of integrated circuit manufacturing, from the standpoint of preventing as appropriate a situation that will severely affect Japan’s national security.”

The decision by Japan comes after the US government imposed increased trade restrictions on Chinese technology business Huawei, blocking the company from buying goods and services from key American suppliers including chipmakers and Google.

The US has expressed concern that Huawei’s close relationship with the Chinese government could be exploited to force the company to use its devices to conduct espionage. Huawei has consistently denied this suggestion.

Last year, the Japanese government barred the use of computers and smartphones produced by Huawei and ZTE, another Chinese business, in its government.

US President Donald Trump held trade talks with Japanese Prime Minister Shinzo Abe in Tokyo on Monday, and said that he expects to make announcements about trade between the two countries in August.

“We’ll get the balance of trade, I think, straightened out rapidly,” Mr Trump said.

Japan has sought to charm Trump to avoid costly tariffs and retain positive relations with an ally that ensures its security against neighboring China and North Korea. At the same time, Trump is looking to reach a deal with Japan quickly as he escalates his trade war with China.

Source:
https://www.telegraph.co.uk/technology/2019/05/28/japan-limit-foreign-investment-technology-companies-fears-chinese/

Imagining How Technology Will Disrupt Future Energy Markets

It’s common today for observers to speculate about how the energy future must look, rather than trying to imagine how it might look. The camp that “proposes” focuses on what governments and bureaucrats could or must force on markets. Meanwhile, imagination is in short supply among the energy punditocracy.

The future that actually unfolds is always shaped by what engineers and entrepreneurs imagine and invent, things that either consume or produce energy. Consider the historical context.

When it comes to energy demand, who in 1919 could have imagined the future that actually unfolded because of technologies only invented a few years before? In the year 1919 there were still roughly as many horses as cars per capita. But 1919 was a full decade into the wildly successful Model T era, and six years after Wright Brothers first flight. A world with far more automobiles and air travel was actually imaginable. But no one at the time foresaw the extent of the energy-consuming road-miles and air-miles to come, now counted in trillions per year.

And, regarding energy production, by 1919 the age of petroleum (which really did save the whales) was already a half-century old; global production had soared over 20-fold from early days. Consequently, 1919 saw the rise of the ‘industry’ of experts predicting peak oil supply. But innovators created a future that would see production rise by over 80-fold from that point. Some of the key technologies that enabled that growth had already been invented by 1919: the Hughes drill bit, patented in 1909, radically accelerated both speed and depth of drilling; the first off-shore platform, opening up vast new territories, had been built 20 years earlier; and scientists were toying with subsurface seismic imaging (1917 saw the first seismograph patent by Canadian Reginald Fessenden) to take the “wild” out of “wildcatters” drilling blindly.

Which brings us to 2019: Let’s start by considering a half dozen examples of new or emerging technologies with demand implications similar to the arrival of the automobile, aircraft or aluminum. (Aircraft-grade aluminum was invented in 1909; its global production today consumes more electricity than Texas).

  1. Autonomous cars – Setting aside eager enthusiasts who think robocars are right around the corner, it is nonetheless reasonable to forecast that the safety, reliability and cost challenges will be conquered in due course. Affordable robocars will then bring an end to mass transit as we know it – why take a bus or subway if a robocar that takes you door-to-door were cost competitive? Since fuel use per passenger mile is far lower with buses and trains, autonomous mobility will increase total road-miles. Studies suggesting robocars will lower energy use unrealistically assume that citizens will choose to share a small-sized vehicle that travels at slow speeds. And most analysts ignore another non-trivial feature of autonomy: i.e., the energy needed to power the silicon ‘brains’ of the robocar. In an all-robocar future, this last factor alone will lead to fuel use equivalent to that used by all cars in California today.
  2. Hyperscale datacenters – Global computing already consumes twice as much electricity as does the entire country of Japan, and we’re still in early days of the computing age. Next comes the vastly more expansive, third era of computing characterized by energy-hungry artificial intelligence, virtual and augmented reality, all anchored in thousands of hyperscale datacenters (there are already hundreds of them), each covering more land that a dozen football fields, each inhaling 50 to 100 MW. The claim that computing will become efficient enough to offset this trend gets it precisely backwards: it is the astonishing improvement in efficiency that has driven, and will continue to drive, massive growth in data traffic. (See here for more on this delicious so-called “Jevon’s Paradox.”)
  3. 3D printers – 3D printers offer entirely new ways to both design and fabricate products of every kind; they will unleash an era of mass customization comparable in impact to the dawn of mass production. While 3D printers are energy-intensive — printing a plastic or metal object uses more energy per pound compared to conventional processes — their value lies in enabling designs or products that are impossible to fabricate conventionally, while adding flexibility as well as proximity to the end-user. 3D printers will become more energy-efficient, but one should expect that the ease of local, on-site and hyper-personalized fabrication will inspire “profligate” consumption.
  4. Magic and meta-materials – The advent of new classes of materials – e.g., graphene, carbon nanotubes, and meta-materials enabling such bizarre features as literal invisibility – together with the emergence of bio-electronics presage truly remarkable, seemingly magical kinds of products and services. But complex and exotic future materials invariably require more energy to fabricate. The materials that are used to build today’s digital infrastructures typically require 1,000 times more energy per pound to fabricate compared to the kinds of materials (steel, plastic, etc.) that dominated the industrial economies of the 19th and 20th centuries. Fabricating meta-materials will follow the same trajectory. Similarly, in due course, the energy needed to manufacture bio-electronics will match that of today’s silicon electronics industry.
  5. Air taxis – More than a dozen small companies, as well as large ones like Boeing, Airbus and Aston Martin and tech companies like Uber, are developing practical passenger ‘drones’. One need no longer engage in cartoonish “Jetson” fantasies to imagine that air taxis are coming. For such a vehicle, the challenge has always been weight; emerging and conceivable ‘magic’ materials provide the needed breakthrough. GPS-controlled and, likely, auto-piloted, ‘fail-safe’ air taxis will offer one of the few ways to significantly relieve urban congestion. But rather than fighting traffic, air taxis must fight gravity which unavoidably leads to far greater energy use per urban mile. But who doubts that, at the right fare, there will be explosive demand for a 10-minute air rides to airports instead of 65 minutes on clogged roads.
  6. Robots – We no longer have to wonder if anthropomorphic robots are merely Hollywood fictions, just watch any Boston Dynamics video. Although the world must yet await the equivalent of a Model A (a general-purpose robot), we will soon see the proliferation of special-purpose robots like the wheeled last-mile delivery bots both UPS and FedEx are developing. But the path to walking automatons is now clear, even if it still seems fanciful, with applications first in hazardous environments, rescue, industries of all kinds, hospitals, and then, eventually, our homes. Like cars and computers, robots are extremely complex and energy-intensive to fabricate. They’ll also, necessarily, consume fuel to operate. The artificial ‘muscles’ in robots require some 10 times more energy than the efficient biology powering humans. So, in some not-so-distant future when the market penetration of robots is the same as cars circa 1919 — one per 10 people — the energy consumed by those robots will likely rival the energy value in the food used to feed all humans.

The point of all the above? Today’s forecasts of slowing, even “peak” growth in energy use typically assume a future world that ignores the impact of new energy demands from new technologies.

Now, turning to the supply side of the energy equation: Since the world will need hydrocarbons for a long time yet, and because most forecasts focus on the future of alternative energy, let’s instead consider a half-dozen emerging technologies that might have impacts on hydrocarbon supply equivalent to the development of the Hughes drill bit, seismic imaging, or the offshore platform circa 1919.

  1. Robots – The oil and gas industry has, since founding, been hardware-centric with continual and often dramatically consequential advances in the mechanical “arts,” from improvements to the original Hughes drill bit to developing hydraulic fracturing (the latter of course, unlocking shale hydrocarbons). The next leap comes from automating the mechanical tasks, including fully automated drilling. Similarly, oil processing systems that can operate autonomously on the ocean-floor will expand the territory for hydrocarbon production as much as did the development of off-shore drilling from the ocean surface one century ago. For a peek at the autonomous future, check out Houston Mechatronics’ Aquanaut, the Tesla of the subsurface. The tetherless, autonomous and artificial-intelligence-driven Aquanaut is the kind of technology that will not only lower deep-water operating costs, but also enable entirely new business models.
  2. Amazon Effect – Artificial intelligence (AI) is the computing mega-trend of the 21st The Amazon effect could also be termed the Uber effect; the use of information platforms to radically improve operational efficacy in ways that traditional players failed to do. In retail domains, market disruption began before e-commerce captured 2 percent of all sales. The multi-trillion-dollar oil & gas industry is far more complex, and one of the least digitalized global businesses; thus there is vast untapped potential to see true game-changers. (Full disclosure: our venture fund is focused on this domain.) The emergence of practical AI in oil & gas will be as consequential as the development of seismic imaging a century ago.
  3. Subsurface CAT-scans – Creating high-resolution images of subsurface features is one of science’s great challenges. The complexity and volume of subsurface geology is challenging enough, and is complicated by the physics impediments to ‘seeing’ through earth and rock. As much as seismic imaging has improved and propelled discovery for a century, it remains nearly as much art as science. But, as with so many other domains, breakthroughs now emerge from better and cheaper sensors which will generate astronomically more data. In combination with low-cost supercomputing-power to separate the signal from the noise, coming next is ‘synthetic’ high-resolution sub-surface imaging.
  4. Hughes’ Bit 2.0 –The invention of the original (1919) Hughes’ drill bit immediately increased drilling speed through rock by six-fold (thereby reducing drilling costs at that time by 75%). Since then, descendant improvements have seen drilling speedscontinue to increase. Computationally-designed alloys and chemicals that lead to tougher drill bits and superior well-boring fluids (that lubricate and carry away crushed rock) will continue that trend. And then, soon to see commercialization are rock-drilling high-power lasers pioneered by Foro which, like Hughes, is a U.S. company. Lasers will open a path to a Hughes-like jump in drilling speed and concomitantly radically reduce the power needed to drill.
  5. Computationally-engineered catalysts – Oil and gas wells, especially those in shale rock, produce both gaseous and liquid hydrocarbons, but rarely in the ratio most useful to markets. In fact, there is often so much co-production of natural gas (in pursuit of oil) that gas becomes negatively priced. If – or when – the emerging field of computational chemistry produces a catalyst that can inexpensively convert that gas to a liquid, oil supplies will balloon and prices will fall (again).
  6. Oil-eating superbugs – Finally, we should consider an advancement in environmental safety and the “social license” for the oil industry. Genetic engineering (“synthetic biology”) may yet produce hyper-efficient, biologically-safe petroleum ‘eating’ superbugs that can rapidly digest and render oil spills harmless.

One thing we know for certain about the future: technology will continue to advance. And we also know that technologies that lower the cost of hydrocarbon production will continue the pattern established by the shale-tech revolution: more and cheaper hydrocarbons “raise the bar” for competing energy forms. Of course evolving technology will also yield cost reductions in all other competing energy forms. The outcome is precisely what the world’s growing economies need: low-cost, abundant energy.

Source: https://www.forbes.com/sites/markpmills/2019/05/28/imagining-how-technology-will-disrupt-future-energy-markets/

Technology, lifestyles are changing how we define where we live

What makes a house or an apartment a home?

For some of us, home is a walk-up apartment that we share with a roommate or two. For others, it might be a center-hall house on a leafy suburban street, or a modern glass box overlooking the sea. The variations are endless. The only real universal feature is a roof over your head; everything else that distinguishes a home from mere shelter is different for each of us.

And evolving technology and lifestyles are changing what we want our homes to be.

“With so many entertainment and smart technology options at our fingertips, we find homeowners are spending more time at home. People are focusing on how they truly use a space to reflect how they live, versus what the room is ‘supposed to be,’” says Kerrie Kelly, an interior design expert for the online real-estate marketplace Zillow.

For instance, she notes, dining rooms are no longer just a place to eat. “Adults work from this space and kids do homework here, making a single-use room more multipurpose,” Kelly says. “We also see ‘library rooms’ in lieu of formal dining rooms, with more attention to comfortable seating for taking in a variety of media. And lastly, the laundry room isn’t just for washing clothes any more. Pet-washing stations are popping up more frequently instead of laundry tubs.”

For city dwellers, she’s noticed an increase in conversions of loft-like work spaces into living spaces.

The retailer Ikea surveyed people across the globe for its 2018 “Life at Home” report, and found that 1 in 4 respondents said they work more from home than ever before. Nearly 2 in 3 said they’d rather live in a small home in a great location than in a big home in a less ideal spot.

Jeffrey Dungan, an international architect based in Mountain Brook, Ala., reports that more clients want to use their homes for creative pursuits.

“There’s this idea that with the increasing popularity of the Maker movement, and people turning hobbies into successful businesses — whether it’s a side hustle or primary income — the home is more and more becoming a place of business,” he says. “Home is the place where you can do what you love unapologetically, and as more people turn what they love to do into a business, then in a way their business becomes home.”

A survey by the home- furnishings retailer Article in 2018 asked people what it took for them to finally call a dwelling a home. Many responders said it takes a couple of holidays, barbecues, family visits, big sporting events and game nights before they really feel “at home.” So feather the proverbial nest however you like, and have fun while you do it. Then invite somebody over.

Source: https://www.staradvertiser.com/2019/05/25/features/technology-lifestyles-are-changing-how-we-define-where-we-live/

Find The Human-Technology Balance To Champion Your Customers

It doesn’t take long today to come across headlines focusing on the rise of artificial intelligence (AI) and automation and its potential impact on jobs in the finance industry. In fact, a study from McKinsey finds that by 2030, up to an estimated 800 million jobs could be lost worldwide to automation, with data-centric roles like accounting being the most susceptible. Technology is introducing new tensions as well as opportunities for the finance industry to transform the way business is done.

While the industry navigates through disruption, I believe there is a way to strike a balance between adopting innovation and maintaining a human approach to doing business. Rather than resisting technological change, companies can control how they respond and take advantage of it in a way that doesn’t lose sight of the customer. For companies that are unsure of how to approach these changes without risking customer backlash, there are several things to consider to ensure these values coexist. 

Use tech to improve human relationships.

Today, companies need to evaluate how they can adopt technological innovations, while ensuring that doing so is in the name of solving real human problems. Making a conscious choice to balance the promise of technology with human benefits also allows organizations to take a more strategic approach to running a business. For instance, customer service is seen as a key area of focus for many companies in financial services. Adobe’s 2018 Digital Trends in Financial Services report found that 36% of those surveyed in the financial services and insurance sectors said customer experience is the main way they’ll differentiate from competitors over the next five years.

The importance of customer service to financial organizations shows exactly how the human-technology balance can help you come out ahead. Take the example of customer service agents using AI and machine learning to respond to customer requests. These technologies can allow them to quickly identify customer needs and spend more time solving problems and maintaining a human touch. Before these technologies were created, many agents struggled with the volume of requests or understanding issues at hand and lost critical time addressing this instead of customer concerns. By enhancing the human elements with their customers through technology, companies can strategically approach areas that impact their bottom line.

Refocus on becoming people-centric.

While many businesses consider themselves customer-centric, I challenge them to become people-centric. This means putting people first — including not just your customers, but employees — and placing them at the heart of your company’s mission and strategy. When people are placed at the center of your organization, adopting innovations will always focus on solving real human problems. 

This is the approach my company takes as it delivers accounting industry technology, while partnering with accountants. Accounting is often viewed as an industry that will lose jobs due to advances in AI and machine learning. And while new technology is changing the way accountants work, these developments are also evolving the role of accountants and providing new opportunities for them and their clients to thrive. This is the true power of “human technology” — taking advantage of the latest technological advances, while still prioritizing the unique things that only human interactions can deliver. 

YOU MAY ALSO LIKE

Find ways for tech to complement what you do.

When faced with disruption, technology does not have to be the only solution — instead, organizations should uncover ways that tech complements what humans do. James Bessen outlines a clear example of this in his book “Learning by Doing: The Real Connection between Innovation, Wages, and Wealth.” In the mid-1990s, when ATMs started becoming widespread in the United States, many people assumed this would be the end of the bank teller job. However, Bessen shows that was far from the case. Even though people could use ATMs to deposit a check or withdraw cash without the help of a human, the number of teller jobs actually increased. 

Bessen explains that the rise of ATMs cut the average number of bank tellers from about 21 down to 13 per branch, so it became cheaper to operate a branch and open up more of them. Soon after, the demand for bank tellers increased, signaling how this labor-saving technology was actually creating more jobs. Aside from banking, this has happened across other industries — including the rise of e-discovery software in the legal profession and scanning technology in grocery store cash registers — where new technology comes in and seemingly threatens the human aspect in these jobs. However, it’s far from the case, and these examples show that technology can, in fact, enrich the human elements, which is especially relevant in today’s age of AI and machine learning. 

Advances in technology will continue to rewrite the rules for how business is done across all industries. Faced with these changes, companies should choose to champion their people and strike a balance between the human and technology elements that will let their business flourish in new ways.  

Source:https://www.forbes.com/sites/forbesfinancecouncil/2019/05/22/find-the-human-technology-balance-to-champion-your-customers/

Businesses and universities team up on a new digital technology credential

Mike Fasil had much to celebrate when he graduated Friday alongside thousands of others from George Mason University.

The son of Ethiopian immigrants and the first in his family to go to college, the 21-year-old from Northern Virginia received a bachelor’s degree in information systems and operations management. He minored in an increasingly popular subject, data analysis, and lined up a job as a business technology analyst.

What also sets his résumé apart is a digital technology credential he earned from George Mason that educators say will soon be offered in several universities in the District and Virginia.

This new marker of achievement reflects growing demand from employers for graduates with fluency in core tech subjects, no matter what their major. It also shows the business community’s deep ties to higher education — a relationship educators and executives insist will not compromise academic quality or independence.

Designed with unusually detailed guidance from major businesses in the Washington region, the digital tech credential aims to certify that graduates have knowledge and skills in fields such as statistics, data visualization and cybersecurity.

“It’s definitely something I’ll be able to have on my belt,” Fasil said. “I have much more exposure in fields I would not have even touched. That is very helpful for me.”

The credential program debuted this year at George Mason and Virginia Commonwealth universities. American University, the University of Richmond and Virginia Tech plan to launch comparable programs in the fall, and more schools may follow.[

The credential is outside of higher-education tradition: It is neither a major, nor a minor, nor a formal certificate. It is, rather, a recognition that students have taken a short sequence of courses (five at GMU) that cover knowledge and skills in high demand.

The courses will vary from school to school.

To help universities select them, business leaders drew up a list of 41 skills they look for in a job candidate with general fluency in digital technology. For example, they want graduates who can:

●Demonstrate how data can be used to reduce uncertainty and risk in decision-making.

●Show knowledge of probability and standard statistical distributions.

●Use a computer application to manage large amounts of information.

●Visualize data using displays including tables, dashboards, graphs, maps and trees.

●Identify data situations vulnerable to insider threats.

The wish list underscores the huge appetite for digital-savvy workers.

“I have been struck by how universal the need is,” said Paul Feeko, a partner at EY. He noted his professional-services firm (known to many as Ernst & Young) worked on the project with a variety of businesses, from defense contractor Northrop Grumman to financial company Capital One.

“How different are we?” he said. “And yet when we talked about our needs, they were so similar, and similarly pervasive.”

Interest in digital technology has exploded in recent years on college campuses. Data science has become one of the hottest subjects for undergraduate and master’s students. Students are also flocking to computer science, computer engineering and majors related to analytics, cybersecurity, information systems and many other tech fields. Employers are hiring those kinds of graduates at a rapid pace in a prosperous economy.

But business leaders are thinking beyond bachelor’s degrees. They want all kinds of graduates to have digital skills. And many want a standardized credential to represent those skills.

“Employers are saying, ‘We’re not going to leave it vague,’ ” said Chauncy Lennon, vice president for the future of learning and work at the Lumina Foundation, based in Indiana, which promotes expansion of learning opportunities beyond high school. “We want this specific credential that’s clearly definite.”

The idea for the digital technology credential grew out of a nonprofit business-university collaboration announced last year, with backing from the Greater Washington Partnership, a civic group. Among the 13 educational institutions involved are the public flagship Universities of Virginia and Maryland, and the private Georgetown, George Washington, Johns Hopkins and Howard universities.

On the business side are 14 companies, including an Amazon subsidiary called Amazon Web Services. (Amazon founder and chief executive Jeff Bezos owns The Washington Post.)

From Richmond to Baltimore, businesses and universities share the goal of developing a tech-savvy workforce to expand the region’s economy. Top business executives started to meet last year with university presidents and provosts. Northrop Grumman hosted a key early meeting in April 2018 at its headquarters in Fairfax County.

“Most of them had never sat down with each other,” Peter L. Scher, chairman of the Mid-Atlantic region for JPMorgan Chase & Co., said. “We saw a lot of commonality.”

Businesses crave more graduates with problem-solving skills who can navigate the technical and ethical challenges of the digital economy. Universities want to make sure they are helping to meet the job needs of the region.

But getting them all to work together — within and across sectors — is a somewhat novel idea.

“Our instinct as universities is to seek differentiation — to compete with one another,” George Mason President Ángel Cabrera said. “It’s clear that in many, many areas, we would be better off by collaborating.”

Cabrera and other university leaders insist they are not ceding control of the academic enterprise to big business. They said they are merely learning more about what employers need so they can offer relevant programs to students.

“We’re not ashamed of our goal to help students be successful professionally,” Cabrera said.

GMU wants “well-rounded scholars,” he said, with a liberal-arts background and high career potential. “If you believe that, then working with the private sector to know exactly what is needed is the smartest thing you can do.”[

Brian K. Fitzgerald, chief executive of the Business-Higher Education Forum, a workforce-development group based in Washington, said companies are not “dictating the curriculum.” Instead, they are sending “a very strong signal” about the workforce they need.

“What we’re really talking about is what’s the definition of a literate person in the 21st century,” Fitzgerald said. “There is definitely a digital component to that.”

At GMU, the digital technology credential is just getting off the ground. Fasil is one of four graduates this spring who completed it. The university said students who join the program will receive opportunities for job shadowing and mentoring, priority for internships and “guaranteed résumé review for open positions” with participating businesses. The credential does not show up yet on transcripts, the university said, but it will be visible as a “badge” through an online site that verifies documents related to education attainment.

Hannah Licea, another graduate who earned the credential, majored in psychology at GMU. The 21-year-old from Houston is pondering a career as a business consultant. When she heard about the digital technology credential, she signed up for a cybersecurity course to satisfy a requirement. It became one of her favorite classes.

Licea said she is more interested in using her skills than in talking up “every little credential or certificate I receive.” Learning about digital technology will pay off in the long run, she said. “This is something I can use at any point in my career, not just for my first job after graduation.”

Source: https://www.washingtonpost.com/local/education/businesses-and-universities-team-up-on-a-new-digital-technology-credential/2019/05/19/f7152632-726a-11e9-9f06-5fc2ee80027a_story.html?noredirect=on&utm_term=.02cec8a42f8d

TECHNOLOGY THAT COULD END HUMANITY—AND HOW TO STOP IT

In his 1798 An Essay on the Principle of Population, Thomas Malthus predicted that the world’s population growth would outpace food production, leading to global famine and mass starvation. That hasn’t happened yet. But a reportfrom the World Resources Institute last year predicts that food producers will need to supply 56 percent more calories by 2050 to meet the demands of a growing population.

It turns out some of the same farming techniques that staved off a Malthusian catastrophe also led to soil erosion and contributed to climate change, which in turn contributes to drought and other challenges for farmers. Feeding the world without deepening the climate crisis will require new technological breakthroughs.

This situation illustrates the push and pull effect of new technologies. Humanity solves one problem, but the unintended side effects of the solution create new ones. Thus far civilization has stayed one step ahead of its problems. But philosopher Nick Bostrom worries we might not always be so lucky.

If you’ve heard of Bostrom, it’s probably for his 2003 “simulation argument” paper which, along with The Matrix, made the question of whether we might all be living in a computer simulation into a popular topic for dorm room conversations and Elon Musk interviews. But since founding the Future of Humanity Institute at the University of Oxford in 2005, Bostrom has been focused on a decidedly more grim field of speculation: existential risks to humanity. In his 2014 book Superintelligence, Bostrom sounded an alarm about the risks of artificial intelligence. His latest paper, The Vulnerable World Hypothesis, widens the lens to look at other ways technology could ultimately devastate civilization, and how humanity might try to avoid that fate. But his vision of a totalitarian future shows why the cure might be worse than the cause.

WIRED: What is the vulnerable world hypothesis?

Nick Bostrom: It’s the idea that we could picture the history of human creativity as the process of extracting balls from a giant urn. These balls represent different ideas, technologies, and methods that we have discovered throughout history. By now we have extracted a great many of these and for the most part they have been beneficial. They are white balls. Some have been mixed blessings, gray balls of various shades. But what we haven’t seen is a black ball, some technology that by default devastates the civilization that discovers it. The vulnerable world hypothesis is that there is some black ball in the urn, that there is some level of technology at which civilization gets decimated by default.

WIRED: What might be an example of a “black ball?”

“The vulnerable world hypothesis is that there is some black ball in the urn, that there is some level of technology at which civilization gets decimated by default.”

NICK BOSTROM

NB: It looks like we will one day democratize the ability to create weapons of mass destruction using synthetic biology. But there isn’t nearly the same kind of security culture in biological sciences as there is nuclear physics and nuclear engineering. After Hiroshima, nuclear scientists realized that what they were doing wasn’t all fun and games and that they needed oversight and a broader sense of responsibility. Many of the physicists who were involved in the Manhattan Project became active in the nuclear disarmament movement and so forth. There isn’t something similar in the bioscience communities. So that’s one area where we could see possible black balls emerging.

WIRED: People have been worried that a suicidal lone wolf might kill the world with a “superbug” at least since Alice Bradley Sheldon’s sci-fi story “The Last Flight of Doctor Ain,” which was published in 1969. What’s new in your paper?

NB: To some extent, the hypothesis is kind of a crystallization of various big ideas that are floating around. I wanted to draw attention to different types of vulnerability. One possibility is that it gets too easy to destroy things, and the world gets destroyed by some evil doer. I call this “easy nukes.” But there are also these other slightly more subtle ways that technology could change the incentives that bad actors face. For example, the “safe first strike scenario,” where it becomes in the interest of some powerful actor like a state to do things that are destructive because they risk being destroyed by a more aggressive actor if they don’t. Another is the “worse global warming” scenario where lots of individually weak actors are incentivized to take actions that individually are quite insignificant but cumulatively create devastating harm to civilization. Cows and fossil fuels look like gray balls so far, but that could change.

“It looks like we will one day democratize the ability to create weapons of mass destruction using synthetic biology.”

NICK BOSTROM

I think what this paper adds is a more systematic way to think about these risks, a categorization of the different approaches to managing these risks and their pros and cons, and the metaphor itself makes it easier to call attention to possibilities that are hard to see.

WIRED: But technological development isn’t as random as pulling balls out of an urn, is it? Governments, universities, corporations, and other institutions decide what research to fund, and the research builds on previous research. It’s not as if research just produces random results in random order.

NB: What’s often hard to predict is, supposing you find the result you’re looking for, what result comes from using that as a stepping stone, what other discoveries might follow from this and what uses might someone put this new information or technology to.

In the paper I have this historical example of when nuclear physicists realized you could split the atom, Leo Szilard realized you could make a chain reaction and make a nuclear bomb. Now we know to make a nuclear explosion requires these difficult and rare materials. We were lucky in that sense.

And though we did avoid nuclear armageddon it looks like a fair amount of luck was involved in that. If you look at the archives from the Cold War it looks like there were many occasions when we drove all the way to the brink. If we’d been slightly less lucky or if we continue in the future to have other Cold Wars or nuclear arms races we might find that nuclear technology was a black ball.

“We might not think the possibility of drawing a black ball outweighs the risks involved in building a surveillance state.”

NICK BOSTROM

If you want to refine the metaphor and make it more realistic you could stipulate that it’s a tubular urn so you’ve got to pull out the balls towards the top of the urn before you can reach the balls further into the urn. You might say that some balls have strings between them so if you get one you get another automatically, you could add various details that would complicate the metaphor but would also incorporate more aspects of our real technological situation. But I think the basic point is best made by the original perhaps oversimplified metaphor of the urn.

WIRED: So is it inevitable that as technology advances, as we continue pulling balls from the urn so to speak, that we’ll eventually draw a black one? Is there anything we can do about that?

NB: I don’t think it’s inevitable. For one, we don’t know if the urn contains any black balls. If we are lucky it doesn’t.

If you want to have a general ability to stabilize civilization in the event that we should pull out the black ball, logically speaking there are four possible things you could do. One would be to stop pulling balls out of the urn. As a general solution, that’s clearly no good. We can’t stop technological development and even if we did, that could be the greatest catastrophe at all. We can choose to deemphasize work on developing more powerful biological weapons. I think that’s clearly a good idea, but that won’t create a general solution.

The second option would be to make sure there are there is nobody who would use technology to do catastrophic evil even if they had access to it. That also looks like a limited solution because realistically you couldn’t get rid of every person who would use a destructive technology. So that leaves two other options. One is to develop the capacity for extremely effective preventive policing, to surveil populations in real time so if someone began using a black ball technology they could be intercepted and stopped. That has many risks and problems as well if you’re talking about an intrusive surveillance scheme, but we can discuss that further. Just to put everything on the map, the fourth possibility would be effective ways of solving global coordination problems, some sort of global governance capability that would prevent great power wars, arms races, and destruction of the global commons.

WIRED: That sounds dystopian. And wouldn’t that sort of one-world government/surveillance state be the exact sort of thing that would motivate someone to try to destroy the world?

NB: It’s not like I’m gung-ho about living under surveillance, or that I’m blind about the ways that could be misused. In the discussion about the preventive policing, I have a little vignette where everyone has a kind of necklace with cameras. I called it a “freedom tag.” It sounds Orwellian on purpose. I wanted to make sure that everybody would be vividly aware of the obvious potential for misuse. I’m not sure every reader got the sense of irony. The vulnerable world hypothesis should be just one consideration among many other considerations. We might not think the possibility of drawing a black ball outweighs the risks involved in building a surveillance state. The paper is not an attempt to make an all things considered assessment about these policy issues.

WIRED: What if instead of focusing on general solutions that attempt to deal with any potential black ball we instead tried to deal with black balls on a case by case basis?

NB: If I were advising a policymaker on what to do first, it would be to take action on specific issues. It would be a lot more feasible and cheaper and less intrusive than these general things. To use biotechnology as an example, there might be specific interventions in the field. For example, perhaps instead of every DNA synthesis research group having their own equipment, maybe DNA synthesis could be structured as a service, where there would be, say, four or five providers, and each research team would send their materials to one of those providers. Then if something really horrific one day did emerge from the urn there would be four or five choke points where you could intervene. Or maybe you could have increased background checks for people working with synthetic biology. That would be the first place I would look if I wanted to translate any of these ideas into practical action.

But if one is looking philosophically at the future of humanity, it’s helpful to have these conceptual tools to allow one to look at these broader structural properties. Many people read the paper and agree with the diagnosis of the problem and then don’t really like the possible remedies. But I’m waiting to hear some better alternatives about how one would better deal with black balls.

Source:https://www.wired.com/story/technology-could-end-humanity-how-stop-it/

Prepare Yourself For The Shock Of Mass Implantable Brain Technology

One of the most controversial narratives of our time will be discussion around identity and intention, that is who is it actually doing or thinking whatever it is you may be witnessing and why. This disruptive shift will be about discerning between human intelligence, artificial intelligence, hybrids-of-sorts and the types of parameters with which to best frame each category. Get ready because we are all about to experience things of which we previously only dreamed. However, the advent of some will be so disconcerting, so questionable that many will wish that such visions remained safely within a fantasy realm.

Our whole world is about to change right before our eyes at a pace at which most none of us voted, few will be early participants and all should remain extremely vigilant. Artificial Intelligence may be one thing, but the advent of actual implantable technology into the brain is a completely different phenomenon.

The new film I Am Human directed and produced by two women, Taryn Southern and Elena Gaby, debuted the Tribeca Film Festival and is an important new addition pertaining to vital discussions around the delicate intersection of the brain and its actual augmentation by technology that is anticipated to take place in a mass manner.

Typically relegated to academic environments, such topics around the brain-tech intersection started to be discussed in a slight bit more pedestrian manner starting back in 2013 when President Obama announced across major media that $100 million in funding would be targeted for an unprecedented neuroscience initiative intended to reconstruct the activity of every single neuron as it fired simultaneously in different brain circuits.

Shortly thereafter, a project known as BigBrain, a collaboration between researchers in Europe and Canada, mapped the human brain with massive precision. Fast forward a bit to today, and we find Elon Musk hard at work trying to link brains to software via chip implant to create something like a situation a la Johnny Depp’s character in the film Transcendence.

And just is touching on only a few jolting highlights in the arena.

Somehow making your brain subject to, paired with, integrated into technology is becoming a completely normal thing to ponder – at least for some. Indeed Southern and Gaby note that several hundred thousand people in the world already have plantable technology in their brains but that by the year 2029 this number is expected to triple due to a move from merely academic to general usage.

The first wave of evolution is expected to offer healing-of-sorts for various individuals such as those profiled in the film with Parkinson’s Disease, paralysis, blindness and more. The next wave is more about general usage.

Of course, who would deny any person suffering from neurological disorders the ability to possess a better quality of life through brain implants? But when such technology is beginning to be touted via interviews in this documentary as that which will be able to help you jump higher, run faster, rid oneself of this habit or that, or that annoying personality trait or another via programming, we could be teetering on some very shaky moral and spiritual ground.

Though the film is positioned more or less as a cheerleader for such technology improving lives, I Am Human only touches on ethics and morality about its subject as more of an obligatory footnote rather than the true deep dive that seems to be mandatory.

Brief in nature is the few minute segment of the film during which a professor addresses a class on neuroethics. She asks the class at what point do such actions around the brain and implantable technology become problematic?  Who is it who will have access to such procedures first? She encourages us all to ponder new questions in society such as what type of legal protection will you want for your brain when you will be able to, essentially, connect it to some type of technology that could be compromised at a push of an upload or download. Data privacy takes on an entirely new meaning when brain hacking on a new and exponential level could be just around the corner.

Indeed, during the Q&A after the film’s premiere, one of the filmmakers excitedly talked about her experience with even just a removable brain interface where she learned such things as about herself at the moment via computer like, well, that she was, ummm, drowsy. But do we really need technology to tell us this? Do we miss the important part of developing a better mind-body connection without such aids or learning so much about our emotions, understanding ourselves and our bodies organically  -which is vital to what many say is essential to both spiritual and emotional maturity  – by taking such short cuts? Or does such tech access provide support to those who do not want to or cannot ever come to such realizations?

The issue is that so much of the answer to the above is gray, dependent upon each individual and has no real precedent from which to make a  one-size-fits-all decision. Further, none of us should be so sure that those purporting to mind the brain interfacing/transplant store, such as those seen in the film, are nearly qualified enough to make decisions for a very, very fragmented society where various agendas lurk around every corner of culture.

As the interest in brain implant technology quickly shifts from medical to mainstream, watch for debates to become very heated. The issue is, perhaps, not so much about the capabilities that this technology affords as it is that such advances now present themselves so quickly that we are not afforded the time to truly test, evaluate and reflect in order to make decisions before blasting for take-off. This is what is demanded and is, fortunately or unfortunately, still very much a human responsibility.

Source:https://www.forbes.com/sites/laurencoleman/2019/05/12/prepare-yourself-for-the-shock-of-mass-implantable-brain-technology/#77481c782407

New technology could help keep kids safe at school

LAS VEGAS (KTNV) — A technology company in Las Vegas thinks a communications system being marketed to large hotels and hospitals could help keep kids safe.

This new tech uses Bluetooth beacons on an app in emergencies to send messages while tracking the locations of employees – as the tracking feature is activated when a threat has been declared.

Employees and possibly students are then asked to respond with whether they are safe or not.

“You would probably test it in a single school to start with and see what challenges we have if any,” said CEO of Technovation Solutions Peg McGregor. 

She also said the technology recently qualified for a federal program that lowers the cost for schools interested in the system.

Source:https://www.ktnv.com/news/new-technology-could-help-keep-kids-safe-at-school

Looking for a moon vacation? Billions being invested in burgeoning space tourism industry

Neil deGrasse Tyson, director of the Hayden Planetarium, famously declared that Earth’s first trillionaire will be a space miner. There’s gold in them thar asteroids.

Perhaps, but the first big money to be made in space will more likely come from tourism, says John Spencer, who founded the Space Tourism Society and designed elements of the International Space Station for NASA.

Commercial space exploration is already attracting vast amounts of capital, according to space-analytics company Bryce Space and Technology, which reported that space startups received $3.2 billion in investment in 2018 and $22 billion since 2000.

A portion of that money has bolstered the growing space-tourism industry.

Potential Options for a Space Tourist

Read More

The industry arguably took off in 2001, when Los Angeles businessman Dennis Tito became the first space tourist, paying $20 million to join a Russian cosmonaut crew on a Soyuz rocket to the International Space Station. Since then, six others have followed, paying to sojourn at the ISS.

Outside the Space Station, which no longer accepts tourists, an industry providing people a suborbital taste of what it’s like to be in space has flourished. Zero Gravity Corp., for example, allows people to feel as though they’re floating in zero gravity by flying a modified Boeing 727 in a parabolic arc, and a number of companies are gearing up to carry passengers to various heights above Earth’s surface, including a hot-air balloon that would provide a view of Earth’s curvature from 12 miles high, and Blue Origin’s passenger rocket, which would provide even more expansive views from 66 miles up — high enough for passengers to be considered astronauts.

And last fall, Elon Musk announced that a passenger had booked a trip on a rocket aiming even higher: the moon.

Japanese billionaire Yusaku Maezawa purchased seats for a one-week journey around the moon with up to eight artists on SpaceX’s as-yet-unbuilt Big Falcon Rocket. The tentative launch date for Earth’s first moon tourists is 2023.

Once journeys such as Maezawa’s become commonplace, Spencer imagines tourism in space will eventually evolve to resemble tourism on the ocean.

Think of it: the cruises, the hotels, the adventure sports and high-end yachts, each of which would require the staff of its Earth-bound counterpart, plus the support staff needed for space.

“Eventually, there will be a lot of jobs in space,” Spencer said. “We need people who cook and clean; we need a space guard service.” He foresees sports such as dune-buggy racing or low-gravity basketball bringing television crews, crowds and the corresponding infrastructure.

Clint Wallington, a hospitality professor who worked with the International Space Station’s education-outreach program and once ran Rochester Institute of Technology’s space-tourism course, agreed that space tourism would create a lot of jobs.

“You start to see how to put together a full hotel support,” he said. And then, he added, you would need people who know how to respond if equipment malfunctions or a passenger gets injured.

Wallington predicted that the type of infrastructure that would allow tourists to visit the moon or spend a few weeks on a space station is decades, if not a century, out.

But he remembers the excitement of the original space race, and does not discount what humanity can accomplish when it becomes captivated by an idea.

“If something becomes super-popular,” he said, “interesting things can happen in a short time.”

Source:https://www.houstonchronicle.com/local/space/mission-moon/article/Looking-for-a-moon-vacation-Billions-being-13818354.php