An artificial intelligence has been asked to create an image of what death looks like, and the results are simply stunning.
The artificial intelligence (AI) that was asked to create the images seen in the above video is called MidJourney, which was created by David Holtz, co-founder of Leap Motion, and is currently run by a small self-funded team that has several well-known advisors such as Jim Keller, known for his work at AMD, Apple, Tesla, and Intel, Nat Friedman, the CEO of Github, and Bill Warner, the founder of Avid Technology and inventor of nonlinear video editing.
MidJourney is an incredible piece of technology, and it recently went into open beta, which means anyone can try it by simply heading over to its dedicated Discord server. Users can enter “/imagine“, followed by a text prompt of what they want the AI to produce. Users have been testing the AI’s capabilities by entering descriptive words such as HD, hyper-realistic, 4K, wallpaper, and more. All of which work perfectly.
As for the predictive capability of MidJourney, none of the images seen in this article or any other source should be taken as a prediction. MidJourney was created to expand the human species’ imaginative power, not predictions.
Using MidJourney’s image generation algorithms, users are able to create ultra-realistic images of whatever they wish. The possibilities are truly endless, and with accurate text inputs, you can create wallpaper-worthy images. I tested the AI and created several images that are now being used as wallpapers, but what was more impressive was what the other users in the Discord were making. Below are some examples of what I found and what the user inputted into the AI to get the result.
Use MidJourney AI here.
VIEW GALLERY – 6 IMAGES
– A detailed futuristic soldier portrait gas mask, slightly visible shoulders, explosion in background
– A detailed oil painting of final fantasy XIII versus battle of light and darkness
-universe
– A young boy sleeping on a mat, smiling at the camera, big brown eyes, hyper realistic, 4K, very clear
A study on the neurons titled “Nanosecond protonic programmable resistors for analog deep learning“has been published in the journal Science.
Researchers from the Massachusetts Institute of Technology (MIT) have created new artificial “neurons“and”synapse“that exist within a new field of artificial intelligence called analog deep learning. Instead of using transistors like in digital processors, analog deep learning uses programmable resistors to “create a network of analog artificial ‘neurons’ and ‘synapses’“that can exceed the performance of a digital neural network, while using a fraction of the energy.
The MIT team’s artificial neurons and synapses are built using a new inorganic material in their fabrication process, increasing the performance of devices using them to one million times faster than previous iterations and one million times faster than the synapses found in the human brain. The new material can also be used with existing silicon fabrication techniques, meaning it can be used to create nanometer-scale devices and potentially integrate the technology with existing computing hardware to facilitate deep-learning applications.
“Once you have an analog processor, you will no longer be training networks everyone else is working on. You will be training networks with unprecedented complexities that no one else can afford to, and therefore vastly outperform them all. In other words, this is not a faster car, this is a spacecraft,” said lead author and MIT postdoc Murat Onen.
“The speed certainly was surprising. Normally, we would not apply such extreme fields across devices, in order to not turn them into ash. But instead, protons ended up shuttling at immense speeds across the device stack, specifically a million times faster compared to what we had before. And this movement doesn’t damage anything, thanks to the small size and low mass of protons. It is almost like teleporting. The nanosecond timescale means we are close to the ballistic or even quantum tunneling regime for the proton, under such an extreme field.,” said senior author Ju Li, the Battelle Energy Alliance Professor of Nuclear Science and Engineering and professor of materials science and engineering.
You can read more from the study hereand from MIT’s breakdown here.
A viral artificial intelligence has been asked to produce an original image of the most closely held secret on Earth.
an artificial intelligence formerly called DALL-E, and currently referred to as Craiyon, has been asked to showcase what it believes to be the most closely held secret on Earth. The artificial intelligence uses the “DALL-E mini” model, which was trained by Boris Dayma and Pedro Cuenca using Google Cloud Servers. The AI is capable of producing original images of whatever a user enters into its text prompt box.
The public can enter any question or phrase they like, and the artificial intelligence will usually spend less than a minute producing a set of images that will show a visual representation of the text entered into the box. While the AI doesn’t hold any predictive value for future events, it still can produce incredibly interesting images based on simple text requests. Try the artificial intelligence for yourself to test your imagination.
Read more: AI asked to show an image of humanity’s greatest threat
Try Crayon Here.
Jack Connor
Jak joined the TweakTown team in 2017 and has since reviewed 100s of new tech products and kept us informed daily on the latest science and space news. Jak’s love for science, space, and technology, and, more specifically, PC gaming, began at 10 years old. It was the day his father showed him how to play Age of Empires on an old Compaq PC. Ever since that day, Jak fell in love with games and the progression of the technology industry in all its forms. Instead of typical FPS, Jak holds a very special spot in his heart for RTS games.
An Australian Aboriginal language only spoken by a handful of people in the Northern Territory has become the inspiration for a new artificial intelligence system, potentially helping people better communicate with machines.
Key points:
The AI system aims to help humans and robots to better communicate
Researchers developed it drawing on some features of the Aboriginal language, Jingulu
It’s considered an endangered language, with only several elderly speakers remaining
Jingulu is considered an endangered language that’s traditionally spoken in the Northern Territory’s Barkly region.
A study, recently published in the academic journal Frontiers in Physics, suggests it has special characteristics that can easily be translated into commands for artificial Intelligence (AI) swarm systems.
“Maybe one of the most powerful things with Jingulu [is] that it gives us the simplicity and flexibility which we can apply in lots of different applications,” lead researcher at University of New South Wales Canberra, Hussein Abbass, said.
AI swarm systems are used in machines to help them to collaborate with humans and undertake complex tasks than humans command them to do.
Dr Abbass said he stumbled on the Jingulu language by accident, while developing a new communication system.
“When I started looking at the abstract, it didn’t take much time to click in my mind about how suitable it is, for the work I do on artificial intelligence and human AI teaming,” he said.
Language easily translatable into AI commands
Dr Abbass said it was normal for AI researchers to draw on different forms of communication for their work, including other human languages, body language and even music.
However, he said the Jingulu language was especially well-suited to AI because it had only three verbs — ‘go’, come’ and ‘do’ — which meant it could be easily translated into commands.
“The specific AI model that we are working on relies on the very simple concepts of attraction and repulsion, in physics …. and underneath this very simple concept fits the mathematics of our AI,” he said.
“We can apply the ‘go’ and ‘come’ to the attraction and repulsion concepts, in the mathematical model that we have, and the ‘do’, to when there’s no movement in a space.
“The structure of Jingulu matches extremely nicely to the mathematics, and that’s what made it really fascinating, for what we do.”
“I have not encountered another language that has all of these advantages simultaneously, and in alignment with AI.”
‘Unique’ elements of language beneficial for AI
Study co-author and University of Canberra professor in linguistics Eleni Petraki said Jingulu’s flexible sentence structure was also an advantage.
“[In most languages] words appear in a specific order …. in Jingulu however you can split those elements,” she said.
With only several, elderly fluent speakers remaining, Jingulu is considered an endangered language, according to Rachel Nordlinger, a linguistics professor at the University of Melbourne and director of the Research Unit for Indigenous Language.
She said while there were related Aboriginal languages with similar features, Jingulu had some unique characteristics.
“What’s different about Jingulu is that other languages might have small numbers of verbs, but there might be 20 of them or 30 of them, whereas Jingulu has only three,” she said.
“The structure’s similar, but it’s different in having such a small number of verbs, that combine with other words.”
AI system has ‘almost infinite’ applications
The new artificial intelligence system created with Jingulu in mind, JSwarm, was initially developed to help farmers herd sheep, as a language which would allow an app used by farmers to communicate with unmanned aerial vehicles performing the task.
It has not yet been implemented, with its developers still working to secure funding.
However Dr Abbass said the system could potentially be used beyond agriculture in the future, including in areas such as medicine to defence.
“[There are] almost an infinite number of applications,” he said.
Sign of growing interest in Indigenous languages
Dr Abbass said the AI system was the first instance he was aware of in which an Australian Indigenous language had been used “at the interface of human and AI communication”.
“You never know where good ideas will come from, and without keeping our minds open, we won’t be able to innovate,” he said.
Dr Nordlinger said researchers’ use of Jingulu to develop the system was an example of the growing level of interest in Indigenous languages in Australia, both from Indigenous communities themselves and the wider Australian public.
“People are becoming more aware of how fascinating these languages are, but also how endangered they are, and therefore how precious they are,” she said.
“I think [this study] is a sign of the growing interest for sure, and it can be a real positive.
“It can only be a good thing to have more attention and more appreciation of these languages.”
Daniel Goleman has a blunt warning for jobseekers in 2022 and beyond: It’s no longer enough just to be smart.
Dr Goleman, an American author and psychologist, has spent decades touting the importance of ’emotional intelligence’ in the workplace and other areas of life.
And it appears companies and organizations have caught up with him.
“[In the mid-1990s] someone said to me, ‘you know, you can’t use the word emotion in a business context’. Today, it’s very, very different,” he tells ABC RN’s Future Tense.
But what exactly is emotional intelligence or EI? And is it just more work-speak or ‘a must-have skill’ of the future?
What is emotional intelligence?
There are several definitions of emotional intelligence, but it boils down to understanding your emotions, understanding the emotions of those around you, and acting accordingly.
Dr Goleman, who put the term on the map with his 1995 book Emotional Intelligence: Why It Can Matter More Than IQ, says it has four main components.
first-up, self awareness. Or as Dr Goleman puts it: “Knowing what you’re feeling, why you feel it, how it makes you think and want to act, how it shapes your perceptions.” So, for example, being able to label an emotion like anger and know the causes behind it.
The second part is “using that information to manage your emotions, in a positive way. To stay motivated, to stay focused, to be adaptable and agile, instead of rigid and locked in.”
The third part involves connecting with other people’s emotions — practicing empathy. It’s “understanding how someone else feels without them telling you in words, because people don’t tell us in words, they tell us in tone of voice and facial expressions, and so on”.
And finally— relationship management or “putting that all together to have effective relationships.”
Dr Goleman also makes a key point: It’s not simply about being nice.
“There’s a difference between being nice and being kind. And it’s really important to understand. You might be nice just not to create waves and get along — but that doesn’t mean that you’re necessarily helping.”
Why does it matter?
Amol Khadikar is a program manager at the Capgemini Research Institute and is based in India.
“[Emotional intelligence] is increasingly seen as a very valuable thing, and its importance has only increased in the last couple of years,” Mr Khadikar says.
Mr Khadikar and his organization tried to measure this with a survey asking 750 executives and 1,500 non-supervisory employees around the world about emotional intelligence.
It found 74 per cent of executives and 58 per cent of non-supervisory employees believe that EI will become a “must-have” skill.
Mr Khadikar says EI will become more important in the years ahead because of one continuing development — as automation and AI see more manual or routine jobs replaced by machines, jobs involving interpersonal skills will be the dominant jobs of the future.
“We [already] see more and more of a demand for people to have skills which require relationship building, more client-facing work,” he says.
“and [the survey] found that the demand for emotional intelligence skills will multiply on average by about six times within the next three to five years.”
Mr Khadikar and his team also built a financial model to assess a potential upside from investing in emotional intelligence training — looking at outcomes like revenue, costs, productivity and workplace attrition.
“We clearly found that there is, essentially, an upside, we found that an investment of around $3 million in an average organization can potentially result in an incremental gain of about $6.8 million over the next three years… And this was a conservative scenario. “
He also cited a study conducted by French personal care company L’Oreal which found that employees with high EI skills outsold other salespeople on an annual basis by around $91,000, resulting in a net revenue increase of more than $2.5 million.
Backed up with training?
Dr Goleman says when he wrote his book in 1995, there was little, if any data, around the benefits of high emotional intelligence.
“Now we know it’s clear,” Dr Goleman says.
“In the workplace, it turns out that emotionally intelligent workers perform better, they’re more engaged in what they do. Leaders who have emotional intelligence get better productivity out of people, and people like working for them,” he says.
But when it comes to exactly how the concept is embraced, it’s much more of a patchwork.
“Most organizations will espouse some interest in [emotional intelligence] — some do it well, some don’t,” Dr Goleman says.
He says while “I think at [an executive level]many people have the luxury of being coached [on emotional intelligence],” training is not widespread outside executive roles.
It’s a point backed up by Mr Khadikar.
“[In our study] we actually found that only about 17 per cent of organizations conduct emotional intelligence training for their non-supervisory employees and only about 32 per cent do so for the middle management employees,” he says.
And Dr Goleman says at worst, some organizations only pay lip service to the idea: Promoting EI but not practicing it.
“It’s the same as with ‘greenwashing,’ where a company or a spokesperson for a company will say, ‘yes, we do this, we advocate emotional intelligence’ … But if you look at their current track record, you realize it’s BS, it’s not true.”
EI in a post-COVID workplace
The COVID-19 pandemic has disrupted traditional workplaces and as cases spike around Australia, some employers are advising their staff to work from home once again.
So what does emotional intelligence look like in a workplace connected through Teams or Zoom? Or more broadly, in increasingly digitized and fragmented professional environments?
Dr Goleman says workplaces need to make sure one-on-one time still exists, as our emotional wellbeing can take a battering if we’re all totally isolated from one another.
“But one-on-one can be digital too. The idea is that it’s personal, you’re talking to the person about themselves, not just about the task at hand, which tends to happen in group calls,” he says.
“So I think that it’s important to balance the isolation, the specialization that can go on in digital media, with having person-to-person [time] that’s in person or online.”
How do you improve your emotional intelligence?
Dr Goleman says we can all improve our emotional intelligence.
“It’s really about habit change,” he says.
He says the most prevalent manifestation of low emotional intelligence in the workplace is poor listening, so, for example, interrupting people or taking over a conversation too soon.
“If you want to change that, that’s a habit. You’ve practiced it thousands of times.”
Dr Goleman says: “First of all, be mindful that this is a moment I can change. Second, you have to have a different repertoire — a new habit to replace it with. [Then] practice that at every naturally occurring opportunity.”
“When you do that kind of learning, it changes the brain, the circuitry for that behavioral sequence, it takes on the new habit, and you do it automatically after a while,” he says.
“It does take a little work, it takes a little persistence, but our data shows it’s very possible.”
RN in your inbox
Get more stories that go beyond the news cycle with our weekly newsletter.
A study on the physics discovery titled “Automated discovery of fundamental variables hidden in experimental data“has been published in the journal Nature Computational Science.
Researchers from Columbia Engineering have developed a new artificial intelligence (AI) program that could derive the fundamental variables of physics from video footage of physical phenomena. The program analyzed videos of systems like the swinging double pendulum, which researchers already know four “state variables” exist for; the angle and angular velocity of each arm. Within a few hours, the AI determined there were 4.7 variables at play.
“We thought this answer was close enough. Especially since all the AI had access to was raw video footage, without any knowledge of physics or geometry. But we wanted to know what the variables actually were, not just their number,” said Hod Lipson, director of the Creative Machines Lab in the Department of Mechanical Engineering.
Two of the variables it identified correlated with the angles of each arm. However, the other two were unclear, as the program interprets and visualizes the variables differently from how humans intuitively understand them. However, as the AI was making accurate predictions about the system, it is clear it managed to identify four valid variables. The researchers then tested the AI on systems we don’t fully understand, like a lava lamp, and a fireplace, identifying 8 and 24 variables, respectively.
“I always wondered, if we ever met an intelligent alien race, would they have discovered the same physics laws as we have, or might they describe the universe in a different way? Perhaps some phenomena seem enigmatically complex because we are trying to understand them using the wrong set of variables. In the experiments, the number of variables was the same each time the AI restarted, but the specific variables were different each time. So yes, there are alternative ways to describe the universe and it is quite possible that our choices aren’t perfect,” Lipson said.