In case you’re new here, I am a Known NerdTM and therefore spend a lot of time thinking about Nerd ThingsTM. In particular, I spend a lot of time thinking about my technology, and how I can get the best out of it.
One of my all time biggest peeves about smartphone home screens, especially for phones which use a “grid of apps” paradigm like iPhone, Xiaomi, and Oppo, is that any amount of business in a home screen wallpaper makes finding or viewing icons difficult.
There are plenty of ways around this: use a flat color or gradient for your Home Screen wallpaper, find a less busy or less interesting Home Screen wallpaper, etc. My favorite hack for years was to add an acrylic blur to my Home Screen wallpaper. This way I had the same image on my Lock Screen and Home Screen, but the readability was significantly improved. There are tons of tutorials on how to do that manually, as well as multiple apps across both major phone platforms.
However! I was recently messing with a Nothing Phone (2) that I have in my possession, and I saw that their 2.0 software update included a pretty cool wallpaper effect they call atmosphere:
I spent a lot of time griping internally about how I would love this functionality on the objectively better mobile operating system, but then I remembered that I am not totally helpless!
So I’ve decided to build my own iOS/iPadOS/MacOS app that provides a lot of the same functionality. I will call it Mesosphere, because obviously I have to pay homage to the functionality I am blatantly copying inspired by. This feels like a good challenge, and a way to learn some skills I’ve been putting off for a while. So. Here’s part 1!
Tools
In order to build this app, I need to have a set of tools. I’ve chosen the following frameworks and software sets to accomplish my mission:
UX Prototyping: Figma
I have a decent amount of wire framing and prototyping experience in the Mac app Sketch, but Figma being a super powerful and web-based tool is intriguing. Also, Figma is big in the industry as a design tool, so I figured I might as well learn.
Coding Language: Swift/Swift UI
Obviously. I could mess with UIKit and objective-c, but those are old as hell and not necessary with the given state of Apple’s newest language/framework combo. I’ve thought about also producing an android version in the future, in which case I would use Dart/Flutter, but that’s a long way off1.
research
Next up, I needed to figure out how the Nothing OS Atmosphere did its thing. First, like any true idiot who spends a ton of time online, I asked ChatGPT and Google Gemini. ChatGPT was less than helpful, but Gemini did suggest that I use a mesh gradient, which will be helpful later.
Reverse engineering the behavior of the feature was made easier by my choice in wallpaper used in the experiment. In order to provide clear colors to track throughout the process, I used the “Six Retro Stripes” wallpaper by Basic Apple Guy. I dropped the wallpaper on the phone and turned on screen recording.
I used both the light and dark mode versions of the wallpaper, and recorded three transitions. I then went through the videos frame by frame, grabbing what I thought of as key frames along the way where the change in color distribution was significant.
Using the highly advanced tool known as “markup in the iPadOS Photos app”, I marked off each of the starting 7 colors from the wallpapers, and tracked their progression through the process to determine if there was a pattern or logic to the color distribution.
Analysis
Initial analysis was…interesting. Within a few frames Nothing OS had begun to blend and blur the colors together, but also had added an additional color from somewhere. By the end of the process, however, only three of the original seven (eight?) remained.
Same stuff, same day. What started as seven (eight?) colors became three by the end. Interestingly, the three remaining colors of light blue, dark blue, and orange were the same in each experiment. The distribution of the colors varied in each of the iterations.
Dark mode progressed similarly to light mode, however without the addition of an extra color. At the end of each iteration we were left with four colors from the original seven (including the background black). Looking closely at the light mode versions, I could see some of color eight, or the mysterious background color Nothing OS made up, so maybe the answer is four in both.
Both light mode and dark mode used orange and the light blue, while the dark mode opted for black instead of the dark blue found in light mode.
Another interesting detail from each of the iterations was the presence of these small almost nugget shaped segments of each color that went flying around the screen to create the color blocks in the final wallpaper.
Looking closely at the color nuggets, I see some star burst shapes in each, or something that looks like the inside of a sand dollar almost.
What we learned:
- Nothing OS picks and chooses colors from a wallpaper based on some preference mechanism, likely vibrance.
- Only selected colors are present in the final wallpaper.
- Instead of generating a true gradient from the wallpaper colors, it seems like Nothing OS is merely creating color blobs and expanding/rotating them into final position.
- Once the blobs are positioned, the software is adding both a gaussian blur and grain/noise to increase the texture and complete the look.
Bonus inspiration!
A few weeks after my initial research, I was using my OnePlus 10 Pro (A+ phone love it) as a backup as I investigated some damage to my iPhone 15 Pro Max.
I noticed inside the wallpaper settings there was a toggle I had never seen before. When I tapped it, this happened:
Basically OxygenOS here is grabbing about four colors from the wallpaper image here, and using them in a number of generated wallpaper choices. These choices include a mesh gradient!
What will mesosphere do?
Ok so basic functionality for the Mesosphere app:
- Be able to select colors from an image.
- Use those colors to create a mesh gradient.
- Add gaussian blur and noise/grain to increase texture.
- Output the gradient image for use as a wallpaper on iOS/iPadOS/macOS devices.
I have some other ideas of further expanded functionality, but those will wait a bit I think. Unless I get REALLY inspired.
Next up! UI/UX and user flow!
1 Yes, I know I could use Flutter to code a cross-platform app, but from what I hear SwiftUI is a significantly better framework for the Apple ecosystem.