Build Design Systems With Penpot Components
Penpot's new component system for building scalable design systems, emphasizing designer-developer collaboration.

medium bookmark / Raindrop.io |
It includes new features like automatic asset scaling for retina devices and image gamma correction. There are also improvements and fixes to existing features such as additive alpha animations and macOS 10.13 path drawing.
However, what we’re especially excited to share is something that is much more subtle to notice. It is the reason that this is our biggest release since launch.
As a creator in Lightwell, you will likely position and customize layers, choreograph animations, and add interactions. While you do this, Lightwell is generating a series of configuration files for your scenes. These configuration files are then read and translated into dynamic content when you run the project in the Lightwell Previewer or from Xcode.
All of this works through Hullabalu StoryKit “HSK”, a framework that we built for our Adventures of Pan app series. HSK manages everything from the reading of the configuration files to reading user and device inputs, from progressing dialogue within a scene to navigating an entire app project.
Over the last few months since Lightwell’s release, we have been so inspired by the community of creators and the projects being created. The more we’ve learned about what you wanted out of Lightwell, the more we came to realize that the old version of StoryKit needed some significant updates.
One of the questions we get asked the most is: “Can I publish to Android?” The main reason we have historically said “no” was due to our rendering engine.
A rendering engine is the code that actually does the heavy lifting of drawing image assets to the screen. This includes everything from reading in the pixel by pixel image data, positioning and transforming that image, animating that image, and finally to drawing each pixel on the screen.
Rendering engines have a lot of components that can be designed to make things easy to develop, work for a wide range of applications, run fast, or perform expensive image effects. However, you can never get all of the features with any one rendering engine.
Hullabalu StoryKit was initially built with a rendering engine that was quick to set up, gave us access to the effects we needed at the time, and worked under a reasonable application use–many full screen’s worth of animating images like this. The drawback, it was iOS and macOS only. So, we looked around for alternatives.
We had a very specific set of requirements for our rendering engine.
And, of course, the rendering engine has to be capable of doing all of this in real-time. Since mobile apps are inherently interactive, content needs to be dynamic and responsive.
After searching for a solution, we couldn’t find any preexisting solution that had everything we needed. So, we built our own. 🤓
Harnessing the power of OpenGL, Hullabalu StoryKit now includes a rendering engine that is built explicitly for your Lightwell projects. That means it is optimized to animate your layers in real-time while fully responding to device inputs like touch and tilt.👆
We have the ability to deform layers for parallax without sacrificing performance, and to add in new functionality with first class support. It’s all built with Swift, so the code side interface is simple and easy to learn (we’ll be sharing more about the API soon).
(👋 Android!)
As we build out cross-platform previewing and publishing support we have some other exciting features on the horizon.
So, “can I publish to Android?”… Soon.
Thank you very much for you continued feedback and for sharing your incredible projects with us! We love hearing about what you’re putting together.
Drop us a line at [email protected] with any questions, requests, feedback, or just to say hello!
Happy creating! 🖖
Max
AI-driven updates, curated by humans and hand-edited for the Prototypr community