Audio Design: One footstep at a time

Audio Design : One footstep at a time


In Life Pictures you find yourself in a surreal place where you can wander through the forest and absorb the nature's sounds. Every step you make hinders for a very short moment the environment's tranquility with the noise you make confirming your physical presence into this place. And this is because of sound. Sound is very important for creating a good player experience and immersing the player in the world.

Let's see how we managed to give this wonderful place a bit of life by using sound. In this week's post we're going to dive into some technical details about how we made the character footsteps system. In future posts we'll talk about how the bushes crackle when the character goes through them and the sounds setup for the rest of the nature elements.
* Mild warning! A lot of blueprinting ahead!

Character footsteps system
While moving, the first person character must make footsteps noise depending on the surface he's walking on and it must be at a specific rate depending on if he's walking or running. So to plan this ahead, this is what we need to do in high-level steps:
  1. if the character is moving and touching the ground, trigger an event for each step he makes; repeat until the character stops or isn't touching the ground anymore.
  2. for every footstep event do a line trace at the bottom of the character capsule, at the imaginary feet position
  3. from the line trace collision info get the physical material of the surface the line intersected and play the footstep sound corresponding to that physical material
  4. rinse and repeat
To integrate all this in our character blueprint in a more decoupled and reusable way, we created a separate actor blueprint to encapsulate all the logic required for doing the above steps. Let's call this actor "BP_FootstepsSoundController" and see how we implemented the above steps.

Triggering footsteps events
In this step we want to trigger an event for each step the character makes as long as he is moving and touching the ground. This logic needs to be executed every frame. So we have to check if the character is moving and touching the ground in a blueprint Tick event like so:
Step 1 - Part 1
In the above blueprint, "OwnerMovementComponent" is a cached reference to the Movement Component of the owner actor, which in our case is our character controller blueprint where we'll integrate this "BP_FootstepsSoundController" actor.

You probably noticed that before doing any other checks in the "Event Tick", we call a function "UpdateFootstepsPlayRate". By setting the "WalkSpeed" and "RunSpeed" of our character on this controller blueprint and then setting the desired "walkFootstepsRateFactor" and "runFootstepsRateFactor" we can easily tweak how fast the footstep sounds should be triggered when the character is walking and running. This function does the smooth interpolation between walking and running accordingly:
UpdateFootstepsPlayRate function
The function "GetGroundMoveSpeed" is a pretty simple custom Blueprint Utility function we made to calculate the ground relative speed of a movement component ignoring the Z axis movement and looks something like this:
GetGroundMoveSpeed function
So getting back to the initial screenshot Step 1 - Part 1, once the movement and ground touch conditions are true, we have to trigger an event for each theoretical footstep the character makes while he's moving. An easy way to trigger an event at a specific time interval and in a loopable manner is to use a Timeline component. Timelines are very useful for a lot of things, allowing you to easily design time-based animations and play them based on various in-game events. In our case, we used the Timeline component to create an event track named "Footstep Event" on which we defined key-frames for when a footstep sound should be played, within an interval of 1 second:
Timeline Footsteps Event Track
As you can see we've set one footstep event at 0.25s and one at 0.75s thus making them equally apart and loopable. By making the timeline length of 1 second we can then easily scale it using the Set Play Rate node to have extra control over how far apart the footsteps events will be triggered. For example when running, we can scale the timeline down by increasing the play rate from 1.0 to 1.5 thus making the footsteps events trigger 50% faster. We saw this in the UpdateFootstepsPlayRate function we discussed about earlier.

So how does this timeline actually help us with these events? Well, on its own the Timeline component doesn't do much. It just plays its animation timeline in a loop. While the character is moving and is touching the ground we should start playing this Timeline Component but only if it isn't already playing:
Step 1 - Part 2
As you noticed the timeline node automatically generated an output pin with the name of our previously created event track "Footstep Event". This pin will be executed every time an event is triggered on our timeline while it's looping.

But what happens if suddenly the character stops moving or he jumps and he doesn't touch the ground anymore? By continuing the "Branch" node from the Step 1 - Part 1 screenshot we showed earlier, on the "False" pin we should stop playing the timeline to avoid hearing footsteps while in the air, by calling our "StopFootstepsTimeline" function:
StopFootstepsTimeline function
The next time the timeline is restarted it will resume where it left off which is exactly what we want because if you move the character in small start/stop cycles you get the impression that you can move him one step at a time which is a nice detail to have.

So putting everything together for this step, we get this in our "BP_FootstepsSoundController" blueprint "Event Tick":
Step 1
Playing the right sound
In steps 2 and 3 of our plan, we need to detect which sound we have to play based on what our character's feet are touching. To do that, for every footstep event we do a line trace at the bottom of the character capsule and from the collision info we get the physical material of the surface the line intersected and play the footstep sound corresponding to that physical material.

Now that we have our timeline component triggering the footstep events, we can call a function to handle the logic of "querying" the surface the character is currently on by using a line trace and playing the corresponding footstep sound:

Let's split the "QueryGroundAndPlaySound" function in half to get a better view of what it does:
QueryGroundAndPlaySound function - Part 1  
This is where we actually use the line trace node and if we get a hit result we "break" the collision hit info to obtain the required "Physical Material" of the surface the line intersected.

To create the link between each physical material in the level and the corresponding footstep sound, we implemented a custom wrapper for Unreal's TMap data container so we can use it in Blueprints and store the objects as <key, value> pairs like this: <UObject, UObject>. But if you don't want to dive into C++ code, something similar can also be achieved by using an array of structs that contain the <key, value> pair of objects. The downside is that the physical material lookup in the array for every line trace will be of O(n) complexity and the more physical materials you use, it won't scale well performance wise. If you do want to dive into a bit of C++ code, you can check out below our simple "TMap" wrapper for blueprints.

* "UObjectsMap" header:

USTRUCT(BlueprintType) struct FObjMapKeyValStruct { GENERATED_USTRUCT_BODY() UPROPERTY(EditAnywhere, BlueprintReadWrite) UObject* keyObj; UPROPERTY(EditAnywhere, BlueprintReadWrite) UObject* valueObj; }; /** * TMap<UObject*, UObject*> type * You can use this in Blueprints to map various UObjects as key and * value pairs allowing you to map just about any Blueprint type * instance in a dictionary-like data structure. */ UCLASS(BlueprintType) class LIFEPICTURESRELEASE_API UObjectsMap : public UObject { GENERATED_BODY() public: TMap<UObject*, UObject*> nativeMap; UFUNCTION(BlueprintCallable, Category = "MobilityBlueprintUtils|ObjectsMap") void Add(UObject* keyObj, UObject* valueObj); UFUNCTION(BlueprintCallable, Category = "MobilityBlueprintUtils|ObjectsMap") int32 Remove(UObject* keyObj); UFUNCTION(BlueprintCallable, Category = "MobilityBlueprintUtils|ObjectsMap") bool Contains(UObject* keyObj); UFUNCTION(BlueprintCallable, Category = "MobilityBlueprintUtils|ObjectsMap") UObject* FindValue(UObject* keyObj); UFUNCTION(BlueprintCallable, Category = "MobilityBlueprintUtils|ObjectsMap") UObject* FindKey(UObject* valueObj); UFUNCTION(BlueprintCallable, BlueprintPure, Category = "MobilityBlueprintUtils|ObjectsMap") int32 Num() { return nativeMap.Num(); } UFUNCTION(BlueprintCallable, Category="MobilityBlueprintUtils|Data Structures") static UObjectsMap* CreateObjectsMap(); UFUNCTION(BlueprintCallable, Category = "MobilityBlueprintUtils|Data Structures") static UObjectsMap* CreateObjectsMapFrom(TArray<FObjMapKeyValStruct> keyValueArray); };


* "UObjectsMap" implementation:

void UObjectsMap::Add(UObject* keyObj, UObject* valueObj) { nativeMap.Add(keyObj, valueObj); } int32 UObjectsMap::Remove(UObject* keyObj) { return nativeMap.Remove(keyObj); } bool UObjectsMap::Contains(UObject* keyObj) { return nativeMap.Contains(keyObj); } UObject* UObjectsMap::FindValue(UObject* keyObj) { UObject** valueObjMapRef = nativeMap.Find(keyObj); return (valueObjMapRef != nullptr) ? *valueObjMapRef : nullptr; } UObject* UObjectsMap::FindKey(UObject* valueObj) { auto* keyObjMapRef = nativeMap.FindKey(valueObj); return (keyObjMapRef != nullptr) ? *keyObjMapRef : nullptr; } UObjectsMap* UObjectsMap::CreateObjectsMap() { return NewObject<UObjectsMap>(); } UObjectsMap* UObjectsMap::CreateObjectsMapFrom(TArray<FObjMapKeyValStruct> keyValueArray) { UObjectsMap* objMap = CreateObjectsMap(); for (auto keyValuePair : keyValueArray) objMap->nativeMap.Add(keyValuePair.keyObj, keyValuePair.valueObj); return objMap; }


To create an instance of this wrapper in blueprints you will have 2 methods available as you noticed in the above code: either directly populate it from an array of our custom <key, value> pairs struct, using the "CreateObjectsMapFrom" node, or creating an empty instance, using the "CreateObjectsMap" node.

Getting back to our previously shown "QueryGroundAndPlaySound" function, as you can see in the image below, each time we get a collision we check if the Physical Material is in our TMap container "footstepsSoundsMap" and if it is, we get the corresponding sound to play at the collision location, otherwise we play a default generic forest road footstep sound - the "DefaultFootstepsSoundCue" node:
QueryGroundAndPlaySound function - Part 2  
For footsteps sounds we didn't use a single sound for a physical material because it would have been unnaturally weird and annoying to hear the exact same sound repeating. So we used a variation of 3-4 sounds from where we randomly pick a different one each time we play the SoundCue by using Unreal's SoundCue Random node:

And this covers the core logic of the "BP_FootstepsSoundController" blueprint actor. In case you missed them above, below are the links from where you can copy the blueprints code:
  • "BP_FootstepsSoundController" blueprint "Tick" event
  • "UpdateFootstepsPlayRate" function
  • "GetGroundMoveSpeed" function
  • "StopFootstepsTimeline" function
  • "QueryGroundAndPlaySound" function

We can now easily integrate this actor as a child actor in any Character blueprint to generate footsteps sounds in a pretty configurable way. Of course the system can be much more complex, but this was more than enough for our needs. To integrate our footsteps controller we just have to drag and drop it in the FirstPersonCharacter blueprint components hierarchy, which should look something like this:

After we configure the footsteps sounds and the corresponding physical materials on the controller, in the "FirstPersonCharacter" blueprint all we have to do is just apply the character "WalkSpeed" and "RunSpeed" in the "BeginPlay" event and then let it do its thing:

Hope you find this useful for your future stealth/action/assassination games where you want to hear your character's footsteps while dancing in the blood puddles of his victims.

Thank you for reading and happy blueprinting!

Taking beautiful pictures

Taking beautiful pictures


Last week we looked into a bunch of technical aspects regarding the photo camera in Life Pictures. Now it's time to take a closer look at how you'll be using the camera in the game.

Memories and time
We mentioned several times until now that taking a picture and keeping it will take you to another memory further along the timeline. What does this mean and how it actually works? When you begin the game you're alone. You take a stroll through the woods and you'll meet one of the 3 girls in Life Pictures. If you interact with her and take a picture, you'll be taken to a memory of your relationship with her. The picture can be of anything, not necessarily the girl, although we suspect you won't be able to help yourself for the first several times.

In the new memory with the girl you've interacted with, you can take a new picture any time you want and time will again advance to another memory. While you're in a relationship with a girl, you don't have to interact with her again - advancing time will take you through the memories of that relationship. After the relationship has played out, you'll be able to interact with another of the remaining girls and revisit the memories of your time with her.

If you're alone and you take a picture without interacting with any girl, you'll continue alone, advancing time much further along. You'll have a new chance to meet a girl or to continue being alone. How you play out your lives is up to you! We encourage you to try as many combinations as you want, so you can find out more about your real life. Can you piece it together?

Camera functions
When taking a picture, you have several functions to help you get the best result. You can zoom in and out, to get the best viewing distance to your subject. You can adjust your height, perfect for some under shots for example. The coolest feature is that you can change the focus area. This starts from focus to infinity and moves closer to you. When the focus area gets small enough, it goes back away from you for a short distance, giving you a lot of control of what you can focus on. Last, but not least, you have the useful guidelines to help you get the best composition for your shots. You can, of course, hide them at any time, to view only your subject without all the clutter. There's a battery indicator too, but we'll let you figure out how that works - no, don't worry, you won't need to recharge anything.

Pictures and albums
Once you take a picture, you can preview it and if you choose to keep it instead of discarding it, time will advance. Where does the picture go? To your Life Pictures album of course. As we mentioned before, you can have many lives in this game. Each life has its own pictures album, which you can view at any time to remember your experiences and choices. While events repeat themselves between the various lives, your choices influence the order in which they happen and that makes all the difference. We can't go into too many details without spoiling anything, but think about this: having a child at 20 is very different than having the same child at 40. Change the order of major events in our lives and that will make for a very different experience. Spoiler: the child was only an example, there won't be any children in Life Pictures.

Technical
We went into a lot of technical stuff in last week's post, so we'll only talk about one more thing now: how we made the focus effect. It's very simple, using only what Unreal puts right in front of us. On our camera component, in the post process settings, we activate the depth of field, using the BokehDOF. Below you can see the values we start with when the camera is set to focus on infinity. As we focus closer, we change the values for the scale, focal distance and focal region. The scale goes up pretty quickly from 0 to 2 and stays there. The focal region decreases fast, then decreases slowly and right at the end goes up just a tiny bit, for a nice increase in control. The focal distance is a bit more interesting, first it stays constant, then increases and in the later part of the curve decreases. Better have a look at the curves below, it's harder to explain in words.

What do the focal region and focal distance curves achieve? Imagine the focus region is a cube which at first starts from your position and goes really far - this would be the infinity focus. Then the cube gets smaller, still starting at your position and stretching closer and closer. When it hits a certain size, the cube also starts moving away from you. When the cube gets really small, it continues shrinking only slightly, while it keeps moving away from you. When it reaches minimal size, the cube starts getting closer to you, increasing slightly in size. The last part helps so you can move a minimal sized focal region closer or farther away from you.

This article pretty much wraps up the information about the photo camera in Life Pictures. We hope you'll have fun with it and take some great pictures when the game releases! Thank you for reading!

Technical dive into our photo camera

Technical dive into our photo camera


As we mentioned in the first post about Life Pictures, the photo camera is a vital part of how you interact with the game. You can take a photo of a moment that catches your eye and if you save it in your album, you'll advance the time to a different memory in your life. But how is this "time travelling" photo camera working behind the scenes?
* Mild warning: technical gibberish and source code ahead!

Basic Setup
The first most basic thing a photo camera needs to do is capture a photo. The easiest way of doing this in Unreal is to use the SceneCaptureComponent2D. It basically acts like a camera component which under the hood sets up a viewport used by the engine to render the scene into a render target texture instead of the screen target buffers. By default, in UE 4.10.4, this component captures the scene without any of the postprocess effects because the "Capture Scene" parameter is set to "SceneColor (HDR)". But in our case we needed to capture the final scene render so this setting should be set to "Final Color (LDR with PostProcess)". You'll notice in the screenshot below that we also unchecked the "Capture Every Frame" flag because we obviously need only one frame to be captured.

Once we have this component set on an actor in the scene, we also need to tell it where exactly should the final rendered scene be outputted and thus we need to set a render target texture in the "Texture Target" parameter. To set it up quickly you can create it directly as an asset,  but we needed it to be created dynamically at run-time so we went the Blueprints way. But, currently there's no blueprint node that allows us to dynamically create an UTextureRenderTarget2D.

Unreal has a nice way of letting you call static C++ functions from Blueprint Function Libraries in the form of Blueprint nodes, so this wasn't a big problem but more of a research issue. We implemented a custom function that creates a 2D render target texture and returns its pointer in blueprints:

* header declaration:

UFUNCTION(BlueprintCallable, BlueprintPure, Category="MobilityBlueprintUtils|Textures") static UTextureRenderTarget2D* CreateRenderTarget2DTexture(int32 width, int32 height, bool makeHDR);


* cpp implementation:

UTextureRenderTarget2D* UMobilityBlueprintUtils::CreateRenderTarget2DTexture(int32 width, int32 height, bool makeHDR) { UTextureRenderTarget2D *rtResult = NewObject<UTextureRenderTarget2D>(); if (makeHDR) rtResult->InitAutoFormat(width, height); else rtResult->InitCustomFormat(width, height, PF_B8G8R8A8, false); int32 resSize = rtResult->GetResourceSize(EResourceSizeMode::Inclusive); return rtResult; }


And the new custom Blueprint node generated by the engine for this function after you compile the code looks something like this:

Now we have an actor with a SceneCaptureComponent2D that can render the scene into a 2D render target texture. If you remember, previously we disabled the "Capture Every Frame" from this component to avoid rendering the scene every frame. Well, unfortunately, unchecking this flag only partially does what you'd expect. If you move the position of the actor or this component it will render the scene again because it detects that its transform is dirty. So to actually make sure this component renders the scene only on demand, we set its "Visible" property to false and each time we want to capture a frame we do the following:

As you noticed in the above blueprint, calling the "UpdateContent" function will force a scene render and after that we disable the visibility flag back to avoid subsequent rendering of the scene into the render target texture.

Congrats! With this simple setup described above you have the basic functionality of a photo camera: capturing the scene.

Our photo camera tends to bit a more complex under the hood and it's actually a separate Camera actor that renders the scene when the player is in the photo camera mode allowing us to apply different post-process effects and play with various camera settings like the FOV. The scene capture component mentioned earlier that's attached to this camera actor is only used when we need to capture a photo. To switch from the first person character camera to our photo camera we used the SetViewTargetWithBlend node.

Saving the photo
All things aside, what is a captured photo if we can't save it or load it back? Unfortunately, our current version of UE 4.10.4 doesn't offer an out of the box way of saving a render target texture to disk from Blueprints. But, it does have a lot of low level oomph that gives you the flexibility of doing this nonetheless. So after building up a good chunk of sweat from diving through the engine code and trying multiple ways of achieving this we managed to come up with yet another blueprint library function that we can call from Blueprints:

bool UMobilityBlueprintUtils::SaveRenderTargetToFile(UTextureRenderTarget2D *rt, FIntPoint newImageSize, const FString& fileDestination, bool applyCropIfResized/* = false*/, EUtilsImageCompressionType compressionType /*= EUtilsImageCompressionType::VE_PNG*/, int32 compressionQuality/*= 0*/) { // Get game thread reference for the render target. FTextureRenderTargetResource *rtResource = rt->GameThread_GetRenderTargetResource(); TArray<FColor> rtPixelsArray; TArray<FColor>* newPixelData = &rtPixelsArray; rtPixelsArray.AddUninitialized(rt->GetSurfaceWidth() * rt->GetSurfaceHeight()); FReadSurfaceDataFlags readPixelFlags(RCM_UNorm); rtResource->ReadPixels(rtPixelsArray, readPixelFlags); // Check if we need to prepare data for resizing the render target texture data. TArray<FColor> dstPixelData; if (newImageSize.X == 0 || newImageSize.Y == 0) { newImageSize.X = rt->GetSurfaceWidth(); newImageSize.Y = rt->GetSurfaceHeight(); } else if (newImageSize.X != rt->GetSurfaceWidth() || newImageSize.Y != rt->GetSurfaceHeight()) { dstPixelData.AddUninitialized(newImageSize.X * newImageSize.Y); if (applyCropIfResized) { FImageUtils::CropAndScaleImage(rt->GetSurfaceWidth(), rt->GetSurfaceHeight(), newImageSize.X, newImageSize.Y, rtPixelsArray, dstPixelData); } else { FImageUtils::ImageResize(rt->GetSurfaceWidth(), rt->GetSurfaceHeight(), rtPixelsArray, newImageSize.X, newImageSize.Y, dstPixelData, true); } newPixelData = &dstPixelData; } // Compress image and save to disk bool imageSavedOk = false; TSharedPtr<IImageWrapper> imageCompressor; IImageWrapperModule* imageWrapperModule = FModuleManager::LoadModulePtr<IImageWrapperModule>(FName("ImageWrapper")); if (imageWrapperModule != nullptr) { EImageFormat::Type targetImageFormat = (EImageFormat::Type)compressionType; imageCompressor = imageWrapperModule->CreateImageWrapper(targetImageFormat); } int32 inRawDataSize = sizeof(FColor) * newImageSize.X * newImageSize.Y; if (imageCompressor.IsValid() && imageCompressor->SetRaw((void*)((*newPixelData).GetData()), inRawDataSize, newImageSize.X, newImageSize.Y, ERGBFormat::BGRA, 8)) { const TArray<uint8>& compressedData = imageCompressor->GetCompressed(compressionQuality); imageSavedOk = FFileHelper::SaveArrayToFile(compressedData, *fileDestination); } return imageSavedOk; }


It's a big function so let's brake it down in chunks and try to understand this dark Unreal magic.

The first thing we need to do is tell the engine to give us a pointer to our render target texture's corresponding resource from the render thread. We need that pointer so we can use it on the main game thread, which is the actual thread we're running this function on:

// Get game thread reference for the render target. FTextureRenderTargetResource *rtResource = rt->GameThread_GetRenderTargetResource();

The render target texture is a special texture that resides in the GPU memory. Normally we can't read pixels from a texture like this directly from CPU executed code. In Unreal, when you create a render target texture, by default it also allocates a secondary corresponding resource of type FTextureRenderTargetResource on the main memory. So what we need to do is read the pixels array using the "FTextureRenderTargetResource" that acts as a bridge between the actual render target texture from the rendering thread and the main thread from where we call our function.

That's why below we use the function "rtResource->ReadPixels(...)" to fetch the render target pixel data to our "rtPixelsArray". The "newPixelData" is just a pointer to the same "rtPixelsArray" that we need later for doing some extra changes to the pixels data structure like resizing or cropping the texture.

TArray<FColor> rtPixelsArray; TArray<FColor>* newPixelData = &rtPixelsArray; rtPixelsArray.AddUninitialized(rt->GetSurfaceWidth() * rt->GetSurfaceHeight()); FReadSurfaceDataFlags readPixelFlags(RCM_UNorm); rtResource->ReadPixels(rtPixelsArray, readPixelFlags);


The code below is just for some extra steps that optionally change the size of the image we're saving on the fly using some useful Unreal utility methods we found by digging in the engine source code, in the FImageUtils class:

// Check if we need to prepare data for resizing the render target texture data. TArray<FColor> dstPixelData; if (newImageSize.X == 0 || newImageSize.Y == 0) { newImageSize.X = rt->GetSurfaceWidth(); newImageSize.Y = rt->GetSurfaceHeight(); } else if (newImageSize.X != rt->GetSurfaceWidth() || newImageSize.Y != rt->GetSurfaceHeight()) { dstPixelData.AddUninitialized(newImageSize.X * newImageSize.Y); if (applyCropIfResized) { FImageUtils::CropAndScaleImage(rt->GetSurfaceWidth(), rt->GetSurfaceHeight(), newImageSize.X, newImageSize.Y, rtPixelsArray, dstPixelData); } else { FImageUtils::ImageResize(rt->GetSurfaceWidth(), rt->GetSurfaceHeight(), rtPixelsArray, newImageSize.X, newImageSize.Y, dstPixelData, true); } newPixelData = &dstPixelData; }


Once we have the final modified pixels array in "newPixelData", we have to use Unreal's FModuleManager  to get a reference to its "ImageWrapper" module which is actually a factory for various types of image compressors. With it we can create an image wrapper for our specified "targetImageFormat" to get an instance of the corresponding "imageCompressor":

bool imageSavedOk = false; TSharedPtr<IImageWrapper> imageCompressor; IImageWrapperModule* imageWrapperModule = FModuleManager::LoadModulePtr<IImageWrapperModule>(FName("ImageWrapper")); if (imageWrapperModule != nullptr) { EImageFormat::Type targetImageFormat = (EImageFormat::Type)compressionType; imageCompressor = imageWrapperModule->CreateImageWrapper(targetImageFormat); }


This is where things get a bit more ugly C++ looking, but it's pretty straightforward once you parse through all the parenthesis. The image compressor isn't documented at all and the only way to find out how it should be used is by searching in the engine source code. So, after much digging, we found that you must calculate the bytes of raw pixel data you want to compress - see "inRawDataSize" below - and use the "SetRaw(...)" function to set the raw pixels array, image size, and pixel storage format on the compressor.

Once this method returns ok, we can retrieve the compressed image data array by using the "GetCompressed(...)" function which we then save to our destination file. Luckily, saving the binary file to disk was more straightforward by using FFileHelper:

int32 inRawDataSize = sizeof(FColor) * newImageSize.X * newImageSize.Y; if (imageCompressor.IsValid() && imageCompressor->SetRaw((void*)((*newPixelData).GetData()), inRawDataSize, newImageSize.X, newImageSize.Y, ERGBFormat::BGRA, 8)) { const TArray<uint8>& compressedData = imageCompressor->GetCompressed(compressionQuality); imageSavedOk = FFileHelper::SaveArrayToFile(compressedData, *fileDestination); }


Cool! After much sweat and engine code digging we finally saved a photo to disk - either in PNG or JPG format. You can end up taking a lot of photos if you play all the possible combination of choices and, to avoid around 2 GB of space for the game saves, we picked the JPG save format in the end.

Loading the photo
So, how do we bring the saved photo back into a texture at run-time? Because we'll eventually need to view the photos in a photo album, right?

This part should be a little easier to read now that you're warmed up from the above code. The main steps of the load function are:
  • setup the image compressor same as we did for when we saved the photo
  • load the raw binary data from the file using "LoadFileToArray(...)" 
  • feed the raw data to the image compressor using "SetCompressed(...)"
  • use the compressor "GetRaw(...)" function to get the uncompressed pixels array
  • create an UTexture2D object with the corresponding image size
  • upload the raw pixels data to the texture's memory block
  • rejoice with the returned texture pointer in Blueprints

And here's the function in all its glory with some extra safety checks and logs for each step mentioned above:

UTexture2D* UMobilityBlueprintUtils::LoadTextureFromFile(const FString& filePath, EUtilsImageCompressionType compressionType /*= EUtilsImageCompressionType::VE_PNG*/) { UTexture2D* loadedTex = nullptr; if (!FPlatformFileManager::Get().GetPlatformFile().FileExists(*filePath)) return nullptr; // Get ImageWrapperModule instance and corresponding image wrapper API. IImageWrapperModule& imgModule = FModuleManager::LoadModuleChecked<IImageWrapperModule>(FName("ImageWrapper")); EImageFormat::Type targetImageFormat = (EImageFormat::Type)compressionType; IImageWrapperPtr imgWrapperPtr = imgModule.CreateImageWrapper(targetImageFormat); if (!imgWrapperPtr.IsValid()) { UE_LOG(LogTemp, Warning, TEXT("[MobilityBlueprintUtils] LoadTextureFromFile: " + "failed to create ImageWrapper to load image!")); return nullptr; } TArray<uint8> rawFileData; if (!FFileHelper::LoadFileToArray(rawFileData, *filePath)) { UE_LOG(LogTemp, Warning, TEXT("[MobilityBlueprintUtils] LoadTextureFromFile: " + "failed to load raw data from specified path: %s"), *filePath); return nullptr; } if (!imgWrapperPtr->SetCompressed(rawFileData.GetData(), rawFileData.Num())) { UE_LOG(LogTemp, Warning, TEXT("[MobilityBlueprintUtils] LoadTextureFromFile: " + "ImageWrapper->SetCompressed failed for file: %s"), *filePath); return nullptr; } const TArray<uint8>* uncompressedData; if (!imgWrapperPtr->GetRaw(ERGBFormat::BGRA, 8, uncompressedData)) { UE_LOG(LogTemp, Warning, TEXT("[MobilityBlueprintUtils] LoadTextureFromFile: " + "ImageWrapper->GetRaw failed for file: %s"), *filePath); return nullptr; } // Create transient texture object loadedTex = UTexture2D::CreateTransient(imgWrapperPtr->GetWidth(), imgWrapperPtr->GetHeight(), EPixelFormat::PF_B8G8R8A8); if (!loadedTex) { UE_LOG(LogTemp, Warning, TEXT("[MobilityBlueprintUtils] LoadTextureFromFile: " + "failed to created UTexture2D for PNG file: %s"), *filePath); return nullptr; } // Upload image data to texture in the top most mip level. // Get the pointer to the texture mip level 0 raw container and lock it for read/write operations // because the texture is used in the render thread and we could cause memory corruptions because // of threading race conditions. // Note: For UTexture2D textures that were created from an Unreal editor asset file, the // "Lock(LOCK_READ_WRITE)" function will internally detach the texture data container from the // asset to avoid messing up the data in case you also import a new file at run-time while in // the editor and it will get reattached when using the "Unlock()" function. void* textureData = loadedTex->PlatformData->Mips[0].BulkData.Lock(LOCK_READ_WRITE); // Copy the raw image data to the address from the above pointer. FMemory::Memcpy(textureData, uncompressedData->GetData(), uncompressedData->Num()); // Unlock the texture data container. (will invalidate the above data container pointer) loadedTex->PlatformData->Mips[0].BulkData.Unlock(); // Update the texture GPU resource with the new data. loadedTex->UpdateResource(); return loadedTex; }


Now that you have this function available in Blueprints, you can load a photo, set it on a material and presto! You have your loaded photo rendered in-game wherever you apply the material. Nothing fancy, but quite handy to have around.

We hope this "little" post helps someone that might be at the starting line with UE4. We sure could've used it a while back. Thank you for reading!