-
Notifications
You must be signed in to change notification settings - Fork 312
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Text rendering example (that uses canvas as texture) #82
Comments
This would be a super valuable sample! I definitely think we should have it. However the APIs for getting canvas data into WebGPU are only just starting to take form. |
Interesting, so does that mean WebGPU doesn't support canvases as textures yet, or just that it's not ergonomic? I've tried to get a canvas into a texture using something like this: const texture = device.createTexture({
size: [canvas.width, canvas.height, 1],
format: 'rgba8unorm',
usage: GPUTextureUsage.SAMPLED | GPUTextureUsage.COPY_DST,
})
const ctx = canvas.getContext('2d')
const imageData = ctx.getImageData(0, 0, canvas.width, canvas.height)
const canvasBitmap = await createImageBitmap(imageData)
device.queue.copyImageBitmapToTexture(
{ imageBitmap: colorAtlasImageBitmap, },
{ texture: colorAtlasTexture },
[colorAtlasImageBitmap.width, colorAtlasImageBitmap.height, 1]
) That doesn't seem to work though? |
It's not possible without some indirection through ImageBitmap or writeTexture just like what you've done. BTW you can skip the imageData step and do |
Sure, I'll see if I can come up with an example to test it; note though that I'm not actually certain if that way of creating an image bitmap from the canvas and using it as a texture works or not. I haven't been able to get anything rendered to the screen so far, porting from WebGL, so I was really just guessing there.
Thank you, that makes things a bit cleaner! |
I was able to modify the fractal cube sample to use a canvas as it's texture instead of the output from the previous render, here's the diff (the code isn't great quality, for instance it probably doesn't need to copy the texture every frame, but should be fine as an example): Canvas as texture diffdiff --git a/src/pages/samples/fractalCube.ts b/src/pages/samples/fractalCube.ts
index 7677815..4908603 100644
--- a/src/pages/samples/fractalCube.ts
+++ b/src/pages/samples/fractalCube.ts
@@ -124,6 +124,20 @@ async function init(canvas: HTMLCanvasElement) {
usage: GPUBufferUsage.UNIFORM | GPUBufferUsage.COPY_DST,
});
+ const c = document.createElement('canvas');
+ const el = document.getElementsByClassName('BasicExample_canvasContainer__3e5KH')[0];
+ if (el.childNodes.length > 1) { el.childNodes[1].remove(); }
+ el.appendChild(c);
+ c.width = 500;
+ c.height = 500;
+ const ctx = c.getContext('2d');
+ ctx.fillStyle = 'purple';
+ ctx.fillRect(0, 0, c.width, c.height);
+ ctx.fillStyle = 'white';
+ ctx.font = 'bold 48px serif';
+ ctx.fillText("This is text", 250, 250);
+ const bm = await createImageBitmap(c);
+
const cubeTexture = device.createTexture({
size: { width: canvas.width, height: canvas.height },
format: 'bgra8unorm',
@@ -194,18 +208,7 @@ async function init(canvas: HTMLCanvasElement) {
passEncoder.draw(36, 1, 0, 0);
passEncoder.endPass();
- commandEncoder.copyTextureToTexture(
- {
- texture: swapChainTexture,
- },
- {
- texture: cubeTexture,
- },
- {
- width: canvas.width,
- height: canvas.height,
- }
- );
+ device.queue.copyImageBitmapToTexture({imageBitmap: bm}, {texture: cubeTexture}, { width: c.width, height: c.height });
device.queue.submit([commandEncoder.finish()]);
}; With that diff everything does indeed work fine: |
This may be a bit much to ask (in which case no worries, I totally understand), but I figure I'll ask anyways; I think it would be quite useful to have an example of rendering text, specifically one that first renders the text to a canvas and then renders that with WebGPU via a texture.
A more complicated but also perhaps more useful example would render some chars to a canvas, create a texture from it, and then use that texture to construct various strings on-the-fly by sampling from different parts of the texture for each letter. (Note that a WebGL version of this is described in https://webglfundamentals.org/webgl/lessons/webgl-text-glyphs.html, which might be helpful.)
I'm happy to help with the implementation of this if wanted and as I'm able, but unfortunately I think my knowledge is too lacking in both WebGPU and graphics programming in general to do the full PR myself.
The text was updated successfully, but these errors were encountered: