Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Text rendering example (that uses canvas as texture) #82

Open
smolck opened this issue Mar 11, 2021 · 5 comments
Open

Text rendering example (that uses canvas as texture) #82

smolck opened this issue Mar 11, 2021 · 5 comments
Labels
sample request Request for a new sample sample wanted We definitely want to add this sample; contributions welcome

Comments

@smolck
Copy link

smolck commented Mar 11, 2021

This may be a bit much to ask (in which case no worries, I totally understand), but I figure I'll ask anyways; I think it would be quite useful to have an example of rendering text, specifically one that first renders the text to a canvas and then renders that with WebGPU via a texture.

A more complicated but also perhaps more useful example would render some chars to a canvas, create a texture from it, and then use that texture to construct various strings on-the-fly by sampling from different parts of the texture for each letter. (Note that a WebGL version of this is described in https://webglfundamentals.org/webgl/lessons/webgl-text-glyphs.html, which might be helpful.)

I'm happy to help with the implementation of this if wanted and as I'm able, but unfortunately I think my knowledge is too lacking in both WebGPU and graphics programming in general to do the full PR myself.

@kainino0x
Copy link
Collaborator

This would be a super valuable sample! I definitely think we should have it.

However the APIs for getting canvas data into WebGPU are only just starting to take form.
gpuweb/gpuweb#1154
gpuweb/gpuweb#1415 (comment)
I suspect the simpler one (copyToTexture) would be ideal for building glyph caches/atlases, while GPUExternalTexture would be better for rendering an entire text block that changes every frame.

@smolck
Copy link
Author

smolck commented Mar 12, 2021

However the APIs for getting canvas data into WebGPU are only just starting to take form.

Interesting, so does that mean WebGPU doesn't support canvases as textures yet, or just that it's not ergonomic? I've tried to get a canvas into a texture using something like this:

  const texture = device.createTexture({
    size: [canvas.width, canvas.height, 1],
    format: 'rgba8unorm',
    usage: GPUTextureUsage.SAMPLED | GPUTextureUsage.COPY_DST,
  })
  const ctx = canvas.getContext('2d')
  const imageData = ctx.getImageData(0, 0, canvas.width, canvas.height)
  const canvasBitmap = await createImageBitmap(imageData)
  device.queue.copyImageBitmapToTexture(
    { imageBitmap: colorAtlasImageBitmap, },
    { texture: colorAtlasTexture },
    [colorAtlasImageBitmap.width, colorAtlasImageBitmap.height, 1]
  )

That doesn't seem to work though?

@kainino0x
Copy link
Collaborator

kainino0x commented Mar 12, 2021

It's not possible without some indirection through ImageBitmap or writeTexture just like what you've done.
Something like that should work (in Chromium) today, I don't know offhand why it doesn't. Could you provide a codepen/jsfiddle/html file to try it out?

BTW you can skip the imageData step and do await createImageBitmap(canvas) directly which makes it theoretically possible to keep the data on-GPU (though Chromium might not).

@smolck
Copy link
Author

smolck commented Mar 12, 2021

It's not possible without some indirection through ImageBitmap or writeTexture just like what you've done.
Something like that should work (in Chromium) today, I don't know offhand why it doesn't. Could you provide a codepen/jsfiddle/html file to try it out?

Sure, I'll see if I can come up with an example to test it; note though that I'm not actually certain if that way of creating an image bitmap from the canvas and using it as a texture works or not. I haven't been able to get anything rendered to the screen so far, porting from WebGL, so I was really just guessing there.

BTW you can skip the imageData step and do await createImageBitmap(canvas) directly which makes it theoretically possible to keep the data on-GPU (though Chromium might not).

Thank you, that makes things a bit cleaner!

@smolck
Copy link
Author

smolck commented Mar 12, 2021

I was able to modify the fractal cube sample to use a canvas as it's texture instead of the output from the previous render, here's the diff (the code isn't great quality, for instance it probably doesn't need to copy the texture every frame, but should be fine as an example):

Canvas as texture diff
diff --git a/src/pages/samples/fractalCube.ts b/src/pages/samples/fractalCube.ts
index 7677815..4908603 100644
--- a/src/pages/samples/fractalCube.ts
+++ b/src/pages/samples/fractalCube.ts
@@ -124,6 +124,20 @@ async function init(canvas: HTMLCanvasElement) {
     usage: GPUBufferUsage.UNIFORM | GPUBufferUsage.COPY_DST,
   });
 
+  const c = document.createElement('canvas');
+  const el = document.getElementsByClassName('BasicExample_canvasContainer__3e5KH')[0];
+  if (el.childNodes.length > 1) { el.childNodes[1].remove(); }
+  el.appendChild(c);
+  c.width = 500;
+  c.height = 500;
+  const ctx = c.getContext('2d');
+  ctx.fillStyle = 'purple';
+  ctx.fillRect(0, 0, c.width, c.height);
+  ctx.fillStyle = 'white';
+  ctx.font = 'bold 48px serif';
+  ctx.fillText("This is text", 250, 250);
+  const bm = await createImageBitmap(c);
+
   const cubeTexture = device.createTexture({
     size: { width: canvas.width, height: canvas.height },
     format: 'bgra8unorm',
@@ -194,18 +208,7 @@ async function init(canvas: HTMLCanvasElement) {
     passEncoder.draw(36, 1, 0, 0);
     passEncoder.endPass();
 
-    commandEncoder.copyTextureToTexture(
-      {
-        texture: swapChainTexture,
-      },
-      {
-        texture: cubeTexture,
-      },
-      {
-        width: canvas.width,
-        height: canvas.height,
-      }
-    );
+    device.queue.copyImageBitmapToTexture({imageBitmap: bm}, {texture: cubeTexture}, { width: c.width, height: c.height });
 
     device.queue.submit([commandEncoder.finish()]);
   };

With that diff everything does indeed work fine:

Screen Shot 2021-03-11 at 8 27 09 PM

@kainino0x kainino0x added sample request Request for a new sample sample wanted We definitely want to add this sample; contributions welcome labels Mar 5, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
sample request Request for a new sample sample wanted We definitely want to add this sample; contributions welcome
Projects
None yet
Development

No branches or pull requests

2 participants