metalkit

OpenGL ES deprecated in iOS 12 and SKShader

落爺英雄遲暮 提交于 2020-01-24 08:52:09
问题 I am very new to the concept and use of shaders in SpriteKit. I found this tutorial on how to render a Mandelbrot fractal with a custom shader file - Fractal.fsh - attached to a Color Sprite's Custom Shader property. https://www.weheartswift.com/fractals-Xcode-6/ It works fine and I thought to my self that learning about OpenGL ES and custom shaders in SpriteKit would be a fun exercise. According to Apple though, OpenGL ES is deprecated as of iOS 12. https://developer.apple.com/library

MTKView texture correct color build-up

我与影子孤独终老i 提交于 2020-01-24 01:03:30
问题 I am working on a metal-backed drawing application where brushstrokes are drawn on an MTKView by stamping a textured square repeatedly along a path. I am having color-accumulation issues as illustrated with the picture below: For alpha values [0.5 - 1.0] the results are more or less what I expect. However, for small alpha values, the result looks patchy and never achieves the uniform/correctly saturated value of the original fully opaque brush color (i.e. like the top brushstroke in above

What is the coordinate system used in metal?

孤者浪人 提交于 2020-01-16 08:40:13
问题 In metal what coordinate system to use inside shader (in and out)? and when we render to texture is it the same? with the z buffer also? Are there any inconsistencies? finally what are the difference between metal, opengl and directX ? 回答1: Metal Coordinate Systems Metal defines several standard coordinate systems to represent transformed graphics data at different stages along the rendering pipeline. 1) NDC (Normalized Device Coordinate): this coordinates is used by developers to construct

How to render each pixel of a bitmap texture to each native physical pixel of the screen on macOS?

我怕爱的太早我们不能终老 提交于 2019-12-31 04:00:50
问题 As modern macOS devices choose to use a scaled HiDPI resolution by default, bitmap images get blurred on screen. Is there a way to render a bitmap pixel by pixel to the true native physical pixels of the display screen? Any CoreGraphics, OpenGL, or metal API that would allow this without change the display mode of the screen? If you are thinking of those convertXXXXToBacking and friends, stop. Here is the explanation for you. A typical 13 in MacBook pro now has native 2560x1600 pixel

Texture Brush (Drawing Application ) Using Metal

杀马特。学长 韩版系。学妹 提交于 2019-12-30 07:49:43
问题 I am trying to implement a metal-backed drawing application where brushstrokes are drawn on an MTKView by textured square repeatedly along a finger position. I am drawing this with alpha 0.2. When the squares are overlapped the color is added. How can I draw with alpha 0.2. 回答1: I think you need to draw the brush squares to a separate texture, initially cleared to transparent, without blending. Then draw that whole texture to your view with blending. If you draw the brush squares directly to

Fill Path in Metal

﹥>﹥吖頭↗ 提交于 2019-12-24 03:23:42
问题 I creating a drawing Application in Metal. I need to create a Fill path using the users touchpoints. What algorithm can we used to achieve that effect in Metal. Here's an example image showing what I'm trying to do: 来源: https://stackoverflow.com/questions/54859460/fill-path-in-metal

Rendering small CIImage centered in MTKView

你。 提交于 2019-12-23 04:58:19
问题 I'm rendering a CIImage to MTKView and the image is smaller than the drawable. let centered = image.transformed(by: CGAffineTransform(translationX: (view.drawableSize.width - image.extent.width) / 2, y: (view.drawableSize.height - image.extent.height) / 2)) context.render(centered, to: drawable.texture, commandBuffer: buffer, bounds: centered.extent, colorSpace: CGColorSpaceCreateDeviceRGB()) I'd expect the code above to render the image in the center of the view, but the image is positioned

MTLBuffer allocation + CPU/GPU synchronisation

二次信任 提交于 2019-12-23 04:47:45
问题 I am using a metal performance shader( MPSImageHistogram ) to compute something in an MTLBuffer that I grab, perform computations, and then display via MTKView . The MTLBuffer output from the shader is small (~4K bytes). So I am allocating a new MTLBuffer object for every render pass, and there are atleast 30 renders per second for every video frame. calculation = MPSImageHistogram(device: device, histogramInfo: &histogramInfo) let bufferLength = calculation.histogramSize(forSourceFormat:

How to convert a MTLTexture to CVpixelBuffer to write into an AVAssetWriter?

∥☆過路亽.° 提交于 2019-12-19 09:25:07
问题 I have a requirement to apply filters on the live video and I'm trying to do it in Metal. But I have encountered problem with converting the MTLTexture into CVPixelBuffer after encoding the filter into destination filter. Reference (https://github.com/oklyc/MetalCameraSample-master-2) Here are my codes. if let pixelBuffer = pixelBuffer { CVPixelBufferLockBaseAddress(pixelBuffer, CVPixelBufferLockFlags.init(rawValue: 0)) let region = MTLRegionMake2D(0, 0, Int(currentDrawable.layer.drawableSize

Add alpha value when create image from MTLTexture

南笙酒味 提交于 2019-12-13 03:48:22
问题 I am creating UIImage from current drawable texture as follows. func createImageFromCurrentDrawable() ->(UIImage){ let context = CIContext() let texture = metalView.currentDrawable!.texture let kciOptions = [kCIContextWorkingColorSpace: CGColorSpace(name: CGColorSpace.sRGB)!, kCIContextOutputPremultiplied: true, kCIContextUseSoftwareRenderer: false] as [String : Any] let cImg = CIImage(mtlTexture: texture, options: kciOptions)! let cgImg = context.createCGImage(cImg, from: cImg.extent)! let