Harbeth
Harbeth is a tiny set of utils and extensions over Apple's Metal framework dedicated to make your Swift GPU code much cleaner and let you prototype your pipelines faster.
Graphics processing And Filter production.👒👒👒
English | 简体中文
Features
🟣 At the moment, the most important features of Metal Moudle can be summarized as follows:
- Support operator chain filter.
- Support quick design filters.
- Support merge multiple filter effects.
- Support fast expansion of output sources.
- Support camera capture effects.
- Support video to add filter special effects.
- Support matrix convolution.
- The filter part is roughly divided into the following modules:
- Blend: This module mainly contains image blend filters.
- Blur: Blur effect
- ColorProcess: basic pixel processing of images.
- Effect: Effect processing.
- Lookup: Lookup table filter.
- Matrix: Matrix convolution filter.
- Shape: Image shape size related.
- VisualEffect: Visual dynamic effects.
100+
kinds of filters are currently available.✌️
A total of
- Code zero intrusion injection filter function.
// Original code:
ImageView.image = originImage
// Injection filter code:
let filter = C7ColorMatrix4x4(matrix: Matrix4x4.sepia)
var filter2 = C7Granularity()
filter2.grain = 0.8
var filter3 = C7SoulOut()
filter3.soul = 0.7
let filters = [filter, filter2, filter3]
// Use:
ImageView.image = try? originImage.makeGroup(filters: filters)
// OR Use:
let AT = C7FilterTexture.init(texture: originImage.mt.toTexture()!)
let result = AT ->> filter ->> filter2 ->> filter3
ImageView.image = result.outputImage()
// Even:
var texture = originImage.mt.toTexture()!
filters.forEach { texture = texture ->> $0 }
ImageView.image = texture.toImage()
- Camera capture generates pictures.
// Inject an edge detection filter:
var filter = C7EdgeGlow()
filter.lineColor = UIColor.red
// Inject a particle filter:
var filter2 = C7Granularity()
filter2.grain = 0.8
// Generate camera collector:
let camera = C7CollectorCamera(callback: { [weak self] (image) in
self?.ImageView.image = image
})
camera.captureSession.sessionPreset = AVCaptureSession.Preset.hd1280x720
camera.filters = [filter, filter2]
Overview
-
Core, basic core board
- C7FilterProtocol: Filter designs must follow this protocol.
- modifier: Encoder type and corresponding function name.
- factors: Set modify parameter factor, you need to convert to
Float
. - otherInputTextures: Multiple input source extensions, An array containing the
MTLTexture
- outputSize: Change the size of the output image.
- C7FilterProtocol: Filter designs must follow this protocol.
-
Outputs, output section
- C7FilterOutput: Output content protocol, all outputs must implement this protocol.
- make: Generate data based on filter processing.
- makeGroup: Multiple filter combinations, Please note that the order in which filters are added may affect the result of image generation.
- C7FilterImage: Image input source based on C7FilterOutput, The following modes support only the encoder based on parallel computing.
- C7FilterTexture: MTLTexture input source based on C7FilterOutput, The input texture is converted to a filter to process the texture.
- C7FilterOutput: Output content protocol, all outputs must implement this protocol.
Usages
- For example, how to design an soul filter.🎷
-
Accomplish
C7FilterProtocal
public struct C7SoulOut: C7FilterProtocol { public var soul: Float = 0.5 public var maxScale: Float = 1.5 public var maxAlpha: Float = 0.5 public var modifier: Modifier { return .compute(kernel: "C7SoulOut") } public var factors: [Float] { return [soul, maxScale, maxAlpha] } public init() { } }
-
Configure additional required textures.
-
Configure the passed parameter factor, only supports
Float
type.- This filter requires three parameters:
soul
: The adjusted soul, from 0.0 to 1.0, with a default of 0.5maxScale
: Maximum soul scalemaxAlpha
: The transparency of the max soul
- This filter requires three parameters:
-
Write a kernel function shader based on parallel computing.
kernel void C7SoulOut(texture2d<half, access::write> outputTexture [[texture(0)]], texture2d<half, access::sample> inputTexture [[texture(1)]], constant float *soulPointer [[buffer(0)]], constant float *maxScalePointer [[buffer(1)]], constant float *maxAlphaPointer [[buffer(2)]], uint2 grid [[thread_position_in_grid]]) { constexpr sampler quadSampler(mag_filter::linear, min_filter::linear); const half4 inColor = inputTexture.read(grid); const float x = float(grid.x) / outputTexture.get_width(); const float y = float(grid.y) / outputTexture.get_height(); const half soul = half(*soulPointer); const half maxScale = half(*maxScalePointer); const half maxAlpha = half(*maxAlphaPointer); const half alpha = maxAlpha * (1.0h - soul); const half scale = 1.0h + (maxScale - 1.0h) * soul; const half soulX = 0.5h + (x - 0.5h) / scale; const half soulY = 0.5h + (y - 0.5h) / scale; const half4 soulMask = inputTexture.sample(quadSampler, float2(soulX, soulY)); const half4 outColor = inColor * (1.0h - alpha) + soulMask * alpha; outputTexture.write(outColor, grid); }
-
Simple to use, since my design is based on a parallel computing pipeline, images can be generated directly.
var filter = C7SoulOut() filter.soul = 0.5 filter.maxScale = 2.0 /// Display directly in ImageView ImageView.image = try? originImage.make(filter: filter)
-
As for the animation above, it is also very simple, add a timer, and then change the value of
soul
and you are done, simple.
Advanced usage
- Operator chain processing
/// 1.Convert to BGRA
let filter1 = C7ColorConvert(with: .color2BGRA)
/// 2.Adjust the granularity
var filter2 = C7Granularity()
filter2.grain = 0.8
/// 3.Adjust white balance
var filter3 = C7WhiteBalance()
filter3.temperature = 5555
/// 4.Adjust the highlight shadows
var filter4 = C7HighlightShadow()
filter4.shadows = 0.4
filter4.highlights = 0.5
/// 5.Combination operation
let AT = C7FilterTexture.init(texture: originImage.mt.toTexture()!)
let result = AT ->> filter1 ->> filter2 ->> filter3 ->> filter4
/// 6.Get the result
filterImageView.image = result.outputImage()
- Batch processing
/// 1.Convert to RBGA
let filter1 = C7ColorConvert(with: .color2RBGA)
/// 2.Adjust the granularity
var filter2 = C7Granularity()
filter2.grain = 0.8
/// 3.Soul effect
var filter3 = C7SoulOut()
filter3.soul = 0.7
/// 4.Combination operation
let group: [C7FilterProtocol] = [filter1, filter2, filter3]
/// 5.Get the result
filterImageView.image = try? originImage.makeGroup(filters: group)
Both methods can handle multiple filter schemes, depending on your mood.✌️
CocoaPods
- If you want to import Metal module, you need in your Podfile:
pod 'Harbeth'
- If you want to import OpenCV image module, you need in your Podfile:
pod 'OpencvQueen'
Remarks
The general process is almost like this, the Demo is also written in great detail, you can check it out for yourself.🎷
Tip: If you find it helpful, please help me with a star. If you have any questions or needs, you can also issue.
Thanks.🎇
About the author
- 🎷 E-mail address: yangkj310@gmail.com 🎷
- 🎸 GitHub address: yangKJ 🎸
License
Harbeth is available under the MIT license. See the LICENSE file for more info.