arkypita/LaserGRBL

Speed modulation in Line to Line tracing

brakthehun opened this issue · 15 comments

My lasers are capable of 15,000 mm/minute.
It would be nice to have the option to speed up the laser rather than dim it.
I think jobs would get done much faster and you would have much more depth than 255 grayscale.

So,, you would choose your min speed (Darkest) and max speed (Lightest) and then choose linear or logarithmic speeds.

This would also mean you don't need PWM to run grayscale.

This is a very interesting suggestion. I think I'll be working on in the first weekend of spare time.

I'm working on it. First results are encouraging.

There is only one problem with this method and it has been tried before. rapid speed changes require massive acceleration and deacceleration. My laser is capable on 30,000 mm/minute. But from stationary to full speed requires about 30mm which means high contrast and high detailed areas don't look sharp at all. As it is, I give my jobs about 30mm before and after each line to allow for the head to get up to speed.

The acceleration required to get the head from stationary to full speed in the space of one pixel would probably break the belt of something else. I have calculated that for my machine to be able to acceleration to a feed of 10,000 in the space of 0.1mm, the width of 1 pixel, would require approx. 40G of acceleration force or 400,000 mm/sec/sec. Just not practical. 40G is probably beyond the specs for the Mirrors in a CO2 laser and probably beyond the specs of a diode laser as well. Highly doubt the belts can take that much force either.

To put it into perspective, The acceleration of a bullet down the barrel of a gun is approx. 200,000 or about 20G.

Hi @mayhem2408 and @brakthehun

As you may understand, I develop and test LaserGRBL with a 2W self-built laser engraving machine. LaserGRBL is developed with this kind of machines as target. For my test I use speed between 2000 and 6000. I can imagine that a professional or semi-professional CO2 engraver may have really different needs and problems, but not being able to test them on the field I rely on your experience and feedback.

For what I know, grbl firmware implements an excellent acceleration manager that prevents the damages @mayhem2408 describes because it translates sudden changes in speed or direction through progressive variations according to $120, $121, $122 configuration.

Of course this means that any kind of speed-modulation with constant laser power will be affected by this behavior which can limit its effectiveness (loss of details, sharpness).

By the way newest grbl v1.1 also include the "dynamic laser power mode" which can balance the effects of acceleration manager. All this may sound great but it is only possible if the laser can be power modulated .

I think that the modes to obtain quality grayscale without power modulation could be:

  • speed modulation but with low speeds (this also means low constant powers or will burn everything)
  • multiple fast passes that progressively darken the image
  • 1bit dithering with high resolution

What do you think about those methods?

@arkypita The 1 bit dithering is something I have been experimenting with for a while. I have a python script with take an image and generated gcode with a line spacing at 0.1mm (10 lines per mm) and each pixel on the line is 0.025mm (40 pixels per mm). Doing this increased the pixel resolution 4 times, but without affecting the speed/time to engrave. The results have been very promising. The script however is not very user friendly.

  • Multiple passes sound interesting but also time consuming.
  • Speed modulation even at low speeds and never shown me crisp clean and sharp images. I gave up of this idea 18 months ago.
  • 1 bit dithering is what I am playing with at the moment.

20170521_105531s
This is a greyscale laser engrave with full 256 greys using PWM modulated at 16KHz and 10 Pixels per mm. (No greyscale curve calibration)

20170521_105521s
This is a greyscale laser engrave with full 256 greys using PWM modulated at 16KHz and 10 Pixels per mm with a calibrated greyscale curve

20170521_105604s
This is a dithered laser engrave at 10 lines per mm vertically and 40 pixels per mm horizontally.

Both images are the same size and took exactly the same time to engrave.

These were done at a feed rate of 2000mm/min using a 3.5W blue diode laser.

I am not sure how to add pictures. I do a lot of pixel art for engraving. That means long lines of single colors. My acceleration is 8000 on my Y axis so it snaps to the next speed just fine.

Here's what I like to do as an example. I would find a picture of the windfish from Link's awakening. I would google it,, then download the highest quality I can. I might make it blockier through Inkskape if I need.

Once I had that image I will vring it in to your beautiful software,, line to line,, vertical,, then next. M4,, Smin:1 SMax 11,,

Make it 400mm x 250mm or so.
Let it create Gcode.

Then I would send $31=11. This is to avoid a to of data being sent through. This is so there are only 12 shades that can send. When I run this 5 hour job on,, Sometime I sit there and see these long traces at S3. I set the speed to 1600. It would be great to see the line speed up to 6400.

@mayhem2408
I agree with you: multiple passes will be very long process but i have manually tested it and result are quite good. Of course if you want 10 grey tones you must do 10 passes... a very big waste of time!!

Using different X/Y resolution sounds to be a good idea. How does your script work to produce the dithered image and the gcode? Would you like to share the idea with me? Maybe I can implement something similar in LaserGRBL.

@brakthehun

I am not sure how to add pictures.
addimage

Then I would send $31=11. This is to avoid a to of data being sent through. This is so there are only 12 shades that can send.

Yes, LaserGRBL translate into lines all the homogeneous sequence of the same color. This is a good optimization, but usually the images has very small difference in color between two neighbouring pixels.

Your trick to reduce the number of color is correct, and can speed-up the process because it reduce the number of gcodes chars to be sent (producing longer lines instead of many small segment). I think i will add a feature for reducing number of colour directly in software without changing S-MIN, S-MAX and $30.

However, keep in mind that there are some intrinsic speed limitations in using gcode+grbl as a way to send raster data, as discussed in other posts.

#38
gnea/grbl#197

btw I will continue the development of speed modulation... i want to see if it could be useful

links20awakening2020the

I don't think multiple passes would be a waste of time. What's the difference between 1800mm/m 1 pass for full darkness or 7200 mm 4 passes. I suppose I would have to test that.

Doubling the line speed does not half the pixels burn intensity. It would be very nice if everything was that linear. I have a number of lasers ranging for 3W to 7W. My 5W laser as been professionally calibrated and is as close to perfect linear power output as we could get. 0% = 0Watts, 25% = 1.2W, 50% = 2.6W, 75%=3.8W and 100% = 5W. However the burn on wood is not linear.As an extreme example, 100% (5000mW) at a feed of 3000 produces a nice dark line. 1% (50mW) at a feed of 30 does absolutely nothing.

The other problem with multiple passes is that after the first pass, the material is now darking and absorbers the lasers energy faster and hence turns even darker quicker. If you have ever does and greyscale test, you will probably noticed that at the lower end of the scale nothing happens, and there will be a point where the burn changes quite rapidly. I have spent countless hours Trying to calibrate the greyscale curve of an image to get the best result.

As you can see here, the greyscale of the image vs the calibrated greyscale image to produce a smooth gradient burn. The non linear nature of the burn makes things very difficult. The Top half is a linear greyscale,

laser_temp_id_layer3_step6

You will notice that the grey ramp up to much slower right until the end. This is because the as the wood gets darker, it burns quicker.

Yes @mayhem2408 I know exactly what you say, you can also add that different material react in very different way, so you need a calibration scale for each material.

A non-linear color conversion is already in my development roadmap.
I think i will add a color-curve control like the one in photoshop

image

In my opinion dithering with a very small dot size produce the best result, it's the only technique that does not affect this problem.

@arkypita I have spent probably 100's of hours working of different methods to calibrate the grey curve and had very good results. Currently I have a calibration file that produces 21 boxes from 0% to 100% in 5% increments. I then take the grey values for each box and run it through a script I have written to adjust the power curve. It works very well. However some of the best burns I have to date are high resolution dithers. As shown in one of my earlier posts in this thread, the high resolution dither looks much better than the greyscale images. The other variables which really frustrated me with greyscales is that even the ambient air temperature will change the grey curve. If the wood is cold, it takes more laser power to start burning. If the humidity is high it takes more laser power to start burning. Different grains in the wood burn at different rates. All of this don't tend to be as big a problem with high res dithering. This is where I am focusing my scripts on at the moment.

The biggest problem I am having with high res dithers is the bandwidth of getting the high res dithered data to GRBL. I have had to abandon GRBL and GRBL-Mega and go with GRBL-LPC as it is impressively fast. At 40 pixels per mm (1016 DPI), and a feed of 3000 mm per minutes, that is 2000 pixels per second. To date I have managed 6000+ pixels per second with GRBL-LPC.

I think you should not do the calibration too complicated, using free editable curves.
It will be hard to handle and it will take lots of time for the user to adjust the curve by trial and error.
A simple gamma curve should work.
A gamma curve just needs 3 inputs: The minimum laser power (white level) the maximum laser power (black level) and the gamma value.
The curve is calculated between minimum an maximum. Depending on your laser, you can set the gamma higher than 1, which makes the contrast high, or set gamma below 1, which sets the contrast lower. Calculation of gamma is quite easy and fast and you don't need any lookup tables.

Naturally you will never get 255 real greysteps with a laser. But I think the gamma correction would give much better results than now.
gamma_high
gamma_low

I have plan to rewrite the whole g-code generator of LaserGRBL, to clean-up the code.
I add this feature in roadmap but I will work on it after the code cleanup.