gianni-blk2gopulse-selfie

Talking Tech: Inside the novel new Leica BLK2GO PULSE

By Christopher Curley

|
11/30/2023

The Leica BLK2GO PULSE represents a new frontier in handheld laser scanning, including a novel implementation of dual Time-of-Flight (ToF) sensors to create a “first person” scanning experience. We chatted with BLK Product Manager Gian-Philipp Patri to talk about Leica Geosystems’ drive to realize innovative new technologies and how the BLK2GO PULSE came to be.  

Let’s talk about the genesis of the BLK2GO PULSE. Where did it come from?

Hexagon invests a lot in research and development. The foundational elements of the BLK2GO PULSE came from a joint venture with Sony, which developed a new Time-of-Flight sensor. Our job was to look at this technology and imagine the possibilities. Could this be something? And if so, for what could it be useful? 

The more we got familiar with it and looked at it from different perspectives, we realized that certain elements provide unique benefits compared to our current solutions, from the way the sensors function with a uniform grid to the instant colorized point clouds that are now possible. It starts with going out of our comfort zone and finding the opportunities within.  

I know a crucial element of the BLK2GO PULSE is not just the sensors but the fact that there are two of them. How did that come together?

With Time-of-Flight, you get a certain field of view, and we had in mind to integrate more than one of those ToFs due to several benefits — the most important of which was capturing more data in less time.  

A BLK2GO with dual-axis LiDAR captures the full dome, often including data you don’t intend to capture. 

With the dual ToFs, we were invited to think differently, and where we landed was: What if we set them side by side? Taking two of the sensor units and combining them into one solution was not an easy task! Because the whole calibration that you do for one sensor needs to work for both together. It's quite some effort.  

But two sensors well-arranged can capture everything a user might need it to, at the same time optimizing the costs to reduce the end-price for our customers.  

This is how “first-person scanning” was born.14  

This is a new term. What does “first-person scanning” really mean?

Essentially, it means that you see what the scanner sees. The two sensors are positioned like “eyes,” and when you connect the BLK2GO PULSE to the BLK Live app on your smartphone mounted to the device, what you see through the display is actually what you are capturing as it’s happening. 

This is a new way of doing things, and it also makes workflows faster and the data much more lightweight because you only capture the exact data you need.  

Think of a floor plan. Why should I capture the ceiling if I want to capture the floor? It just makes your data unnecessarily big, and then you have to transfer it, process it, and then spend time removing this extra data, right? It's not an intelligent way to do things.  

And then, even if you feed this BLK2GO PULSE data into post-processing, the package is smaller because you don't have to post-process the data on the ceiling or objects you aren't interested in because you were in control and didn’t capture it. Perfect. 

I’ve got another one for you. PULSE Technology — that’s more than just a new sensor, right? 

Yes. PULSE Technology is the fusion of this dual ToF LiDAR with GrandSLAM. 

GrandSLAM is a combination of Visual Inertial System (VIS) and LiDAR, where VIS is a combination of the cameras and the Inertial Measurement Unit (IMU). All this allows you to track where you are in space and to perform Simultaneous Location and Mapping. So, you map the world from an unknown position in real time.  

It would be easy to say, “Okay, we can replace the dual axis LiDAR with this time of flight LiDAR. So information XY set gets fed by a different kind of sensor.” 

But we then realized it's not just replacing one piece with another, but really thinking about our unique advantages with this new sensor. As we dug into it, we discovered that we could colorize in real-time.

1_5

I’ll bite: Why can the BLK2GO PULSE do this while the BLK2GO can’t?

That’s a good question. It has to do with the volume of data and synchronization. With GrandSLAM, the cameras capture visual data at a specific interval, and on the BLK2GO, the dual-axis LiDAR captures data on an entirely different interval — 420,000 points per second. So you get color information regularly, but less frequently than the LiDAR sensor is gathering points. As a result, many of the points you capture don't yet have color information. You cannot solve this in real time because the frequencies don't match. You can only do it in post.  

That’s different with the BLK2GO PULSE because time-of-flight is kind of a 3D camera where, in each pulse, we get 3D information, and we can sync those pulses with the camera frequency. This means with one shot on the BLK2GO PULSE, you can go from the LiDAR system to the camera system to RGB color.  

And this allows you to accurately say that this single point has this particular color in real-time. Put all those points together, and you have an instantly colorized point cloud.  

That’s a more illustrative and less technically correct explanation than R&D would give you, but the point is that it’s very technically complex to make this sensor fusion work. Hexagon and Sony, however, have figured it out together, and the result is an instant in-field rendering of the point cloud in a way that’s never been done before.