Digitising orchards - an experiment

What if you could bring people close to an orchard through their computer, smartphone or virtual reality? Here's what we explored.

Digitising an orchard

As part of the project, we planned to scan an orchard, take 360 degree images and use drone photography. 

Architect, extended reality (XR) technology enthusiast and founder of 'Wooducation', Georgina McLeod, volunteered to collaborate with Luton Orchards on this experiment.

The idea was to recreate the orchard environment in virtual reality, to test if it could be useful to: 

Interrupting the work

After having had to interrupt our collaboration in April 2024, Luton Orchards project coordinator Konni Deppe restarted it by chatting with Georgina McLeod on 13 September 2024. 

The following headings break down an edited transcript of that conversation, describing our experiences and the lessons learnt.

Setting the Matterport camera.

Scanning an orchard 

Konni Deppe: So, our work was all about exploring digital ways to make orchards accessible.

Georgina McLeod: Yes, we were trying to digitise an orchard as a communication and mapping tool, and for educational purposes.

K: And to see if orchards could be preserved and shared digitally. We started with scanning the Stockingstone Road Allotments orchard one rainy day in April. You used the little camera, what was that called?

G: It was a Matterport camera. It produces what we call a photogrammetry survey. It takes photos, but it also maps sort of mesh data. Now, obviously, being an architect, I'm used to surveying buildings, so I do a little bit of landscape. 

If I'm surveying a building, there will be an element of landscape. But this obviously was all landscape and it was on quite a traverse slope. And, you know, it's a repetitive landscape as well. The camera sort of uses image mapping to stitch the different scenes together. 

So that was quite challenging in itself, because we had to keep moving the camera back and often it wouldn’t work.

Georgina (left) and Konni during a break in the rain. 

K: We took 96 scans in total, I think. 

G: Yes, 96 scans. And it rained!

K: And it seemed like the software struggled. Is this another tree or is this the same tree? Do you think that's what it was?

G: Yes. Normally when I scan a building, you have a lot of differences. Even if you've got rhythm and, you know, repetition in a building, there's still something that distinguishes a disparity between two different spaces. But with the trees, I think the technology was really struggling. So I think Matterport probably wasn't the best platform to use for the orchards.

K: Because once it was stitched together and you would jump from tree to tree, it would not know where to attach the info spot that would then have other information, like the tag number and blossom photographs.

G: Yes. So it became rather discombobulated. And I think also the method of surveying around the perimeter - in hindsight, it would have been better to pick points more central. So divide the site up almost into a grid of, say, ten squares that may have worked better.

K: Oh, like quadrants or whatever that's called. Interesting. I think because we did take the scans with a Matterport camera rather than with a phone, the free account does not let you publish the location. So it was kind of useless, as we could not share it.

Handwritten map

Still unbeatable in the field: a handwritten record of the trees.

Sticks in a meadow

The software struggled to stitch together our orchard. Were there too few distinguishing features between the images?

Setting the camera

Using state-of-the-art equipment - thankfully it was showerproof too. 

360 degree images and 2D floorplans in Kuula

G: The next thing I tried was taking 360 [degree] images. I took the Matterport scan and some of the data that we've collected, and I modelled and created a 3D site plan with indicative trees. And then I put that into Kuula virtual reality. 

What I'd pasted into there, into the Kuula platform, was a series of 360 degree panoramic images that were taken from the 3D model. And the 3D model was produced in a software called Twinmotion, which is kind of like gaming software. Twinmotion enables us to navigate through the digital twin, essentially without having all the problems. 

Kuula Stockingstone Allotments property - this was our 'sandbox' to try out and is still a work in progress. Many things such as the names of the cultivars and drone footage are not implemented yet

K: So how did you manage to have the dimensions for the floorplan?

G: So the Matterport scan had created mesh data for me and I was able to take exact, precise dimensions.

Matterport struggled with the photogrammetry - so the 360 bit that you spin around in when you spin around on the photos. But the mesh data we'd collected was actually pretty accurate. If you just stripped away the photogrammetry, sort of like the textures and reality, you end up with a mesh. 

And that was perfect for me to be able to put that into software and model it in 3D. And then we can add, y'know, as much or as little data as possible into that.

K: And in theory, could this mesh data also produce a 2D? And that still exists? And you'd say, that's good enough if something could come of it?

G: Absolutely. We can convert that into 2D plans. We can add more information. And I also used Google Earth and a screenshot aerial view from Google Earth, and I laid it over the mesh data to compare, so that I could then model some of the surrounding context - because you'll notice I put some of the other allotments and things like that in there.

K: Okay. And then I remember you sending me that. You had already put, for example, the tag numbers in there. Were they accurate? Or was it more dummy numbers? 

G: No, I had modelled the trees based on the hand[-drawn] survey we’d done on the day. So we did a sort of land survey, and we went around and we tagged the trees, and we sort of did an inventory. So I used a combination of Google Earth, trying to see the trees and the map that we've drawn, and photographs. So the trees are not in the exact location. They are indicative, but in terms of their order and the type of trees they are in the right order. And the tags should correspond. I'm not sure I'd actually tag them all. I've done a couple as an example.

K: On the day I remember, I took a photograph of your hand-drawn map and I actually managed to ID three of the cultivars, based on the original planting plan. And it was so helpful that they were tagged. I just hope the tags stay on!

Drone footage and getting flight permissions

K: And then we tried to do drone footage at Wigmore Valley Peace Garden Orchard.

G: Oh god!

K: And we thought we had everything in place, you have a proper operator ID to fly a drone. 

G: Correct. 

Everything in place?

We thought we'd prepared, but the unlock code came 48 hours later, so we had to abandon the first drone filming attempt. 

4 April 2024: too early

Would this have been a good day to take drone footage? Probably not. Too little foliage on the trees would have made it hard to see anything. 

9 May 2024: perfect

The improvised May filming date turned out to be much more suitable for taking orchard footage. See the footage at Bide-a-While orchard.

K:  And if I remember correctly, you got all the right permissions. We actually got the [London Luton Airport] tower to fast-track our application, but we didn’t know about the one-day time lag with the drone manufacturer. 

G: It was the code on the actual device. And I got that permission about 48 hours later. So I didn't realise. And that was my mistake, because I've never had to fly in a no-fly zone. 

So, obviously, when you obtain a drone flyer's ID licence, it's like doing a theory test for driving a car. And there is a whole section on flying in a flight path and it tells you about gaining these permissions, but it doesn't tell you the specifics on how long it will take. 

We'd managed to get lucky and we got the permission from the tower directly and we had to call them and let them know what time we were taking off, how long we were going to be airborne, and then to call them once we'd landed, so they knew, if they identified it on their system, who we were. However, I had assumed that the permission, because the the drone device itself has a safety feature using GPS, so it knows if it's in a flight zone, so of course it wouldn't allow us to take off and we needed permission to unlock the device itself, but you have to apply for that in advance.

K: So when you got that thing through 48 hours later, do you think if we had had that on the day, it would have actually unlocked the drone?

G: Yes, I've looked into it and it would have. So it was a real shame, but a good learning curve.

K: Yeah. And would you be up for trying again next year?

G: Absolutely. Now we know what to do, y'know, and this is great, because I have to teach my students surveying. We use drones, obviously, we don't fly in no-fly zones, but I've actually been using this as a case study to say to them, look, this is what happens.

K: And it's expensive, say, if your client is sitting there and you didn’t get the drone unlocked.

G: Yeah, exactly. So the best way of learning is actually making errors. So it's been really valuable and we know for next time.

The best time to scan and photograph trees

K:  Another thing I thought is also that we were a bit early on 4 April because very few trees were in blossom and everything else was just bare sticks.

G: Yes. We need some more leafage there. More leafage, a bit more blossom.

4 April 2024: only cherries and pears were in blossom

Our April date was before trees were in blossom and leaf. That made it hard for the software. 

4 April 2024: apple buds

The apple blossom hadn't opened yet. 

9 May 2024: the orchard in leaf, bloom and some fruitlets

This was the day when another drone photographer could help out - and the timing was perfect. 

K: So after our collaboration was cut short, a local artist from another Lottery-funded project had recommended a photographer who was available at super-short notice. We went out to Bide-a-While, which is easier than Wigmore, because it's further away from the Luton Airport no-flight zone. And we couldn't have timed it better even if we had met the day you and I had wanted to meet, because it was, I don't know, a week later or two weeks later, on 9 May. 

It was the perfect time for trees to be in full blossom everywhere. That was a day when the trees looked stunning.

And it shows. The beauty of an orchard in blossom is something you can capture. What's very nice is this quite slow movement of the drone, and the ability to fly. Anything digital that's interactive would need to be able to convey the same beauty and gracefulness.

Future explorations: total station

G: I have another suggestion on what to try next. So there's another device that I have that I'm trained on now to teach my students for surveying. It's called a total station. 

K: A what? 

G: A total station. So it's sort of a laser measurer and it's what professional land surveyors will use, and we could use that to create what we call a topographical survey. 

So it would be more to pinpoint the exact boundaries of the site and draw a plan. So not necessarily in 3D, but what it would enable us to do, if we're successful, is to get the levels from the site, which could be quite interesting. So, as you know, when we measure levels on site, we have a datum and it's based on anything that's above sea level. So we can use this total station to pinpoint different markers around the site, and we can actually pinpoint the exact locations of the trees as well, which would be really helpful. So it would be a very basic line cad drawing. But it would be accurate.

K: And what can you do with that line drawing now? Does it sit on a platform or can you export it?

G: You can export it to anything. We could pop it into the 3D model that I've already created and adjust the 3D model to ensure it's 100% accurate. You could export it as a PDF. 

You know, there's a variety of things that we could do with a topographical survey, but mainly in my line of work, we would use it as what we call a DWG drawing. So it's like a CAD format drawing, and then we would develop that in CAD or import it into 3D modelling software.

Future applications: immersive classrooms

G: I've been working on a project in further education, and as part of this project, we're delivering these immersive classrooms to colleges. Could you repeat your talk at one of these colleges? 

The way immersive classrooms work is a bit like a projector in the cinema. You have a series of those on the ceiling and blank walls in a box, and they use projection mapping. They have a series of sensors on the walls, so they become interactive spaces. 

The platform is usually on an iPad, where you can create content, and that might be photographs, videos, or 360 [degree] footage, like what we took on the orchard site.

And they are actually creating one to demo at the moment on forest bathing. And the idea is bringing the outside into this immersive space, so it becomes accessible for all. 

You know, if you think about people that get out to those spaces, if there's people in wheelchairs, a lot of people. 

The accessibility on your sites is actually something to consider. There's a big slope at Stockingstone. So how can you take the orchard and allow other people to experience it in an immersive and interactive way? Because you can show them photographs, you can show them videos, you can show them drone footage. But what if they could sit in a room where it is one to one scale, and they could go up and physically touch the trees?

And we have just completed one of these immersive rooms and they are looking for case studies, and people to come and try it out. We could get you in the immersive room and set up somebody to film you doing a talk in this immersive way.

K: Oh god, that sounds amazing. 

G: It would be an opportunity for you to showcase your work in an immersive way. 

We've already got your 360 images. We might need to take some more, but we could also put them in a virtual reality headset so they could virtually walk around your orchards. That's very simple to do. 

K: My only caveat would be that we didn't capture the orchard at the time when it's stunning. That's the thing. Otherwise it’s just a meadow with sticks.

G: Well, we'll have to get some more footage when it's all looking good!

Searching for the purpose of a digital orchard

K: It's interesting, because I'm still thinking about what is the key purpose of doing these digital mapping things? Like, where is the usefulness? What’s the problem we’re trying to fix. 

G: Exactly.

K:  Even if you had a super-accurate map, it’s still hard to tell what tree you’re looking at when you’re in front of it. I don't know how accurate GPS can be.

One thing we did at Wigmore shortly after, which was amazing in May, we went there with students from Barnfield College. And they tagged all the trees, so they tagged all the stations with a unique number and registered any cultivar tags that were still on the tree.

And then we did some general maintenance work. And now that is, for me, so useful, even though sometimes it's hard to explain why I find it so useful. 

I had the original planting plan that had 70 trees, or 69. There was someone who did it in Arbortrack, which is the council’s tree software, and they had 67 trees, so there was a discrepancy. 

And then there's just no way, without actually having a physical something on each individual station, there's no way how you can be sure what you're looking at.

G:  Exactly. Precisely. Well, you know, we could use the stuff that I've produced as more of a visual aid to embedded information.