- B23N-1941: Crowns, Competition, and Continuity: A Machine Learning Approach for Annual Yield Prediction of Pinus taeda (L.) Using UAV-LiDAR Data
-
Board 1941‚ Hall EFG (Poster Hall)NOLA CC
Author(s):Generic 'disconnected' Message
Gunjan Barua, Virginia Polytechnic Institute and State University (First Author, Presenting Author)
David Carter, Michigan State University
Valerie Thomas, Virginia Polytechnic Institute and State University
Patrick Green, Virginia Polytechnic Institute and State University
Phil Radtke, Virginia Polytechnic Institute and State University
Thomas Pingel, Binghamton University
Rachel Cook, North Carolina State University Raleigh
Timothy Albaugh, Virginia Polytechnic Institute and State University
Rafael Rubilar, University of Concepcion
Otavio Campoe, Federal University of Lavras
Matthew Sumnall, Virginia Polytechnic Institute and State University
Foresters need to know how much wood their trees will produce each year, but measuring every tree by hand is slow and expensive. The study utilized a drone equipped with laser scanning (LiDAR) over two loblolly pine plantations in the southeastern United States when the trees were eight years old. The scan captured the size and shape of 13,547 individual trees planted at three different spacings. The team fed these measurements, plus information about how crowded each tree was, into machine‑learning models—random forests and support‑vector regression—and a traditional mixed-effect model. They asked a hard question: can one model, built from this single flight, accurately predict each tree’s yearly volume growth for the next seven years, matching the accuracy of annual ground surveys? The study will show how planting density affects errors, which LiDAR features matter most at different ages, and how mistakes add up when tree‑level predictions are summed to the whole stand. If successful, the approach offers a low‑cost way to generate reliable, year‑by‑year yield forecasts from just one drone mission, helping managers adapt thinning and harvest plans in near‑real time.
Scientific DisciplineNeighborhoodType
Enter Note
Go to previous page in this tab
Session
