diff --git "a/9NFQT4oBgHgl3EQfJDUU/content/tmp_files/load_file.txt" "b/9NFQT4oBgHgl3EQfJDUU/content/tmp_files/load_file.txt" new file mode 100644--- /dev/null +++ "b/9NFQT4oBgHgl3EQfJDUU/content/tmp_files/load_file.txt" @@ -0,0 +1,945 @@ +filepath=/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf,len=944 +page_content='(Preprint) AAS 23-294 DEEP MONOCULAR HAZARD DETECTION FOR SAFE SMALL BODY LANDING Travis Driver⋆*, Kento Tomita⋆†, Koki Ho‡, and Panagiotis Tsiotras§ Hazard detection and avoidance is a key technology for future robotic small body sample return and lander missions.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Current state-of-the-practice methods rely on high-fidelity, a priori terrain maps, which require extensive human-in-the-loop verification and expensive reconnaissance campaigns to resolve mapping uncer- tainties.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' We propose a novel safety mapping paradigm that leverages deep se- mantic segmentation techniques to predict landing safety directly from a single monocular image, thus reducing reliance on high-fidelity, a priori data products.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' We demonstrate precise and accurate safety mapping performance on real in-situ imagery of prospective sample sites from the OSIRIS-REx mission.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' INTRODUCTION Hazard detection and avoidance (HD&A) is a key technology for future robotic small body sam- ple return and lander missions.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Current approaches rely on high-fidelity digital elevation maps (DEMs) derived from digital terrain models (DTMs), local topography and albedo maps, generated on the ground.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='1 However, DTM construction involves extensive human-in-the-loop verification, carefully designed image acquisition plans, and expensive reconnaissance campaigns to resolve mapping uncertainties.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='2,3 We, instead, propose a novel safety mapping paradigm that leverages Bayesian deep learning techniques to accurately predict landing safety maps directly from monocu- lar images in order to reduce reliance on expensive high-fidelity, a priori data products (i.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=', DTMs).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Safety mapping methodologies that leverage deep learning have demonstrated potential to im- prove the accuracy of onboard hazard detection.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Previous works4,5 have leveraged deep semantic segmentation to classify safe and unsafe landing locations from digital elevation maps (DEMs) derived from simulated LiDAR scans.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' However, generating reliable DEMs from LiDAR scans is non-trivial and requires accurate state estimates and range measurements.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Moreover, LiDARs typ- ically feature a relatively small effective operating range6 and increased size, weight, and power (SWaP) requirements relative to passive sensors such as monocular cameras.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Thus, we propose to derive landing safety maps directly from monocular images without assuming any a priori data or relying on the fidelity of the current state estimate.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Deep semantic segmentation has been previously employed for surface characterization of small bodies from simulated monocular images, primar- ily focusing on boulder detection.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='7,8 Conversely, we apply our models to real images and directly predict safety maps that conform to realistic landing parameters and constraints.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' ⋆These authors contributed equally to this work.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' PhD Student, Institute for Robotics and Intelligent Machines, School of Aerospace Engineering, Georgia Institute of Technology, Atlanta, GA 30332, USA.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' †PhD Student, School of Aerospace Engineering, Georgia Institute of Technology, Atlanta, GA 30332, USA.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' ‡Associate Professor, School of Aerospace Engineering, Georgia Institute of Technology, Atlanta, GA 30332, USA.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' §David & Andrew Lewis Chair, Professor, Institute for Robotics and Intelligent Machines, School of Aerospace Engineer- ing, Georgia Institute of Technology, Atlanta, GA 30332, USA.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' 1 arXiv:2301.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='13254v1 [cs.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='CV] 30 Jan 2023 The contributions of this paper are as follows: first, we develop a novel safety mapping paradigm that leverages Bayesian deep learning techniques to predict landing safety directly from monoc- ular images;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' second, we construct a dataset of real monocular images and corresponding land- ing safety maps that conform to realistic landing parameters for training and testing our mod- els;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' third, we demonstrate precise and accurate safety mapping performance on real imagery of prospective sample sites from the recent OSIRIS-REx mission to Asteroid 101955 Bennu.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Our code, data, and trained models will be made available to the public at https://github.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='com/ travisdriver/deep_monocular_hd.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' RELATED WORK Current hazard detection methodologies for small body missions rely on high-fidelity digital el- evation maps (DEMs) derived from digital terrain models (DTMs), local topography and albedo maps.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='1 However, DTM construction typically involves extensive human-in-the-loop verification and carefully designed image acquisition plans to achieve optimal results.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='2,3 Consequently, au- tonomous hazard detection and avoidance (HD&A) has been identified as a high-priority technol- ogy9 to promote and enable new mission concepts to near-earth asteroids, comets, the Moon, Mars, and beyond.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' The Autonomous Landing Hazard Avoidance Technology (ALHAT)10,11 program was launched in 2005, followed by the Safe & Precise Landing—Integrated Capabilities Evolution (SPLICE) program12 in 2018, in order to develop autonomous landing technologies.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' These programs have fo- cused on developing HD&A algorithms that operate on DEMs generated from range measurements acquired by active sensors such as flash LiDARs.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' However, these methods are constrained by the relatively small effective operating range and the increased size, weight, and power (SWaP) require- ments of LiDARs relative to passive sensors such as monocular cameras.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Indeed, the OSIRIS-REx Guidance, Navigation, and Control (GNC) flash LiDAR had a maximum operational range of ap- proximately 1 km,13,14 while preliminary testing of the Hazard Detection LiDAR of the SPLICE program demonstrated a 5 cm ground sample distance at 500 meters and near-nadir pointing.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='12 Conversely, the OSIRIS-REx Camera Suite (OCAMS)15 was able to acquire 5 cm GSD images at almost 4 km, providing higher-resolution measurements earlier in the mission than the active sensors onboard and allowing for detailed surface characterization during the early phases of the mission.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 Moreover, constructing a DEM from LiDAR scans is non-trivial and requires accurate range measurements and precise knowledge of the spacecraft’s relative pose to the landing plane.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Instead, we focus on estimating landing safety directly from a single monocular image.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Safety mapping methodologies that leverage deep learning have demonstrated potential to im- prove hazard detection accuracy and have also been shown to offer competitive runtimes on flight relevant hardware.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='16 Previous works have leveraged deep semantic segmentation to classify safe and unsafe landing locations from high-resolution DEMs.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Moghe and Zanetti4 leverage a deep neu- ral network architecture for predicting safety maps from a DEM and design a novel loss function specifically designed to decrease the false safe rate and encourage more precise safe predictions.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Tomita et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='5 employ a Bayesian SegNet architecture17 for segmentation of input DEMs into safe and unsafe landing locations.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' The Bayesian architecture implemented by Tomita et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='5 enables uncertainty quantification of the predicted safety map through the predictive entropy of the model, allowing for more precise predictions through global thresholding with respect to this uncertainty measure.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' We build upon this work and demonstrate its efficacy on monocular imagery.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Methods based on deep learning have also been developed for surface segmentation from monoc- 2 ular imagery.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Pugliatti and Maestrini7 employ a custom U-Net architecture for classification of surface landmarks, i.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=', boulders and crater rims.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Caroselli et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='8 apply deep semantic segmen- tation to boulder detection on synthetic images of a fabricated small body model and post-process the network prediction to derive a landing safety map based on boulder density.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Conversely, we apply our models to real images and directly predict safety maps that conform to realistic landing parameters and constraints.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' PROPOSED APPROACH Our novel safety mapping paradigm leverages Bayesian deep learning techniques to develop an uncertainty-aware semantic segmentation model to predict safety maps from monocular imagery.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' We train and test our model on real in-situ imagery from the OSIRIS-REx mission to asteroid 101955 Bennu with corresponding ground truth safety maps generated using realistic landing pa- rameters and constraints.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Bayesian Deep Learning Given training input data X = {x1, .' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' .' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' .' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' , xN} with corresponding labels Y = {y1, .' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' .' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' .' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' , yN}, Bayesian deep learning employs Bayesian inference to maximize the posterior distribution of the network parameters θ given the training data X, Y: p(θ | X, Y) = p(Y | X, θ)p(θ) p(Y | X) .' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' (1) The distribution above can then be used to predict the likelihood of an output y∗ for a new input x∗ via p(y∗ | x∗, X, Y) = Ep(θ | X,Y)[p(y∗ | x∗, θ)], (2) where we may assume a softmax likelihood for p(y∗ | x∗, θ).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' However, computing p(θ | X, Y) is intractable and must be approximated using variational inference.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Gal and Ghahramani18,19 showed that training a deep convolutional neural network (CNN) with dropout layers is equivalent to approximate variational inference with a variational distribution q(θ) which imposes a Bernoulli distribution over the model weights.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Specifically, consider a convolu- tional layer i with ci−1 input channels, ci output channels, and kernel size k.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Then dropout can be viewed as imposing a distribution over the layer weights Wi according to Wi = Mi diag([ϵj]ci j=1), ϵj ∼ Bern(pj), (3) where ϵj are Bernoulli distributed random variables with parameter pj (which we take to be 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='5), and Mi ∈ Rci−1×k×k×ci are the variational weight parameters optimized during training.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Therefore, Equation (2) may be approximated by p(y∗ | x∗, X, Y) ≈ Eq(θ)[p(y∗ | x∗, θ)].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' (4) Finally,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' employing dropout at test time permits the use of the predictive entropy approximated through T stochastic forward passes through the network as a measure of uncertainty:20 H[y | x,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' X,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Y] ≈ − d � k=1 � 1 T T � t=1 p(y = k | x,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' θt) � log � 1 T T � t=1 p(y = k | x,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' θt) � ,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' (5) 3 CFF (1/2) CFF Entropy Mean Stochastic dropout samples Uncertainty Safety map 32 32 64 128 2 64 128 256 128 1024 CFF Conv + BN + ReLU Dropout Max Pool Pyramid Pool (1/4) (1/8) (1/16) Bilinear downsampling (1/32) 256 256 2x Bilinear Upsample Cascade Feature Fusion (1/2) (1/4) (1/4) (1/2) (1) Figure 1: Bayesian ICNet architecture.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Multiscale fusion is conducted within the cascade feature fusion (CFF)21 modules.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' The ratio in parentheses denotes the relative magnitude of the spatial dimensions with respect to the original image.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' where θt corresponds to a realization of the network parameters, distributed according to q(θ), sampled during a forward pass through the network, and the average over the forward passes is taken to be the final prediction probabilities.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' This process is referred to as Monte Carlo (MC) dropout.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='18,19 For the task of semantic segmentation, assume x is a tuple (X, u) containing an image tensor X ∈ Rh×w×c and an image coordinate u ∈ R2, and k ∈ {1, .' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' .' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' .' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' , d} is a pixel-wise class label for the pixel located at u.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' We will demonstrate that leveraging this uncertainty measure for our network predictions leads to increased precision and accuracy of safe landing locations.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Uncertainty-Aware Semantic Segmentation We leverage an uncertainty-aware semantic segmentation architecture based on the image cas- cade network (ICNet),21 shown in Figure 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' ICNet is a highly efficient segmentation architecture that blends coarse prediction maps obtained from down-sampled inputs with high-resolution feature maps obtained from high throughput networks that operate on the full-resolution image, allowing for fast inference on high-resolution images while maintaining accuracy.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Multiscale feature map fusion is conducted by the cascade feature fusion (CFF) modules, whereby a reduced-resolution segmentation map is computed from the two multiscale feature map inputs.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' The multiscale predic- tions are used to train the network via a weighted softmax cross-entropy loss.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='21 We implement a Bayesian version of ICNet, which we denote as BICNet, where dropout layers are added to allow for stochastic sampling with respect to the model parameters using techniques from Bayesian deep learning, i.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=', MC dropout, as described in the previous subsection.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Ideally, a Bayesian NN would feature a dropout layer after every hidden layer of the network.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='18,19 However, as observed in previous works,17,20 adding dropout layers after every convolutional layer in more complex networks is too strong of a regularizer, resulting in underfitting.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Therefore, we follow the 4 00(a) Global shape model Nightingale Osprey Kingfisher Sandpiper (b) DTM (left) and reconnaissance imagery (right) for each tag site Figure 2: OSIRIS-REx TAG site datasets.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' TAG site locations are indicated by the corresponding color in the global shape model.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' work of Mukhoti et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='20 and Kendall and Cippola17 and only insert dropout layers after the central encoder and decoder layers.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' At test time, we perform T = 8 stochastic forward passes and use the predictive entropy, defined in Equation (5), as a measure of uncertainty, and use the average of this measure over all training instances as a threshold to mask out high uncertainty regions in the image.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Data Generation High-fidelity DTMs (i.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=', 5 cm ground sample distance) of the four prospective Touch-And-Go (TAG) sample sites developed as part of the OSIRIS-REx mission to Asteroid 101955 Bennu, i.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=', Nightingale, Kingfisher, Osprey, and Sandpiper, were used to generate ground truth safety map labels for reconnaissance imagery from the mission.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Specifically, we leverage monocular recon- naissance imagery and the corresponding camera pose labels, relative to a body fixed frame of the asteroid, provided through the AstroVision dataset.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='22 For each image, DEMs are constructed by transforming the DTM into a local coordinate system in which the +z-axis points opposite the vector corresponding to the direction of the gravitational force due to the target body at the point on the surface closest to the center of the image.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' The gravity due to body was computed using a global shape model of Bennu23 and assuming a constant-denity polyhedron.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='24 Safety mapping was conducted on the DEM and then projected back into the image to produce pixel-wise landing safety labels.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Example reconnaissance images for each prospective TAG site are provided in Figure 2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Landing safety was computed from the DEMs using the method developed by the Autonomous Landing Hazard Avoidance Technology (ALHAT) project.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='25 The ALHAT method evaluates the lander contact locations for all pixels and for all orientations to assess the worst-case surface slope and roughness values with respect to the surface elevation data contained in the ground truth DEMs.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Specifically, a landing plane is computed for each pixel by assessing the elevation of four evenly spaced contact points, emulating lander foot pads, on the perimeter of a circle specified by the di- ameter of the lander.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Slope is defined as the largest angle between the landing plane and x-y plane of the ground truth DEM for all orientations, and the roughness is the largest perpendicular distance to the terrain above the the landing plane for all orientations.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Any pixel with slope and roughness exceeding a given threshold is labeled as unsafe, where we chose a threshold of 30◦ for slope and 3.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='5 cm for roughness.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' We specify a lander with a 35 cm diameter, similar to the MASCOT (Mobile As- teroid surface SCOuT) lander that was deployed during the Hayabusa2 mission to Asteroid 162173 5 (a) Image 20 40 60 80 Slope ( ) (b) Slope 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='0 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='1 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='2 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='3 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='4 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='5 Roughness (m) (c) Roughness (d) Slope & Roughness Safety Map (e) Roughness-only Safety Map Figure 3: Ground truth safety map example.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Safe and unsafe regions are drawn in green and red, respectively, in the safety map.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Ryugu.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='26 An example safety map along with its corresponding monocular image is provided in Figure 3.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Moreover, we provide the data distributions of our datasets with respect to the ground sample distance, imaging depth, viewing angle, and visibility ratio in Figure 4.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Ground sample distance (GSD) measures the average distance on the surface spanned by a single pixel, which is a function of the distance to the surface and the camera intrinsics, as landing safety becomes increasingly difficult to observe as the relative size of the lander in the image decreases.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' The imaging depth measures the average distance to the surface when the image was taken, and provides context for the GSD values.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Specifically, the MapCam of the OSIRIS-REx Camera Suite (OCAMS),15 with a focal length of ∼125 mm, can provide 5 cm GSD measuresments of the surface at distances of approximately 1 km, while the PolyCam, with a focal length of ∼620 mm, provides the same resolution at distances of almost 4 km.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Viewing angle measures the angle between the −z-axis of the ground truth DEM and the camera boresight.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Finally, the visibility ratio is the ratio of visible (i.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=', not occluded by shadows) pixels to total pixels in the image and provides a measure of the illumination conditions in the image.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' 6 +PolyCam MapCam SamCam Figure 4: Data distributions with respect to imaging depth, GSD, viewing angle, and visibility ratio.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Our dataset features a total of 770 images annotated with per-pixel safety labels: 133 of Kingfisher, 342 of Nightingale, 162 of Osprey, and 91 of Sandpiper, and 42 from the TAG sample collection event at Nightingale.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' RESULTS In this section, we first present our suite of metrics used to evaluate the performance of our approach.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' We then validate our approach on two different experiments using real images from the OSIRIS-REx mission to Asteroid 101955 Bennu, including images captured during the actual TAG sample collection event.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Metric Definitions We measure the quality of the predicted per-pixel safety map labels of our model with respect to precision, sensitivity, accuracy, and mean intersection over union (mIoU): precision = true safe true safe + false safe, (6) 7 Table 1: Overall performance for the Sandpiper landing site experiment.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' The values in paren- theses are the metrics with shadowed pixels ignored.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' All reported values are percentages.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' METHOD PRECISION SENSITIVITY ACCURACY MIOU SLOPE & ROUGHNESS WITHOUT UNCERTAINTY 60.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='66 (62.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='91) 67.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='21 (70.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='05) 69.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='53 (69.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='41) 52.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='05 (52.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='27) WITH UNCERTAINTY 76.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='98 (77.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='67) 20.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='09 (21.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='86) 82.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='29 (82.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='01) 65.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='76 (65.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='93) ROUGHNESS ONLY WITHOUT UNCERTAINTY 77.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='24 (78.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='92) 61.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='61 (63.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='66) 63.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='53 (64.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='41) 44.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='87 (45.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='31) WITH UNCERTAINTY 85.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='71 (86.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='32) 28.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='78 (31.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='63) 73.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='77 (73.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='93) 55.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='11 (54.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='98) sensitivity = true safe true safe + false unsafe, (7) accuracy = true safe + true unsafe valid pixels , (8) mIoU = 1 2 � true safe valid pixels − true unsafe + true unsafe valid pixels − true safe � .' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' (9) True safe (false safe) includes pixels predicted to be safe by our models that are safe (unsafe) in the ground truth labels, and true unsafe (false unsafe) includes pixels predicted to be unsafe that are unsafe (safe) in the ground truth labels.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Note that false unsafe includes safe pixels that are ignored and not labeled safe due to high uncertainty.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' For our application, we can interpret precision as the reliability of the pixels predicted to be safe, and sensitivity as detection rate of true safe sites, respectively.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Accuracy and mIoU are the metrics evaluated for the valid pixels, which are the pixels with smaller uncertainty than the threshold.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' In other words, valid pixels correspond to the predictions that the network is most “certain” about.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' For the results without uncertainty thresholding, accuracy and mIoU are evaluated for all the pixels with valid safety labels.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' In the following analysis, we refer to the ratio of pixels that fall above our uncertainty threshold, and consequently marked as unsafe, to valid pixels as the screening rate.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Experiment 1: Prospective Landing Site Sandpiper We train our model on images from three of the prospective landing sites from the OSIRIS- Rex mission, namely, Nightingale, Osprey, and Kingfisher, and test our model on images of the remaining sample site, Sandpiper.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' This emulates a scenario in which data from previously mapped landing sites could be used to train a network to predict landing safety in a new, unexplored region of the target body without requiring the construction of high-fidelity DEMs.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' These results are detailed in Table 1, and qualitative examples are provided in Figure 5, where we consider performance with respect to identifying both slope and roughness hazards and roughness-only hazards.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' The results illustrate that our models are able to predict safety maps from just a single monocular image of the propspective landing site, which is completely unseen during training, with accuracy over 69% for the slope and roughness hazard detection case, and over 63% for the roughness-only 8 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='1 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='2 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='3 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='4 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='5 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 Uncertainty 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='1 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='2 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='3 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='4 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='5 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 Uncertainty 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='1 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='2 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='3 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='4 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='5 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 Uncertainty (a) Slope & roughness 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='1 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='2 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='3 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='4 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='5 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 Uncertainty 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='1 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='2 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='3 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='4 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='5 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 Uncertainty 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='1 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='2 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='3 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='4 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='5 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 Uncertainty Test Image Without Uncertainty With Uncertainty Uncertainty (b) Roughness-only Figure 5: Qualitative monocular safety mapping results for the Sandpiper experiment.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Green, yellow, blue, and red labels represent true safe, true unsafe, false unsafe, and false safe, respectively.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' 9 hazard detection case, even without uncertainty thresholding.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Moreover, our Bayesian ICNet archi- tecture enables uncertainty thresholding in order to further boost performance by ignoring regions in which the models’ prediction has high entropy.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' With the uncertainty threshold, accuracy increases to 82.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='29% and 73.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='77% for the slope and roughness and roughness-only cases, respectively, at the cost of decreased sensitivity.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Importantly, we are able to achieve 76.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='98% and 85.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='71% precision for the slope and roughness and roughness-only cases, respectively, after uncertainty thresholding.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' We also have a slight increase in all the metrics by ignoring shadowed pixels, which are reported in parentheses in Table 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Comparing the two different hazard detection tasks, i.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=', slope and roughness hazards and roughness- only hazards, the roughness-only case has lower values of sensitivity, accuracy, and mIoU, but higher values of precision.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' This suggests that the roughness-only case is a harder task in terms of precisely labeling safe and unsafe pixels on average, resulting in lower accuracy and mIoU, but is an easier task in terms of identifying only safe pixels, thus resulting in higher precision.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' This is partially due to the higher incidence of safe pixels for the roughness-only case as compared to the slope and roughness case, as illustrated in Figure 3.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Additionally, we analyzed the per-image metrics with respect to GSD, viewing angle, and visibil- ity ratio for the slope and roughness case, shown in Figure 6, and the roughness-only case, shown in Figure 7, in order to identify possible causes of uncertainty in the predictions.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' As a general trend, we can observe that a higher uncertainty results in lower performance metrics of precision, sensitivity, accuracy, and mIoU.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Intuitively, low visibility is a common factor that results in higher uncertainty in our model for both the slope and roughness case and the roughness-only case.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Our models as- sign a higher uncertainty to images with larger GSD for the slope and roughness case, and larger viewing angle for the roughness-only case.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Note that increased uncertainty for images at higher GSDs may also be due to these instances being less represented in the training data as shown in Figure 4.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' In either case, we demonstrate that the uncertainty threshold serves as a powerful tool for detecting and accounting for difficult or out-of-distribution input conditions, allowing our models to predict precise and accurate safety maps across multiple GSDs, viewing angles, and illumination conditions.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Experiment 2: OSIRIS-REx TAG Sequence For the second experiment, we trained two models using different combinations of images from the OSIRIS-REx mission: one model, denoted by BICNet-NKO, is trained on Nightingale, King- fisher, and Osprey, and the other model, denoted by BICNet-KOS, is trained on Kingfisher, Osprey, and Sandpiper.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' These models were trained for the slope and roughness case only.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' We tested the two models on images captured during the TAG sample collection event at the Nightingale sample site.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' We present both models to illustrate the effect of the different training data distributions on the test results.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' A subset of the 42 image sequence is shown in Figure 8.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Note that the 42 test images are not included in the training set for the Nightingale images.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Table 2 and Figure 10 show quantitative results and qualitative examples, respectively.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Compar- ing BICNet-NKO and BICNet-KOS in Table 2, we can see sensitivity of BICNet-KOS is signifi- cantly lower than that of BICNet-NKO after uncertainty thresholding.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Indeed, BICNet-KOS assigns a high uncertainty to almost all regions of the input images and are thus overwritten as unsafe after uncertainty thresholding, as shown in Figure 10, which is not the case for BICNet-NKO.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' These differences are most likely explained by the difference between the training and testing data distri- butions for BICNet-KOS, as illustrated in Figure 4.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Specifially, the TAG images have less overlap 10 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='8 Precision 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='02 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='03 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='04 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='05 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='06 GSD (m/pixel) 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='0 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='2 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='4 Sensitivity 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='8 Accuracy uncertainty 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='44 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='48 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='52 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='56 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='60 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='4 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='8 IoU 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='02 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='03 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='04 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='05 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='06 GSD (m/pixel) 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='45 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='50 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='55 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='60 Uncertainty 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='7 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='8 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='9 Screen Rate 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='8 Precision 10 20 30 40 50 Viewing Angle ( ) 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='0 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='2 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='4 Sensitivity 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='8 Accuracy uncertainty 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='44 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='48 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='52 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='56 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='60 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='4 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='8 IoU 10 20 30 40 50 Viewing Angle ( ) 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='45 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='50 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='55 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='60 Uncertainty 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='7 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='8 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='9 Screen Rate 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='8 Precision 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='4 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='8 Visibility Ratio 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='0 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='2 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='4 Sensitivity 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='8 Accuracy uncertainty 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='44 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='48 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='52 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='56 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='60 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='4 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='8 IoU 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='4 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='8 Visibility Ratio 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='45 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='50 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='55 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='60 Uncertainty 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='7 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='8 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='9 Screen Rate Figure 6: Per-image metrics for slope & roughness safety on the Sandpiper experiment with respect to GSD, viewing angle, and visibility ratio.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' 11 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='7 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='8 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='9 Precision 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='02 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='03 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='04 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='05 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='06 GSD (m/pixel) 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='0 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='2 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='4 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 Sensitivity 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='5 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='7 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='8 Accuracy uncertainty 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='40 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='44 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='48 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='52 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='56 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='60 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='3 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='4 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='5 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='7 IoU 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='02 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='03 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='04 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='05 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='06 GSD (m/pixel) 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='40 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='45 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='50 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='55 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='60 Uncertainty 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='4 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='8 Screen Rate 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='7 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='8 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='9 Precision 10 20 30 40 50 Viewing Angle ( ) 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='0 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='2 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='4 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 Sensitivity 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='5 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='7 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='8 Accuracy uncertainty 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='40 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='44 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='48 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='52 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='56 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='60 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='3 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='4 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='5 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='7 IoU 10 20 30 40 50 Viewing Angle ( ) 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='40 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='45 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='50 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='55 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='60 Uncertainty 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='4 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='8 Screen Rate 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='7 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='8 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='9 Precision 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='4 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='8 Visibility Ratio 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='0 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='2 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='4 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 Sensitivity 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='5 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='7 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='8 Accuracy uncertainty 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='40 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='44 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='48 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='52 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='56 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='60 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='3 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='4 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='5 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='7 IoU 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='4 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='8 Visibility Ratio 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='40 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='45 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='50 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='55 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='60 Uncertainty 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='4 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='8 Screen Rate Figure 7: Per-image metrics for roughness-only safety on the Sandpiper experiment with re- spect to GSD, viewing angle, and visibility ratio.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' 12 2020-10-20T21:30:48 2020-10-20T21:31:48 2020-10-20T21:32:48 2020-10-20T21:33:48 2020-10-20T21:34:48 2020-10-20T21:35:48 2020-10-20T21:36:48 2020-10-20T21:37:48 2020-10-20T21:38:48 2020-10-20T21:39:48 2020-10-20T21:40:48 2020-10-20T21:41:48 Figure 8: Frames from the OSIRIS-Rex TAG sequence captured by the SamCam.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' with the images of the prospective landing sites except for Nightingale with respect to viewing angle and visibility ratio.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Therefore, the exclusion of Nightingale from training data increases the predic- tive uncertainty for TAG images at test time for BICNet-KOS.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Note that the Nightingale images (excluding the TAG images) were used for validation during training for the BICNet-KOS model in order to rule out overfitting as a cause for the decreased prediction performance.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' This effect of training data distributions on the uncertainty level, and the accompanying predictive performance, is consistent with the per-image metrics with respect to GSD, as shown in Figure 9.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Indeed, for BICNet-NKO, as the GSD of testing images gets closer to the peak of the training data, the predictive performance increases and uncertainty decreases.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Lower precision for the higher GSD images is also partly due to the very low incidence of safe regions for these instances (see Figure 10).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Conversely, for BICNet-KOS, all test images have a high screening rate and a low sensitivity due to the high uncertainty, purportedly due to the out-of-distribution training data with respect to the viewing angle and visibility ratio and the relatively small size of the training set (386 images).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' We do not provide the per-image metrics with respect to the viewing angle and visibility ratio, as these values remain relatively constant over the entire TAG sequence at ∼7◦ and ∼71%, respectively, as shown in Figure 4.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' These results illustrate the effect of training data distributions and the ability of the uncertainty measure to identify out-of-distribution data for uncertainty-aware segmentation networks.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' We postulate that training our model with a more comprehensive set of images will decrease prediction uncertainty and increase the performance.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' CONCLUSION In this paper we presented a novel landing hazard detection approach for small body missions that predicts safety maps directly from monocular imagery.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' We implemented an efficient, uncertainty- aware segmentation network that demonstrated hazard detection performance at over 80% accuracy and over 85% precision on real images of unseen landing sites captured during the OSIRIS-REx mission to Asteroid 101955 Bennu.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' We believe that monocular safety mapping is a promising technology for reducing reliance on human-in-the-loop procedures used in current safety mapping methodologies.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Future work will involve developing a more comprehensive distribution of training data and identifying and rectifying causes of uncertainty to increase the reliability of the proposed 13 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='4 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='8 Precision 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='015 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='020 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='025 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='030 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='035 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='040 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='045 GSD (m/pixel) 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='2 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='4 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 Sensitivity 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='7 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='8 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='9 Accuracy uncertainty 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='38 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='40 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='42 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='44 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='46 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='5 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='7 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='8 IoU 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='015 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='020 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='025 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='030 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='035 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='040 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='045 GSD (m/pixel) 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='375 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='400 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='425 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='450 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='475 Uncertainty 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='5 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 Screen Rate (a) BICNet-NKO 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='5 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='7 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='8 Precision 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='015 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='020 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='025 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='030 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='035 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='040 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='045 GSD (m/pixel) 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='00 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='02 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='04 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='06 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='08 Sensitivity 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='75 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='80 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='85 Accuracy uncertainty 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='49 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='50 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='51 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='52 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='53 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='5 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 IoU 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='015 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='020 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='025 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='030 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='035 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='040 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='045 GSD (m/pixel) 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='50 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='52 Uncertainty 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='65 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='70 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='75 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='80 Screen Rate (b) BICNet-KOS Figure 9: Per-image metrics for the TAG experiment with respect to the GSD for slope & roughness safety.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' 14 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='1 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='2 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='3 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='4 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='5 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 Uncertainty 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='1 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='2 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='3 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='4 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='5 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 Uncertainty 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='1 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='2 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='3 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='4 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='5 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 Uncertainty (a) BICNet-NKO 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='1 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='2 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='3 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='4 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='5 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 Uncertainty 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='1 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='2 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='3 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='4 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='5 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 Uncertainty 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='1 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='2 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='3 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='4 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='5 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='6 Uncertainty Test Image Without Uncertainty With Uncertainty Uncertainty (b) BICNet-KOS Figure 10: Qualitative monocular safety mapping results for the TAG experiment.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Green, yellow, blue, and red labels represent true safe, true unsafe, false unsafe, and false safe, respectively.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' 15 Table 2: Overall performance for the TAG experiment for slope & roughness safety.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' BICNet- NKO is our Bayesian ICNet model trained on Nightingale, Kingfisher, and Osprey, and BICNet- KOS is trained on Kingfisher, Osprey, and Sandpiper.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' All reported values are percentages.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' METHOD PRECISION SENSITIVITY ACCURACY MIOU BICNET-NKO WITHOUT UNCERTAINTY 49.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='02 (50.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='21) 61.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='16 (66.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='53) 67.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='44 (67.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='09) 48.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='65 (48.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='66) WITH UNCERTAINTY 61.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='38 (61.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='66) 26.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='60 (30.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='80) 78.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='62 (77.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='09) 60.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='04 (59.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='23) BICNET-KOS WITHOUT UNCERTAINTY 50.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='08 (50.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='42) 42.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='18 (50.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='37) 67.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='24 (65.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='32) 45.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='48 (45.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='11) WITH UNCERTAINTY 65.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='05 (64.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='89) 2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='87 (4.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='05) 83.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='11 (82.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='65) 52.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='58 (55.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='87) approach.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Our code, data, and trained models will be made available to the public at https: //github.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='com/travisdriver/deep_monocular_hd.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' ACKNOWLEDGMENTS This work supported by a NASA Space Technology Graduate Research Opportunity and the NASA Early Career Faculty Program (grant no.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' 80NSSC20K0064).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' The authors would like to thank Kenneth Getzandanner and Michael Shoemaker from NASA Goddard Space Flight Center for several helpful discussions and comments.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' REFERENCES [1] K.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Berry, K.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Getzandanner, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Moreau, S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Rieger, P.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' G.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Antreasian, C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' D.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Adam, D.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Wibben, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Leonard, A.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' H.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Levine, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Geeraert, et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=', “Contact with Bennu!' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Flight Performance Versus Prediction of OSIRIS-REx “TAG” Sample Collection,” AIAA SciTech Forum, 2022, p.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' 2521.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' [2] O.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Barnouin, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Daly, E.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Palmer, C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Johnson, R.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Gaskell, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Al Asad, E.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Bierhaus, K.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Craft, C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Ernst, R.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Espiritu, H.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Nair, G.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Neumann, L.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Nguyen, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Nolan, E.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Mazarico, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Perry, L.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Philpott, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Roberts, R.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Steele, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Seabrook, H.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Susorney, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Weirich, and D.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Lauretta, “Digital terrain mapping by the OSIRIS-REx mission,” Planetary and Space Science, Vol.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' 180, 2020, p.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' 104764.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' [3] E.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' E.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Palmer, R.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Gaskell, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' G.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Daly, O.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Barnouin, C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' D.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Adam, and D.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Lauretta, “Practical Stereophotoclinometry for Modeling Shape and Topography on Planetary Missions,” Planetary Science, Vol.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' 3, No.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' 102, 2022, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' 1–16.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' [4] R.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Moghe and R.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Zanetti, “A Deep learning approach to Hazard detection for Autonomous Lunar land- ing,” J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' of the Astronautical Sciences, Vol.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' 67, No.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' 4, 2020, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' 1811–1830.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' [5] K.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Tomita, A.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' K.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Skinner, and K.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Ho, “Bayesian Deep Learning for Segmentation for Autonomous Safe Planetary Landing,” J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' of Spacecraft and Rockets, 2022, https://doi.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='org/10.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='2514/1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='A35104.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' [6] D.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' A.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Lorenz, R.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Olds, A.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' May, C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Mario, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' E.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Perry, E.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' E.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Palmer, and M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Daly, “Lessons learned from OSIRIS-REx autonomous navigation using natural feature tracking,” IEEE Aerospace Conf.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=', 2017, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' 1–12.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' [7] M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Pugliatti and M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Maestrini, “Small-Body Segmentation Based on Morphological Features with a U-Shaped Network Architecture,” J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' of Spacecraft and Rockets, 2022, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' 1–15.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' [8] E.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Caroselli, F.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Belien, A.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Falke, F.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Curti, and R.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' F¨orstner, “Deep Learning-Based Passive Hazard Detection for Asteroid Landing in Unexplored Environment,” AAS Guidance, Navigation and Control (GN&C) Conf.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=', 2022, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' 1–16.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' [9] “NASA Technology Taxonomy,” tech.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' rep.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=', National Aeronautics and Space Administration (NASA), 2020.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' [10] C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' D.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Epp, E.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' A.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Robertson, and T.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Brady, “Autonomous landing and hazard avoidance technology (ALHAT),” IEEE Aerospace Conf.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=', IEEE, 2008, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' 1–7.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' [11] J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Carson, N.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Trawny, E.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Robertson, V.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' E.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Roback, D.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Pierrottet, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Devolites, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Hart, and J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' N.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Estes, “Preparation and integration of ALHAT precision landing technology for Morpheus flight testing,” AIAA SPACE Conf.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=', 2014, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' 1–16.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' 16 [12] R.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' R.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Sostaric, S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Pedrotty, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Carson, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' N.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Estes, F.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Amzajerdian, A.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Dwyer-Cianciolo, and J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' B.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Blair, “The SPLICE Project: Safe and Precise Landing Technology Development and Testing,” AIAA SciTech Forum, 2021, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' 1–9.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' [13] E.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Church, T.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Bourbeau, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Curriden, A.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Deguzman, F.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Jaen, H.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Ma, K.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Mahoney, C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Miller, B.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Short, K.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Waldorff, et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=', “Flash Lidar On-Orbit Performance at Asteroid Bennu,” AAS Guidance, Navigation and Control (GN&C) Conf.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=', 2020.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' [14] J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Leonard, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Moreau, P.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' G.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Antreasian, K.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Getzandanner, E.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Church, C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Miller, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' G.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Daly, O.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Barnouin, and D.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Lauretta, “Cross-Calibration of GNC and OLA LIDAR Systems Onboard OSIRIS-REx,” AAS Guidance, Navigation and Control (GN&C) Conf.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=', No.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' 22-166, 2022.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' [15] B.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Rizk, C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' D.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' d’Aubigny, D.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Golish, C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Fellows, C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Merrill, P.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Smith, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Walker, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Hendershot, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Han- cock, S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Bailey, D.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' DellaGiustina, D.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Lauretta, R.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Tanner, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Williams, K.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Harshman, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Fitzgibbon, W.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Verts, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Chen, T.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Connors, D.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Hamara, A.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Dowd, A.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Lowman, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Dubin, R.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Burt, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Whiteley, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Watson, T.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' McMahon, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Ward, D.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Booher, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Read, B.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Williams, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Hunten, E.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Little, T.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Saltz- man, D.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Alfred, S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' O’Dougherty, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Walthall, K.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Kenagy, S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Peterson, B.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Crowther, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Perry, C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' See, S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Selznick, C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Sauve, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Beiser, W.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Black, R.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Pfisterer1, A.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Lancaster, S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Oliver, C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Oquest, D.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Crow- ley, C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Morgan, C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Castle, R.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Dominguez, and M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Sullivan, “OCAMS: The OSIRIS-REx Camera Suite,” Space Science Reviews, Vol.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' 214, No.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' 26, 2018, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' 1–55.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' [16] T.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Claudet, K.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Tomita, and K.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Ho, “Benchmark Analysis of Semantic Segmentation Algorithms for Safe Planetary Landing Site Selection,” IEEE Access, Vol.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' 10, 2022, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' 41766–41775.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' [17] V.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' B.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Alex Kendall and R.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Cipolla, “Bayesian SegNet: Model Uncertainty in Deep Convolutional Encoder-Decoder Architectures for Scene Understanding,” British Machine Vision Conf.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' (BMVC), BMVA Press, September 2017, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' 57.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='1–57.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='12.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' [18] Y.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Gal and Z.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Ghahramani, “Bayesian convolutional neural networks with Bernoulli approximate vari- ational inference,” arXiv preprint arXiv:1506.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='02158, 2015.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' [19] Y.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Gal and Z.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Ghahramani, “Dropout as a Bayesian approximation: Representing model uncertainty in deep learning,” Int.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Conf.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' on Machine Learning (ICML), PMLR, 2016, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' 1050–1059.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' [20] J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Mukhoti and Y.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Gal, “Evaluating Bayesian deep learning methods for semantic segmentation,” Int.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Conf.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' on Learning Representations (ICLR), 2018.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' [21] H.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Zhao, X.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Qi, X.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Shen, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Shi, and J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Jia, “ICNet for real-time semantic segmentation on high- resolution images,” European Conf.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' on Computer Vision (ECCV), 2018, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' 405–420.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' [22] T.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Driver, K.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Skinner, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Dor, and P.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Tsiotras, “AstroVision: Towards Autonomous Feature Detection and Description for Missions to Small Bodies Using Deep Learning,” Special Issue on AI for Space, Acta Astronautica, 2022.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' [23] J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' A.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Seabrook, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' G.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Daly, O.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Barnouin, E.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' E.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Palmer, R.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' W.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Gaskell, H.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Nair, and D.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Lauretta, “Building a High-resolution Digital Terrain Model of Bennu from Laser Altimetry Data,” The Planetary Science J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=', Vol.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' 3, No.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' 12, 2022, p.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' 265.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' [24] R.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' A.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Werner and D.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Scheeres, “Exterior gravitation of a polyhedron derived and compared with harmonic and mascon gravitation representations of asteroid 4769 Castalia,” Celestial Mechanics and Dynamical Astronomy, Vol.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' 65, 1996, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' 313–344.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' [25] T.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Ivanov, A.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Huertas, and J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Carson, “Probabilistic hazard detection for autonomous safe landing,” AIAA Guidance, Navigation, and Control (GNC) Conf.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=', 2013, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' 1–13.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' [26] T.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content='-M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Ho, V.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Baturkin, C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Grimm, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' T.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Grundmann, C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Hobbie, E.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Ksenik, C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Lange, K.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Sasaki, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Schlotterer, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' Talapina, et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=', “MASCOT—the mobile asteroid surface scout onboard the Hayabusa2 mission,” Space Science Reviews, Vol.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' 208, No.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' 1, 2017, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' 339–374.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'} +page_content=' 17' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NFQT4oBgHgl3EQfJDUU/content/2301.13254v1.pdf'}