feat: merge planar-brute-embed into the topola monorepo

- feat: use OrderedPair instead of custon BandName/BandUid (note: changes ordering of BandUid)
- fix(crates): rename planar-brute-embed to planar-incr-embed
This commit is contained in:
Ellen Emilia Anna Zscheile 2025-01-31 14:13:28 +01:00
parent a39546f0c9
commit 6eb941a137
39 changed files with 3185 additions and 143 deletions

View File

@ -34,7 +34,7 @@ features = ["serde-1"]
[workspace.dependencies.serde]
version = "1"
features = ["derive"]
features = ["derive", "rc"]
[package]
name = "topola"
@ -59,6 +59,10 @@ serde.workspace = true
spade.workspace = true
thiserror.workspace = true
[dependencies.planar-incr-embed]
path = "crates/planar-incr-embed"
features = ["serde"]
[dependencies.specctra-core]
path = "crates/specctra-core"
features = ["rstar"]

73
LICENSES/Apache-2.0.txt Normal file
View File

@ -0,0 +1,73 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files.
"Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions:
(a) You must give any other recipients of the Work or Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License.
You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following boilerplate notice, with the fields enclosed by brackets "[]" replaced with your own identifying information. (Don't include the brackets!) The text should be enclosed in the appropriate comment syntax for the file format. We also recommend that a file or class name and description of purpose be included on the same "printed page" as the copyright notice for easier identification within third-party archives.
Copyright [yyyy] [name of copyright owner]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

View File

@ -45,10 +45,11 @@ repository on [GitHub](https://github.com/mikwielgus/topola).
## Licence
Topola is licensed under the [MIT licence](LICENSE). Files present in
Topola is licensed under the [MIT licence](LICENSES/MIT.txt). Files present in
the `assets/` directory are dual-licensed as under MIT or
[Creative Commons Attribution 4.0 International](https://creativecommons.org/licenses/by/4.0/)
licence.
[Creative Commons Attribution 4.0 International](LICENSES/CC-BY-4.0.txt)
licence. The file `crates/planar-incr-embed/src/math.rs` is dual-licensed as under
MIT or [Apache 2.0](LICENSES/Apache-2.0.txt) license.
## Gallery

View File

@ -15,7 +15,12 @@ SPDX-FileCopyrightText = "2024 Topola contributors"
SPDX-License-Identifier = "MIT"
[[annotations]]
path = ["tests/single_layer/**", "tests/multilayer/**"]
path = [
"crates/planar-incr-embed/tests/**",
"crates/planar-incr-embed/src/planarr/snapshots/**",
"tests/single_layer/**",
"tests/multilayer/**"
]
SPDX-FileCopyrightText = "2024 Topola contributors"
SPDX-License-Identifier = "MIT"

View File

@ -0,0 +1,33 @@
# SPDX-FileCopyrightText: 2024 Topola contributors
#
# SPDX-License-Identifier: MIT
[package]
name = "planar-incr-embed"
version = "0.1.0"
edition = "2021"
license = "MIT"
[features]
serde = ["dep:serde", "spade/serde"]
[dependencies]
log.workspace = true
num-traits = "0.2"
peeking_take_while = "1.0"
spade.workspace = true
thiserror.workspace = true
[dependencies.serde]
workspace = true
optional = true
[dev-dependencies]
ron = "0.8"
[dev-dependencies.insta]
version = "1.42"
features = ["json"]
[dev-dependencies.serde]
workspace = true

View File

@ -0,0 +1,9 @@
<!--
SPDX-FileCopyrightText: 2025 Topola contributors
SPDX-License-Identifier: MIT
-->
# planar-incr-embed
WIP implementation of incrementally finding planar graph embeddings modulo homotopy equivalence with fixed vertex positions.

View File

@ -0,0 +1,118 @@
// SPDX-FileCopyrightText: 2024 Topola contributors
//
// SPDX-License-Identifier: MIT
//
//! planar embedding / plane graph algorithms
pub mod pmg_astar;
use crate::{navmesh::NavmeshRef, NavmeshBase, NavmeshIndex};
use alloc::{collections::BTreeSet, vec::Vec};
/// The goal of one search iteration (a single path to be etched / embedded)
#[derive(Clone, Debug)]
pub struct Goal<PNI, EP> {
pub source: PNI,
pub target: BTreeSet<PNI>,
pub label: EP,
}
#[derive(Clone, Debug)]
pub struct TargetNetwork<Scalar>(Vec<spade::Point2<Scalar>>);
#[derive(Clone, Debug)]
pub struct PreparedGoal<B: NavmeshBase> {
pub source: B::PrimalNodeIndex,
pub target: BTreeSet<B::PrimalNodeIndex>,
pub label: B::EtchedPath,
pub target_network: TargetNetwork<B::Scalar>,
pub minimal_costs: B::Scalar,
}
impl<PNI: Clone + Eq + Ord, EP: Clone + Eq> Goal<PNI, EP> {
pub fn target_network<B: NavmeshBase<PrimalNodeIndex = PNI, EtchedPath = EP>>(
&self,
navmesh: NavmeshRef<'_, B>,
) -> TargetNetwork<B::Scalar> {
TargetNetwork(
self.target
.iter()
.cloned()
.map(|i| &navmesh.node_data(&NavmeshIndex::Primal(i)).unwrap().pos)
.cloned()
.collect(),
)
}
pub fn estimate_costs_for_source<B: NavmeshBase<PrimalNodeIndex = PNI, EtchedPath = EP>>(
&self,
navmesh: NavmeshRef<'_, B>,
source: &NavmeshIndex<PNI>,
) -> B::Scalar
where
B::Scalar: num_traits::Float,
{
self.target_network(navmesh)
.estimate_costs(&navmesh.node_data(source).unwrap().pos)
}
pub fn estimate_costs<B: NavmeshBase<PrimalNodeIndex = PNI, EtchedPath = EP>>(
&self,
navmesh: NavmeshRef<'_, B>,
) -> B::Scalar
where
B::Scalar: num_traits::Float,
{
self.estimate_costs_for_source(navmesh, &NavmeshIndex::Primal(self.source.clone()))
}
pub fn prepare<B: NavmeshBase<PrimalNodeIndex = PNI, EtchedPath = EP>>(
self,
navmesh: NavmeshRef<'_, B>,
) -> PreparedGoal<B>
where
B::Scalar: num_traits::Float,
{
let target_network = self.target_network(navmesh);
let minimal_costs = target_network.estimate_costs(
&navmesh
.node_data(&NavmeshIndex::Primal(self.source.clone()))
.unwrap()
.pos,
);
PreparedGoal {
source: self.source,
target: self.target,
label: self.label,
target_network,
minimal_costs,
}
}
}
impl<Scalar: num_traits::Float> TargetNetwork<Scalar> {
pub fn estimate_costs(&self, source_pos: &spade::Point2<Scalar>) -> Scalar {
self.0
.iter()
.map(|i| crate::utils::euclidean_distance(source_pos, i))
.reduce(Scalar::min)
.unwrap()
}
}
impl<B: NavmeshBase> PreparedGoal<B>
where
B::Scalar: num_traits::Float,
{
pub fn estimate_costs_for_source<CT>(
&self,
navmesh: NavmeshRef<'_, B>,
source: &NavmeshIndex<B::PrimalNodeIndex>,
) -> B::Scalar {
self.target_network
.estimate_costs(&navmesh.node_data(source).unwrap().pos)
}
}

View File

@ -0,0 +1,494 @@
// SPDX-FileCopyrightText: 2025 Topola contributors
// SPDX-FileCopyrightText: 2021 petgraph contributors
//
// SPDX-License-Identifier: MIT
//
//! planar multi-goal A*-like path search implementation
use crate::{
algo::{Goal, PreparedGoal},
navmesh::{EdgeIndex, EdgePaths, Navmesh, NavmeshRef, NavmeshRefMut},
Edge, NavmeshBase, NavmeshIndex, RelaxedPath,
};
use alloc::collections::{BTreeMap, BinaryHeap};
use alloc::{boxed::Box, sync::Arc, vec::Vec};
use core::{cmp::Ordering, ops::ControlFlow};
use num_traits::float::TotalOrder;
/// A walk task
#[derive(Clone, Debug)]
pub struct Task<B: NavmeshBase> {
/// index of current goal
pub goal_idx: usize,
/// costs / weights accumulated so far
pub costs: B::Scalar,
/// estimated minimal costs until goal
pub estimated_remaining: B::Scalar,
/// estimated minimal costs for all remaining goals after this one
pub estimated_remaining_goals: B::Scalar,
/// the current navmesh edge paths
pub edge_paths: Box<[EdgePaths<B::EtchedPath, B::GapComment>]>,
/// the currently selected node
pub selected_node: NavmeshIndex<B::PrimalNodeIndex>,
/// the previously selected node
pub prev_node: NavmeshIndex<B::PrimalNodeIndex>,
/// the introduction position re: `selected_node`
pub cur_intro: usize,
}
/// Results after a [`Task`] is done.
#[derive(Clone, Debug)]
pub struct TaskResult<B: NavmeshBase> {
/// index of current goal
pub goal_idx: usize,
/// costs / weights accumulated so far
pub costs: B::Scalar,
/// the current navmesh edges
pub edge_paths: Box<[EdgePaths<B::EtchedPath, B::GapComment>]>,
/// the previously selected node
pub prev_node: NavmeshIndex<B::PrimalNodeIndex>,
/// the introduction position re: `target`
pub cur_intro: usize,
}
/// The main path search data structure
#[derive(Clone, Debug)]
pub struct PmgAstar<B: NavmeshBase> {
/// task queue, ordered by costs ascending
pub queue: BinaryHeap<Task<B>>,
// constant data
pub nodes:
Arc<BTreeMap<NavmeshIndex<B::PrimalNodeIndex>, crate::Node<B::PrimalNodeIndex, B::Scalar>>>,
pub edges: Arc<
BTreeMap<EdgeIndex<NavmeshIndex<B::PrimalNodeIndex>>, (Edge<B::PrimalNodeIndex>, usize)>,
>,
pub goals: Box<[PreparedGoal<B>]>,
}
impl<B: NavmeshBase> Task<B>
where
B::Scalar: num_traits::Float,
{
fn edge_paths_count(&self) -> usize {
self.edge_paths.iter().map(|i| i.len()).sum::<usize>()
}
fn estimated_full_costs(&self) -> B::Scalar {
self.costs + self.estimated_remaining + self.estimated_remaining_goals
}
}
impl<B: NavmeshBase> PartialEq for Task<B>
where
B::PrimalNodeIndex: Ord,
B::EtchedPath: PartialOrd,
B::GapComment: PartialOrd,
B::Scalar: num_traits::Float + num_traits::float::TotalOrder + PartialOrd,
{
fn eq(&self, other: &Self) -> bool {
self.estimated_full_costs()
.total_cmp(&other.estimated_full_costs())
== Ordering::Equal
&& self.goal_idx == other.goal_idx
&& other.edge_paths_count() == self.edge_paths_count()
&& self.selected_node == other.selected_node
&& self.prev_node == other.prev_node
&& self.cur_intro == other.cur_intro
&& self
.edge_paths
.partial_cmp(&other.edge_paths)
.map(|i| i == Ordering::Equal)
.unwrap_or(true)
}
}
impl<B: NavmeshBase> Eq for Task<B>
where
B::PrimalNodeIndex: Ord,
B::EtchedPath: PartialOrd,
B::GapComment: PartialOrd,
B::Scalar: num_traits::Float + num_traits::float::TotalOrder + PartialOrd,
{
}
// tasks are ordered such that smaller costs and higher goal indices are ordered as being larger (better)
impl<B: NavmeshBase> Ord for Task<B>
where
B::PrimalNodeIndex: Ord,
B::EtchedPath: PartialOrd,
B::GapComment: PartialOrd,
B::Scalar: num_traits::Float + num_traits::float::TotalOrder + PartialOrd,
{
fn cmp(&self, other: &Self) -> Ordering {
// smaller costs are better
other
.estimated_full_costs()
.total_cmp(&self.estimated_full_costs())
// higher goal index is better
.then_with(|| self.goal_idx.cmp(&other.goal_idx))
// less inserted paths in edges are better
.then_with(|| other.edge_paths_count().cmp(&self.edge_paths_count()))
// tie-break on the rest
.then_with(|| self.selected_node.cmp(&other.selected_node))
.then_with(|| self.prev_node.cmp(&other.prev_node))
.then_with(|| self.cur_intro.cmp(&other.cur_intro))
.then_with(|| {
self.edge_paths
.partial_cmp(&other.edge_paths)
.unwrap_or(Ordering::Equal)
})
}
}
impl<B: NavmeshBase> PartialOrd for Task<B>
where
B::PrimalNodeIndex: Ord,
B::EtchedPath: PartialOrd,
B::GapComment: PartialOrd,
B::Scalar: num_traits::Float + num_traits::float::TotalOrder + PartialOrd,
{
fn partial_cmp(&self, other: &Self) -> Option<Ordering> {
Some(self.cmp(other))
}
}
impl<B: NavmeshBase<Scalar = Scalar>, Scalar: num_traits::Float + core::iter::Sum> PmgAstar<B> {
fn estimate_remaining_goals_costs(&self, start_goal_idx: usize) -> Scalar {
self.goals
.get(start_goal_idx + 1..)
.map(|rgoals| rgoals.iter().map(|i| i.minimal_costs).sum())
.unwrap_or_else(Scalar::zero)
}
}
impl<B: NavmeshBase> PreparedGoal<B>
where
B::GapComment: Clone,
B::Scalar: num_traits::Float + core::iter::Sum,
{
/// start processing the goal
fn start_pmga<'a, F: Fn(NavmeshRef<B>) -> Option<B::Scalar>>(
&'a self,
navmesh: NavmeshRef<'a, B>,
goal_idx: usize,
env: &'a PmgAstar<B>,
evaluate_navmesh: &'a F,
) -> Option<impl Iterator<Item = Task<B>> + 'a> {
let source = NavmeshIndex::Primal(self.source.clone());
let estimated_remaining_goals = env.estimate_remaining_goals_costs(goal_idx);
Some(
navmesh
.node_data(&source)?
.neighs
.iter()
.filter_map({
let source = source.clone();
move |neigh| {
navmesh
.resolve_edge_data(source.clone(), neigh.clone())
.map(|(_, epi)| {
let edge_len = navmesh.access_edge_paths(epi).len();
(neigh, epi, edge_len)
})
}
})
.flat_map(move |(neigh, epi, edge_len)| {
let source = source.clone();
// A*-like remaining costs estimation
let estimated_remaining =
self.estimate_costs_for_source::<B::GapComment>(navmesh, neigh);
(0..=edge_len).filter_map(move |i| {
let mut edge_paths = Box::from(navmesh.edge_paths);
let mut navmesh = NavmeshRefMut {
nodes: navmesh.nodes,
edges: navmesh.edges,
edge_paths: &mut edge_paths,
};
navmesh.access_edge_paths_mut(epi).with_borrow_mut(|mut j| {
j.insert(i, RelaxedPath::Normal(self.label.clone()))
});
evaluate_navmesh(navmesh.as_ref()).map(|costs| Task {
goal_idx,
costs,
estimated_remaining,
estimated_remaining_goals,
edge_paths,
selected_node: neigh.clone(),
prev_node: source.clone(),
cur_intro: edge_len - i,
})
})
}),
)
}
}
impl<B: NavmeshBase> Task<B>
where
B::EtchedPath: PartialOrd,
B::GapComment: Clone + PartialOrd,
B::Scalar: num_traits::Float + num_traits::float::TotalOrder,
{
pub fn run<F>(
self,
env: &mut PmgAstar<B>,
evaluate_navmesh: F,
) -> ControlFlow<TaskResult<B>, (Self, Vec<NavmeshIndex<B::PrimalNodeIndex>>)>
where
F: Fn(NavmeshRef<B>) -> Option<B::Scalar>,
{
if let NavmeshIndex::Primal(primal) = &self.selected_node {
if env.goals[self.goal_idx].target.contains(primal) {
let Self {
goal_idx,
costs,
estimated_remaining: _,
estimated_remaining_goals: _,
edge_paths,
prev_node,
cur_intro,
selected_node: _,
} = self;
return ControlFlow::Break(TaskResult {
goal_idx,
costs,
edge_paths,
prev_node,
cur_intro,
});
} else {
panic!("wrong primal node selected");
}
}
let forks = self.progress(env, evaluate_navmesh);
ControlFlow::Continue((self, forks))
}
/// progress to the next step, splitting the task into new tasks (make sure to call `done` beforehand)
fn progress<F>(
&self,
env: &mut PmgAstar<B>,
evaluate_navmesh: F,
) -> Vec<NavmeshIndex<B::PrimalNodeIndex>>
where
F: Fn(NavmeshRef<B>) -> Option<B::Scalar>,
{
let goal_idx = self.goal_idx;
let navmesh = NavmeshRef {
nodes: &env.nodes,
edges: &env.edges,
edge_paths: &self.edge_paths,
};
let goal = &env.goals[goal_idx];
let Some((_, other_ends)) = navmesh.planarr_find_all_other_ends(
&self.selected_node,
&self.prev_node,
self.cur_intro,
true,
) else {
return Vec::new();
};
let mut ret = Vec::new();
env.queue
.extend(other_ends.filter_map(|(neigh, stop_data)| {
if let NavmeshIndex::Primal(primal) = &neigh {
if !goal.target.contains(primal) {
return None;
}
}
// A*-like remaining costs estimation
let estimated_remaining =
goal.estimate_costs_for_source::<B::GapComment>(navmesh, &neigh);
let mut edge_paths = self.edge_paths.clone();
let mut navmesh = NavmeshRefMut {
nodes: &env.nodes,
edges: &env.edges,
edge_paths: &mut edge_paths,
};
let cur_intro = navmesh
.edge_data_mut(self.selected_node.clone(), neigh.clone())
.unwrap()
.with_borrow_mut(|mut x| {
x.insert(
stop_data.insert_pos,
RelaxedPath::Normal(goal.label.clone()),
);
x.len() - stop_data.insert_pos - 1
});
ret.push(neigh.clone());
evaluate_navmesh(navmesh.as_ref()).map(|costs| Task {
goal_idx,
costs,
estimated_remaining,
estimated_remaining_goals: self.estimated_remaining_goals,
edge_paths,
selected_node: neigh.clone(),
prev_node: self.selected_node.clone(),
cur_intro,
})
}));
ret
}
}
#[derive(Clone, Debug, PartialEq)]
pub struct IntermedResult<B: NavmeshBase> {
pub edge_paths: Box<[EdgePaths<B::EtchedPath, B::GapComment>]>,
pub goal_idx: usize,
pub forks: Vec<NavmeshIndex<B::PrimalNodeIndex>>,
pub selected_node: NavmeshIndex<B::PrimalNodeIndex>,
pub maybe_finished_goal: Option<B::Scalar>,
}
impl<B> PmgAstar<B>
where
B: NavmeshBase,
B::EtchedPath: PartialOrd,
B::GapComment: Clone + PartialOrd,
B::Scalar: Default
+ core::fmt::Debug
+ core::iter::Sum
+ num_traits::Float
+ num_traits::float::TotalOrder,
{
/// * `evaluate_navmesh` calculates the exact cost of a given navmesh (lower cost is better)
pub fn new<F>(
navmesh: &Navmesh<B>,
goals: Vec<Goal<B::PrimalNodeIndex, B::EtchedPath>>,
evaluate_navmesh: F,
) -> Self
where
F: Fn(NavmeshRef<B>) -> Option<B::Scalar>,
{
let mut this = Self {
queue: BinaryHeap::new(),
goals: goals
.into_iter()
.map({
let navmesh = navmesh.as_ref();
move |i| i.prepare(navmesh)
})
.collect(),
nodes: navmesh.nodes.clone(),
edges: navmesh.edges.clone(),
};
// fill queue with first goal
if let Some(first_goal) = this.goals.first() {
this.queue = {
let navmesh = NavmeshRef {
nodes: &this.nodes,
edges: &this.edges,
edge_paths: &navmesh.edge_paths,
};
let tmp = if let Some(iter) =
first_goal.start_pmga(navmesh, 0, &this, &evaluate_navmesh)
{
iter.collect()
} else {
BinaryHeap::new()
};
tmp
};
}
this
}
pub fn queue_len(&self) -> usize {
self.queue.len()
}
/// run one step of the path-search
pub fn step<F>(
&mut self,
evaluate_navmesh: F,
) -> ControlFlow<
Option<(B::Scalar, Box<[EdgePaths<B::EtchedPath, B::GapComment>]>)>,
IntermedResult<B>,
>
where
B::PrimalNodeIndex: core::fmt::Debug,
F: Fn(NavmeshRef<B>) -> Option<B::Scalar>,
{
let Some(task) = self.queue.pop() else {
log::info!("found no complete result");
return ControlFlow::Break(None);
};
ControlFlow::Continue(match task.run(self, &evaluate_navmesh) {
ControlFlow::Break(taskres) => {
let next_goal_idx = taskres.goal_idx + 1;
let navmesh = NavmeshRef {
nodes: &self.nodes,
edges: &self.edges,
edge_paths: &taskres.edge_paths,
};
let edge_count = taskres.edge_paths.iter().map(|i| i.len()).sum::<usize>();
match self.goals.get(next_goal_idx) {
None => {
// done with all goals
log::info!(
"found result with {} edges and costs {:?}",
edge_count,
taskres.costs
);
return ControlFlow::Break(Some((taskres.costs, taskres.edge_paths)));
}
Some(next_goal) => {
// prepare next goal
log::debug!(
"found partial result (goal {}) with {} edges and costs {:?}",
taskres.goal_idx,
edge_count,
taskres.costs,
);
let mut tmp = if let Some(iter) =
next_goal.start_pmga(navmesh, next_goal_idx, self, &evaluate_navmesh)
{
iter.collect()
} else {
BinaryHeap::new()
};
let forks = tmp.iter().map(|i| i.selected_node.clone()).collect();
self.queue.append(&mut tmp);
IntermedResult {
goal_idx: taskres.goal_idx,
forks,
edge_paths: taskres.edge_paths,
selected_node: NavmeshIndex::Primal(next_goal.source.clone()),
maybe_finished_goal: Some(taskres.costs),
}
}
}
}
ControlFlow::Continue((task, forks)) => {
// task got further branched
IntermedResult {
goal_idx: task.goal_idx,
forks,
edge_paths: task.edge_paths,
selected_node: task.selected_node,
maybe_finished_goal: None,
}
}
})
}
}

View File

@ -0,0 +1,121 @@
// SPDX-FileCopyrightText: 2025 Topola contributors
//
// SPDX-License-Identifier: MIT
//
//! WIP implementation of incrementally finding planar graph embeddings modulo homotopy equivalence with fixed vertex positions
//!
//! ## usually used generic parameter names
//! * `EP`: type of etched path descriptor
//! * `NI`: type of node indices
//! * `PNI`: type of primal node indices
//! * `Scalar`: type of coordinates of points and vectors (distances)
#![allow(clippy::type_complexity)]
#![no_std]
extern crate alloc;
pub mod algo;
pub mod math;
pub mod mayrev;
pub mod navmesh;
pub mod planarr;
mod utils;
use alloc::boxed::Box;
use core::fmt;
/// Immutable data associated to a topological navmesh edge
#[derive(Clone, Copy, Debug, PartialEq, PartialOrd)]
#[cfg_attr(feature = "serde", derive(serde::Deserialize, serde::Serialize))]
pub struct Edge<PNI> {
pub lhs: Option<PNI>,
pub rhs: Option<PNI>,
}
impl<PNI> Edge<PNI> {
#[inline]
pub fn flip(&mut self) {
core::mem::swap(&mut self.lhs, &mut self.rhs);
}
#[inline]
pub fn as_ref(&self) -> Edge<&PNI> {
Edge {
lhs: self.lhs.as_ref(),
rhs: self.rhs.as_ref(),
}
}
}
#[derive(Clone, Copy, Debug, PartialEq, Eq, PartialOrd, Ord, Hash)]
#[cfg_attr(
any(test, feature = "serde"),
derive(serde::Deserialize, serde::Serialize)
)]
pub enum RelaxedPath<EP, CT> {
Normal(EP),
Weak(CT),
}
pub trait GetIndex {
type Idx: Clone + Eq;
/// extract the index of the underlying node
fn get_index(&self) -> Self::Idx;
}
/// Trait as container for navmesh type-level parameters
// in order to avoid extremely long type signatures
pub trait NavmeshBase {
type PrimalNodeIndex: Clone + Eq + Ord + fmt::Debug;
/// type for `RelaxedPath::Normal` component
type EtchedPath: Clone + Eq + fmt::Debug;
/// type for `RelaxedPath::Weak` component
type GapComment: Clone + fmt::Debug;
type Scalar: Clone + fmt::Debug;
}
#[derive(Clone, Copy, Debug, PartialEq, Eq, PartialOrd, Ord, Hash)]
#[cfg_attr(feature = "serde", derive(serde::Deserialize, serde::Serialize))]
pub enum DualIndex {
/// refers to a Delaunay face
Inner(usize),
/// refers to a Delaunay edge
Outer(usize),
}
#[derive(Clone, Copy, Debug, PartialEq, Eq, PartialOrd, Ord, Hash)]
#[cfg_attr(feature = "serde", derive(serde::Deserialize, serde::Serialize))]
pub enum NavmeshIndex<PNI> {
Dual(DualIndex),
Primal(PNI),
}
#[derive(Clone, Debug, PartialEq, Eq, PartialOrd, Hash)]
#[cfg_attr(feature = "serde", derive(serde::Deserialize, serde::Serialize))]
pub struct Node<PNI, Scalar> {
pub neighs: Box<[NavmeshIndex<PNI>]>,
pub pos: spade::Point2<Scalar>,
/// if this node is `DualOuter`, then it has an `open_direction`
pub open_direction: Option<spade::Point2<Scalar>>,
}
impl<PNI, Scalar> Node<PNI, Scalar> {
pub fn dual_neighbors(&self) -> impl Iterator<Item = &DualIndex> {
self.neighs.iter().filter_map(|i| match i {
NavmeshIndex::Dual(dual) => Some(dual),
NavmeshIndex::Primal(_) => None,
})
}
pub fn primal_neighbors(&self) -> impl Iterator<Item = &PNI> {
self.neighs.iter().filter_map(|i| match i {
NavmeshIndex::Dual(_) => None,
NavmeshIndex::Primal(prim) => Some(prim),
})
}
}

View File

@ -0,0 +1,37 @@
// SPDX-FileCopyrightText: 2025 Topola contributors
// SPDX-FileCopyrightText: 2024 Spade contributors
// SPDX-FileCopyrightText: 2024 Stefan Altmayer <stoeoef@gmail.com>
//
// SPDX-License-Identifier: MIT OR Apache-2.0
use num_traits::float::Float;
use spade::{Point2, SpadeNum};
// source: https://github.com/Stoeoef/spade/blob/083b7744d91b994f4b482cdfaf0787a049e94858/src/delaunay_core/math.rs,
// starting at line 359, modified to avoid private mothods of spade
pub fn circumcenter<S>(positions: [Point2<S>; 3]) -> (Point2<S>, S)
where
S: SpadeNum + Float,
{
let [v0, v1, v2] = positions;
let b = Point2 {
x: v1.x - v0.x,
y: v1.y - v0.y,
};
let c = Point2 {
x: v2.x - v0.x,
y: v2.y - v0.y,
};
let one = S::one();
let two = one + one;
let d = two * (b.x * c.y - c.x * b.y);
let len_b = b.x * b.x + b.y * b.y;
let len_c = c.x * c.x + c.y * c.y;
let d_inv: S = one / d;
let x = (len_b * c.y - len_c * b.y) * d_inv;
let y = (-len_b * c.x + len_c * b.x) * d_inv;
let result = Point2::new(x + v0.x, y + v0.y);
(result, x * x + y * y)
}

View File

@ -0,0 +1,294 @@
// SPDX-FileCopyrightText: 2025 Topola contributors
//
// SPDX-License-Identifier: MIT
//
//! `MaybeReversed`: Structure making it easier to interact with a vector/slice and its reversal uniformly.
use alloc::{sync::Arc, vec::Vec};
use core::{borrow, iter::DoubleEndedIterator, marker::PhantomData, ops};
/// Structure making it easier to interact with a vector/slice and its reversal uniformly.
#[derive(Debug, PartialEq, Eq, PartialOrd, Ord, Hash)]
pub struct MaybeReversed<I, T: ?Sized> {
pub inner: I,
pub reversed: bool,
item_marker: PhantomData<fn(T) -> T>,
}
impl<I: Clone, T: ?Sized> Clone for MaybeReversed<I, T> {
fn clone(&self) -> Self {
Self {
inner: self.inner.clone(),
reversed: self.reversed,
item_marker: PhantomData,
}
}
}
impl<I: Copy, T: ?Sized> Copy for MaybeReversed<I, T> {}
impl<I, T> MaybeReversed<I, T> {
pub fn new(inner: I) -> Self {
Self {
inner,
reversed: false,
item_marker: PhantomData,
}
}
pub fn map<J, U, F>(self, f: F) -> MaybeReversed<J, U>
where
F: FnOnce(I) -> J,
{
let Self {
inner, reversed, ..
} = self;
MaybeReversed {
inner: f(inner),
reversed,
item_marker: PhantomData,
}
}
#[inline]
#[must_use]
pub fn flip(mut self) -> Self {
self.reversed ^= true;
self
}
}
impl<T> MaybeReversed<&mut Arc<[T]>, T> {
pub fn with_borrow_mut<R, F>(&mut self, f: F) -> R
where
F: FnOnce(MaybeReversed<&mut Vec<T>, T>) -> R,
T: Clone,
{
let mut inner: Vec<T> = self.inner.iter().cloned().collect();
let ret = f(MaybeReversed {
inner: &mut inner,
reversed: self.reversed,
item_marker: PhantomData,
});
*self.inner = Arc::from(inner.into_boxed_slice());
ret
}
}
impl<'a, I: ?Sized + borrow::Borrow<[T]>, T> MaybeReversed<&'a I, T> {
pub fn as_ref(&self) -> MaybeReversed<&'a [T], T> {
MaybeReversed {
inner: self.inner.borrow(),
reversed: self.reversed,
item_marker: PhantomData,
}
}
#[inline]
pub fn len(&self) -> usize {
self.inner.borrow().len()
}
pub fn resolve_index(&self, index: usize) -> usize {
if self.reversed {
self.len().checked_sub(index + 1).unwrap()
} else {
index
}
}
#[inline]
pub fn iter(&self) -> MaybeReversed<core::slice::Iter<'a, T>, T> {
MaybeReversed {
inner: self.inner.borrow().iter(),
reversed: self.reversed,
item_marker: PhantomData,
}
}
}
impl<I: ?Sized + borrow::Borrow<[T]>, T> MaybeReversed<&mut I, T> {
#[inline(always)]
pub fn as_ref(&self) -> MaybeReversed<&'_ I, T> {
MaybeReversed {
inner: &*self.inner,
reversed: self.reversed,
item_marker: PhantomData,
}
}
#[inline]
pub fn len(&self) -> usize {
self.inner.borrow().len()
}
#[inline]
pub fn is_empty(&self) -> bool {
self.inner.borrow().is_empty()
}
#[inline]
pub fn resolve_index(&self, index: usize) -> usize {
self.as_ref().resolve_index(index)
}
#[inline]
pub fn iter(&self) -> MaybeReversed<core::slice::Iter<'_, T>, T> {
self.as_ref().iter()
}
}
impl<T> MaybeReversed<&mut [T], T> {
#[inline]
pub fn iter_mut(&mut self) -> MaybeReversed<core::slice::IterMut<'_, T>, T> {
MaybeReversed {
inner: self.inner.iter_mut(),
reversed: self.reversed,
item_marker: PhantomData,
}
}
}
impl<T> MaybeReversed<&mut Vec<T>, T> {
#[inline(always)]
pub fn as_ref_slice(&self) -> MaybeReversed<&'_ [T], T> {
MaybeReversed {
inner: &self.inner[..],
reversed: self.reversed,
item_marker: PhantomData,
}
}
#[inline(always)]
pub fn as_mut_slice(&mut self) -> MaybeReversed<&'_ mut [T], T> {
MaybeReversed {
inner: &mut self.inner[..],
reversed: self.reversed,
item_marker: PhantomData,
}
}
pub fn insert(&mut self, index: usize, element: T) {
if self.reversed {
self.inner
.insert(self.len().checked_sub(index).unwrap(), element);
} else {
self.inner.insert(index, element);
}
}
}
impl<I: ?Sized + borrow::Borrow<[T]> + ops::Index<usize, Output = T>, T> ops::Index<usize>
for MaybeReversed<&I, T>
{
type Output = <I as ops::Index<usize>>::Output;
fn index(&self, index: usize) -> &Self::Output {
self.inner.index(self.resolve_index(index))
}
}
impl<I: ?Sized + borrow::Borrow<[T]> + ops::Index<usize, Output = T>, T> ops::Index<usize>
for MaybeReversed<&mut I, T>
{
type Output = <I as ops::Index<usize>>::Output;
fn index(&self, index: usize) -> &Self::Output {
self.inner.index(self.resolve_index(index))
}
}
impl<I: ?Sized + borrow::Borrow<[T]> + ops::IndexMut<usize, Output = T>, T> ops::IndexMut<usize>
for MaybeReversed<&mut I, T>
{
fn index_mut(&mut self, index: usize) -> &mut Self::Output {
self.inner.index_mut(self.resolve_index(index))
}
}
#[inline]
pub fn index<T>(obj: &[T], idx: MaybeReversed<usize, T>) -> &T {
use ops::Index;
obj.index(
MaybeReversed {
inner: obj,
reversed: idx.reversed,
item_marker: PhantomData,
}
.resolve_index(idx.inner),
)
}
#[inline]
pub fn index_mut<T>(obj: &mut [T], idx: MaybeReversed<usize, T>) -> &mut T {
use ops::IndexMut;
obj.index_mut(
MaybeReversed {
inner: &obj[..],
reversed: idx.reversed,
item_marker: PhantomData,
}
.resolve_index(idx.inner),
)
}
#[inline]
pub fn index_forward<T, U>(obj: &[T], idx: MaybeReversed<usize, U>) -> MaybeReversed<&T, U> {
MaybeReversed {
inner: &obj[idx.inner],
reversed: idx.reversed,
item_marker: PhantomData,
}
}
#[inline]
pub fn index_mut_forward<T, U>(
obj: &mut [T],
idx: MaybeReversed<usize, U>,
) -> MaybeReversed<&mut T, U> {
MaybeReversed {
inner: &mut obj[idx.inner],
reversed: idx.reversed,
item_marker: PhantomData,
}
}
impl<I: DoubleEndedIterator, T> Iterator for MaybeReversed<I, T> {
type Item = <I as Iterator>::Item;
fn next(&mut self) -> Option<<I as Iterator>::Item> {
match self.reversed {
false => self.inner.next(),
true => self.inner.next_back(),
}
}
fn fold<B, F>(self, init: B, f: F) -> B
where
F: FnMut(B, Self::Item) -> B,
{
match self.reversed {
false => self.inner.fold(init, f),
true => self.inner.rfold(init, f),
}
}
}
impl<I: DoubleEndedIterator, T> DoubleEndedIterator for MaybeReversed<I, T> {
fn next_back(&mut self) -> Option<<I as Iterator>::Item> {
match self.reversed {
false => self.inner.next_back(),
true => self.inner.next(),
}
}
fn rfold<B, F>(self, init: B, f: F) -> B
where
F: FnMut(B, Self::Item) -> B,
{
match self.reversed {
false => self.inner.rfold(init, f),
true => self.inner.fold(init, f),
}
}
}

View File

@ -0,0 +1,380 @@
// SPDX-FileCopyrightText: 2025 Topola contributors
// SPDX-FileCopyrightText: 2024 Spade contributors
//
// SPDX-License-Identifier: MIT
//
//! Topological navmesh generation from Delaunay triangulation
// idea: see issue topola/topola#132
use alloc::collections::{btree_map::Entry, BTreeMap, BTreeSet};
use alloc::{boxed::Box, sync::Arc, vec::Vec};
use core::ops;
use num_traits::{Float, FloatConst};
use spade::{
handles::{DirectedEdgeHandle, VoronoiVertex as VoroV},
HasPosition, Point2, Triangulation,
};
use crate::{
navmesh::{EdgeIndex, EdgePaths, NavmeshSer},
DualIndex, Edge, NavmeshBase, NavmeshIndex, Node,
};
#[derive(Clone, Debug, PartialEq)]
pub struct TrianVertex<PNI, T: spade::SpadeNum> {
pub idx: PNI,
pub pos: Point2<T>,
}
impl<PNI, T: spade::SpadeNum> HasPosition for TrianVertex<PNI, T> {
type Scalar = T;
#[inline]
fn position(&self) -> spade::Point2<T> {
self.pos
}
}
#[derive(Clone, Copy, Debug)]
struct EdgeMeta<PNI, Scalar> {
//hypot: Scalar,
/// `atan2`, but into the range `[0, 2π)`.
angle: Scalar,
data: Edge<PNI>,
}
impl<PNI, Scalar: Float + FloatConst + ops::AddAssign + ops::SubAssign> EdgeMeta<PNI, Scalar> {
/// rotate the angle by 180°
fn flip(&mut self) {
self.data.flip();
self.angle += Scalar::PI();
if self.angle >= Scalar::TAU() {
self.angle -= Scalar::TAU();
}
}
}
impl<B: NavmeshBase> NavmeshSer<B>
where
B::Scalar: spade::SpadeNum
+ Float
+ FloatConst
+ num_traits::float::TotalOrder
+ ops::AddAssign
+ ops::SubAssign,
{
pub fn from_triangulation<T>(triangulation: &T) -> Self
where
T: Triangulation<Vertex = TrianVertex<B::PrimalNodeIndex, B::Scalar>>,
{
let mut nodes =
BTreeMap::<_, (Point2<B::Scalar>, BTreeSet<_>, Option<Point2<B::Scalar>>)>::new();
// note that all the directions are in the range [0, 2π).
let mut edges = BTreeMap::<
EdgeIndex<NavmeshIndex<B::PrimalNodeIndex>>,
EdgeMeta<B::PrimalNodeIndex, B::Scalar>,
>::new();
// insert edge for each Voronoi edge
for edge in triangulation.undirected_voronoi_edges() {
let edge = edge.as_directed();
let (a_vert, b_vert) = (edge.from(), edge.to());
let (a_idx, a) = insert_dual_node_position::<B, _, _, _>(&mut nodes, &a_vert, None);
let (b_idx, b) =
insert_dual_node_position::<B, _, _, _>(&mut nodes, &b_vert, Some(a_idx.clone()));
nodes.get_mut(&a_idx).unwrap().1.insert(b_idx.clone());
// https://docs.rs/spade/2.12.1/src/spade/delaunay_core/handles/public_handles.rs.html#305
let delaunay = edge.as_delaunay_edge();
insert_edge(
&mut edges,
a_idx,
&a,
b_idx,
&b,
Edge {
lhs: Some(delaunay.from().data().idx.clone()),
rhs: Some(delaunay.to().data().idx.clone()),
},
);
}
// insert edge for each {primal node} * {neighbors of that node}
for node in triangulation.vertices() {
// iterate over neighbors to generate sectors information
let idx = NavmeshIndex::Primal(node.data().idx.clone());
let a = node.data().pos;
let mut primal_neighs = BTreeSet::new();
for edge in node.as_voronoi_face().adjacent_edges() {
// to convert dual edges around a node into dual nodes around a node,
// we use the dual nodes that the edges point to.
let dual_node = edge.to();
let (dual_idx, b) = insert_dual_node_position::<B, _, _, _>(
&mut nodes,
&dual_node,
Some(idx.clone()),
);
primal_neighs.insert(dual_idx.clone());
insert_edge(
&mut edges,
idx.clone(),
&a,
dual_idx,
&b,
Edge {
lhs: None,
rhs: None,
},
);
}
assert!(nodes.insert(idx, (a, primal_neighs, None)).is_none());
}
// insert hull edges between `DualOuter` vertices
{
let convex_hull = triangulation.convex_hull().collect::<Vec<_>>();
// if the convex hull only consists of two entries, we only have two primals, and the
// convex hull generated by the code below would be invalid.
// but given we want to find shortest paths, and for two entries that is unique,
// we can just skip the generation in that case
if convex_hull.len() > 2 {
for cvhedges in convex_hull.windows(2) {
let [edge1, edge2] = cvhedges else {
continue;
};
insert_convex_hull_edge::<B, _, _, _>(&mut nodes, &mut edges, edge1, edge2);
}
insert_convex_hull_edge::<B, _, _, _>(
&mut nodes,
&mut edges,
&convex_hull[convex_hull.len() - 1],
&convex_hull[0],
);
}
}
let nodes = finalize_nodes::<B>(nodes, &edges);
let empty_edge: EdgePaths<B::EtchedPath, B::GapComment> =
Arc::from(Vec::new().into_boxed_slice());
Self {
nodes: Arc::new(nodes),
edges: edges
.into_iter()
.map(|(k, emeta)| (k, (emeta.data, empty_edge.clone())))
.collect(),
}
}
}
fn voronoi_vertex_get_index<V: HasPosition, DE, UE, F>(
vertex: &spade::handles::VoronoiVertex<'_, V, DE, UE, F>,
) -> DualIndex {
match vertex {
VoroV::Inner(face) => DualIndex::Inner(face.index()),
VoroV::Outer(out) => DualIndex::Outer(out.index()),
}
}
// TODO: bound to root_bbox + some padding
fn voronoi_vertex_get_position<V, DE, UE, F>(
vertex: &spade::handles::VoronoiVertex<'_, V, DE, UE, F>,
) -> (
Point2<<V as HasPosition>::Scalar>,
Option<Point2<<V as HasPosition>::Scalar>>,
)
where
V: HasPosition,
<V as HasPosition>::Scalar: num_traits::float::Float,
{
match vertex {
VoroV::Inner(face) => (face.circumcenter(), None),
VoroV::Outer(halfspace) => {
let delauney = halfspace.as_delaunay_edge();
let from = delauney.from().position();
let to = delauney.to().position();
let orth = Point2 {
x: -(to.y - from.y),
y: to.x - from.x,
};
(
Point2 {
x: (from.x + to.x) / ((2.0).into()) + orth.x,
y: (from.y + to.y) / ((2.0).into()) + orth.y,
},
Some(orth),
)
}
}
}
fn insert_edge<PNI, Scalar>(
edges: &mut BTreeMap<EdgeIndex<NavmeshIndex<PNI>>, EdgeMeta<PNI, Scalar>>,
a_idx: NavmeshIndex<PNI>,
a: &Point2<Scalar>,
b_idx: NavmeshIndex<PNI>,
b: &Point2<Scalar>,
data: Edge<PNI>,
) where
PNI: Ord,
Scalar: Float + FloatConst + ops::AddAssign + ops::SubAssign,
{
let direction = Point2 {
x: b.x - a.x,
y: b.y - a.y,
};
let angle = direction.y.atan2(direction.x);
let mut edgemeta = EdgeMeta {
// hypot: Scalar::hypot(direction.x, direction.y),
angle: if angle.is_sign_negative() {
angle + Scalar::TAU()
} else {
angle
},
data,
};
if a_idx > b_idx {
edgemeta.flip();
}
edges.insert(EdgeIndex::from((a_idx, b_idx)), edgemeta);
}
fn insert_dual_node_position<B: NavmeshBase, DE, UE, F>(
nodes: &mut BTreeMap<
NavmeshIndex<B::PrimalNodeIndex>,
(
Point2<B::Scalar>,
BTreeSet<NavmeshIndex<B::PrimalNodeIndex>>,
Option<Point2<B::Scalar>>,
),
>,
vertex: &spade::handles::VoronoiVertex<
'_,
TrianVertex<B::PrimalNodeIndex, B::Scalar>,
DE,
UE,
F,
>,
new_neighbor: Option<NavmeshIndex<B::PrimalNodeIndex>>,
) -> (NavmeshIndex<B::PrimalNodeIndex>, Point2<B::Scalar>)
where
B::Scalar: spade::SpadeNum + Float,
{
let idx = NavmeshIndex::Dual(voronoi_vertex_get_index(vertex));
let ret = match nodes.entry(idx.clone()) {
Entry::Occupied(occ) => occ.into_mut(),
Entry::Vacant(vac) => {
let (x, x_odir) = voronoi_vertex_get_position(vertex);
vac.insert((x, BTreeSet::new(), x_odir))
}
};
if let Some(neigh) = new_neighbor {
ret.1.insert(neigh);
}
(idx, ret.0)
}
fn insert_convex_hull_edge<B: NavmeshBase, DE, UE, F>(
nodes: &mut BTreeMap<
NavmeshIndex<B::PrimalNodeIndex>,
(
Point2<B::Scalar>,
BTreeSet<NavmeshIndex<B::PrimalNodeIndex>>,
Option<Point2<B::Scalar>>,
),
>,
edges: &mut BTreeMap<
EdgeIndex<NavmeshIndex<B::PrimalNodeIndex>>,
EdgeMeta<B::PrimalNodeIndex, B::Scalar>,
>,
edge1: &DirectedEdgeHandle<TrianVertex<B::PrimalNodeIndex, B::Scalar>, DE, UE, F>,
edge2: &DirectedEdgeHandle<TrianVertex<B::PrimalNodeIndex, B::Scalar>, DE, UE, F>,
) where
B::Scalar: spade::SpadeNum + Float + FloatConst + ops::AddAssign + ops::SubAssign,
{
let (edge1to, edge2to) = (
edge1.as_voronoi_edge().from(),
edge2.as_voronoi_edge().from(),
);
assert!(matches!(edge1to, VoroV::Outer(_)));
assert!(matches!(edge2to, VoroV::Outer(_)));
let a_idx = NavmeshIndex::Dual(voronoi_vertex_get_index(&edge1to));
let b_idx = NavmeshIndex::Dual(voronoi_vertex_get_index(&edge2to));
let a_pos = {
let a = nodes.get_mut(&a_idx).unwrap();
a.1.insert(b_idx.clone());
a.0
};
let b_pos = {
let b = nodes.get_mut(&b_idx).unwrap();
b.1.insert(a_idx.clone());
b.0
};
// > The edges are returned in clockwise order as seen from any point in the triangulation.
let rhs_prim = edge1.to().data().idx.clone();
debug_assert!(edge2.from().data().idx == rhs_prim);
insert_edge(
edges,
a_idx,
&a_pos,
b_idx,
&b_pos,
Edge {
lhs: None,
rhs: Some(rhs_prim),
},
);
}
/// extracts sector information
fn finalize_nodes<B: NavmeshBase>(
nodes: BTreeMap<
NavmeshIndex<B::PrimalNodeIndex>,
(
Point2<B::Scalar>,
BTreeSet<NavmeshIndex<B::PrimalNodeIndex>>,
Option<Point2<B::Scalar>>,
),
>,
edges: &BTreeMap<
EdgeIndex<NavmeshIndex<B::PrimalNodeIndex>>,
EdgeMeta<B::PrimalNodeIndex, B::Scalar>,
>,
) -> BTreeMap<NavmeshIndex<B::PrimalNodeIndex>, Node<B::PrimalNodeIndex, B::Scalar>>
where
B::Scalar: Float + FloatConst + num_traits::float::TotalOrder + ops::AddAssign + ops::SubAssign,
{
nodes
.into_iter()
.map(|(idx, (pos, neighs, open_direction))| {
let mut neighs: Box<[_]> = neighs.into_iter().collect();
neighs.sort_by(|a, b| {
let mut dir_a = edges[&EdgeIndex::from((idx.clone(), a.clone()))].clone();
if idx > *a {
dir_a.flip();
}
let mut dir_b = edges[&EdgeIndex::from((idx.clone(), b.clone()))].clone();
if idx > *b {
dir_b.flip();
}
<B::Scalar as num_traits::float::TotalOrder>::total_cmp(&dir_a.angle, &dir_b.angle)
});
(
idx,
Node {
neighs,
pos,
open_direction,
},
)
})
.collect()
}

View File

@ -0,0 +1,477 @@
// SPDX-FileCopyrightText: 2025 Topola contributors
//
// SPDX-License-Identifier: MIT
//
//! A topological navmesh implementation
// idea: see issue topola/topola#132
use alloc::collections::BTreeMap;
use alloc::{boxed::Box, sync::Arc, vec::Vec};
use crate::{mayrev::MaybeReversed, planarr, Edge, NavmeshBase, NavmeshIndex, Node, RelaxedPath};
mod generate;
mod ordered_pair;
pub use generate::TrianVertex;
pub use ordered_pair::OrderedPair;
pub type EdgeIndex<T> = ordered_pair::OrderedPair<T>;
pub type EdgePaths<EP, CT> = Arc<[RelaxedPath<EP, CT>]>;
/// A topological navmesh (here in comfortably serializable form),
/// built upon the merging of the primary and dual graph,
/// and with an operation to create barriers.
///
/// This is basically a planar graph embedding.
#[derive(Clone, Debug)]
#[cfg_attr(feature = "serde", derive(serde::Deserialize, serde::Serialize))]
#[cfg_attr(
feature = "serde",
serde(bound(
deserialize = "B: NavmeshBase,
B::PrimalNodeIndex: serde::Deserialize<'de>,
B::EtchedPath: serde::Deserialize<'de>,
B::GapComment: serde::Deserialize<'de>,
B::Scalar: serde::Deserialize<'de>",
serialize = "B: NavmeshBase,
B::PrimalNodeIndex: serde::Serialize,
B::EtchedPath: serde::Serialize,
B::GapComment: serde::Serialize,
B::Scalar: serde::Serialize"
))
)]
pub struct NavmeshSer<B: NavmeshBase> {
pub nodes: Arc<BTreeMap<NavmeshIndex<B::PrimalNodeIndex>, Node<B::PrimalNodeIndex, B::Scalar>>>,
pub edges: BTreeMap<
EdgeIndex<NavmeshIndex<B::PrimalNodeIndex>>,
(
Edge<B::PrimalNodeIndex>,
EdgePaths<B::EtchedPath, B::GapComment>,
),
>,
}
/// A topological navmesh,
/// built upon the merging of the primary and dual graph,
/// and with an operation to create barriers.
///
/// This is basically a planar graph embedding, and is used for enumeration of such embeddings.
#[derive(Clone, Debug)]
#[cfg_attr(feature = "serde", derive(serde::Deserialize, serde::Serialize))]
#[cfg_attr(
feature = "serde",
serde(bound(
deserialize = "B: NavmeshBase,
B::PrimalNodeIndex: serde::Deserialize<'de>,
B::EtchedPath: serde::Deserialize<'de>,
B::GapComment: serde::Deserialize<'de>,
B::Scalar: serde::Deserialize<'de>",
serialize = "B: NavmeshBase,
B::PrimalNodeIndex: serde::Serialize,
B::EtchedPath: serde::Serialize,
B::GapComment: serde::Serialize,
B::Scalar: serde::Serialize"
))
)]
pub struct Navmesh<B: NavmeshBase> {
pub nodes: Arc<BTreeMap<NavmeshIndex<B::PrimalNodeIndex>, Node<B::PrimalNodeIndex, B::Scalar>>>,
pub edges: Arc<
BTreeMap<EdgeIndex<NavmeshIndex<B::PrimalNodeIndex>>, (Edge<B::PrimalNodeIndex>, usize)>,
>,
pub edge_paths: Box<[EdgePaths<B::EtchedPath, B::GapComment>]>,
}
pub struct NavmeshRef<'a, B: NavmeshBase> {
pub nodes: &'a BTreeMap<NavmeshIndex<B::PrimalNodeIndex>, Node<B::PrimalNodeIndex, B::Scalar>>,
pub edges: &'a BTreeMap<
EdgeIndex<NavmeshIndex<B::PrimalNodeIndex>>,
(Edge<B::PrimalNodeIndex>, usize),
>,
pub edge_paths: &'a [EdgePaths<B::EtchedPath, B::GapComment>],
}
pub struct NavmeshRefMut<'a, B: NavmeshBase> {
pub nodes: &'a BTreeMap<NavmeshIndex<B::PrimalNodeIndex>, Node<B::PrimalNodeIndex, B::Scalar>>,
pub edges: &'a BTreeMap<
EdgeIndex<NavmeshIndex<B::PrimalNodeIndex>>,
(Edge<B::PrimalNodeIndex>, usize),
>,
pub edge_paths: &'a mut [EdgePaths<B::EtchedPath, B::GapComment>],
}
impl<B: NavmeshBase> Clone for NavmeshRef<'_, B> {
#[inline(always)]
fn clone(&self) -> Self {
*self
}
}
impl<B: NavmeshBase> Copy for NavmeshRef<'_, B> {}
impl<B: NavmeshBase> Default for Navmesh<B> {
fn default() -> Self {
Self {
nodes: Arc::new(BTreeMap::new()),
edges: Arc::new(BTreeMap::new()),
edge_paths: Vec::new().into_boxed_slice(),
}
}
}
impl<B: NavmeshBase> NavmeshBase for NavmeshSer<B> {
type PrimalNodeIndex = B::PrimalNodeIndex;
type EtchedPath = B::EtchedPath;
type GapComment = B::GapComment;
type Scalar = B::Scalar;
}
impl<B: NavmeshBase> NavmeshBase for Navmesh<B> {
type PrimalNodeIndex = B::PrimalNodeIndex;
type EtchedPath = B::EtchedPath;
type GapComment = B::GapComment;
type Scalar = B::Scalar;
}
impl<B: NavmeshBase> NavmeshBase for NavmeshRef<'_, B> {
type PrimalNodeIndex = B::PrimalNodeIndex;
type EtchedPath = B::EtchedPath;
type GapComment = B::GapComment;
type Scalar = B::Scalar;
}
impl<B: NavmeshBase> NavmeshBase for NavmeshRefMut<'_, B> {
type PrimalNodeIndex = B::PrimalNodeIndex;
type EtchedPath = B::EtchedPath;
type GapComment = B::GapComment;
type Scalar = B::Scalar;
}
impl<B: NavmeshBase> From<NavmeshSer<B>> for Navmesh<B> {
fn from(this: NavmeshSer<B>) -> Navmesh<B> {
let mut edge_paths = Vec::with_capacity(this.edges.len());
let edges = this
.edges
.into_iter()
.map(|(key, (edge, paths))| {
let idx = edge_paths.len();
edge_paths.push(paths);
(key, (edge, idx))
})
.collect::<BTreeMap<_, _>>();
Self {
nodes: this.nodes,
edges: Arc::new(edges),
edge_paths: edge_paths.into_boxed_slice(),
}
}
}
impl<B: NavmeshBase> From<Navmesh<B>> for NavmeshSer<B> {
fn from(this: Navmesh<B>) -> NavmeshSer<B> {
Self {
nodes: this.nodes,
edges: this
.edges
.iter()
.map(|(key, (value, idx))| {
(key.clone(), (value.clone(), this.edge_paths[*idx].clone()))
})
.collect(),
}
}
}
impl<B: NavmeshBase> Navmesh<B> {
#[inline(always)]
pub fn as_ref(&self) -> NavmeshRef<B> {
NavmeshRef {
nodes: &self.nodes,
edges: &self.edges,
edge_paths: &self.edge_paths,
}
}
#[inline(always)]
pub fn as_mut(&mut self) -> NavmeshRefMut<B> {
NavmeshRefMut {
nodes: &self.nodes,
edges: &self.edges,
edge_paths: &mut self.edge_paths,
}
}
}
pub(crate) fn resolve_edge_data<PNI: Ord, EP>(
edges: &BTreeMap<EdgeIndex<NavmeshIndex<PNI>>, (Edge<PNI>, usize)>,
from_node: NavmeshIndex<PNI>,
to_node: NavmeshIndex<PNI>,
) -> Option<(Edge<&PNI>, MaybeReversed<usize, EP>)> {
let reversed = from_node > to_node;
let edge_idx: EdgeIndex<NavmeshIndex<PNI>> = (from_node, to_node).into();
let edge = edges.get(&edge_idx)?;
let mut data = edge.0.as_ref();
if reversed {
data.flip();
}
let mut ret = MaybeReversed::new(edge.1);
ret.reversed = reversed;
Some((data, ret))
}
impl<'a, B: NavmeshBase + 'a> NavmeshRefMut<'a, B> {
#[inline(always)]
pub fn as_ref(&'a self) -> NavmeshRef<'a, B> {
NavmeshRef {
nodes: self.nodes,
edges: self.edges,
edge_paths: self.edge_paths,
}
}
pub fn edge_data_mut(
&mut self,
from_node: NavmeshIndex<B::PrimalNodeIndex>,
to_node: NavmeshIndex<B::PrimalNodeIndex>,
) -> Option<
MaybeReversed<
&mut Arc<[RelaxedPath<B::EtchedPath, B::GapComment>]>,
RelaxedPath<B::EtchedPath, B::GapComment>,
>,
> {
self.resolve_edge_data(from_node, to_node)
.map(|(_, item)| item)
.map(|item| self.access_edge_paths_mut(item))
}
#[inline(always)]
pub fn resolve_edge_data(
&self,
from_node: NavmeshIndex<B::PrimalNodeIndex>,
to_node: NavmeshIndex<B::PrimalNodeIndex>,
) -> Option<(
Edge<&B::PrimalNodeIndex>,
MaybeReversed<usize, RelaxedPath<B::EtchedPath, B::GapComment>>,
)> {
resolve_edge_data(self.edges, from_node, to_node)
}
/// ## Panics
/// This function panics if the given `item` is out of bounds
/// (e.g. only happens when it was produced by a `Navmesh` with different edges)
#[inline(always)]
pub fn access_edge_paths_mut(
&mut self,
item: MaybeReversed<usize, RelaxedPath<B::EtchedPath, B::GapComment>>,
) -> MaybeReversed<
&mut Arc<[RelaxedPath<B::EtchedPath, B::GapComment>]>,
RelaxedPath<B::EtchedPath, B::GapComment>,
> {
crate::mayrev::index_mut_forward(self.edge_paths, item)
}
}
impl<'a, B: NavmeshBase + 'a> NavmeshRef<'a, B> {
#[inline(always)]
pub fn resolve_edge_data(
&self,
from_node: NavmeshIndex<B::PrimalNodeIndex>,
to_node: NavmeshIndex<B::PrimalNodeIndex>,
) -> Option<(
Edge<&B::PrimalNodeIndex>,
MaybeReversed<usize, RelaxedPath<B::EtchedPath, B::GapComment>>,
)> {
resolve_edge_data(self.edges, from_node, to_node)
}
#[inline(always)]
pub fn node_data(
&self,
node: &NavmeshIndex<B::PrimalNodeIndex>,
) -> Option<&'a Node<B::PrimalNodeIndex, B::Scalar>> {
self.nodes.get(node)
}
pub fn edge_data(
&self,
from_node: NavmeshIndex<B::PrimalNodeIndex>,
to_node: NavmeshIndex<B::PrimalNodeIndex>,
) -> Option<
MaybeReversed<
&'a Arc<[RelaxedPath<B::EtchedPath, B::GapComment>]>,
RelaxedPath<B::EtchedPath, B::GapComment>,
>,
> {
self.resolve_edge_data(from_node, to_node)
.map(|(_, item)| self.access_edge_paths(item))
}
/*
pub fn check_edge_rules<'b, ObjectKind: Copy + Eq + Ord>(
&self,
from_node: NavmeshIndex<B::PrimalNodeIndex>,
to_node: NavmeshIndex<B::PrimalNodeIndex>,
) -> Option<(B::Scalar, bool)>
where
for<'i> &'i B::PrimalNodeIndex: topola_rules::GetConditions<'i, ObjectKind = ObjectKind>,
for<'i> &'i B::EtchedPath: topola_rules::GetConditions<'i, ObjectKind = ObjectKind>,
B::EtchedPath: topola_rules::GetWidth<Scalar = B::Scalar>,
R: topola_rules::AccessRules<Scalar = B::Scalar, ObjectKind = ObjectKind>,
B::Scalar: Default + num_traits::Float + core::iter::Sum,
{
let edgd = self.edge_data(from_node.clone(), to_node.clone())?;
if edgd.inner.is_empty() {
return Some((B::Scalar::default(), true));
}
let (NavmeshIndex::Dual(_), NavmeshIndex::Dual(_)) = (from_node.clone(), to_node.clone())
else {
// we only check dual-dual connections for now
return Some((B::Scalar::default(), true));
};
// decode dual nodes into the unique orthogonal primal nodes
let start_node_data = self.node_data(&from_node)?;
let start_node_neighs = &start_node_data.neighs;
let start_target_pos = start_node_neighs
.iter()
.enumerate()
.find(|(_, i)| **i == from_node)?
.0;
let prev = &start_node_neighs
[(start_node_neighs.len() + start_target_pos - 1) % start_node_neighs.len()];
let next = &start_node_neighs[(start_target_pos + 1) % start_node_neighs.len()];
let filtered_edgd: Vec<_> = Iterator::filter_map(edgd.iter(), |i| match i {
RelaxedPath::Normal(i) => Some(i),
RelaxedPath::Weak(_) => None,
})
.collect();
let usage_width: B::Scalar =
Iterator::map(filtered_edgd.iter(), topola_rules::GetWidth::width).sum();
let conds: Vec<_> = filtered_edgd
.iter()
.filter_map(|i| i.conditions())
.collect();
let mut usage = usage_width
+ conds
.windows(2)
.filter_map(|x| if let [i, j] = x { Some((i, j)) } else { None })
.map(|(i, j)| self.rules.clearance((i, j).into()))
.sum();
let (NavmeshIndex::Primal(prev), NavmeshIndex::Primal(next)) = (prev, next) else {
return if let (
NavmeshIndex::Dual(DualIndex::Outer(_)),
NavmeshIndex::Dual(DualIndex::Outer(_)),
) = (from_node, to_node)
{
// `DualOuter`-`DualOuter` are only associated to a single primal
// TODO: but even those should have a maximum capacity (e.g. up to PCB border)
// TODO: handle (more difficult) how large polygons are and such...
Some((usage, true))
} else {
// something is wrong
None
};
};
let stop_node_data = self.node_data(&to_node)?;
let to_primals: BTreeSet<_> = stop_node_data.primal_neighbors().collect();
assert!(to_primals.contains(&prev));
assert!(to_primals.contains(&next));
use topola_rules::GetConditions;
let prev_cond = prev.conditions();
let next_cond = next.conditions();
if let Some(prev_cond) = prev_cond {
usage = usage
+ self
.rules
.clearance((&prev_cond, conds.first().unwrap()).into());
}
if let Some(next_cond) = next_cond {
usage = usage
+ self
.rules
.clearance((&next_cond, conds.last().unwrap()).into());
}
// TODO: handle (more difficult) how large polygons are and such...
Some((
usage,
usage
<= B::Scalar::hypot(
stop_node_data.pos.x - start_node_data.pos.x,
stop_node_data.pos.y - start_node_data.pos.y,
),
))
}
*/
/// ## Panics
/// This function panics if the given `item` is out of bounds
/// (e.g. only happens when it was produced by a `Navmesh` with different edges)
#[inline(always)]
pub fn access_edge_paths(
&self,
item: MaybeReversed<usize, RelaxedPath<B::EtchedPath, B::GapComment>>,
) -> MaybeReversed<
&'a Arc<[RelaxedPath<B::EtchedPath, B::GapComment>]>,
RelaxedPath<B::EtchedPath, B::GapComment>,
> {
crate::mayrev::index_forward(self.edge_paths, item)
}
/// See [`find_other_end`](planarr::find_other_end).
pub fn planarr_find_other_end(
self,
node: &NavmeshIndex<B::PrimalNodeIndex>,
start: &NavmeshIndex<B::PrimalNodeIndex>,
pos: usize,
already_inserted_at_start: bool,
stop: &NavmeshIndex<B::PrimalNodeIndex>,
) -> Option<(usize, planarr::OtherEnd)> {
planarr::find_other_end(
self.nodes[node].neighs.iter().map(move |neigh| {
let edge = self
.edge_data(node.clone(), neigh.clone())
.expect("unable to resolve neighbor");
(neigh.clone(), edge)
}),
start,
pos,
already_inserted_at_start,
stop,
)
}
/// See [`find_all_other_ends`](planarr::find_all_other_ends).
pub fn planarr_find_all_other_ends(
self,
node: &'a NavmeshIndex<B::PrimalNodeIndex>,
start: &'a NavmeshIndex<B::PrimalNodeIndex>,
pos: usize,
already_inserted_at_start: bool,
) -> Option<(
usize,
impl Iterator<Item = (NavmeshIndex<B::PrimalNodeIndex>, planarr::OtherEnd)> + 'a,
)> {
planarr::find_all_other_ends(
self.nodes[node].neighs.iter().map(move |neigh| {
let edge = self
.edge_data(node.clone(), neigh.clone())
.expect("unable to resolve neighbor");
(neigh.clone(), edge)
}),
start,
pos,
already_inserted_at_start,
)
}
}

View File

@ -0,0 +1,70 @@
// SPDX-FileCopyrightText: 2025 Topola contributors
//
// SPDX-License-Identifier: MIT
#[derive(Clone, Copy, Debug, PartialEq, Eq, PartialOrd, Ord, Hash)]
#[cfg_attr(feature = "serde", derive(serde::Deserialize))]
#[cfg_attr(feature = "serde", serde(from = "(T, T)"))]
#[cfg_attr(
feature = "serde",
serde(bound(deserialize = "T: serde::Deserialize<'de> + Ord"))
)]
pub struct OrderedPair<T>(T, T);
impl<T> core::ops::Index<bool> for OrderedPair<T> {
type Output = T;
#[inline]
fn index(&self, index: bool) -> &T {
if index {
&self.1
} else {
&self.0
}
}
}
impl<T: Ord> From<(T, T)> for OrderedPair<T> {
fn from(mut x: (T, T)) -> Self {
if x.0 > x.1 {
core::mem::swap(&mut x.0, &mut x.1);
}
let (a, b) = x;
Self(a, b)
}
}
impl<T> From<OrderedPair<T>> for (T, T) {
#[inline(always)]
fn from(OrderedPair(a, b): OrderedPair<T>) -> (T, T) {
(a, b)
}
}
impl<'a, T> From<&'a OrderedPair<T>> for (&'a T, &'a T) {
#[inline(always)]
fn from(OrderedPair(a, b): &'a OrderedPair<T>) -> (&'a T, &'a T) {
(a, b)
}
}
#[cfg(feature = "serde")]
impl<T: serde::Serialize> serde::Serialize for OrderedPair<T> {
#[inline]
fn serialize<S: serde::Serializer>(&self, serializer: S) -> Result<S::Ok, S::Error> {
use serde::ser::SerializeTuple;
let mut tuple = match serializer.serialize_tuple(2) {
Ok(x) => x,
Err(e) => return Err(e),
};
match tuple.serialize_element(&self.0) {
Ok(x) => x,
Err(e) => return Err(e),
}
match tuple.serialize_element(&self.1) {
Ok(x) => x,
Err(e) => return Err(e),
}
tuple.end()
}
}

View File

@ -0,0 +1,207 @@
// SPDX-FileCopyrightText: 2025 Topola contributors
//
// SPDX-License-Identifier: MIT
//
//! per-node planar arrangement structures
//! * `NI`... type of node indices
//! * `EP`... type of etched path descriptor
use crate::{
mayrev::MaybeReversed,
utils::{handle_lifo_relaxed, rotate_iter},
RelaxedPath,
};
use alloc::vec::Vec;
/// Data about the other end of a path going through an planar arrangement
#[derive(Clone, Debug, PartialEq, Eq, PartialOrd, Ord)]
#[cfg_attr(
any(test, feature = "serde"),
derive(serde::Deserialize, serde::Serialize)
)]
pub struct OtherEnd {
/// the index / position of the target section
pub section_idx: usize,
/// the position to be used inside the target section
pub insert_pos: usize,
}
/// Find the appropriate other end (in `stop`) position of a path
/// starting between `pos - 1` and `pos`
/// (at `pos` exactly if `already_inserted_at_start` is set) in `start`.
/// Returns `Option<start_idx, stop_data>`.
///
/// ## Edge cases
/// Note that this function "works" if `start == stop`, but does introduce a bias
/// (currently meaning that routes are introduced before instead of after this)
/// into further routes to resolve the resulting ambiguity
/// and should therefore be avoided entirely.
///
/// ## Failure
/// If the input data is invalid:
/// * the `start` or `stop` doesn't exist, or
/// * `NI`'s are non-unique (`self.0.iter().map(|i| &i.0)` should never contain duplicates)
/// * the `[EP]`'s aren't LIFO-ordered as expected, or
/// * the `pos` is out of bounds,
///
/// this function returns `None`.
pub fn find_other_end<'a, NI, EP, CT, Iter, EPB>(
this: Iter,
start: &NI,
pos: usize,
already_inserted_at_start: bool,
stop: &NI,
) -> Option<(usize, OtherEnd)>
where
Iter: Clone
+ Iterator<Item = (NI, MaybeReversed<&'a EPB, RelaxedPath<EP, CT>>)>
+ core::iter::ExactSizeIterator,
NI: Eq,
EP: Clone + Eq + 'a,
EPB: core::borrow::Borrow<[RelaxedPath<EP, CT>]> + 'a,
{
let mut stack = Vec::new();
// iteration in counter-clockwise order
let (start_idx, mut it) = rotate_iter(
this.map(|(i, j)| (i, j.as_ref())).enumerate(),
move |(_, (i, _))| i == start,
);
// 1. handle start
{
let (_, (_, start_eps)) = it.next().unwrap();
if start == stop {
return Some((
start_idx,
OtherEnd {
section_idx: start_idx,
insert_pos: pos + 1,
},
));
}
for i in start_eps
.as_ref()
.iter()
.skip(pos + usize::from(already_inserted_at_start))
{
handle_lifo_relaxed(&mut stack, i);
}
}
// 2. handle rest
for (nni, (ni, eps)) in it {
if &ni == stop {
// find insertion point (one of `eps.len()+1` positions)
return if stack.is_empty() {
Some((
start_idx,
OtherEnd {
section_idx: nni,
insert_pos: 0,
},
))
} else {
for (n, i) in eps.as_ref().iter().enumerate() {
handle_lifo_relaxed(&mut stack, i);
if stack.is_empty() {
return Some((
start_idx,
OtherEnd {
section_idx: nni,
insert_pos: n + 1,
},
));
}
}
None
};
} else {
for i in eps.as_ref().iter() {
handle_lifo_relaxed(&mut stack, i);
}
}
}
None
}
/// Find the appropriate other ends of a path
/// starting between `pos - 1` and `pos`
/// (at `pos` exactly if `already_inserted_at_start` is set) in `start`.
/// Returns `Option<start_idx, Vec<(stop_ni, stop_data)>>`.
///
/// ## Edge cases
/// Note that this function won't report possible entries with `start_idx == stop_idx`.
///
/// ## Failure
/// If the input data is invalid:
/// * the `start` doesn't exist, or
/// * `NI`'s are non-unique (`self.0.iter().map(|i| &i.0)` should never contain duplicates)
/// * the `[EP]`'s aren't LIFO-ordered as expected, or
/// * the `pos` is out of bounds,
///
/// this function returns `None`.
pub fn find_all_other_ends<'a, NI, EP, CT, Iter, EPB>(
this: Iter,
start: &'a NI,
pos: usize,
already_inserted_at_start: bool,
) -> Option<(usize, impl Iterator<Item = (NI, OtherEnd)> + 'a)>
where
Iter: Clone
+ Iterator<Item = (NI, MaybeReversed<&'a EPB, RelaxedPath<EP, CT>>)>
+ core::iter::ExactSizeIterator
+ 'a,
NI: Clone + Eq,
EP: Clone + Eq + 'a,
CT: 'a,
EPB: core::borrow::Borrow<[RelaxedPath<EP, CT>]> + 'a,
{
let mut stack = Vec::new();
// iteration in counter-clockwise order
let (start_idx, mut it) = rotate_iter(this.enumerate(), move |(_, (i, _))| i == start);
// 1. handle start
{
let (_, (st_, start_eps)) = it.next().unwrap();
if &st_ != start {
panic!();
}
for i in start_eps
.iter()
.skip(pos + usize::from(already_inserted_at_start))
{
handle_lifo_relaxed(&mut stack, i);
}
}
// 2. handle rest
Some((
start_idx,
it.filter_map(move |(section_idx, (ni, eps))| {
// find possible insertion point
// (at most one of `eps.len()+1` positions)
let mut pos = if stack.is_empty() { Some(0) } else { None };
for (n, i) in eps.iter().enumerate() {
handle_lifo_relaxed(&mut stack, i);
if pos.is_none() && stack.is_empty() {
pos = Some(n + 1);
}
}
pos.map(|insert_pos| {
(
ni,
OtherEnd {
section_idx,
insert_pos,
},
)
})
}),
))
}
#[cfg(test)]
mod tests;

View File

@ -0,0 +1,5 @@
---
source: src/planarr.rs
expression: "(tmp.0, tmp.1.collect::<Vec<_>>())"
---
[1, [[2, {"section_idx": 2, "insert_pos": 0}], [0, {"section_idx": 0, "insert_pos": 1}]]]

View File

@ -0,0 +1,45 @@
---
source: src/planarr.rs
expression: s.0
---
[
[
0,
[
{
"Normal": "c"
},
{
"Normal": "b"
},
{
"Normal": "a"
}
]
],
[
1,
[
{
"Normal": "a"
},
{
"Normal": "b"
},
{
"Normal": "d"
}
]
],
[
2,
[
{
"Normal": "d"
},
{
"Normal": "c"
}
]
]
]

View File

@ -0,0 +1,5 @@
---
source: src/planarr.rs
expression: "(tmp.0, tmp.1.collect::<Vec<_>>())"
---
[0, [[1, {"section_idx": 1, "insert_pos": 2}], [2, {"section_idx": 2, "insert_pos": 0}]]]

View File

@ -0,0 +1,45 @@
---
source: src/planarr.rs
expression: s.0
---
[
[
0,
[
{
"Normal": "c"
},
{
"Normal": "b"
},
{
"Normal": "a"
}
]
],
[
1,
[
{
"Normal": "a"
},
{
"Normal": "b"
},
{
"Normal": "d"
}
]
],
[
2,
[
{
"Normal": "c"
},
{
"Normal": "d"
}
]
]
]

View File

@ -0,0 +1,150 @@
// SPDX-FileCopyrightText: 2025 Topola contributors
//
// SPDX-License-Identifier: MIT
//
//! per-node planar arrangement structures
//! * `NI`... type of node indices
//! * `EP`... type of etched path descriptor
extern crate std;
use super::*;
use alloc::{boxed::Box, sync::Arc};
use insta::assert_compact_json_snapshot;
#[derive(Clone, Debug, PartialEq, Eq, PartialOrd, Ord)]
#[cfg_attr(feature = "serde", derive(serde::Deserialize, serde::Serialize))]
struct PlanarArrangement<NI, EP, CT>(
/// counter-clockwise (CCW) ordered sectors, containing CCW ordered paths
pub Box<[(NI, Arc<[RelaxedPath<EP, CT>]>)]>,
);
impl<NI: Clone, EP, CT> PlanarArrangement<NI, EP, CT> {
pub fn from_node_indices(idxs: impl Iterator<Item = NI>) -> Self {
let empty_edge: Arc<[RelaxedPath<EP, CT>]> = Arc::from(Vec::new().into_boxed_slice());
Self(idxs.map(|i| (i, empty_edge.clone())).collect())
}
pub fn as_mut(&mut self) -> PlanarArrangementRefMut<'_, NI, EP, CT> {
PlanarArrangementRefMut(
self.0
.iter_mut()
.map(|(i, j)| (i.clone(), MaybeReversed::new(j)))
.collect(),
)
}
}
struct PlanarArrangementRefMut<'a, NI, EP, CT>(
/// counter-clockwise (CCW) ordered sectors, containing CCW ordered paths
pub Box<
[(
NI,
MaybeReversed<&'a mut Arc<[RelaxedPath<EP, CT>]>, RelaxedPath<EP, CT>>,
)],
>,
);
impl<NI: Clone + Eq, EP: Clone + Eq, CT> PlanarArrangementRefMut<'_, NI, EP, CT> {
/// See [`find_other_end`].
#[inline(always)]
pub fn find_other_end(
&self,
start: &NI,
pos: usize,
already_inserted_at_start: bool,
stop: &NI,
) -> Option<(usize, OtherEnd)> {
find_other_end(
self.0.iter().map(|(i, j)| (i.clone(), j.as_ref())),
start,
pos,
already_inserted_at_start,
stop,
)
}
/// See [`find_all_other_ends`].
#[inline(always)]
pub fn find_all_other_ends<'a>(
&'a self,
start: &'a NI,
pos: usize,
already_inserted_at_start: bool,
) -> Option<(usize, impl Iterator<Item = (NI, OtherEnd)> + 'a)> {
find_all_other_ends(
self.0.iter().map(|(i, j)| (i.clone(), j.as_ref())),
start,
pos,
already_inserted_at_start,
)
}
/// Insert a path into the current sectors arrangement,
/// starting at position `pos_start` in sector `start`, and finding
/// the appropriate other end position in `stop`, and inserting the path there.
///
/// See also [`find_other_end`]
/// (which does implement the search, look there for failure and edge cases).
///
/// If given valid input `self`, this function won't make `self` invalid.
///
/// The result in the success case is the inverted stop position
/// (which can be passed to the next `insert_path` of the neighbor `stop`'s node)
pub fn insert_path(
&mut self,
start: &NI,
pos_start: usize,
stop: &NI,
path: EP,
) -> Result<usize, EP>
where
CT: Clone,
{
match self.find_other_end(start, pos_start, false, stop) {
None => Err(path),
Some((idx_start, stop_data)) => {
let path = RelaxedPath::Normal(path);
self.0[idx_start].1.with_borrow_mut(|mut j| {
j.insert(pos_start, path.clone());
});
self.0[stop_data.section_idx].1.with_borrow_mut(|mut j| {
j.insert(stop_data.insert_pos, path);
Ok(j.len() - stop_data.insert_pos - 1)
})
}
}
}
}
#[test]
fn simple00() {
let mut s = PlanarArrangement::<_, _, ()>::from_node_indices(0..3);
let mut s_ = s.as_mut();
assert_eq!(s_.insert_path(&0, 0, &1, 'a'), Ok(0));
assert_eq!(s_.insert_path(&0, 0, &1, 'b'), Ok(0));
{
let tmp = s_.find_all_other_ends(&0, 0, false).unwrap();
assert_compact_json_snapshot!((tmp.0, tmp.1.collect::<Vec<_>>()));
}
assert_eq!(s_.insert_path(&0, 0, &2, 'c'), Ok(0));
{
let tmp = s_.find_all_other_ends(&1, 2, false).unwrap();
assert_compact_json_snapshot!((tmp.0, tmp.1.collect::<Vec<_>>()));
}
assert_eq!(s_.insert_path(&1, 2, &2, 'd'), Ok(1));
assert_compact_json_snapshot!(s.0);
}
#[test]
fn simple01() {
let mut s = PlanarArrangement::<_, _, ()>::from_node_indices(0..3);
let mut s_ = s.as_mut();
s_.0[2].1.reversed = true;
assert_eq!(s_.insert_path(&0, 0, &1, 'a'), Ok(0));
assert_eq!(s_.insert_path(&0, 0, &1, 'b'), Ok(0));
assert_eq!(s_.insert_path(&0, 0, &2, 'c'), Ok(0));
assert_eq!(s_.insert_path(&1, 2, &2, 'd'), Ok(1));
assert_compact_json_snapshot!(s.0);
}

View File

@ -0,0 +1,50 @@
// SPDX-FileCopyrightText: 2025 Topola contributors
//
// SPDX-License-Identifier: MIT
use crate::RelaxedPath;
use alloc::vec::Vec;
fn handle_lifo<EP: Clone + Eq>(stack: &mut Vec<EP>, item: &EP) {
if stack.last() == Some(item) {
stack.pop();
} else {
stack.push(item.clone());
}
}
pub fn handle_lifo_relaxed<EP: Clone + Eq, CT>(stack: &mut Vec<EP>, item: &RelaxedPath<EP, CT>) {
match item {
RelaxedPath::Normal(item) => handle_lifo(stack, item),
RelaxedPath::Weak(_) => {}
}
}
/// Rotates a finite iterator around such that it starts at `start`, and note the start index
pub fn rotate_iter<Item, Iter, F>(iter: Iter, is_start: F) -> (usize, impl Iterator<Item = Item>)
where
Iter: Clone + Iterator<Item = Item>,
F: Clone + Fn(&Item) -> bool,
{
use peeking_take_while::PeekableExt;
let not_is_start = move |i: &Item| !is_start(i);
let mut it_first = iter.clone().peekable();
let start_idx = it_first
.by_ref()
.peeking_take_while(not_is_start.clone())
.count();
(start_idx, it_first.chain(iter.take_while(not_is_start)))
}
pub fn euclidean_distance<Scalar>(a: &spade::Point2<Scalar>, b: &spade::Point2<Scalar>) -> Scalar
where
Scalar: num_traits::Float,
{
let delta = spade::Point2 {
x: a.x - b.x,
y: a.y - b.y,
};
delta.y.hypot(delta.x)
}

View File

@ -0,0 +1,86 @@
// SPDX-FileCopyrightText: 2024 Topola contributors
//
// SPDX-License-Identifier: MIT
#![cfg(feature = "serde")]
use pie::{DualIndex, NavmeshIndex, RelaxedPath};
use planar_incr_embed as pie;
#[derive(
Clone, Copy, Debug, PartialEq, Eq, PartialOrd, Ord, serde::Deserialize, serde::Serialize,
)]
enum PrimitiveIndex {
FixedDot(usize),
}
#[derive(
Clone, Copy, Debug, PartialEq, Eq, PartialOrd, Ord, serde::Deserialize, serde::Serialize,
)]
enum LayoutNodeIndex {
Primitive(PrimitiveIndex),
}
#[derive(Clone)]
struct MyNavmeshBase;
type EtchedPath = pie::navmesh::EdgeIndex<LayoutNodeIndex>;
impl pie::NavmeshBase for MyNavmeshBase {
type PrimalNodeIndex = LayoutNodeIndex;
type EtchedPath = EtchedPath;
type GapComment = ();
type Scalar = f64;
}
type Navmesh = pie::navmesh::Navmesh<MyNavmeshBase>;
#[test]
fn tht_3pin_xlr_to_tht_3pin_xlr() {
// the intermediate navmesh
let navmesh_ser: pie::navmesh::NavmeshSer<MyNavmeshBase> = ron::from_str(
&std::fs::read_to_string("tests/tht_3pin_xlr_to_tht_3pin_xlr/navmesh_intermed.ron")
.unwrap(),
)
.unwrap();
let mut navmesh: Navmesh = navmesh_ser.into();
/*
let _goals: Vec<EtchedPath> = ron::from_str(
&std::fs::read_to_string("tests/tht_3pin_xlr_to_tht_3pin_xlr/goals.ron").unwrap(),
)
.unwrap();
*/
// execute the path of the second goal "2 -> 8"
let p2 = LayoutNodeIndex::Primitive(PrimitiveIndex::FixedDot(2));
let p8 = LayoutNodeIndex::Primitive(PrimitiveIndex::FixedDot(8));
let label = RelaxedPath::Normal(EtchedPath::from((p2, p8)));
let p2 = NavmeshIndex::Primal(p2);
let d5 = NavmeshIndex::Dual(DualIndex::Inner(5));
let d6 = NavmeshIndex::Dual(DualIndex::Inner(6));
let next_pos = navmesh
.as_ref()
.planarr_find_other_end(&d6, &p2, 0, false, &d5)
.unwrap()
.1;
let mut tmp = navmesh.as_mut();
tmp.edge_data_mut(d6, p2).unwrap().with_borrow_mut(|mut j| {
j.insert(0, label.clone());
});
let next_pos = tmp.edge_data_mut(d6, d5).unwrap().with_borrow_mut(|mut j| {
j.insert(next_pos.insert_pos, label);
j.len() - next_pos.insert_pos - 1
});
insta::assert_json_snapshot!(navmesh
.as_ref()
.planarr_find_all_other_ends(&d5, &d6, next_pos, true)
.unwrap()
.1
.collect::<Vec<_>>());
}

View File

@ -0,0 +1,43 @@
---
source: tests/complex.rs
expression: "navmesh.as_ref().planarr_find_all_other_ends(&d5, &d6, next_pos,\ntrue).unwrap().1.collect::<Vec<_>>()"
---
[
[
{
"Primal": {
"Primitive": {
"FixedDot": 0
}
}
},
{
"section_idx": 5,
"insert_pos": 0
}
],
[
{
"Dual": {
"Inner": 2
}
},
{
"section_idx": 2,
"insert_pos": 1
}
],
[
{
"Primal": {
"Primitive": {
"FixedDot": 4
}
}
},
{
"section_idx": 3,
"insert_pos": 0
}
]
]

View File

@ -0,0 +1,5 @@
[
(Primitive(FixedDot(0)), Primitive(FixedDot(6))),
(Primitive(FixedDot(2)), Primitive(FixedDot(8))),
(Primitive(FixedDot(4)), Primitive(FixedDot(10)))
]

View File

@ -0,0 +1,233 @@
(
nodes: {
Dual(Inner(1)): (
neighs: [
Primal(Primitive(FixedDot(2))),
Dual(Inner(6)),
Primal(Primitive(FixedDot(4))),
Dual(Inner(2)),
Primal(Primitive(FixedDot(10))),
Dual(Inner(3)),
],
pos: (x: 0.0, y: 0.0),
open_direction: None,
),
Dual(Inner(2)): (
neighs: [
Dual(Inner(5)),
Primal(Primitive(FixedDot(8))),
Dual(Inner(4)),
Primal(Primitive(FixedDot(10))),
Dual(Inner(1)),
Primal(Primitive(FixedDot(4))),
],
pos: (x: 0.0, y: 0.0),
open_direction: None,
),
Dual(Inner(3)): (
neighs: [
Primal(Primitive(FixedDot(2))),
Dual(Inner(1)),
Primal(Primitive(FixedDot(10))),
Dual(Inner(4)),
Primal(Primitive(FixedDot(6))),
Dual(Outer(13)),
],
pos: (x: 0.0, y: 0.0),
open_direction: None,
),
Dual(Inner(4)): (
neighs: [
Primal(Primitive(FixedDot(10))),
Dual(Inner(2)),
Primal(Primitive(FixedDot(8))),
Dual(Outer(15)),
Primal(Primitive(FixedDot(6))),
Dual(Inner(3)),
],
pos: (x: 0.0, y: 0.0),
open_direction: None,
),
Dual(Inner(5)): (
neighs: [
Dual(Outer(19)),
Primal(Primitive(FixedDot(8))),
Dual(Inner(2)),
Primal(Primitive(FixedDot(4))),
Dual(Inner(6)),
Primal(Primitive(FixedDot(0))),
],
pos: (x: 0.0, y: 0.0),
open_direction: None,
),
Dual(Inner(6)): (
neighs: [
Primal(Primitive(FixedDot(0))),
Dual(Inner(5)),
Primal(Primitive(FixedDot(4))),
Dual(Inner(1)),
Primal(Primitive(FixedDot(2))),
Dual(Outer(21)),
],
pos: (x: 0.0, y: 0.0),
open_direction: None,
),
Dual(Outer(13)): (
neighs: [
Dual(Outer(21)),
Dual(Inner(3)),
Primal(Primitive(FixedDot(6))),
Dual(Outer(15)),
],
pos: (x: 0.0, y: 0.0),
open_direction: Some((
x: -0.000000000014551915228366852,
y: -13970.0,
)),
),
Dual(Outer(15)): (
neighs: [
Primal(Primitive(FixedDot(8))),
Dual(Outer(19)),
Dual(Outer(13)),
Dual(Inner(4)),
],
pos: (x: 0.0, y: 0.0),
open_direction: Some((
x: -7620.0,
y: 635.0,
)),
),
Dual(Outer(19)): (
neighs: [
Dual(Outer(15)),
Dual(Inner(5)),
Primal(Primitive(FixedDot(0))),
Dual(Outer(21)),
],
pos: (x: 0.0, y: 0.0),
open_direction: Some((
x: 0.000000000014551915228366852,
y: 13970.0,
)),
),
Dual(Outer(21)): (
neighs: [
Dual(Outer(19)),
Dual(Inner(6)),
Primal(Primitive(FixedDot(2))),
Dual(Outer(13)),
],
pos: (x: 0.0, y: 0.0),
open_direction: Some((
x: 7620.0,
y: -635.0,
)),
),
Primal(Primitive(FixedDot(0))): (
neighs: [
Dual(Outer(19)),
Dual(Inner(5)),
Dual(Inner(6)),
],
pos: (x: 0.0, y: 0.0),
open_direction: None,
),
Primal(Primitive(FixedDot(2))): (
neighs: [
Dual(Outer(21)),
Dual(Inner(6)),
Dual(Inner(1)),
Dual(Inner(3)),
],
pos: (x: 0.0, y: 0.0),
open_direction: None,
),
Primal(Primitive(FixedDot(4))): (
neighs: [
Dual(Inner(6)),
Dual(Inner(5)),
Dual(Inner(2)),
Dual(Inner(1)),
],
pos: (x: 0.0, y: 0.0),
open_direction: None,
),
Primal(Primitive(FixedDot(6))): (
neighs: [
Dual(Inner(4)),
Dual(Outer(13)),
Dual(Inner(3)),
],
pos: (x: 0.0, y: 0.0),
open_direction: None,
),
Primal(Primitive(FixedDot(8))): (
neighs: [
Dual(Inner(2)),
Dual(Inner(5)),
Dual(Outer(15)),
Dual(Inner(4)),
],
pos: (x: 0.0, y: 0.0),
open_direction: None,
),
Primal(Primitive(FixedDot(10))): (
neighs: [
Dual(Inner(2)),
Dual(Inner(4)),
Dual(Inner(3)),
Dual(Inner(1)),
],
pos: (x: 0.0, y: 0.0),
open_direction: None,
),
},
edges: {
(Dual(Inner(1)), Dual(Inner(2))): ((), []),
(Dual(Inner(1)), Dual(Inner(3))): ((), []),
(Dual(Inner(1)), Dual(Inner(6))): ((), []),
(Dual(Inner(1)), Primal(Primitive(FixedDot(2)))): ((), []),
(Dual(Inner(1)), Primal(Primitive(FixedDot(4)))): ((), []),
(Dual(Inner(1)), Primal(Primitive(FixedDot(10)))): ((), []),
(Dual(Inner(2)), Dual(Inner(4))): ((), [
Normal((Primitive(FixedDot(0)), Primitive(FixedDot(6)))),
]),
(Dual(Inner(2)), Dual(Inner(5))): ((), [
Normal((Primitive(FixedDot(0)), Primitive(FixedDot(6)))),
]),
(Dual(Inner(2)), Primal(Primitive(FixedDot(4)))): ((), []),
(Dual(Inner(2)), Primal(Primitive(FixedDot(8)))): ((), []),
(Dual(Inner(2)), Primal(Primitive(FixedDot(10)))): ((), []),
(Dual(Inner(3)), Dual(Inner(4))): ((), []),
(Dual(Inner(3)), Dual(Outer(13))): ((), []),
(Dual(Inner(3)), Primal(Primitive(FixedDot(2)))): ((), []),
(Dual(Inner(3)), Primal(Primitive(FixedDot(6)))): ((), []),
(Dual(Inner(3)), Primal(Primitive(FixedDot(10)))): ((), []),
(Dual(Inner(4)), Dual(Outer(15))): ((), []),
(Dual(Inner(4)), Primal(Primitive(FixedDot(6)))): ((), [
Normal((Primitive(FixedDot(0)), Primitive(FixedDot(6)))),
]),
(Dual(Inner(4)), Primal(Primitive(FixedDot(8)))): ((), []),
(Dual(Inner(4)), Primal(Primitive(FixedDot(10)))): ((), []),
(Dual(Inner(5)), Dual(Inner(6))): ((), []),
(Dual(Inner(5)), Dual(Outer(19))): ((), []),
(Dual(Inner(5)), Primal(Primitive(FixedDot(0)))): ((), [
Normal((Primitive(FixedDot(0)), Primitive(FixedDot(6)))),
]),
(Dual(Inner(5)), Primal(Primitive(FixedDot(4)))): ((), []),
(Dual(Inner(5)), Primal(Primitive(FixedDot(8)))): ((), []),
(Dual(Inner(6)), Dual(Outer(21))): ((), []),
(Dual(Inner(6)), Primal(Primitive(FixedDot(0)))): ((), []),
(Dual(Inner(6)), Primal(Primitive(FixedDot(2)))): ((), []),
(Dual(Inner(6)), Primal(Primitive(FixedDot(4)))): ((), []),
(Dual(Outer(13)), Dual(Outer(15))): ((), []),
(Dual(Outer(13)), Dual(Outer(21))): ((), []),
(Dual(Outer(13)), Primal(Primitive(FixedDot(6)))): ((), []),
(Dual(Outer(15)), Dual(Outer(19))): ((), []),
(Dual(Outer(15)), Primal(Primitive(FixedDot(8)))): ((), []),
(Dual(Outer(19)), Dual(Outer(21))): ((), []),
(Dual(Outer(19)), Primal(Primitive(FixedDot(0)))): ((), []),
(Dual(Outer(21)), Primal(Primitive(FixedDot(2)))): ((), []),
}
)

View File

@ -0,0 +1,59 @@
(
nodes:{
Dual(Inner(1)):(neighs:[Primal(Primitive(FixedDot(2))),Dual(Inner(6)),Primal(Primitive(FixedDot(4))),Dual(Inner(2)),Primal(Primitive(FixedDot(10))),Dual(Inner(3))],open_direction:None),
Dual(Inner(2)):(neighs:[Dual(Inner(5)),Primal(Primitive(FixedDot(8))),Dual(Inner(4)),Primal(Primitive(FixedDot(10))),Dual(Inner(1)),Primal(Primitive(FixedDot(4)))],open_direction:None),
Dual(Inner(3)):(neighs:[Primal(Primitive(FixedDot(2))),Dual(Inner(1)),Primal(Primitive(FixedDot(10))),Dual(Inner(4)),Primal(Primitive(FixedDot(6))),Dual(Outer(13))],open_direction:None),
Dual(Inner(4)):(neighs:[Primal(Primitive(FixedDot(10))),Dual(Inner(2)),Primal(Primitive(FixedDot(8))),Dual(Outer(15)),Primal(Primitive(FixedDot(6))),Dual(Inner(3))],open_direction:None),
Dual(Inner(5)):(neighs:[Dual(Outer(19)),Primal(Primitive(FixedDot(8))),Dual(Inner(2)),Primal(Primitive(FixedDot(4))),Dual(Inner(6)),Primal(Primitive(FixedDot(0)))],open_direction:None),
Dual(Inner(6)):(neighs:[Primal(Primitive(FixedDot(0))),Dual(Inner(5)),Primal(Primitive(FixedDot(4))),Dual(Inner(1)),Primal(Primitive(FixedDot(2))),Dual(Outer(21))],open_direction:None),
Dual(Outer(13)):(neighs:[Dual(Outer(21)),Dual(Inner(3)),Primal(Primitive(FixedDot(6))),Dual(Outer(15))],open_direction:Some((x:-0.000000000014551915228366852,y:-13970.0))),
Dual(Outer(15)):(neighs:[Primal(Primitive(FixedDot(8))),Dual(Outer(19)),Dual(Outer(13)),Dual(Inner(4))],open_direction:Some((x:-7620.0,y:635.0))),
Dual(Outer(19)):(neighs:[Dual(Outer(15)),Dual(Inner(5)),Primal(Primitive(FixedDot(0))),Dual(Outer(21))],open_direction:Some((x:0.000000000014551915228366852,y:13970.0))),
Dual(Outer(21)):(neighs:[Dual(Outer(19)),Dual(Inner(6)),Primal(Primitive(FixedDot(2))),Dual(Outer(13))],open_direction:Some((x:7620.0,y:-635.0))),
Primal(Primitive(FixedDot(0))):(neighs:[Dual(Outer(19)),Dual(Inner(5)),Dual(Inner(6))],open_direction:None),
Primal(Primitive(FixedDot(2))):(neighs:[Dual(Outer(21)),Dual(Inner(6)),Dual(Inner(1)),Dual(Inner(3))],open_direction:None),
Primal(Primitive(FixedDot(4))):(neighs:[Dual(Inner(6)),Dual(Inner(5)),Dual(Inner(2)),Dual(Inner(1))],open_direction:None),
Primal(Primitive(FixedDot(6))):(neighs:[Dual(Inner(4)),Dual(Outer(13)),Dual(Inner(3))],open_direction:None),
Primal(Primitive(FixedDot(8))):(neighs:[Dual(Inner(2)),Dual(Inner(5)),Dual(Outer(15)),Dual(Inner(4))],open_direction:None),
Primal(Primitive(FixedDot(10))):(neighs:[Dual(Inner(2)),Dual(Inner(4)),Dual(Inner(3)),Dual(Inner(1))],open_direction:None)
},
edges:{
(Dual(Inner(1)),Dual(Inner(2))):(paths:[],weight:13342.493438320249),
(Dual(Inner(1)),Dual(Inner(3))):(paths:[],weight:854.5080618564726),
(Dual(Inner(1)),Dual(Inner(6))):(paths:[],weight:9463.450638488148),
(Dual(Inner(1)),Primal(Primitive(FixedDot(2)))):(paths:[],weight:7255.50060560296),
(Dual(Inner(1)),Primal(Primitive(FixedDot(4)))):(paths:[],weight:7255.5006056029515),
(Dual(Inner(1)),Primal(Primitive(FixedDot(10)))):(paths:[],weight:7255.5006056029415),
(Dual(Inner(2)),Dual(Inner(4))):(paths:[(Primitive(FixedDot(0)),Primitive(FixedDot(6)))],weight:9463.450638488148),
(Dual(Inner(2)),Dual(Inner(5))):(paths:[(Primitive(FixedDot(0)),Primitive(FixedDot(6)))],weight:854.5080618564726),
(Dual(Inner(2)),Primal(Primitive(FixedDot(4)))):(paths:[(Primitive(FixedDot(4)),Primitive(FixedDot(10)))],weight:7255.5006056029415),
(Dual(Inner(2)),Primal(Primitive(FixedDot(8)))):(paths:[],weight:7255.50060560296),
(Dual(Inner(2)),Primal(Primitive(FixedDot(10)))):(paths:[(Primitive(FixedDot(4)),Primitive(FixedDot(10)))],weight:7255.5006056029515),
(Dual(Inner(3)),Dual(Inner(4))):(paths:[],weight:9794.094419218618),
(Dual(Inner(3)),Dual(Outer(13))):(paths:[],weight:10315.419947506583),
(Dual(Inner(3)),Primal(Primitive(FixedDot(2)))):(paths:[],weight:7883.284871174122),
(Dual(Inner(3)),Primal(Primitive(FixedDot(6)))):(paths:[],weight:7883.284871174128),
(Dual(Inner(3)),Primal(Primitive(FixedDot(10)))):(paths:[],weight:7883.284871174124),
(Dual(Inner(4)),Dual(Outer(15))):(paths:[],weight:7945.163814846131),
(Dual(Inner(4)),Primal(Primitive(FixedDot(6)))):(paths:[(Primitive(FixedDot(0)),Primitive(FixedDot(6)))],weight:3834.860957550632),
(Dual(Inner(4)),Primal(Primitive(FixedDot(8)))):(paths:[],weight:3834.8609575506225),
(Dual(Inner(4)),Primal(Primitive(FixedDot(10)))):(paths:[],weight:3834.8609575506225),
(Dual(Inner(5)),Dual(Inner(6))):(paths:[(Primitive(FixedDot(2)),Primitive(FixedDot(8)))],weight:9794.094419218618),
(Dual(Inner(5)),Dual(Outer(19))):(paths:[],weight:10315.419947506569),
(Dual(Inner(5)),Primal(Primitive(FixedDot(0)))):(paths:[(Primitive(FixedDot(0)),Primitive(FixedDot(6)))],weight:7883.284871174128),
(Dual(Inner(5)),Primal(Primitive(FixedDot(4)))):(paths:[],weight:7883.284871174124),
(Dual(Inner(5)),Primal(Primitive(FixedDot(8)))):(paths:[(Primitive(FixedDot(2)),Primitive(FixedDot(8)))],weight:7883.284871174122),
(Dual(Inner(6)),Dual(Outer(21))):(paths:[],weight:7945.163814846131),
(Dual(Inner(6)),Primal(Primitive(FixedDot(0)))):(paths:[],weight:3834.860957550632),
(Dual(Inner(6)),Primal(Primitive(FixedDot(2)))):(paths:[(Primitive(FixedDot(2)),Primitive(FixedDot(8)))],weight:3834.8609575506225),
(Dual(Inner(6)),Primal(Primitive(FixedDot(4)))):(paths:[],weight:3834.8609575506225),
(Dual(Outer(13)),Dual(Outer(15))):(paths:[],weight:23307.614233335862),
(Dual(Outer(13)),Dual(Outer(21))):(paths:[],weight:22729.54093795121),
(Dual(Outer(13)),Primal(Primitive(FixedDot(6)))):(paths:[],weight:15618.934822836045),
(Dual(Outer(15)),Dual(Outer(19))):(paths:[],weight:22729.5409379512),
(Dual(Outer(15)),Primal(Primitive(FixedDot(8)))):(paths:[],weight:8548.949131326026),
(Dual(Outer(19)),Dual(Outer(21))):(paths:[],weight:23307.61423333585),
(Dual(Outer(19)),Primal(Primitive(FixedDot(0)))):(paths:[],weight:15618.93482283603),
(Dual(Outer(21)),Primal(Primitive(FixedDot(2)))):(paths:[],weight:8548.949131326026)
}
)

View File

@ -178,19 +178,11 @@ impl SpecctraMesadata {
}
impl AccessRules for SpecctraMesadata {
fn clearance(&self, conditions1: &Conditions, conditions2: &Conditions) -> f64 {
let (Some(net1), Some(net2)) = (conditions1.maybe_net, conditions2.maybe_net) else {
return 0.0;
};
fn clearance(&self, conditions1: &Conditions<'_>, conditions2: &Conditions<'_>) -> f64 {
let clr1 = self.get_rule(conditions1.net).clearance;
let clr2 = self.get_rule(conditions2.net).clearance;
let clr1 = self.get_rule(net1).clearance;
let clr2 = self.get_rule(net2).clearance;
if clr1 > clr2 {
clr1
} else {
clr2
}
f64::max(clr1, clr2)
}
fn largest_clearance(&self, _maybe_net: Option<usize>) -> f64 {

View File

@ -2,18 +2,35 @@
//
// SPDX-License-Identifier: MIT
pub trait GetConditions {
fn conditions(&self) -> Conditions;
use std::borrow::Cow;
pub trait GetConditions<'a> {
fn conditions(self) -> Option<Conditions<'a>>;
}
#[derive(Debug, Default)]
pub struct Conditions {
pub maybe_net: Option<usize>,
pub maybe_region: Option<String>,
pub maybe_layer: Option<String>,
#[derive(
Clone,
Debug,
Default,
PartialEq,
Eq,
PartialOrd,
Ord,
Hash,
serde::Deserialize,
serde::Serialize,
)]
pub struct Conditions<'a> {
pub net: usize,
#[serde(borrow)]
pub maybe_region: Option<Cow<'a, str>>,
#[serde(borrow)]
pub maybe_layer: Option<Cow<'a, str>>,
}
pub trait AccessRules {
fn clearance(&self, conditions1: &Conditions, conditions2: &Conditions) -> f64;
fn clearance(&self, conditions1: &Conditions<'_>, conditions2: &Conditions<'_>) -> f64;
fn largest_clearance(&self, net: Option<usize>) -> f64;
}

View File

@ -43,7 +43,7 @@ impl MeasureLengthExecutionStepper {
let mut length = 0.0;
for selector in self.selection.selectors() {
let band = autorouter.board.bandname_band(&selector.band).unwrap().0;
let band = autorouter.board.bandname_band(&selector.band).unwrap()[false];
length += band.ref_(autorouter.board.layout().drawing()).length();
}

View File

@ -41,7 +41,7 @@ impl RemoveBandsExecutionStepper {
let mut edit = LayoutEdit::new();
for selector in self.selection.selectors() {
let band = autorouter.board.bandname_band(&selector.band).unwrap().0;
let band = autorouter.board.bandname_band(&selector.band).unwrap()[false];
autorouter.board.layout_mut().remove_band(&mut edit, band);
}
Ok(Some(edit))

View File

@ -8,11 +8,9 @@
pub use specctra_core::mesadata::AccessMesadata;
use std::{cmp::Ordering, collections::BTreeMap};
use bimap::BiBTreeMap;
use derive_getters::Getters;
use serde::{Deserialize, Serialize};
use std::collections::BTreeMap;
use crate::{
drawing::{
@ -33,21 +31,7 @@ use crate::{
};
/// Represents a band between two pins.
#[derive(Clone, Debug, Deserialize, Eq, PartialOrd, Ord, PartialEq, Serialize)]
pub struct BandName(String, String);
impl BandName {
/// Creates a new [`BandName`] and manages their order.
///
/// This function ensures that the two pin names are sorted in lexicographical order, so that the smaller name always comes first.
pub fn new(pinname1: String, pinname2: String) -> Self {
if pinname1.cmp(&pinname2) == Ordering::Greater {
BandName(pinname2, pinname1)
} else {
BandName(pinname1, pinname2)
}
}
}
pub type BandName = planar_incr_embed::navmesh::OrderedPair<String>;
#[derive(Clone, Debug, PartialEq, Eq, PartialOrd, Ord, Hash)]
pub enum ResolvedSelector<'a> {
@ -284,14 +268,17 @@ impl<M: AccessMesadata> Board<M> {
.unwrap()
.to_string();
self.band_bandname
.insert(band, BandName::new(source_pinname, target_pinname));
.insert(band, BandName::from((source_pinname, target_pinname)));
}
/// Finds a band between two pin names.
pub fn band_between_pins(&self, pinname1: &str, pinname2: &str) -> Option<BandUid> {
self.band_bandname
// note: it doesn't matter in what order pinnames are given, the constructor sorts them
.get_by_right(&BandName::new(pinname1.to_string(), pinname2.to_string()))
.get_by_right(&BandName::from((
pinname1.to_string(),
pinname2.to_string(),
)))
.copied()
}

View File

@ -2,7 +2,6 @@
//
// SPDX-License-Identifier: MIT
use core::{cmp, hash};
use enum_dispatch::enum_dispatch;
use petgraph::stable_graph::NodeIndex;
@ -20,54 +19,10 @@ use super::{
Drawing,
};
#[derive(Clone, Copy, Debug)]
pub struct BandUid(pub BandTermsegIndex, pub BandTermsegIndex);
impl BandUid {
pub fn new(first_seg1: BandTermsegIndex, first_seg2: BandTermsegIndex) -> Self {
if first_seg1.petgraph_index() <= first_seg2.petgraph_index() {
BandUid(first_seg1, first_seg2)
} else {
BandUid(first_seg2, first_seg1)
}
}
}
impl PartialEq for BandUid {
fn eq(&self, other: &Self) -> bool {
self.0.petgraph_index() == other.0.petgraph_index()
&& self.1.petgraph_index() == other.1.petgraph_index()
}
}
impl Eq for BandUid {}
impl hash::Hash for BandUid {
fn hash<H: hash::Hasher>(&self, state: &mut H) {
self.0.petgraph_index().hash(state);
self.1.petgraph_index().hash(state);
}
}
impl cmp::PartialOrd for BandUid {
fn partial_cmp(&self, other: &Self) -> Option<cmp::Ordering> {
Some(self.cmp(other))
}
}
impl cmp::Ord for BandUid {
fn cmp(&self, other: &Self) -> cmp::Ordering {
use cmp::Ordering as O;
match self.0.petgraph_index().cmp(&other.0.petgraph_index()) {
O::Less => O::Less,
O::Greater => O::Greater,
O::Equal => self.1.petgraph_index().cmp(&other.1.petgraph_index()),
}
}
}
pub type BandUid = planar_incr_embed::navmesh::OrderedPair<BandTermsegIndex>;
#[enum_dispatch(GetPetgraphIndex)]
#[derive(Clone, Copy, Debug, Eq, Ord, PartialEq, PartialOrd)]
#[derive(Clone, Copy, Debug, Eq, Ord, PartialEq, PartialOrd, Hash)]
pub enum BandTermsegIndex {
Straight(LoneLooseSegIndex),
Bended(SeqLooseSegIndex),

View File

@ -27,10 +27,10 @@ pub trait Collect {
impl<CW: Copy, R: AccessRules> Collect for Drawing<CW, R> {
fn loose_band_uid(&self, start_loose: LooseIndex) -> BandUid {
BandUid::new(
BandUid::from((
self.loose_band_first_seg(start_loose),
self.loose_band_last_seg(start_loose),
)
))
}
fn bend_bow(&self, bend: LooseBendIndex) -> Vec<PrimitiveIndex> {

View File

@ -981,8 +981,12 @@ impl<CW: Copy, R: AccessRules> Drawing<CW, R> {
let epsilon = 1.0;
inflated_shape = node.primitive(self).shape().inflate(
(self.rules.clearance(&conditions, &infringee_conditions) - epsilon)
.clamp(0.0, f64::INFINITY),
match (&conditions, infringee_conditions) {
(None, _) | (_, None) => 0.0,
(Some(lhs), Some(rhs)) => {
(self.rules.clearance(lhs, &rhs) - epsilon).clamp(0.0, f64::INFINITY)
}
},
);
inflated_shape

View File

@ -97,7 +97,8 @@ impl<CW: Copy, R: AccessRules> Guide for Drawing<CW, R> {
width: f64,
) -> Result<(Line, Line), NoTangents> {
let from_circle = self.head_circle(head, width);
let to_circle = self.dot_circle(around, width, &self.conditions(head.face().into()));
let to_circle =
self.dot_circle(around, width, self.conditions(head.face().into()).as_ref());
let from_cw = self.head_cw(head);
let tangents: Vec<Line> =
@ -113,16 +114,17 @@ impl<CW: Copy, R: AccessRules> Guide for Drawing<CW, R> {
width: f64,
) -> Result<Line, NoTangents> {
let from_circle = self.head_circle(head, width);
let to_circle = self.dot_circle(around, width, &self.conditions(head.face().into()));
let to_circle =
self.dot_circle(around, width, self.conditions(head.face().into()).as_ref());
let from_cw = self.head_cw(head);
math::tangent_segment(from_circle, from_cw, to_circle, Some(cw))
}
fn head_around_dot_offset(&self, head: &Head, around: DotIndex, _width: f64) -> f64 {
self.rules().clearance(
&self.conditions(around.into()),
&self.conditions(head.face().into()),
self.clearance(
self.conditions(around.into()).as_ref(),
self.conditions(head.face().into()).as_ref(),
)
}
@ -133,7 +135,8 @@ impl<CW: Copy, R: AccessRules> Guide for Drawing<CW, R> {
width: f64,
) -> Result<(Line, Line), NoTangents> {
let from_circle = self.head_circle(head, width);
let to_circle = self.bend_circle(around, width, &self.conditions(head.face().into()));
let to_circle =
self.bend_circle(around, width, self.conditions(head.face().into()).as_ref());
let from_cw = self.head_cw(head);
let tangents: Vec<Line> =
@ -149,16 +152,17 @@ impl<CW: Copy, R: AccessRules> Guide for Drawing<CW, R> {
width: f64,
) -> Result<Line, NoTangents> {
let from_circle = self.head_circle(head, width);
let to_circle = self.bend_circle(around, width, &self.conditions(head.face().into()));
let to_circle =
self.bend_circle(around, width, self.conditions(head.face().into()).as_ref());
let from_cw = self.head_cw(head);
math::tangent_segment(from_circle, from_cw, to_circle, Some(cw))
}
fn head_around_bend_offset(&self, head: &Head, around: BendIndex, _width: f64) -> f64 {
self.rules().clearance(
&self.conditions(head.face().into()),
&self.conditions(around.into()),
self.clearance(
self.conditions(head.face().into()).as_ref(),
self.conditions(around.into()).as_ref(),
)
}
@ -196,18 +200,37 @@ impl<CW: Copy, R: AccessRules> Guide for Drawing<CW, R> {
}
trait GuidePrivate {
fn clearance(&self, lhs: Option<&Conditions<'_>>, rhs: Option<&Conditions<'_>>) -> f64;
fn head_circle(&self, head: &Head, width: f64) -> Circle;
fn bend_circle(&self, bend: BendIndex, width: f64, guide_conditions: &Conditions) -> Circle;
fn bend_circle(
&self,
bend: BendIndex,
width: f64,
guide_conditions: Option<&Conditions<'_>>,
) -> Circle;
fn dot_circle(&self, dot: DotIndex, width: f64, guide_conditions: &Conditions) -> Circle;
fn dot_circle(
&self,
dot: DotIndex,
width: f64,
guide_conditions: Option<&Conditions<'_>>,
) -> Circle;
fn rear(&self, head: CaneHead) -> DotIndex;
fn conditions(&self, node: PrimitiveIndex) -> Conditions;
fn conditions(&self, node: PrimitiveIndex) -> Option<Conditions<'_>>;
}
impl<CW: Copy, R: AccessRules> GuidePrivate for Drawing<CW, R> {
fn clearance(&self, lhs: Option<&Conditions<'_>>, rhs: Option<&Conditions<'_>>) -> f64 {
match (lhs, rhs) {
(None, _) | (_, None) => 0.0,
(Some(lhs), Some(rhs)) => self.rules().clearance(lhs, rhs),
}
}
fn head_circle(&self, head: &Head, width: f64) -> Circle {
match *head {
Head::Bare(head) => Circle {
@ -216,19 +239,28 @@ impl<CW: Copy, R: AccessRules> GuidePrivate for Drawing<CW, R> {
},
Head::Cane(head) => {
if let Some(inner) = self.primitive(head.cane.bend).inner() {
self.bend_circle(inner.into(), width, &self.conditions(head.face().into()))
self.bend_circle(
inner.into(),
width,
self.conditions(head.face().into()).as_ref(),
)
} else {
self.dot_circle(
self.primitive(head.cane.bend).core().into(),
width,
&self.conditions(head.face().into()),
self.conditions(head.face().into()).as_ref(),
)
}
}
}
}
fn bend_circle(&self, bend: BendIndex, width: f64, guide_conditions: &Conditions) -> Circle {
fn bend_circle(
&self,
bend: BendIndex,
width: f64,
guide_conditions: Option<&Conditions<'_>>,
) -> Circle {
let outer_circle = match bend.primitive(self).shape() {
PrimitiveShape::Bend(shape) => shape.outer_circle(),
_ => unreachable!(),
@ -238,21 +270,22 @@ impl<CW: Copy, R: AccessRules> GuidePrivate for Drawing<CW, R> {
pos: outer_circle.pos,
r: outer_circle.r
+ width / 2.0
+ self
.rules()
.clearance(&self.conditions(bend.into()), guide_conditions),
+ self.clearance(self.conditions(bend.into()).as_ref(), guide_conditions),
}
}
fn dot_circle(&self, dot: DotIndex, width: f64, guide_conditions: &Conditions) -> Circle {
fn dot_circle(
&self,
dot: DotIndex,
width: f64,
guide_conditions: Option<&Conditions<'_>>,
) -> Circle {
let shape = dot.primitive(self).shape();
Circle {
pos: shape.center(),
r: shape.width() / 2.0
+ width / 2.0
+ self
.rules()
.clearance(&self.conditions(dot.into()), guide_conditions),
+ self.clearance(self.conditions(dot.into()).as_ref(), guide_conditions),
}
}
@ -261,7 +294,7 @@ impl<CW: Copy, R: AccessRules> GuidePrivate for Drawing<CW, R> {
.other_joint(head.cane.dot.into())
}
fn conditions(&self, node: PrimitiveIndex) -> Conditions {
fn conditions(&self, node: PrimitiveIndex) -> Option<Conditions<'_>> {
node.primitive(self).conditions()
}
}

View File

@ -12,13 +12,12 @@ use crate::{
dot::{DotIndex, LooseDotIndex},
graph::{MakePrimitive, PrimitiveIndex},
primitive::{GetJoints, LoneLooseSeg, LooseBend, LooseDot, Primitive, SeqLooseSeg},
rules::AccessRules,
seg::{LoneLooseSegIndex, SeqLooseSegIndex},
},
graph::GetPetgraphIndex,
};
use super::rules::AccessRules;
#[enum_dispatch]
pub trait GetPrevNextLoose {
fn next_loose(&self, maybe_prev: Option<LooseIndex>) -> Option<LooseIndex>;

View File

@ -10,6 +10,7 @@ use crate::{
bend::{BendIndex, FixedBendWeight, LooseBendIndex, LooseBendWeight},
dot::{DotIndex, DotWeight, FixedDotIndex, FixedDotWeight, LooseDotIndex, LooseDotWeight},
graph::{GetMaybeNet, PrimitiveIndex, PrimitiveWeight},
rules::{AccessRules, Conditions, GetConditions},
seg::{FixedSegWeight, LoneLooseSegWeight, SegIndex, SeqLooseSegIndex, SeqLooseSegWeight},
Drawing,
},
@ -17,8 +18,6 @@ use crate::{
graph::{GenericIndex, GetPetgraphIndex},
};
use specctra_core::rules::{AccessRules, Conditions, GetConditions};
#[enum_dispatch]
pub trait GetDrawing<'a, R: AccessRules> {
fn drawing(&self) -> &Drawing<impl Copy, R>;
@ -171,16 +170,16 @@ pub enum Primitive<'a, CW: Copy, R: AccessRules> {
LooseBend(LooseBend<'a, CW, R>),
}
impl<'a, CW: Copy, R: AccessRules> specctra_core::rules::GetConditions for Primitive<'a, CW, R> {
fn conditions(&self) -> specctra_core::rules::Conditions {
impl<'a, CW: Copy, R: AccessRules> GetConditions<'a> for &Primitive<'a, CW, R> {
fn conditions(self) -> Option<Conditions<'a>> {
match self {
Self::FixedDot(x) => x.conditions(),
Self::LooseDot(x) => x.conditions(),
Self::FixedSeg(x) => x.conditions(),
Self::LoneLooseSeg(x) => x.conditions(),
Self::SeqLooseSeg(x) => x.conditions(),
Self::FixedBend(x) => x.conditions(),
Self::LooseBend(x) => x.conditions(),
Primitive::FixedDot(x) => x.conditions(),
Primitive::LooseDot(x) => x.conditions(),
Primitive::FixedSeg(x) => x.conditions(),
Primitive::LoneLooseSeg(x) => x.conditions(),
Primitive::SeqLooseSeg(x) => x.conditions(),
Primitive::FixedBend(x) => x.conditions(),
Primitive::LooseBend(x) => x.conditions(),
}
}
}
@ -240,16 +239,16 @@ where
}
}
impl<'a, W, CW: Copy, R: AccessRules> GetConditions for GenericPrimitive<'a, W, CW, R>
impl<'a, W, CW: Copy, R: AccessRules> GetConditions<'a> for &GenericPrimitive<'a, W, CW, R>
where
GenericPrimitive<'a, W, CW, R>: GetMaybeNet,
{
fn conditions(&self) -> Conditions {
Conditions {
maybe_net: self.maybe_net(),
maybe_region: Some("A".to_string()),
maybe_layer: Some("F.Cu".to_string()),
}
fn conditions(self) -> Option<Conditions<'a>> {
self.maybe_net().map(|net| Conditions {
net,
maybe_region: Some("A".into()),
maybe_layer: Some("F.Cu".into()),
})
}
}

View File

@ -80,6 +80,13 @@ impl<W> Ord for GenericIndex<W> {
}
}
impl<W> core::hash::Hash for GenericIndex<W> {
#[inline]
fn hash<H: core::hash::Hasher>(&self, state: &mut H) {
self.node_index.hash(state);
}
}
impl<W> GetPetgraphIndex for GenericIndex<W> {
#[inline]
fn petgraph_index(&self) -> NodeIndex<usize> {

View File

@ -205,7 +205,7 @@ pub fn assert_band_length(
rel_err: f64,
) {
let band = board.band_between_pins(source_pin, target_pin).unwrap();
let band_length = band.0.ref_(board.layout().drawing()).length();
let band_length = band[false].ref_(board.layout().drawing()).length();
assert!(
(band_length - expected_length).abs() < expected_length * rel_err,
"band_length = {}, expected_length = {}, epsilon = {}",