commit e18997ad9a4924615484ec5ed7c551fb41e3bd04
parent 51f7e4b8e7e44266271278a494031450e6b5813b
Author: Antoine Amarilli <a3nm@a3nm.net>
Date: Mon, 26 May 2025 12:22:16 +0200
commit with codex
Diffstat:
21 files changed, 154 insertions(+), 3 deletions(-)
diff --git a/aggregation b/aggregation
@@ -5,3 +5,5 @@
See also: [quantile], [median], [mean], [sum]
Up: [database_theory]
+
+Aliases: aggregate
diff --git a/bounded_language b/bounded_language
@@ -0,0 +1,7 @@
+# Bounded language
+
+A [formal_language] L is *bounded* if there are words w_1, ..., w_k such that L is included in w_1^* ... w_k^*
+
+Up: [formal_language]
+
+See also: [polyslender], [CFL]
diff --git a/c2 b/c2
@@ -12,3 +12,5 @@ Turns out to be the expressive power of [Weisfeiler_Leman]
subclass: [gc2]
Up: [logic]
+
+See also: [FO2_plus_C]
diff --git a/color_refinement_power b/color_refinement_power
@@ -0,0 +1,5 @@
+# Color refinement power
+
+Color refinement works on "almost all graphs", cf [babai1980random]
+
+Up: [weisfeiler_leman]
diff --git a/conjunction b/conjunction
@@ -2,6 +2,7 @@
- [conjunctive_query]
- [conjunctive_normal_form]
+- [imbalanced_conjunction]
Up: [boolean_operator]
diff --git a/context_free_grammar_unambiguous b/context_free_grammar_unambiguous
@@ -8,4 +8,4 @@ Up: [unambiguity], [context_free_grammar]
See also: [inherently_ambiguous]
-Aliases: unambiguous CFG, unambiguous CFGs
+Aliases: unambiguous CFG, unambiguous CFGs, CFG unambiguous, uCFG, uCFGs
diff --git a/fo2 b/fo2
@@ -1,6 +1,6 @@
# Fo2
-[FOk] with two variables
+[FOk] with [two_variables]
extensions:
- [c2] with [counting_quantifier]
diff --git a/fok b/fok
@@ -21,3 +21,5 @@
- finding the best k: [fok_rewriting]
Up: [first_order_logic]
+
+See also: [two_variables]
diff --git a/gnn_recurrent b/gnn_recurrent
@@ -0,0 +1,9 @@
+# GNN recurrent
+
+A [GNN] that consists of a single layer applied repeatedly: only one [combination_function], one [aggregation_function], and one [aggregation_function] for [global_aggregation]
+
+We can say that there is a designated component so that every node can indicate whether computation is finished, and when computation is finished for every node we apply the [readout_function]
+
+[GNN_recurrent_expressiveness]
+
+Up: [fmt_2025_martin_grohe]
diff --git a/graph_connected b/graph_connected
@@ -0,0 +1,9 @@
+# Graph connected
+
+A [graph] ([directed_graph] or [undirected_graph]) having only one [connected_component]
+
+See also: [strongly_connected_graph]
+
+Up: [graph]
+
+Aliases: connected graph, connected graphs
diff --git a/graph_neural_networks b/graph_neural_networks
@@ -0,0 +1,25 @@
+# Graph neural networks (GNN)
+
+A kind of [distributed_algorithm] on a [graph], related to [message_passing], to compute a [function]
+
+- initially every node of the graph is mapped to a vector of [real_numbers]
+- at every step, you [aggregate] coordinatewise the state of the [neighbors], and then you use a [combination_function] with the previous node state and the aggregate states of the neighbors
+ - the [aggregation] function can be [sum], [mean], or [max] (coordinatewise)
+ - the [combination_function] is computed by a [feedforward_neural_network] (multilayer [perceptron])
+
+Extensions:
+- [global_aggregation]: add a value which is obtained by aggregating the state of all nodes, and is passed to the [combination_function]
+- [random_initialization]: include in the initial label of the nodes one [random] number
+
+The network computes a value on every node. We can also use a [readout_function] to get a final value out of the entire graph: computed by a [feedforward_neural_network] from the result of applying an [aggregation_function] to the state of all nodes
+
+Observation:
+- On [graph_isomorphic] [graphs], the value of the function computed by a GNN is the same
+
+- [graph_neural_network_expressiveness]
+
+Other model: [GNN_recurrent]
+
+Up: [graph], [neural_network]
+
+Aliases: graph neural network, GNN, GNNs
diff --git a/imbalanced_conjunction b/imbalanced_conjunction
@@ -0,0 +1,5 @@
+# Imbalanced conjunction
+
+[razgon2025decision]
+
+Up: [conjunction]
diff --git a/information_extraction_declarative b/information_extraction_declarative
@@ -1,5 +1,7 @@
# Information extraction declarative
- [spanner]
+- [annotation_transducer]
+- general architecture: [schmid2025general]
Up: [information_extraction], [declarative]
diff --git a/neural_network b/neural_network
@@ -0,0 +1,11 @@
+# Neural network
+
+- [neural_network_compression]
+- [neural_network_architecture]
+- [graph_neural_network]
+ - [message_passing_graph_neural_network]
+- [neural_network_expressiveness]
+- [neural_network_feedforward]
+ - [genann] small library for [neural_network_feedforward] in [c]
+
+Up: [machine_learning]
diff --git a/neural_network_expressiveness b/neural_network_expressiveness
@@ -0,0 +1,6 @@
+# Neural network expressiveness
+
+- [pablo_keynote]
+- [svete2024efficiently]
+
+Up: [neural_network], [expressiveness]
diff --git a/neural_network_feedforward b/neural_network_feedforward
@@ -0,0 +1,9 @@
+# Neural network feedforward
+
+A [DAG] of weighted nodes and edges, and every node gets a value which is the result of applying an [activation_function] (typically [ReLu]) to the sum of the node weight and a [linear_combination] of the values of the input nodes weighted by the weight of the edges
+
+- [universal_approximation_theorem]
+
+Up: [neural_network]
+
+Aliases: feedforward neural network
diff --git a/query_containment_problem b/query_containment_problem
@@ -0,0 +1,9 @@
+# Query containment problem
+
+[Computational_problem] of deciding [query_containment]
+
+Up: [query_containment]
+
+Aliases: QCP
+
+See also: [bag_query_containment]
diff --git a/randomness b/randomness
@@ -16,3 +16,5 @@
Up: [mathematics]
See also: [probabilities]
+
+Aliases: random
diff --git a/relu b/relu
@@ -0,0 +1,7 @@
+# ReLu
+
+rectified linear function: relu(x) = max(0, x)
+
+Up: [function], [machine_learning]
+
+See also: [activation_function], [neural_network], [max]
diff --git a/weakly_connected_graph b/weakly_connected_graph
@@ -4,4 +4,4 @@ A [directed_graph] which is [weakly_connected], i.e., has only one [weakly_conne
Up: [directed_graph]
-See also: [strongly_connected_graph]
+See also: [strongly_connected_graph], [graph_connected]
diff --git a/weisfeiler_leman b/weisfeiler_leman
@@ -0,0 +1,38 @@
+# Weisfeiler-Leman algorithm
+
+https://en.wikipedia.org/wiki/Weisfeiler_Leman_graph_isomorphism_test
+
+aka "color refinement"
+
+- initially all vertices have the same color
+- then distinguish vertices based on the multiset of their neighboring colors
+- two vertices with different colors will never be mappable to each other
+- in fact you don't need to remember all the colors, you just want to iterate a partitioning
+ - |V| rounds of refinement at most until convergence
+ - but can be implemented in O(|G| log |G|) where |G| = |V| + |E|
+ - related to [automaton_minimization]
+- the coloring can be made canonical (cf [graph_canonical_labeling])
+- of course if all vertices have same [degree] then this does not distinguish anything
+ - simple example: one 6-[cycle] vs two disjoint [triangles]
+
+## Power
+
+[color_refinement_power]
+
+## Expressiveness
+
+[color_refinement_expressive_power]
+
+## Multidimensional version
+
+[weisfeiler_leman_multidimensional]
+
+## Other
+
+- [color_refinement_query_evaluation]
+
+Up: [graph]
+
+See also: [graph_isomorphism], [proof_complexity], [machine_learning], [games], [logic], [fok], [fok_counting], [homomorphism_count]
+
+Aliases: color refinement, color refinement algorithm