networkx-1.11/0000755000175000017500000000000012653231455013201 5ustar aricaric00000000000000networkx-1.11/setup.cfg0000644000175000017500000000027012653231455015021 0ustar aricaric00000000000000[nosetests] verbosity = 0 detailed-errors = 1 with-doctest = 1 match = (?:^|[\b_\./-])[Tt]est(?!ing) [wheel] universal = 1 [egg_info] tag_build = tag_date = 0 tag_svn_revision = 0 networkx-1.11/networkx.egg-info/0000755000175000017500000000000012653231454016553 5ustar aricaric00000000000000networkx-1.11/networkx.egg-info/PKG-INFO0000644000175000017500000000273412653231454017656 0ustar aricaric00000000000000Metadata-Version: 1.1 Name: networkx Version: 1.11 Summary: Python package for creating and manipulating graphs and networks Home-page: http://networkx.github.io/ Author: NetworkX Developers Author-email: networkx-discuss@googlegroups.com License: BSD Download-URL: https://pypi.python.org/pypi/networkx/ Description: NetworkX is a Python package for the creation, manipulation, and study of the structure, dynamics, and functions of complex networks. Keywords: Networks,Graph Theory,Mathematics,network,graph,discrete mathematics,math Platform: Linux Platform: Mac OSX Platform: Windows Platform: Unix Classifier: Development Status :: 5 - Production/Stable Classifier: Intended Audience :: Developers Classifier: Intended Audience :: Science/Research Classifier: License :: OSI Approved :: BSD License Classifier: Operating System :: OS Independent Classifier: Programming Language :: Python :: 2 Classifier: Programming Language :: Python :: 2.7 Classifier: Programming Language :: Python :: 3 Classifier: Programming Language :: Python :: 3.5 Classifier: Programming Language :: Python :: 3.3 Classifier: Programming Language :: Python :: 3.4 Classifier: Topic :: Software Development :: Libraries :: Python Modules Classifier: Topic :: Scientific/Engineering :: Bio-Informatics Classifier: Topic :: Scientific/Engineering :: Information Analysis Classifier: Topic :: Scientific/Engineering :: Mathematics Classifier: Topic :: Scientific/Engineering :: Physics networkx-1.11/networkx.egg-info/SOURCES.txt0000644000175000017500000005711412653231454020447 0ustar aricaric00000000000000INSTALL.txt LICENSE.txt MANIFEST.in README.rst setup.cfg setup.py doc/Makefile doc/gh-pages.py doc/gitwash_dumper.py doc/make_examples_rst.py doc/make_gallery.py doc/requirements.txt doc/rst_templates/autosummary/base.rst doc/rst_templates/autosummary/class.rst doc/rst_templates/autosummary/function.rst doc/rst_templates/autosummary/module.rst doc/source/bibliography.rst doc/source/conf.py doc/source/download.rst doc/source/index.rst doc/source/install.rst doc/source/overview.rst doc/source/test.rst doc/source/developer/index.rst doc/source/developer/gitwash/branch_dropdown.png doc/source/developer/gitwash/configure_git.rst doc/source/developer/gitwash/development_workflow.rst doc/source/developer/gitwash/following_latest.rst doc/source/developer/gitwash/forking_button.png doc/source/developer/gitwash/forking_hell.rst doc/source/developer/gitwash/git_development.rst doc/source/developer/gitwash/git_install.rst doc/source/developer/gitwash/git_intro.rst doc/source/developer/gitwash/git_links.inc doc/source/developer/gitwash/git_resources.rst doc/source/developer/gitwash/index.rst doc/source/developer/gitwash/known_projects.inc doc/source/developer/gitwash/links.inc doc/source/developer/gitwash/maintainer_workflow.rst doc/source/developer/gitwash/patching.rst doc/source/developer/gitwash/pull_button.png doc/source/developer/gitwash/set_up_fork.rst doc/source/developer/gitwash/this_project.inc doc/source/reference/algorithms.approximation.rst doc/source/reference/algorithms.assortativity.rst doc/source/reference/algorithms.bipartite.rst doc/source/reference/algorithms.block.rst doc/source/reference/algorithms.boundary.rst doc/source/reference/algorithms.centrality.rst doc/source/reference/algorithms.chordal.rst doc/source/reference/algorithms.clique.rst doc/source/reference/algorithms.clustering.rst doc/source/reference/algorithms.coloring.rst doc/source/reference/algorithms.community.rst doc/source/reference/algorithms.component.rst doc/source/reference/algorithms.connectivity.rst doc/source/reference/algorithms.core.rst doc/source/reference/algorithms.cycles.rst doc/source/reference/algorithms.dag.rst doc/source/reference/algorithms.distance_measures.rst doc/source/reference/algorithms.distance_regular.rst doc/source/reference/algorithms.dominance.rst doc/source/reference/algorithms.dominating.rst doc/source/reference/algorithms.euler.rst doc/source/reference/algorithms.flow.rst doc/source/reference/algorithms.graphical.rst doc/source/reference/algorithms.hierarchy.rst doc/source/reference/algorithms.hybrid.rst doc/source/reference/algorithms.isolates.rst doc/source/reference/algorithms.isomorphism.rst doc/source/reference/algorithms.isomorphism.vf2.rst doc/source/reference/algorithms.link_analysis.rst doc/source/reference/algorithms.link_prediction.rst doc/source/reference/algorithms.matching.rst doc/source/reference/algorithms.minors.rst doc/source/reference/algorithms.mis.rst doc/source/reference/algorithms.mst.rst doc/source/reference/algorithms.operators.rst doc/source/reference/algorithms.rich_club.rst doc/source/reference/algorithms.rst doc/source/reference/algorithms.shortest_paths.rst doc/source/reference/algorithms.simple_paths.rst doc/source/reference/algorithms.swap.rst doc/source/reference/algorithms.traversal.rst doc/source/reference/algorithms.tree.rst doc/source/reference/algorithms.triads.rst doc/source/reference/algorithms.vitality.rst doc/source/reference/api_0.99.rst doc/source/reference/api_1.0.rst doc/source/reference/api_1.10.rst doc/source/reference/api_1.11.rst doc/source/reference/api_1.4.rst doc/source/reference/api_1.5.rst doc/source/reference/api_1.6.rst doc/source/reference/api_1.7.rst doc/source/reference/api_1.8.rst doc/source/reference/api_1.9.rst doc/source/reference/api_changes.rst doc/source/reference/citing.rst doc/source/reference/classes.digraph.rst doc/source/reference/classes.graph.rst doc/source/reference/classes.multidigraph.rst doc/source/reference/classes.multigraph.rst doc/source/reference/classes.rst doc/source/reference/convert.rst doc/source/reference/credits.rst doc/source/reference/drawing.rst doc/source/reference/exceptions.rst doc/source/reference/functions.rst doc/source/reference/generators.rst doc/source/reference/glossary.rst doc/source/reference/history.rst doc/source/reference/index.rst doc/source/reference/introduction.rst doc/source/reference/legal.rst doc/source/reference/linalg.rst doc/source/reference/news.rst doc/source/reference/pdf_reference.rst doc/source/reference/readwrite.adjlist.rst doc/source/reference/readwrite.edgelist.rst doc/source/reference/readwrite.gexf.rst doc/source/reference/readwrite.gml.rst doc/source/reference/readwrite.gpickle.rst doc/source/reference/readwrite.graphml.rst doc/source/reference/readwrite.json_graph.rst doc/source/reference/readwrite.leda.rst doc/source/reference/readwrite.multiline_adjlist.rst doc/source/reference/readwrite.nx_shp.rst doc/source/reference/readwrite.pajek.rst doc/source/reference/readwrite.rst doc/source/reference/readwrite.sparsegraph6.rst doc/source/reference/readwrite.yaml.rst doc/source/reference/relabel.rst doc/source/reference/utils.rst doc/source/static/art1.png doc/source/static/networkx.css doc/source/static/trac.css doc/source/static/force/force.css doc/source/templates/index.html doc/source/templates/indexsidebar.html doc/source/templates/layout.html doc/source/tutorial/index.rst doc/source/tutorial/tutorial.rst doc/sphinxext/LICENSE.txt doc/sphinxext/customroles.py examples/3d_drawing/mayavi2_spring.py examples/advanced/eigenvalues.py examples/advanced/heavy_metal_umlaut.py examples/advanced/iterated_dynamical_systems.py examples/advanced/parallel_betweenness.py examples/algorithms/blockmodel.py examples/algorithms/davis_club.py examples/algorithms/hartford_drug.edgelist examples/algorithms/krackhardt_centrality.py examples/algorithms/rcm.py examples/basic/properties.py examples/basic/read_write.py examples/drawing/atlas.py examples/drawing/chess_masters.py examples/drawing/chess_masters_WCC.pgn.bz2 examples/drawing/circular_tree.py examples/drawing/degree_histogram.py examples/drawing/edge_colormap.py examples/drawing/ego_graph.py examples/drawing/four_grids.py examples/drawing/giant_component.py examples/drawing/house_with_colors.py examples/drawing/knuth_miles.py examples/drawing/knuth_miles.txt.gz examples/drawing/labels_and_colors.py examples/drawing/lanl_routes.edgelist examples/drawing/lanl_routes.py examples/drawing/node_colormap.py examples/drawing/random_geometric_graph.py examples/drawing/sampson.py examples/drawing/sampson_data.zip examples/drawing/simple_path.py examples/drawing/unix_email.mbox examples/drawing/unix_email.py examples/drawing/weighted_graph.py examples/graph/atlas.py examples/graph/atlas2.py examples/graph/degree_sequence.py examples/graph/erdos_renyi.py examples/graph/expected_degree_sequence.py examples/graph/football.py examples/graph/karate_club.py examples/graph/knuth_miles.py examples/graph/knuth_miles.txt.gz examples/graph/napoleon_russian_campaign.py examples/graph/roget.py examples/graph/roget_dat.txt.gz examples/graph/unix_email.mbox examples/graph/unix_email.py examples/graph/words.py examples/graph/words_dat.txt.gz examples/javascript/force.py examples/javascript/http_server.py examples/multigraph/chess_masters.py examples/multigraph/chess_masters_WCC.pgn.bz2 examples/pygraphviz/pygraphviz_attributes.py examples/pygraphviz/pygraphviz_draw.py examples/pygraphviz/pygraphviz_simple.py examples/pygraphviz/write_dotfile.py examples/subclass/antigraph.py examples/subclass/printgraph.py networkx/__init__.py networkx/convert.py networkx/convert_matrix.py networkx/exception.py networkx/relabel.py networkx/release.py networkx/version.py networkx.egg-info/PKG-INFO networkx.egg-info/SOURCES.txt networkx.egg-info/dependency_links.txt networkx.egg-info/not-zip-safe networkx.egg-info/requires.txt networkx.egg-info/top_level.txt networkx/algorithms/__init__.py networkx/algorithms/block.py networkx/algorithms/boundary.py networkx/algorithms/clique.py networkx/algorithms/cluster.py networkx/algorithms/core.py networkx/algorithms/cycles.py networkx/algorithms/dag.py networkx/algorithms/distance_measures.py networkx/algorithms/distance_regular.py networkx/algorithms/dominance.py networkx/algorithms/dominating.py networkx/algorithms/euler.py networkx/algorithms/graphical.py networkx/algorithms/hierarchy.py networkx/algorithms/hybrid.py networkx/algorithms/isolate.py networkx/algorithms/link_prediction.py networkx/algorithms/matching.py networkx/algorithms/minors.py networkx/algorithms/mis.py networkx/algorithms/mst.py networkx/algorithms/richclub.py networkx/algorithms/simple_paths.py networkx/algorithms/smetric.py networkx/algorithms/swap.py networkx/algorithms/triads.py networkx/algorithms/vitality.py networkx/algorithms/approximation/__init__.py networkx/algorithms/approximation/clique.py networkx/algorithms/approximation/clustering_coefficient.py networkx/algorithms/approximation/connectivity.py networkx/algorithms/approximation/dominating_set.py networkx/algorithms/approximation/independent_set.py networkx/algorithms/approximation/kcomponents.py networkx/algorithms/approximation/matching.py networkx/algorithms/approximation/ramsey.py networkx/algorithms/approximation/vertex_cover.py networkx/algorithms/approximation/tests/test_approx_clust_coeff.py networkx/algorithms/approximation/tests/test_clique.py networkx/algorithms/approximation/tests/test_connectivity.py networkx/algorithms/approximation/tests/test_dominating_set.py networkx/algorithms/approximation/tests/test_independent_set.py networkx/algorithms/approximation/tests/test_kcomponents.py networkx/algorithms/approximation/tests/test_matching.py networkx/algorithms/approximation/tests/test_ramsey.py networkx/algorithms/approximation/tests/test_vertex_cover.py networkx/algorithms/assortativity/__init__.py networkx/algorithms/assortativity/connectivity.py networkx/algorithms/assortativity/correlation.py networkx/algorithms/assortativity/mixing.py networkx/algorithms/assortativity/neighbor_degree.py networkx/algorithms/assortativity/pairs.py networkx/algorithms/assortativity/tests/base_test.py networkx/algorithms/assortativity/tests/test_connectivity.py networkx/algorithms/assortativity/tests/test_correlation.py networkx/algorithms/assortativity/tests/test_mixing.py networkx/algorithms/assortativity/tests/test_neighbor_degree.py networkx/algorithms/assortativity/tests/test_pairs.py networkx/algorithms/bipartite/__init__.py networkx/algorithms/bipartite/basic.py networkx/algorithms/bipartite/centrality.py networkx/algorithms/bipartite/cluster.py networkx/algorithms/bipartite/edgelist.py networkx/algorithms/bipartite/generators.py networkx/algorithms/bipartite/matching.py networkx/algorithms/bipartite/matrix.py networkx/algorithms/bipartite/projection.py networkx/algorithms/bipartite/redundancy.py networkx/algorithms/bipartite/spectral.py networkx/algorithms/bipartite/tests/test_basic.py networkx/algorithms/bipartite/tests/test_centrality.py networkx/algorithms/bipartite/tests/test_cluster.py networkx/algorithms/bipartite/tests/test_edgelist.py networkx/algorithms/bipartite/tests/test_generators.py networkx/algorithms/bipartite/tests/test_matching.py networkx/algorithms/bipartite/tests/test_matrix.py networkx/algorithms/bipartite/tests/test_project.py networkx/algorithms/bipartite/tests/test_redundancy.py networkx/algorithms/bipartite/tests/test_spectral_bipartivity.py networkx/algorithms/centrality/__init__.py networkx/algorithms/centrality/betweenness.py networkx/algorithms/centrality/betweenness_subset.py networkx/algorithms/centrality/closeness.py networkx/algorithms/centrality/communicability_alg.py networkx/algorithms/centrality/current_flow_betweenness.py networkx/algorithms/centrality/current_flow_betweenness_subset.py networkx/algorithms/centrality/current_flow_closeness.py networkx/algorithms/centrality/degree_alg.py networkx/algorithms/centrality/dispersion.py networkx/algorithms/centrality/eigenvector.py networkx/algorithms/centrality/flow_matrix.py networkx/algorithms/centrality/harmonic.py networkx/algorithms/centrality/katz.py networkx/algorithms/centrality/load.py networkx/algorithms/centrality/tests/test_betweenness_centrality.py networkx/algorithms/centrality/tests/test_betweenness_centrality_subset.py networkx/algorithms/centrality/tests/test_closeness_centrality.py networkx/algorithms/centrality/tests/test_communicability.py networkx/algorithms/centrality/tests/test_current_flow_betweenness_centrality.py networkx/algorithms/centrality/tests/test_current_flow_betweenness_centrality_subset.py networkx/algorithms/centrality/tests/test_current_flow_closeness.py networkx/algorithms/centrality/tests/test_degree_centrality.py networkx/algorithms/centrality/tests/test_dispersion.py networkx/algorithms/centrality/tests/test_eigenvector_centrality.py networkx/algorithms/centrality/tests/test_harmonic_centrality.py networkx/algorithms/centrality/tests/test_katz_centrality.py networkx/algorithms/centrality/tests/test_load_centrality.py networkx/algorithms/chordal/__init__.py networkx/algorithms/chordal/chordal_alg.py networkx/algorithms/chordal/tests/test_chordal.py networkx/algorithms/coloring/__init__.py networkx/algorithms/coloring/greedy_coloring.py networkx/algorithms/coloring/greedy_coloring_with_interchange.py networkx/algorithms/coloring/tests/test_coloring.py networkx/algorithms/community/__init__.py networkx/algorithms/community/kclique.py networkx/algorithms/community/tests/test_kclique.py networkx/algorithms/components/__init__.py networkx/algorithms/components/attracting.py networkx/algorithms/components/biconnected.py networkx/algorithms/components/connected.py networkx/algorithms/components/semiconnected.py networkx/algorithms/components/strongly_connected.py networkx/algorithms/components/weakly_connected.py networkx/algorithms/components/tests/test_attracting.py networkx/algorithms/components/tests/test_biconnected.py networkx/algorithms/components/tests/test_connected.py networkx/algorithms/components/tests/test_semiconnected.py networkx/algorithms/components/tests/test_strongly_connected.py networkx/algorithms/components/tests/test_subgraph_copies.py networkx/algorithms/components/tests/test_weakly_connected.py networkx/algorithms/connectivity/__init__.py networkx/algorithms/connectivity/connectivity.py networkx/algorithms/connectivity/cuts.py networkx/algorithms/connectivity/kcomponents.py networkx/algorithms/connectivity/kcutsets.py networkx/algorithms/connectivity/stoerwagner.py networkx/algorithms/connectivity/utils.py networkx/algorithms/connectivity/tests/test_connectivity.py networkx/algorithms/connectivity/tests/test_cuts.py networkx/algorithms/connectivity/tests/test_kcomponents.py networkx/algorithms/connectivity/tests/test_kcutsets.py networkx/algorithms/connectivity/tests/test_stoer_wagner.py networkx/algorithms/flow/__init__.py networkx/algorithms/flow/capacityscaling.py networkx/algorithms/flow/edmondskarp.py networkx/algorithms/flow/maxflow.py networkx/algorithms/flow/mincost.py networkx/algorithms/flow/networksimplex.py networkx/algorithms/flow/preflowpush.py networkx/algorithms/flow/shortestaugmentingpath.py networkx/algorithms/flow/utils.py networkx/algorithms/flow/tests/gl1.gpickle.bz2 networkx/algorithms/flow/tests/gw1.gpickle.bz2 networkx/algorithms/flow/tests/netgen-2.gpickle.bz2 networkx/algorithms/flow/tests/test_maxflow.py networkx/algorithms/flow/tests/test_maxflow_large_graph.py networkx/algorithms/flow/tests/test_mincost.py networkx/algorithms/flow/tests/wlm3.gpickle.bz2 networkx/algorithms/isomorphism/__init__.py networkx/algorithms/isomorphism/isomorph.py networkx/algorithms/isomorphism/isomorphvf2.py networkx/algorithms/isomorphism/matchhelpers.py networkx/algorithms/isomorphism/vf2userfunc.py networkx/algorithms/isomorphism/tests/iso_r01_s80.A99 networkx/algorithms/isomorphism/tests/iso_r01_s80.B99 networkx/algorithms/isomorphism/tests/si2_b06_m200.A99 networkx/algorithms/isomorphism/tests/si2_b06_m200.B99 networkx/algorithms/isomorphism/tests/test_isomorphism.py networkx/algorithms/isomorphism/tests/test_isomorphvf2.py networkx/algorithms/isomorphism/tests/test_vf2userfunc.py networkx/algorithms/link_analysis/__init__.py networkx/algorithms/link_analysis/hits_alg.py networkx/algorithms/link_analysis/pagerank_alg.py networkx/algorithms/link_analysis/tests/test_hits.py networkx/algorithms/link_analysis/tests/test_pagerank.py networkx/algorithms/operators/__init__.py networkx/algorithms/operators/all.py networkx/algorithms/operators/binary.py networkx/algorithms/operators/product.py networkx/algorithms/operators/unary.py networkx/algorithms/operators/tests/test_all.py networkx/algorithms/operators/tests/test_binary.py networkx/algorithms/operators/tests/test_product.py networkx/algorithms/operators/tests/test_unary.py networkx/algorithms/shortest_paths/__init__.py networkx/algorithms/shortest_paths/astar.py networkx/algorithms/shortest_paths/dense.py networkx/algorithms/shortest_paths/generic.py networkx/algorithms/shortest_paths/unweighted.py networkx/algorithms/shortest_paths/weighted.py networkx/algorithms/shortest_paths/tests/test_astar.py networkx/algorithms/shortest_paths/tests/test_dense.py networkx/algorithms/shortest_paths/tests/test_dense_numpy.py networkx/algorithms/shortest_paths/tests/test_generic.py networkx/algorithms/shortest_paths/tests/test_unweighted.py networkx/algorithms/shortest_paths/tests/test_weighted.py networkx/algorithms/tests/test_block.py networkx/algorithms/tests/test_boundary.py networkx/algorithms/tests/test_clique.py networkx/algorithms/tests/test_cluster.py networkx/algorithms/tests/test_core.py networkx/algorithms/tests/test_cycles.py networkx/algorithms/tests/test_dag.py networkx/algorithms/tests/test_distance_measures.py networkx/algorithms/tests/test_distance_regular.py networkx/algorithms/tests/test_dominance.py networkx/algorithms/tests/test_dominating.py networkx/algorithms/tests/test_euler.py networkx/algorithms/tests/test_graphical.py networkx/algorithms/tests/test_hierarchy.py networkx/algorithms/tests/test_hybrid.py networkx/algorithms/tests/test_link_prediction.py networkx/algorithms/tests/test_matching.py networkx/algorithms/tests/test_minors.py networkx/algorithms/tests/test_mis.py networkx/algorithms/tests/test_mst.py networkx/algorithms/tests/test_richclub.py networkx/algorithms/tests/test_simple_paths.py networkx/algorithms/tests/test_smetric.py networkx/algorithms/tests/test_swap.py networkx/algorithms/tests/test_triads.py networkx/algorithms/tests/test_vitality.py networkx/algorithms/traversal/__init__.py networkx/algorithms/traversal/breadth_first_search.py networkx/algorithms/traversal/depth_first_search.py networkx/algorithms/traversal/edgedfs.py networkx/algorithms/traversal/tests/test_bfs.py networkx/algorithms/traversal/tests/test_dfs.py networkx/algorithms/traversal/tests/test_edgedfs.py networkx/algorithms/tree/__init__.py networkx/algorithms/tree/branchings.py networkx/algorithms/tree/recognition.py networkx/algorithms/tree/tests/test_branchings.py networkx/algorithms/tree/tests/test_recognition.py networkx/classes/__init__.py networkx/classes/digraph.py networkx/classes/function.py networkx/classes/graph.py networkx/classes/multidigraph.py networkx/classes/multigraph.py networkx/classes/ordered.py networkx/classes/tests/historical_tests.py networkx/classes/tests/test_digraph.py networkx/classes/tests/test_digraph_historical.py networkx/classes/tests/test_function.py networkx/classes/tests/test_graph.py networkx/classes/tests/test_graph_historical.py networkx/classes/tests/test_multidigraph.py networkx/classes/tests/test_multigraph.py networkx/classes/tests/test_ordered.py networkx/classes/tests/test_special.py networkx/classes/tests/test_timing.py networkx/classes/tests/timingclasses.py networkx/drawing/__init__.py networkx/drawing/layout.py networkx/drawing/nx_agraph.py networkx/drawing/nx_pydot.py networkx/drawing/nx_pylab.py networkx/drawing/tests/test_agraph.py networkx/drawing/tests/test_layout.py networkx/drawing/tests/test_pydot.py networkx/drawing/tests/test_pylab.py networkx/external/__init__.py networkx/generators/__init__.py networkx/generators/atlas.py networkx/generators/classic.py networkx/generators/community.py networkx/generators/degree_seq.py networkx/generators/directed.py networkx/generators/ego.py networkx/generators/expanders.py networkx/generators/geometric.py networkx/generators/intersection.py networkx/generators/line.py networkx/generators/nonisomorphic_trees.py networkx/generators/random_clustered.py networkx/generators/random_graphs.py networkx/generators/small.py networkx/generators/social.py networkx/generators/stochastic.py networkx/generators/threshold.py networkx/generators/tests/test_atlas.py networkx/generators/tests/test_classic.py networkx/generators/tests/test_community.py networkx/generators/tests/test_degree_seq.py networkx/generators/tests/test_directed.py networkx/generators/tests/test_ego.py networkx/generators/tests/test_expanders.py networkx/generators/tests/test_geometric.py networkx/generators/tests/test_intersection.py networkx/generators/tests/test_line.py networkx/generators/tests/test_nonisomorphic_trees.py networkx/generators/tests/test_random_clustered.py networkx/generators/tests/test_random_graphs.py networkx/generators/tests/test_small.py networkx/generators/tests/test_stochastic.py networkx/generators/tests/test_threshold.py networkx/linalg/__init__.py networkx/linalg/algebraicconnectivity.py networkx/linalg/attrmatrix.py networkx/linalg/graphmatrix.py networkx/linalg/laplacianmatrix.py networkx/linalg/modularitymatrix.py networkx/linalg/spectrum.py networkx/linalg/tests/test_algebraic_connectivity.py networkx/linalg/tests/test_graphmatrix.py networkx/linalg/tests/test_laplacian.py networkx/linalg/tests/test_modularity.py networkx/linalg/tests/test_spectrum.py networkx/readwrite/__init__.py networkx/readwrite/adjlist.py networkx/readwrite/edgelist.py networkx/readwrite/gexf.py networkx/readwrite/gml.py networkx/readwrite/gpickle.py networkx/readwrite/graph6.py networkx/readwrite/graphml.py networkx/readwrite/leda.py networkx/readwrite/multiline_adjlist.py networkx/readwrite/nx_shp.py networkx/readwrite/nx_yaml.py networkx/readwrite/p2g.py networkx/readwrite/pajek.py networkx/readwrite/sparse6.py networkx/readwrite/json_graph/__init__.py networkx/readwrite/json_graph/adjacency.py networkx/readwrite/json_graph/node_link.py networkx/readwrite/json_graph/tree.py networkx/readwrite/json_graph/tests/test_adjacency.py networkx/readwrite/json_graph/tests/test_node_link.py networkx/readwrite/json_graph/tests/test_tree.py networkx/readwrite/tests/test_adjlist.py networkx/readwrite/tests/test_edgelist.py networkx/readwrite/tests/test_gexf.py networkx/readwrite/tests/test_gml.py networkx/readwrite/tests/test_gpickle.py networkx/readwrite/tests/test_graph6.py networkx/readwrite/tests/test_graphml.py networkx/readwrite/tests/test_leda.py networkx/readwrite/tests/test_p2g.py networkx/readwrite/tests/test_pajek.py networkx/readwrite/tests/test_shp.py networkx/readwrite/tests/test_sparse6.py networkx/readwrite/tests/test_yaml.py networkx/testing/__init__.py networkx/testing/utils.py networkx/testing/tests/test_utils.py networkx/tests/__init__.py networkx/tests/benchmark.py networkx/tests/test.py networkx/tests/test_convert.py networkx/tests/test_convert_numpy.py networkx/tests/test_convert_pandas.py networkx/tests/test_convert_scipy.py networkx/tests/test_exceptions.py networkx/tests/test_relabel.py networkx/utils/__init__.py networkx/utils/contextmanagers.py networkx/utils/decorators.py networkx/utils/heaps.py networkx/utils/misc.py networkx/utils/random_sequence.py networkx/utils/rcm.py networkx/utils/union_find.py networkx/utils/tests/test.txt networkx/utils/tests/test_contextmanager.py networkx/utils/tests/test_decorators.py networkx/utils/tests/test_heaps.py networkx/utils/tests/test_misc.py networkx/utils/tests/test_random_sequence.py networkx/utils/tests/test_rcm.py networkx/utils/tests/test_unionfind.pynetworkx-1.11/networkx.egg-info/requires.txt0000644000175000017500000000002112653231454021144 0ustar aricaric00000000000000decorator>=3.4.0 networkx-1.11/networkx.egg-info/not-zip-safe0000644000175000017500000000000112637544506021010 0ustar aricaric00000000000000 networkx-1.11/networkx.egg-info/dependency_links.txt0000644000175000017500000000000112653231454022621 0ustar aricaric00000000000000 networkx-1.11/networkx.egg-info/top_level.txt0000644000175000017500000000001112653231454021275 0ustar aricaric00000000000000networkx networkx-1.11/PKG-INFO0000644000175000017500000000273412653231455014304 0ustar aricaric00000000000000Metadata-Version: 1.1 Name: networkx Version: 1.11 Summary: Python package for creating and manipulating graphs and networks Home-page: http://networkx.github.io/ Author: NetworkX Developers Author-email: networkx-discuss@googlegroups.com License: BSD Download-URL: https://pypi.python.org/pypi/networkx/ Description: NetworkX is a Python package for the creation, manipulation, and study of the structure, dynamics, and functions of complex networks. Keywords: Networks,Graph Theory,Mathematics,network,graph,discrete mathematics,math Platform: Linux Platform: Mac OSX Platform: Windows Platform: Unix Classifier: Development Status :: 5 - Production/Stable Classifier: Intended Audience :: Developers Classifier: Intended Audience :: Science/Research Classifier: License :: OSI Approved :: BSD License Classifier: Operating System :: OS Independent Classifier: Programming Language :: Python :: 2 Classifier: Programming Language :: Python :: 2.7 Classifier: Programming Language :: Python :: 3 Classifier: Programming Language :: Python :: 3.5 Classifier: Programming Language :: Python :: 3.3 Classifier: Programming Language :: Python :: 3.4 Classifier: Topic :: Software Development :: Libraries :: Python Modules Classifier: Topic :: Scientific/Engineering :: Bio-Informatics Classifier: Topic :: Scientific/Engineering :: Information Analysis Classifier: Topic :: Scientific/Engineering :: Mathematics Classifier: Topic :: Scientific/Engineering :: Physics networkx-1.11/networkx/0000755000175000017500000000000012653231454015061 5ustar aricaric00000000000000networkx-1.11/networkx/exception.py0000644000175000017500000000344412637544450017443 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ ********** Exceptions ********** Base exceptions and errors for NetworkX. """ __author__ = """Aric Hagberg (hagberg@lanl.gov)\nPieter Swart (swart@lanl.gov)\nDan Schult(dschult@colgate.edu)\nLoïc Séguin-C. """ # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. # # Exception handling # the root of all Exceptions class NetworkXException(Exception): """Base class for exceptions in NetworkX.""" class NetworkXError(NetworkXException): """Exception for a serious error in NetworkX""" class NetworkXPointlessConcept(NetworkXException): """Harary, F. and Read, R. "Is the Null Graph a Pointless Concept?" In Graphs and Combinatorics Conference, George Washington University. New York: Springer-Verlag, 1973. """ class NetworkXAlgorithmError(NetworkXException): """Exception for unexpected termination of algorithms.""" class NetworkXUnfeasible(NetworkXAlgorithmError): """Exception raised by algorithms trying to solve a problem instance that has no feasible solution.""" class NetworkXNoPath(NetworkXUnfeasible): """Exception for algorithms that should return a path when running on graphs where such a path does not exist.""" class NetworkXNoCycle(NetworkXUnfeasible): """Exception for algorithms that should return a cycle when running on graphs where such a cycle does not exist.""" class NetworkXUnbounded(NetworkXAlgorithmError): """Exception raised by algorithms trying to solve a maximization or a minimization problem instance that is unbounded.""" class NetworkXNotImplemented(NetworkXException): """Exception raised by algorithms not implemented for a type of graph.""" networkx-1.11/networkx/linalg/0000755000175000017500000000000012653231454016327 5ustar aricaric00000000000000networkx-1.11/networkx/linalg/spectrum.py0000644000175000017500000000535112637544450020554 0ustar aricaric00000000000000""" Eigenvalue spectrum of graphs. """ # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import networkx as nx __author__ = "\n".join(['Aric Hagberg ', 'Pieter Swart (swart@lanl.gov)', 'Dan Schult(dschult@colgate.edu)', 'Jean-Gabriel Young (jean.gabriel.young@gmail.com)']) __all__ = ['laplacian_spectrum', 'adjacency_spectrum', 'modularity_spectrum'] def laplacian_spectrum(G, weight='weight'): """Return eigenvalues of the Laplacian of G Parameters ---------- G : graph A NetworkX graph weight : string or None, optional (default='weight') The edge data key used to compute each value in the matrix. If None, then each edge has weight 1. Returns ------- evals : NumPy array Eigenvalues Notes ----- For MultiGraph/MultiDiGraph, the edges weights are summed. See to_numpy_matrix for other options. See Also -------- laplacian_matrix """ from scipy.linalg import eigvalsh return eigvalsh(nx.laplacian_matrix(G,weight=weight).todense()) def adjacency_spectrum(G, weight='weight'): """Return eigenvalues of the adjacency matrix of G. Parameters ---------- G : graph A NetworkX graph weight : string or None, optional (default='weight') The edge data key used to compute each value in the matrix. If None, then each edge has weight 1. Returns ------- evals : NumPy array Eigenvalues Notes ----- For MultiGraph/MultiDiGraph, the edges weights are summed. See to_numpy_matrix for other options. See Also -------- adjacency_matrix """ from scipy.linalg import eigvals return eigvals(nx.adjacency_matrix(G,weight=weight).todense()) def modularity_spectrum(G): """Return eigenvalues of the modularity matrix of G. Parameters ---------- G : Graph A NetworkX Graph or DiGraph Returns ------- evals : NumPy array Eigenvalues See Also -------- modularity_matrix References ---------- .. [1] M. E. J. Newman, "Modularity and community structure in networks", Proc. Natl. Acad. Sci. USA, vol. 103, pp. 8577-8582, 2006. """ from scipy.linalg import eigvals if G.is_directed(): return eigvals(nx.directed_modularity_matrix(G)) else: return eigvals(nx.modularity_matrix(G)) # fixture for nose tests def setup_module(module): from nose import SkipTest try: import scipy.linalg except: raise SkipTest("scipy.linalg not available") networkx-1.11/networkx/linalg/attrmatrix.py0000644000175000017500000003717212637544500021113 0ustar aricaric00000000000000""" Functions for constructing matrix-like objects from graph attributes. """ __all__ = ['attr_matrix', 'attr_sparse_matrix'] import networkx as nx def _node_value(G, node_attr): """Returns a function that returns a value from G.node[u]. We return a function expecting a node as its sole argument. Then, in the simplest scenario, the returned function will return G.node[u][node_attr]. However, we also handle the case when `node_attr` is None or when it is a function itself. Parameters ---------- G : graph A NetworkX graph node_attr : {None, str, callable} Specification of how the value of the node attribute should be obtained from the node attribute dictionary. Returns ------- value : function A function expecting a node as its sole argument. The function will returns a value from G.node[u] that depends on `edge_attr`. """ if node_attr is None: value = lambda u: u elif not hasattr(node_attr, '__call__'): # assume it is a key for the node attribute dictionary value = lambda u: G.node[u][node_attr] else: # Advanced: Allow users to specify something else. # # For example, # node_attr = lambda u: G.node[u].get('size', .5) * 3 # value = node_attr return value def _edge_value(G, edge_attr): """Returns a function that returns a value from G[u][v]. Suppose there exists an edge between u and v. Then we return a function expecting u and v as arguments. For Graph and DiGraph, G[u][v] is the edge attribute dictionary, and the function (essentially) returns G[u][v][edge_attr]. However, we also handle cases when `edge_attr` is None and when it is a function itself. For MultiGraph and MultiDiGraph, G[u][v] is a dictionary of all edges between u and v. In this case, the returned function sums the value of `edge_attr` for every edge between u and v. Parameters ---------- G : graph A NetworkX graph edge_attr : {None, str, callable} Specification of how the value of the edge attribute should be obtained from the edge attribute dictionary, G[u][v]. For multigraphs, G[u][v] is a dictionary of all the edges between u and v. This allows for special treatment of multiedges. Returns ------- value : function A function expecting two nodes as parameters. The nodes should represent the from- and to- node of an edge. The function will return a value from G[u][v] that depends on `edge_attr`. """ if edge_attr is None: # topological count of edges if G.is_multigraph(): value = lambda u,v: len(G[u][v]) else: value = lambda u,v: 1 elif not hasattr(edge_attr, '__call__'): # assume it is a key for the edge attribute dictionary if edge_attr == 'weight': # provide a default value if G.is_multigraph(): value = lambda u,v: sum([d.get(edge_attr, 1) for d in G[u][v].values()]) else: value = lambda u,v: G[u][v].get(edge_attr, 1) else: # otherwise, the edge attribute MUST exist for each edge if G.is_multigraph(): value = lambda u,v: sum([d[edge_attr] for d in G[u][v].values()]) else: value = lambda u,v: G[u][v][edge_attr] else: # Advanced: Allow users to specify something else. # # Alternative default value: # edge_attr = lambda u,v: G[u][v].get('thickness', .5) # # Function on an attribute: # edge_attr = lambda u,v: abs(G[u][v]['weight']) # # Handle Multi(Di)Graphs differently: # edge_attr = lambda u,v: numpy.prod([d['size'] for d in G[u][v].values()]) # # Ignore multiple edges # edge_attr = lambda u,v: 1 if len(G[u][v]) else 0 # value = edge_attr return value def attr_matrix(G, edge_attr=None, node_attr=None, normalized=False, rc_order=None, dtype=None, order=None): """Returns a NumPy matrix using attributes from G. If only `G` is passed in, then the adjacency matrix is constructed. Let A be a discrete set of values for the node attribute `node_attr`. Then the elements of A represent the rows and columns of the constructed matrix. Now, iterate through every edge e=(u,v) in `G` and consider the value of the edge attribute `edge_attr`. If ua and va are the values of the node attribute `node_attr` for u and v, respectively, then the value of the edge attribute is added to the matrix element at (ua, va). Parameters ---------- G : graph The NetworkX graph used to construct the NumPy matrix. edge_attr : str, optional Each element of the matrix represents a running total of the specified edge attribute for edges whose node attributes correspond to the rows/cols of the matirx. The attribute must be present for all edges in the graph. If no attribute is specified, then we just count the number of edges whose node attributes correspond to the matrix element. node_attr : str, optional Each row and column in the matrix represents a particular value of the node attribute. The attribute must be present for all nodes in the graph. Note, the values of this attribute should be reliably hashable. So, float values are not recommended. If no attribute is specified, then the rows and columns will be the nodes of the graph. normalized : bool, optional If True, then each row is normalized by the summation of its values. rc_order : list, optional A list of the node attribute values. This list specifies the ordering of rows and columns of the array. If no ordering is provided, then the ordering will be random (and also, a return value). Other Parameters ---------------- dtype : NumPy data-type, optional A valid NumPy dtype used to initialize the array. Keep in mind certain dtypes can yield unexpected results if the array is to be normalized. The parameter is passed to numpy.zeros(). If unspecified, the NumPy default is used. order : {'C', 'F'}, optional Whether to store multidimensional data in C- or Fortran-contiguous (row- or column-wise) order in memory. This parameter is passed to numpy.zeros(). If unspecified, the NumPy default is used. Returns ------- M : NumPy matrix The attribute matrix. ordering : list If `rc_order` was specified, then only the matrix is returned. However, if `rc_order` was None, then the ordering used to construct the matrix is returned as well. Examples -------- Construct an adjacency matrix: >>> G = nx.Graph() >>> G.add_edge(0,1,thickness=1,weight=3) >>> G.add_edge(0,2,thickness=2) >>> G.add_edge(1,2,thickness=3) >>> nx.attr_matrix(G, rc_order=[0,1,2]) matrix([[ 0., 1., 1.], [ 1., 0., 1.], [ 1., 1., 0.]]) Alternatively, we can obtain the matrix describing edge thickness. >>> nx.attr_matrix(G, edge_attr='thickness', rc_order=[0,1,2]) matrix([[ 0., 1., 2.], [ 1., 0., 3.], [ 2., 3., 0.]]) We can also color the nodes and ask for the probability distribution over all edges (u,v) describing: Pr(v has color Y | u has color X) >>> G.node[0]['color'] = 'red' >>> G.node[1]['color'] = 'red' >>> G.node[2]['color'] = 'blue' >>> rc = ['red', 'blue'] >>> nx.attr_matrix(G, node_attr='color', normalized=True, rc_order=rc) matrix([[ 0.33333333, 0.66666667], [ 1. , 0. ]]) For example, the above tells us that for all edges (u,v): Pr( v is red | u is red) = 1/3 Pr( v is blue | u is red) = 2/3 Pr( v is red | u is blue) = 1 Pr( v is blue | u is blue) = 0 Finally, we can obtain the total weights listed by the node colors. >>> nx.attr_matrix(G, edge_attr='weight', node_attr='color', rc_order=rc) matrix([[ 3., 2.], [ 2., 0.]]) Thus, the total weight over all edges (u,v) with u and v having colors: (red, red) is 3 # the sole contribution is from edge (0,1) (red, blue) is 2 # contributions from edges (0,2) and (1,2) (blue, red) is 2 # same as (red, blue) since graph is undirected (blue, blue) is 0 # there are no edges with blue endpoints """ try: import numpy as np except ImportError: raise ImportError( "attr_matrix() requires numpy: http://scipy.org/ ") edge_value = _edge_value(G, edge_attr) node_value = _node_value(G, node_attr) if rc_order is None: ordering = list(set([node_value(n) for n in G])) else: ordering = rc_order N = len(ordering) undirected = not G.is_directed() index = dict(zip(ordering, range(N))) M = np.zeros((N,N), dtype=dtype, order=order) seen = set([]) for u,nbrdict in G.adjacency_iter(): for v in nbrdict: # Obtain the node attribute values. i, j = index[node_value(u)], index[node_value(v)] if v not in seen: M[i,j] += edge_value(u,v) if undirected: M[j,i] = M[i,j] if undirected: seen.add(u) if normalized: M /= M.sum(axis=1).reshape((N,1)) M = np.asmatrix(M) if rc_order is None: return M, ordering else: return M def attr_sparse_matrix(G, edge_attr=None, node_attr=None, normalized=False, rc_order=None, dtype=None): """Returns a SciPy sparse matrix using attributes from G. If only `G` is passed in, then the adjacency matrix is constructed. Let A be a discrete set of values for the node attribute `node_attr`. Then the elements of A represent the rows and columns of the constructed matrix. Now, iterate through every edge e=(u,v) in `G` and consider the value of the edge attribute `edge_attr`. If ua and va are the values of the node attribute `node_attr` for u and v, respectively, then the value of the edge attribute is added to the matrix element at (ua, va). Parameters ---------- G : graph The NetworkX graph used to construct the NumPy matrix. edge_attr : str, optional Each element of the matrix represents a running total of the specified edge attribute for edges whose node attributes correspond to the rows/cols of the matirx. The attribute must be present for all edges in the graph. If no attribute is specified, then we just count the number of edges whose node attributes correspond to the matrix element. node_attr : str, optional Each row and column in the matrix represents a particular value of the node attribute. The attribute must be present for all nodes in the graph. Note, the values of this attribute should be reliably hashable. So, float values are not recommended. If no attribute is specified, then the rows and columns will be the nodes of the graph. normalized : bool, optional If True, then each row is normalized by the summation of its values. rc_order : list, optional A list of the node attribute values. This list specifies the ordering of rows and columns of the array. If no ordering is provided, then the ordering will be random (and also, a return value). Other Parameters ---------------- dtype : NumPy data-type, optional A valid NumPy dtype used to initialize the array. Keep in mind certain dtypes can yield unexpected results if the array is to be normalized. The parameter is passed to numpy.zeros(). If unspecified, the NumPy default is used. Returns ------- M : SciPy sparse matrix The attribute matrix. ordering : list If `rc_order` was specified, then only the matrix is returned. However, if `rc_order` was None, then the ordering used to construct the matrix is returned as well. Examples -------- Construct an adjacency matrix: >>> G = nx.Graph() >>> G.add_edge(0,1,thickness=1,weight=3) >>> G.add_edge(0,2,thickness=2) >>> G.add_edge(1,2,thickness=3) >>> M = nx.attr_sparse_matrix(G, rc_order=[0,1,2]) >>> M.todense() matrix([[ 0., 1., 1.], [ 1., 0., 1.], [ 1., 1., 0.]]) Alternatively, we can obtain the matrix describing edge thickness. >>> M = nx.attr_sparse_matrix(G, edge_attr='thickness', rc_order=[0,1,2]) >>> M.todense() matrix([[ 0., 1., 2.], [ 1., 0., 3.], [ 2., 3., 0.]]) We can also color the nodes and ask for the probability distribution over all edges (u,v) describing: Pr(v has color Y | u has color X) >>> G.node[0]['color'] = 'red' >>> G.node[1]['color'] = 'red' >>> G.node[2]['color'] = 'blue' >>> rc = ['red', 'blue'] >>> M = nx.attr_sparse_matrix(G, node_attr='color', \ normalized=True, rc_order=rc) >>> M.todense() matrix([[ 0.33333333, 0.66666667], [ 1. , 0. ]]) For example, the above tells us that for all edges (u,v): Pr( v is red | u is red) = 1/3 Pr( v is blue | u is red) = 2/3 Pr( v is red | u is blue) = 1 Pr( v is blue | u is blue) = 0 Finally, we can obtain the total weights listed by the node colors. >>> M = nx.attr_sparse_matrix(G, edge_attr='weight',\ node_attr='color', rc_order=rc) >>> M.todense() matrix([[ 3., 2.], [ 2., 0.]]) Thus, the total weight over all edges (u,v) with u and v having colors: (red, red) is 3 # the sole contribution is from edge (0,1) (red, blue) is 2 # contributions from edges (0,2) and (1,2) (blue, red) is 2 # same as (red, blue) since graph is undirected (blue, blue) is 0 # there are no edges with blue endpoints """ try: import numpy as np from scipy import sparse except ImportError: raise ImportError( "attr_sparse_matrix() requires scipy: http://scipy.org/ ") edge_value = _edge_value(G, edge_attr) node_value = _node_value(G, node_attr) if rc_order is None: ordering = list(set([node_value(n) for n in G])) else: ordering = rc_order N = len(ordering) undirected = not G.is_directed() index = dict(zip(ordering, range(N))) M = sparse.lil_matrix((N,N), dtype=dtype) seen = set([]) for u,nbrdict in G.adjacency_iter(): for v in nbrdict: # Obtain the node attribute values. i, j = index[node_value(u)], index[node_value(v)] if v not in seen: M[i,j] += edge_value(u,v) if undirected: M[j,i] = M[i,j] if undirected: seen.add(u) if normalized: norms = np.asarray(M.sum(axis=1)).ravel() for i,norm in enumerate(norms): M[i,:] /= norm if rc_order is None: return M, ordering else: return M # fixture for nose tests def setup_module(module): from nose import SkipTest try: import numpy except: raise SkipTest("NumPy not available") try: import scipy except: raise SkipTest("SciPy not available") networkx-1.11/networkx/linalg/__init__.py0000644000175000017500000000067712637544450020457 0ustar aricaric00000000000000from networkx.linalg.attrmatrix import * import networkx.linalg.attrmatrix from networkx.linalg.spectrum import * import networkx.linalg.spectrum from networkx.linalg.graphmatrix import * import networkx.linalg.graphmatrix from networkx.linalg.laplacianmatrix import * import networkx.linalg.laplacianmatrix from networkx.linalg.algebraicconnectivity import * from networkx.linalg.modularitymatrix import * import networkx.linalg.modularitymatrix networkx-1.11/networkx/linalg/modularitymatrix.py0000644000175000017500000001051612637544500022323 0ustar aricaric00000000000000"""Modularity matrix of graphs. """ # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. from __future__ import division import networkx as nx from networkx.utils import not_implemented_for __author__ = "\n".join(['Aric Hagberg ', 'Pieter Swart (swart@lanl.gov)', 'Dan Schult (dschult@colgate.edu)', 'Jean-Gabriel Young (Jean.gabriel.young@gmail.com)']) __all__ = ['modularity_matrix', 'directed_modularity_matrix'] @not_implemented_for('directed') @not_implemented_for('multigraph') def modularity_matrix(G, nodelist=None): """Return the modularity matrix of G. The modularity matrix is the matrix B = A - , where A is the adjacency matrix and is the average adjacency matrix, assuming that the graph is described by the configuration model. More specifically, the element B_ij of B is defined as A_ij - k_i k_j/m where k_i(in) is the degree of node i, and were m is the number of edges in the graph. Parameters ---------- G : Graph A NetworkX graph nodelist : list, optional The rows and columns are ordered according to the nodes in nodelist. If nodelist is None, then the ordering is produced by G.nodes(). Returns ------- B : Numpy matrix The modularity matrix of G. Examples -------- >>> import networkx as nx >>> k =[3, 2, 2, 1, 0] >>> G = nx.havel_hakimi_graph(k) >>> B = nx.modularity_matrix(G) See Also -------- to_numpy_matrix adjacency_matrix laplacian_matrix directed_modularity_matrix References ---------- .. [1] M. E. J. Newman, "Modularity and community structure in networks", Proc. Natl. Acad. Sci. USA, vol. 103, pp. 8577-8582, 2006. """ if nodelist is None: nodelist = G.nodes() A = nx.to_scipy_sparse_matrix(G, nodelist=nodelist, format='csr') k = A.sum(axis=1) m = G.number_of_edges() # Expected adjacency matrix X = k * k.transpose() / (2 * m) return A - X @not_implemented_for('undirected') @not_implemented_for('multigraph') def directed_modularity_matrix(G, nodelist=None): """Return the directed modularity matrix of G. The modularity matrix is the matrix B = A - , where A is the adjacency matrix and is the expected adjacency matrix, assuming that the graph is described by the configuration model. More specifically, the element B_ij of B is defined as B_ij = A_ij - k_i(out) k_j(in)/m where k_i(in) is the in degree of node i, and k_j(out) is the out degree of node j, with m the number of edges in the graph. Parameters ---------- G : DiGraph A NetworkX DiGraph nodelist : list, optional The rows and columns are ordered according to the nodes in nodelist. If nodelist is None, then the ordering is produced by G.nodes(). Returns ------- B : Numpy matrix The modularity matrix of G. Examples -------- >>> import networkx as nx >>> G = nx.DiGraph() >>> G.add_edges_from(((1,2), (1,3), (3,1), (3,2), (3,5), (4,5), (4,6), ... (5,4), (5,6), (6,4))) >>> B = nx.directed_modularity_matrix(G) Notes ----- NetworkX defines the element A_ij of the adjacency matrix as 1 if there is a link going from node i to node j. Leicht and Newman use the opposite definition. This explains the different expression for B_ij. See Also -------- to_numpy_matrix adjacency_matrix laplacian_matrix modularity_matrix References ---------- .. [1] E. A. Leicht, M. E. J. Newman, "Community structure in directed networks", Phys. Rev Lett., vol. 100, no. 11, p. 118703, 2008. """ if nodelist is None: nodelist = G.nodes() A = nx.to_scipy_sparse_matrix(G, nodelist=nodelist, format='csr') k_in = A.sum(axis=0) k_out = A.sum(axis=1) m = G.number_of_edges() # Expected adjacency matrix X = k_out * k_in / m return A - X # fixture for nose tests def setup_module(module): from nose import SkipTest try: import numpy import scipy except: raise SkipTest("NumPy not available") networkx-1.11/networkx/linalg/tests/0000755000175000017500000000000012653231454017471 5ustar aricaric00000000000000networkx-1.11/networkx/linalg/tests/test_modularity.py0000644000175000017500000000453012637544450023302 0ustar aricaric00000000000000from nose import SkipTest import networkx as nx from networkx.generators.degree_seq import havel_hakimi_graph class TestModularity(object): numpy = 1 # nosetests attribute, use nosetests -a 'not numpy' to skip test @classmethod def setupClass(cls): global numpy global scipy global assert_equal global assert_almost_equal try: import numpy import scipy from numpy.testing import assert_equal, assert_almost_equal except ImportError: raise SkipTest('SciPy not available.') def setUp(self): deg = [3, 2, 2, 1, 0] self.G = havel_hakimi_graph(deg) # Graph used as an example in Sec. 4.1 of Langville and Meyer, # "Google's PageRank and Beyond". (Used for test_directed_laplacian) self.DG = nx.DiGraph() self.DG.add_edges_from(((1,2), (1,3), (3,1), (3,2), (3,5), (4,5), (4,6), (5,4), (5,6), (6,4))) def test_modularity(self): "Modularity matrix" B = numpy.matrix([[-1.125, 0.25 , 0.25 , 0.625, 0. ], [ 0.25 , -0.5 , 0.5 , -0.25 , 0. ], [ 0.25 , 0.5 , -0.5 , -0.25 , 0. ], [ 0.625, -0.25 , -0.25 , -0.125, 0. ], [ 0. , 0. , 0. , 0. , 0. ]]) permutation = [4, 0, 1, 2, 3] assert_equal(nx.modularity_matrix(self.G), B) assert_equal(nx.modularity_matrix(self.G, nodelist=permutation), B[numpy.ix_(permutation, permutation)]) def test_directed_modularity(self): "Directed Modularity matrix" B = numpy.matrix([[-0.2, 0.6, 0.8, -0.4, -0.4, -0.4], [ 0. , 0. , 0. , 0. , 0. , 0. ], [ 0.7, 0.4, -0.3, -0.6, 0.4, -0.6], [-0.2, -0.4, -0.2, -0.4, 0.6, 0.6], [-0.2, -0.4, -0.2, 0.6, -0.4, 0.6], [-0.1, -0.2, -0.1, 0.8, -0.2, -0.2]]) node_permutation = [5, 1, 2, 3, 4, 6] idx_permutation = [4, 0, 1, 2, 3, 5] assert_equal(nx.directed_modularity_matrix(self.DG), B) assert_equal(nx.directed_modularity_matrix(self.DG, nodelist=node_permutation), B[numpy.ix_(idx_permutation, idx_permutation)]) networkx-1.11/networkx/linalg/tests/test_algebraic_connectivity.py0000644000175000017500000002504612637544450025625 0ustar aricaric00000000000000from contextlib import contextmanager from math import sqrt import networkx as nx from nose import SkipTest from nose.tools import * methods = ('tracemin_pcg', 'tracemin_chol', 'tracemin_lu', 'lanczos', 'lobpcg') try: from numpy.random import get_state, seed, set_state, shuffle @contextmanager def save_random_state(): state = get_state() try: yield finally: set_state(state) def preserve_random_state(func): def wrapper(*args, **kwargs): with save_random_state(): seed(1234567890) return func(*args, **kwargs) wrapper.__name__ = func.__name__ return wrapper except ImportError: @contextmanager def save_random_state(): yield def preserve_random_state(func): return func def check_eigenvector(A, l, x): nx = numpy.linalg.norm(x) # Check zeroness. assert_not_almost_equal(nx, 0) y = A * x ny = numpy.linalg.norm(y) # Check collinearity. assert_almost_equal(numpy.dot(x, y), nx * ny) # Check eigenvalue. assert_almost_equal(ny, l * nx) class TestAlgebraicConnectivity(object): numpy = 1 @classmethod def setupClass(cls): global numpy try: import numpy.linalg import scipy.sparse except ImportError: raise SkipTest('SciPy not available.') @preserve_random_state def test_directed(self): G = nx.DiGraph() for method in self._methods: assert_raises(nx.NetworkXNotImplemented, nx.algebraic_connectivity, G, method=method) assert_raises(nx.NetworkXNotImplemented, nx.fiedler_vector, G, method=method) @preserve_random_state def test_null_and_singleton(self): G = nx.Graph() for method in self._methods: assert_raises(nx.NetworkXError, nx.algebraic_connectivity, G, method=method) assert_raises(nx.NetworkXError, nx.fiedler_vector, G, method=method) G.add_edge(0, 0) for method in self._methods: assert_raises(nx.NetworkXError, nx.algebraic_connectivity, G, method=method) assert_raises(nx.NetworkXError, nx.fiedler_vector, G, method=method) @preserve_random_state def test_disconnected(self): G = nx.Graph() G.add_nodes_from(range(2)) for method in self._methods: assert_equal(nx.algebraic_connectivity(G), 0) assert_raises(nx.NetworkXError, nx.fiedler_vector, G, method=method) G.add_edge(0, 1, weight=0) for method in self._methods: assert_equal(nx.algebraic_connectivity(G), 0) assert_raises(nx.NetworkXError, nx.fiedler_vector, G, method=method) @preserve_random_state def test_unrecognized_method(self): G = nx.path_graph(4) assert_raises(nx.NetworkXError, nx.algebraic_connectivity, G, method='unknown') assert_raises(nx.NetworkXError, nx.fiedler_vector, G, method='unknown') @preserve_random_state def test_two_nodes(self): G = nx.Graph() G.add_edge(0, 1, weight=1) A = nx.laplacian_matrix(G) for method in self._methods: assert_almost_equal(nx.algebraic_connectivity( G, tol=1e-12, method=method), 2) x = nx.fiedler_vector(G, tol=1e-12, method=method) check_eigenvector(A, 2, x) G = nx.MultiGraph() G.add_edge(0, 0, spam=1e8) G.add_edge(0, 1, spam=1) G.add_edge(0, 1, spam=-2) A = -3 * nx.laplacian_matrix(G, weight='spam') for method in self._methods: assert_almost_equal(nx.algebraic_connectivity( G, weight='spam', tol=1e-12, method=method), 6) x = nx.fiedler_vector(G, weight='spam', tol=1e-12, method=method) check_eigenvector(A, 6, x) @preserve_random_state def test_path(self): G = nx.path_graph(8) A = nx.laplacian_matrix(G) sigma = 2 - sqrt(2 + sqrt(2)) for method in self._methods: assert_almost_equal(nx.algebraic_connectivity( G, tol=1e-12, method=method), sigma) x = nx.fiedler_vector(G, tol=1e-12, method=method) check_eigenvector(A, sigma, x) @preserve_random_state def test_cycle(self): G = nx.cycle_graph(8) A = nx.laplacian_matrix(G) sigma = 2 - sqrt(2) for method in self._methods: assert_almost_equal(nx.algebraic_connectivity( G, tol=1e-12, method=method), sigma) x = nx.fiedler_vector(G, tol=1e-12, method=method) check_eigenvector(A, sigma, x) @preserve_random_state def test_buckminsterfullerene(self): G = nx.Graph( [(1, 10), (1, 41), (1, 59), (2, 12), (2, 42), (2, 60), (3, 6), (3, 43), (3, 57), (4, 8), (4, 44), (4, 58), (5, 13), (5, 56), (5, 57), (6, 10), (6, 31), (7, 14), (7, 56), (7, 58), (8, 12), (8, 32), (9, 23), (9, 53), (9, 59), (10, 15), (11, 24), (11, 53), (11, 60), (12, 16), (13, 14), (13, 25), (14, 26), (15, 27), (15, 49), (16, 28), (16, 50), (17, 18), (17, 19), (17, 54), (18, 20), (18, 55), (19, 23), (19, 41), (20, 24), (20, 42), (21, 31), (21, 33), (21, 57), (22, 32), (22, 34), (22, 58), (23, 24), (25, 35), (25, 43), (26, 36), (26, 44), (27, 51), (27, 59), (28, 52), (28, 60), (29, 33), (29, 34), (29, 56), (30, 51), (30, 52), (30, 53), (31, 47), (32, 48), (33, 45), (34, 46), (35, 36), (35, 37), (36, 38), (37, 39), (37, 49), (38, 40), (38, 50), (39, 40), (39, 51), (40, 52), (41, 47), (42, 48), (43, 49), (44, 50), (45, 46), (45, 54), (46, 55), (47, 54), (48, 55)]) for normalized in (False, True): if not normalized: A = nx.laplacian_matrix(G) sigma = 0.2434017461399311 else: A = nx.normalized_laplacian_matrix(G) sigma = 0.08113391537997749 for method in methods: try: assert_almost_equal(nx.algebraic_connectivity( G, normalized=normalized, tol=1e-12, method=method), sigma) x = nx.fiedler_vector(G, normalized=normalized, tol=1e-12, method=method) check_eigenvector(A, sigma, x) except nx.NetworkXError as e: if e.args not in (('Cholesky solver unavailable.',), ('LU solver unavailable.',)): raise _methods = ('tracemin', 'lanczos', 'lobpcg') class TestSpectralOrdering(object): numpy = 1 @classmethod def setupClass(cls): global numpy try: import numpy.linalg import scipy.sparse except ImportError: raise SkipTest('SciPy not available.') @preserve_random_state def test_nullgraph(self): for graph in (nx.Graph, nx.DiGraph, nx.MultiGraph, nx.MultiDiGraph): G = graph() assert_raises(nx.NetworkXError, nx.spectral_ordering, G) @preserve_random_state def test_singleton(self): for graph in (nx.Graph, nx.DiGraph, nx.MultiGraph, nx.MultiDiGraph): G = graph() G.add_node('x') assert_equal(nx.spectral_ordering(G), ['x']) G.add_edge('x', 'x', weight=33) G.add_edge('x', 'x', weight=33) assert_equal(nx.spectral_ordering(G), ['x']) @preserve_random_state def test_unrecognized_method(self): G = nx.path_graph(4) assert_raises(nx.NetworkXError, nx.spectral_ordering, G, method='unknown') @preserve_random_state def test_three_nodes(self): G = nx.Graph() G.add_weighted_edges_from([(1, 2, 1), (1, 3, 2), (2, 3, 1)], weight='spam') for method in self._methods: order = nx.spectral_ordering(G, weight='spam', method=method) assert_equal(set(order), set(G)) ok_(set([1, 3]) in (set(order[:-1]), set(order[1:]))) G = nx.MultiDiGraph() G.add_weighted_edges_from([(1, 2, 1), (1, 3, 2), (2, 3, 1), (2, 3, 2)]) for method in self._methods: order = nx.spectral_ordering(G, method=method) assert_equal(set(order), set(G)) ok_(set([2, 3]) in (set(order[:-1]), set(order[1:]))) @preserve_random_state def test_path(self): path = list(range(10)) shuffle(path) G = nx.Graph() G.add_path(path) for method in self._methods: order = nx.spectral_ordering(G, method=method) ok_(order in [path, list(reversed(path))]) @preserve_random_state def test_disconnected(self): G = nx.Graph() G.add_path(range(0, 10, 2)) G.add_path(range(1, 10, 2)) for method in self._methods: order = nx.spectral_ordering(G, method=method) assert_equal(set(order), set(G)) seqs = [list(range(0, 10, 2)), list(range(8, -1, -2)), list(range(1, 10, 2)), list(range(9, -1, -2))] ok_(order[:5] in seqs) ok_(order[5:] in seqs) @preserve_random_state def test_cycle(self): path = list(range(10)) G = nx.Graph() G.add_path(path, weight=5) G.add_edge(path[-1], path[0], weight=1) A = nx.laplacian_matrix(G).todense() for normalized in (False, True): for method in methods: try: order = nx.spectral_ordering(G, normalized=normalized, method=method) except nx.NetworkXError as e: if e.args not in (('Cholesky solver unavailable.',), ('LU solver unavailable.',)): raise else: if not normalized: ok_(order in [[1, 2, 0, 3, 4, 5, 6, 9, 7, 8], [8, 7, 9, 6, 5, 4, 3, 0, 2, 1]]) else: ok_(order in [[1, 2, 3, 0, 4, 5, 9, 6, 7, 8], [8, 7, 6, 9, 5, 4, 0, 3, 2, 1]]) _methods = ('tracemin', 'lanczos', 'lobpcg') networkx-1.11/networkx/linalg/tests/test_laplacian.py0000644000175000017500000001252512637544500023034 0ustar aricaric00000000000000from nose import SkipTest import networkx as nx from networkx.generators.degree_seq import havel_hakimi_graph class TestLaplacian(object): numpy=1 # nosetests attribute, use nosetests -a 'not numpy' to skip test @classmethod def setupClass(cls): global numpy global scipy global assert_equal global assert_almost_equal try: import numpy import scipy from numpy.testing import assert_equal,assert_almost_equal except ImportError: raise SkipTest('SciPy not available.') def setUp(self): deg=[3,2,2,1,0] self.G=havel_hakimi_graph(deg) self.WG=nx.Graph( (u,v,{'weight':0.5,'other':0.3}) for (u,v) in self.G.edges_iter() ) self.WG.add_node(4) self.MG=nx.MultiGraph(self.G) # Graph with selfloops self.Gsl = self.G.copy() for node in self.Gsl.nodes(): self.Gsl.add_edge(node, node) def test_laplacian(self): "Graph Laplacian" NL=numpy.array([[ 3, -1, -1, -1, 0], [-1, 2, -1, 0, 0], [-1, -1, 2, 0, 0], [-1, 0, 0, 1, 0], [ 0, 0, 0, 0, 0]]) WL=0.5*NL OL=0.3*NL assert_equal(nx.laplacian_matrix(self.G).todense(),NL) assert_equal(nx.laplacian_matrix(self.MG).todense(),NL) assert_equal(nx.laplacian_matrix(self.G,nodelist=[0,1]).todense(), numpy.array([[ 1, -1],[-1, 1]])) assert_equal(nx.laplacian_matrix(self.WG).todense(),WL) assert_equal(nx.laplacian_matrix(self.WG,weight=None).todense(),NL) assert_equal(nx.laplacian_matrix(self.WG,weight='other').todense(),OL) def test_normalized_laplacian(self): "Generalized Graph Laplacian" GL=numpy.array([[ 1.00, -0.408, -0.408, -0.577, 0.00], [-0.408, 1.00, -0.50, 0.00 , 0.00], [-0.408, -0.50, 1.00, 0.00, 0.00], [-0.577, 0.00, 0.00, 1.00, 0.00], [ 0.00, 0.00, 0.00, 0.00, 0.00]]) Lsl = numpy.array([[ 0.75 , -0.2887, -0.2887, -0.3536, 0.], [-0.2887, 0.6667, -0.3333, 0. , 0.], [-0.2887, -0.3333, 0.6667, 0. , 0.], [-0.3536, 0. , 0. , 0.5 , 0.], [ 0. , 0. , 0. , 0. , 0.]]) assert_almost_equal(nx.normalized_laplacian_matrix(self.G).todense(), GL,decimal=3) assert_almost_equal(nx.normalized_laplacian_matrix(self.MG).todense(), GL,decimal=3) assert_almost_equal(nx.normalized_laplacian_matrix(self.WG).todense(), GL,decimal=3) assert_almost_equal(nx.normalized_laplacian_matrix(self.WG,weight='other').todense(), GL,decimal=3) assert_almost_equal(nx.normalized_laplacian_matrix(self.Gsl).todense(), Lsl, decimal=3) def test_directed_laplacian(self): "Directed Laplacian" # Graph used as an example in Sec. 4.1 of Langville and Meyer, # "Google's PageRank and Beyond". The graph contains dangling nodes, so # the pagerank random walk is selected by directed_laplacian G = nx.DiGraph() G.add_edges_from(((1,2), (1,3), (3,1), (3,2), (3,5), (4,5), (4,6), (5,4), (5,6), (6,4))) GL = numpy.array([[ 0.9833, -0.2941, -0.3882, -0.0291, -0.0231, -0.0261], [-0.2941, 0.8333, -0.2339, -0.0536, -0.0589, -0.0554], [-0.3882, -0.2339, 0.9833, -0.0278, -0.0896, -0.0251], [-0.0291, -0.0536, -0.0278, 0.9833, -0.4878, -0.6675], [-0.0231, -0.0589, -0.0896, -0.4878, 0.9833, -0.2078], [-0.0261, -0.0554, -0.0251, -0.6675, -0.2078, 0.9833]]) assert_almost_equal(nx.directed_laplacian_matrix(G, alpha=0.9), GL, decimal=3) # Make the graph strongly connected, so we can use a random and lazy walk G.add_edges_from((((2,5), (6,1)))) GL = numpy.array([[ 1. , -0.3062, -0.4714, 0. , 0. , -0.3227], [-0.3062, 1. , -0.1443, 0. , -0.3162, 0. ], [-0.4714, -0.1443, 1. , 0. , -0.0913, 0. ], [ 0. , 0. , 0. , 1. , -0.5 , -0.5 ], [ 0. , -0.3162, -0.0913, -0.5 , 1. , -0.25 ], [-0.3227, 0. , 0. , -0.5 , -0.25 , 1. ]]) assert_almost_equal(nx.directed_laplacian_matrix(G, walk_type='random'), GL, decimal=3) GL = numpy.array([[ 0.5 , -0.1531, -0.2357, 0. , 0. , -0.1614], [-0.1531, 0.5 , -0.0722, 0. , -0.1581, 0. ], [-0.2357, -0.0722, 0.5 , 0. , -0.0456, 0. ], [ 0. , 0. , 0. , 0.5 , -0.25 , -0.25 ], [ 0. , -0.1581, -0.0456, -0.25 , 0.5 , -0.125 ], [-0.1614, 0. , 0. , -0.25 , -0.125 , 0.5 ]]) assert_almost_equal(nx.directed_laplacian_matrix(G, walk_type='lazy'), GL, decimal=3) networkx-1.11/networkx/linalg/tests/test_spectrum.py0000644000175000017500000000365212637544500022753 0ustar aricaric00000000000000from nose import SkipTest import networkx as nx from networkx.generators.degree_seq import havel_hakimi_graph class TestSpectrum(object): numpy=1 # nosetests attribute, use nosetests -a 'not numpy' to skip test @classmethod def setupClass(cls): global numpy global assert_equal global assert_almost_equal try: import numpy import scipy from numpy.testing import assert_equal,assert_almost_equal except ImportError: raise SkipTest('SciPy not available.') def setUp(self): deg=[3,2,2,1,0] self.G=havel_hakimi_graph(deg) self.P=nx.path_graph(3) self.WG=nx.Graph( (u,v,{'weight':0.5,'other':0.3}) for (u,v) in self.G.edges_iter() ) self.WG.add_node(4) self.DG=nx.DiGraph() self.DG.add_path([0,1,2]) def test_laplacian_spectrum(self): "Laplacian eigenvalues" evals=numpy.array([0, 0, 1, 3, 4]) e=sorted(nx.laplacian_spectrum(self.G)) assert_almost_equal(e,evals) e=sorted(nx.laplacian_spectrum(self.WG,weight=None)) assert_almost_equal(e,evals) e=sorted(nx.laplacian_spectrum(self.WG)) assert_almost_equal(e,0.5*evals) e=sorted(nx.laplacian_spectrum(self.WG,weight='other')) assert_almost_equal(e,0.3*evals) def test_adjacency_spectrum(self): "Adjacency eigenvalues" evals=numpy.array([-numpy.sqrt(2), 0, numpy.sqrt(2)]) e=sorted(nx.adjacency_spectrum(self.P)) assert_almost_equal(e,evals) def test_modularity_spectrum(self): "Modularity eigenvalues" evals=numpy.array([-1.5, 0., 0.]) e=sorted(nx.modularity_spectrum(self.P)) assert_almost_equal(e,evals) # Directed modularity eigenvalues evals=numpy.array([-0.5, 0., 0.]) e=sorted(nx.modularity_spectrum(self.DG)) assert_almost_equal(e,evals) networkx-1.11/networkx/linalg/tests/test_graphmatrix.py0000644000175000017500000001030412637544500023427 0ustar aricaric00000000000000from nose import SkipTest import networkx as nx from networkx.generators.degree_seq import havel_hakimi_graph class TestGraphMatrix(object): numpy=1 # nosetests attribute, use nosetests -a 'not numpy' to skip test @classmethod def setupClass(cls): global numpy global assert_equal global assert_almost_equal try: import numpy import scipy from numpy.testing import assert_equal,assert_almost_equal except ImportError: raise SkipTest('SciPy not available.') def setUp(self): deg=[3,2,2,1,0] self.G=havel_hakimi_graph(deg) self.OI=numpy.array([[-1, -1, -1, 0], [1, 0, 0, -1], [0, 1, 0, 1], [0, 0, 1, 0], [0, 0, 0, 0]]) self.A=numpy.array([[0, 1, 1, 1, 0], [1, 0, 1, 0, 0], [1, 1, 0, 0, 0], [1, 0, 0, 0, 0], [0, 0, 0, 0, 0]]) self.WG=nx.Graph( (u,v,{'weight':0.5,'other':0.3}) for (u,v) in self.G.edges_iter() ) self.WG.add_node(4) self.WA=numpy.array([[0 , 0.5, 0.5, 0.5, 0], [0.5, 0 , 0.5, 0 , 0], [0.5, 0.5, 0 , 0 , 0], [0.5, 0 , 0 , 0 , 0], [0 , 0 , 0 , 0 , 0]]) self.MG=nx.MultiGraph(self.G) self.MG2=self.MG.copy() self.MG2.add_edge(0,1) self.MG2A=numpy.array([[0, 2, 1, 1, 0], [2, 0, 1, 0, 0], [1, 1, 0, 0, 0], [1, 0, 0, 0, 0], [0, 0, 0, 0, 0]]) self.MGOI=numpy.array([[-1, -1, -1, -1, 0], [1, 1, 0, 0, -1], [0, 0, 1, 0, 1], [0, 0, 0, 1, 0], [0, 0, 0, 0, 0]]) def test_incidence_matrix(self): "Conversion to incidence matrix" assert_equal(nx.incidence_matrix(self.G,oriented=True).todense(),self.OI) assert_equal(nx.incidence_matrix(self.G).todense(),numpy.abs(self.OI)) assert_equal(nx.incidence_matrix(self.MG,oriented=True).todense(),self.OI) assert_equal(nx.incidence_matrix(self.MG).todense(),numpy.abs(self.OI)) assert_equal(nx.incidence_matrix(self.MG2,oriented=True).todense(),self.MGOI) assert_equal(nx.incidence_matrix(self.MG2).todense(),numpy.abs(self.MGOI)) assert_equal(nx.incidence_matrix(self.WG,oriented=True).todense(),self.OI) assert_equal(nx.incidence_matrix(self.WG).todense(),numpy.abs(self.OI)) assert_equal(nx.incidence_matrix(self.WG,oriented=True, weight='weight').todense(),0.5*self.OI) assert_equal(nx.incidence_matrix(self.WG,weight='weight').todense(), numpy.abs(0.5*self.OI)) assert_equal(nx.incidence_matrix(self.WG,oriented=True,weight='other').todense(), 0.3*self.OI) WMG=nx.MultiGraph(self.WG) WMG.add_edge(0,1,attr_dict={'weight':0.5,'other':0.3}) assert_equal(nx.incidence_matrix(WMG,weight='weight').todense(), numpy.abs(0.5*self.MGOI)) assert_equal(nx.incidence_matrix(WMG,weight='weight',oriented=True).todense(), 0.5*self.MGOI) assert_equal(nx.incidence_matrix(WMG,weight='other',oriented=True).todense(), 0.3*self.MGOI) def test_adjacency_matrix(self): "Conversion to adjacency matrix" assert_equal(nx.adj_matrix(self.G).todense(),self.A) assert_equal(nx.adj_matrix(self.MG).todense(),self.A) assert_equal(nx.adj_matrix(self.MG2).todense(),self.MG2A) assert_equal(nx.adj_matrix(self.G,nodelist=[0,1]).todense(),self.A[:2,:2]) assert_equal(nx.adj_matrix(self.WG).todense(),self.WA) assert_equal(nx.adj_matrix(self.WG,weight=None).todense(),self.A) assert_equal(nx.adj_matrix(self.MG2,weight=None).todense(),self.MG2A) assert_equal(nx.adj_matrix(self.WG,weight='other').todense(),0.6*self.WA) networkx-1.11/networkx/linalg/graphmatrix.py0000644000175000017500000001264712637544500021242 0ustar aricaric00000000000000""" Adjacency matrix and incidence matrix of graphs. """ # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import networkx as nx __author__ = "\n".join(['Aric Hagberg (hagberg@lanl.gov)', 'Pieter Swart (swart@lanl.gov)', 'Dan Schult(dschult@colgate.edu)']) __all__ = ['incidence_matrix', 'adj_matrix', 'adjacency_matrix', ] def incidence_matrix(G, nodelist=None, edgelist=None, oriented=False, weight=None): """Return incidence matrix of G. The incidence matrix assigns each row to a node and each column to an edge. For a standard incidence matrix a 1 appears wherever a row's node is incident on the column's edge. For an oriented incidence matrix each edge is assigned an orientation (arbitrarily for undirected and aligning to direction for directed). A -1 appears for the tail of an edge and 1 for the head of the edge. The elements are zero otherwise. Parameters ---------- G : graph A NetworkX graph nodelist : list, optional (default= all nodes in G) The rows are ordered according to the nodes in nodelist. If nodelist is None, then the ordering is produced by G.nodes(). edgelist : list, optional (default= all edges in G) The columns are ordered according to the edges in edgelist. If edgelist is None, then the ordering is produced by G.edges(). oriented: bool, optional (default=False) If True, matrix elements are +1 or -1 for the head or tail node respectively of each edge. If False, +1 occurs at both nodes. weight : string or None, optional (default=None) The edge data key used to provide each value in the matrix. If None, then each edge has weight 1. Edge weights, if used, should be positive so that the orientation can provide the sign. Returns ------- A : SciPy sparse matrix The incidence matrix of G. Notes ----- For MultiGraph/MultiDiGraph, the edges in edgelist should be (u,v,key) 3-tuples. "Networks are the best discrete model for so many problems in applied mathematics" [1]_. References ---------- .. [1] Gil Strang, Network applications: A = incidence matrix, http://academicearth.org/lectures/network-applications-incidence-matrix """ import scipy.sparse if nodelist is None: nodelist = G.nodes() if edgelist is None: if G.is_multigraph(): edgelist = G.edges(keys=True) else: edgelist = G.edges() A = scipy.sparse.lil_matrix((len(nodelist),len(edgelist))) node_index = dict( (node,i) for i,node in enumerate(nodelist) ) for ei,e in enumerate(edgelist): (u,v) = e[:2] if u == v: continue # self loops give zero column try: ui = node_index[u] vi = node_index[v] except KeyError: raise NetworkXError('node %s or %s in edgelist ' 'but not in nodelist"%(u,v)') if weight is None: wt = 1 else: if G.is_multigraph(): ekey = e[2] wt = G[u][v][ekey].get(weight,1) else: wt = G[u][v].get(weight,1) if oriented: A[ui,ei] = -wt A[vi,ei] = wt else: A[ui,ei] = wt A[vi,ei] = wt return A.asformat('csc') def adjacency_matrix(G, nodelist=None, weight='weight'): """Return adjacency matrix of G. Parameters ---------- G : graph A NetworkX graph nodelist : list, optional The rows and columns are ordered according to the nodes in nodelist. If nodelist is None, then the ordering is produced by G.nodes(). weight : string or None, optional (default='weight') The edge data key used to provide each value in the matrix. If None, then each edge has weight 1. Returns ------- A : SciPy sparse matrix Adjacency matrix representation of G. Notes ----- For directed graphs, entry i,j corresponds to an edge from i to j. If you want a pure Python adjacency matrix representation try networkx.convert.to_dict_of_dicts which will return a dictionary-of-dictionaries format that can be addressed as a sparse matrix. For MultiGraph/MultiDiGraph with parallel edges the weights are summed. See to_numpy_matrix for other options. The convention used for self-loop edges in graphs is to assign the diagonal matrix entry value to the edge weight attribute (or the number 1 if the edge has no weight attribute). If the alternate convention of doubling the edge weight is desired the resulting Scipy sparse matrix can be modified as follows: >>> import scipy as sp >>> G = nx.Graph([(1,1)]) >>> A = nx.adjacency_matrix(G) >>> print(A.todense()) [[1]] >>> A.setdiag(A.diagonal()*2) >>> print(A.todense()) [[2]] See Also -------- to_numpy_matrix to_scipy_sparse_matrix to_dict_of_dicts """ return nx.to_scipy_sparse_matrix(G,nodelist=nodelist,weight=weight) adj_matrix=adjacency_matrix # fixture for nose tests def setup_module(module): from nose import SkipTest try: import scipy except: raise SkipTest("SciPy not available") networkx-1.11/networkx/linalg/laplacianmatrix.py0000644000175000017500000001711112637544500022054 0ustar aricaric00000000000000"""Laplacian matrix of graphs. """ # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import networkx as nx from networkx.utils import not_implemented_for __author__ = "\n".join(['Aric Hagberg ', 'Pieter Swart (swart@lanl.gov)', 'Dan Schult (dschult@colgate.edu)', 'Alejandro Weinstein ']) __all__ = ['laplacian_matrix', 'normalized_laplacian_matrix', 'directed_laplacian_matrix'] @not_implemented_for('directed') def laplacian_matrix(G, nodelist=None, weight='weight'): """Return the Laplacian matrix of G. The graph Laplacian is the matrix L = D - A, where A is the adjacency matrix and D is the diagonal matrix of node degrees. Parameters ---------- G : graph A NetworkX graph nodelist : list, optional The rows and columns are ordered according to the nodes in nodelist. If nodelist is None, then the ordering is produced by G.nodes(). weight : string or None, optional (default='weight') The edge data key used to compute each value in the matrix. If None, then each edge has weight 1. Returns ------- L : SciPy sparse matrix The Laplacian matrix of G. Notes ----- For MultiGraph/MultiDiGraph, the edges weights are summed. See Also -------- to_numpy_matrix normalized_laplacian_matrix """ import scipy.sparse if nodelist is None: nodelist = G.nodes() A = nx.to_scipy_sparse_matrix(G, nodelist=nodelist, weight=weight, format='csr') n,m = A.shape diags = A.sum(axis=1) D = scipy.sparse.spdiags(diags.flatten(), [0], m, n, format='csr') return D - A @not_implemented_for('directed') def normalized_laplacian_matrix(G, nodelist=None, weight='weight'): r"""Return the normalized Laplacian matrix of G. The normalized graph Laplacian is the matrix .. math:: N = D^{-1/2} L D^{-1/2} where `L` is the graph Laplacian and `D` is the diagonal matrix of node degrees. Parameters ---------- G : graph A NetworkX graph nodelist : list, optional The rows and columns are ordered according to the nodes in nodelist. If nodelist is None, then the ordering is produced by G.nodes(). weight : string or None, optional (default='weight') The edge data key used to compute each value in the matrix. If None, then each edge has weight 1. Returns ------- N : NumPy matrix The normalized Laplacian matrix of G. Notes ----- For MultiGraph/MultiDiGraph, the edges weights are summed. See to_numpy_matrix for other options. If the Graph contains selfloops, D is defined as diag(sum(A,1)), where A is the adjacency matrix [2]_. See Also -------- laplacian_matrix References ---------- .. [1] Fan Chung-Graham, Spectral Graph Theory, CBMS Regional Conference Series in Mathematics, Number 92, 1997. .. [2] Steve Butler, Interlacing For Weighted Graphs Using The Normalized Laplacian, Electronic Journal of Linear Algebra, Volume 16, pp. 90-98, March 2007. """ import scipy import scipy.sparse if nodelist is None: nodelist = G.nodes() A = nx.to_scipy_sparse_matrix(G, nodelist=nodelist, weight=weight, format='csr') n,m = A.shape diags = A.sum(axis=1).flatten() D = scipy.sparse.spdiags(diags, [0], m, n, format='csr') L = D - A with scipy.errstate(divide='ignore'): diags_sqrt = 1.0/scipy.sqrt(diags) diags_sqrt[scipy.isinf(diags_sqrt)] = 0 DH = scipy.sparse.spdiags(diags_sqrt, [0], m, n, format='csr') return DH.dot(L.dot(DH)) ############################################################################### # Code based on # https://bitbucket.org/bedwards/networkx-community/src/370bd69fc02f/networkx/algorithms/community/ @not_implemented_for('undirected') @not_implemented_for('multigraph') def directed_laplacian_matrix(G, nodelist=None, weight='weight', walk_type=None, alpha=0.95): r"""Return the directed Laplacian matrix of G. The graph directed Laplacian is the matrix .. math:: L = I - (\Phi^{1/2} P \Phi^{-1/2} + \Phi^{-1/2} P^T \Phi^{1/2} ) / 2 where `I` is the identity matrix, `P` is the transition matrix of the graph, and `\Phi` a matrix with the Perron vector of `P` in the diagonal and zeros elsewhere. Depending on the value of walk_type, `P` can be the transition matrix induced by a random walk, a lazy random walk, or a random walk with teleportation (PageRank). Parameters ---------- G : DiGraph A NetworkX graph nodelist : list, optional The rows and columns are ordered according to the nodes in nodelist. If nodelist is None, then the ordering is produced by G.nodes(). weight : string or None, optional (default='weight') The edge data key used to compute each value in the matrix. If None, then each edge has weight 1. walk_type : string or None, optional (default=None) If None, `P` is selected depending on the properties of the graph. Otherwise is one of 'random', 'lazy', or 'pagerank' alpha : real (1 - alpha) is the teleportation probability used with pagerank Returns ------- L : NumPy array Normalized Laplacian of G. Raises ------ NetworkXError If NumPy cannot be imported NetworkXNotImplemnted If G is not a DiGraph Notes ----- Only implemented for DiGraphs See Also -------- laplacian_matrix References ---------- .. [1] Fan Chung (2005). Laplacians and the Cheeger inequality for directed graphs. Annals of Combinatorics, 9(1), 2005 """ import scipy as sp from scipy.sparse import identity, spdiags, linalg if walk_type is None: if nx.is_strongly_connected(G): if nx.is_aperiodic(G): walk_type = "random" else: walk_type = "lazy" else: walk_type = "pagerank" M = nx.to_scipy_sparse_matrix(G, nodelist=nodelist, weight=weight, dtype=float) n, m = M.shape if walk_type in ["random", "lazy"]: DI = spdiags(1.0/sp.array(M.sum(axis=1).flat), [0], n, n) if walk_type == "random": P = DI * M else: I = identity(n) P = (I + DI * M) / 2.0 elif walk_type == "pagerank": if not (0 < alpha < 1): raise nx.NetworkXError('alpha must be between 0 and 1') # this is using a dense representation M = M.todense() # add constant to dangling nodes' row dangling = sp.where(M.sum(axis=1) == 0) for d in dangling[0]: M[d] = 1.0 / n # normalize M = M / M.sum(axis=1) P = alpha * M + (1 - alpha) / n else: raise nx.NetworkXError("walk_type must be random, lazy, or pagerank") evals, evecs = linalg.eigs(P.T, k=1) v = evecs.flatten().real p = v / v.sum() sqrtp = sp.sqrt(p) Q = spdiags(sqrtp, [0], n, n) * P * spdiags(1.0/sqrtp, [0], n, n) I = sp.identity(len(G)) return I - (Q + Q.T) /2.0 # fixture for nose tests def setup_module(module): from nose import SkipTest try: import numpy except: raise SkipTest("NumPy not available") networkx-1.11/networkx/linalg/algebraicconnectivity.py0000644000175000017500000004311012637544500023251 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ Algebraic connectivity and Fiedler vectors of undirected graphs. """ __author__ = """ysitu """ # Copyright (C) 2014 ysitu # All rights reserved. # BSD license. from functools import partial import networkx as nx from networkx.utils import not_implemented_for from networkx.utils import reverse_cuthill_mckee_ordering from re import compile try: from numpy import (array, asmatrix, asarray, dot, matrix, ndarray, ones, reshape, sqrt, zeros) from numpy.linalg import norm, qr from numpy.random import normal from scipy.linalg import eigh, inv from scipy.sparse import csc_matrix, spdiags from scipy.sparse.linalg import eigsh, lobpcg __all__ = ['algebraic_connectivity', 'fiedler_vector', 'spectral_ordering'] except ImportError: __all__ = [] try: from scipy.linalg.blas import dasum, daxpy, ddot except ImportError: if __all__: # Make sure the imports succeeded. # Use minimal replacements if BLAS is unavailable from SciPy. dasum = partial(norm, ord=1) ddot = dot def daxpy(x, y, a): y += a * x return y _tracemin_method = compile('^tracemin(?:_(.*))?$') class _PCGSolver(object): """Preconditioned conjugate gradient method. """ def __init__(self, A, M): self._A = A self._M = M or (lambda x: x.copy()) def solve(self, B, tol): B = asarray(B) X = ndarray(B.shape, order='F') for j in range(B.shape[1]): X[:, j] = self._solve(B[:, j], tol) return X def _solve(self, b, tol): A = self._A M = self._M tol *= dasum(b) # Initialize. x = zeros(b.shape) r = b.copy() z = M(r) rz = ddot(r, z) p = z.copy() # Iterate. while True: Ap = A(p) alpha = rz / ddot(p, Ap) x = daxpy(p, x, a=alpha) r = daxpy(Ap, r, a=-alpha) if dasum(r) < tol: return x z = M(r) beta = ddot(r, z) beta, rz = beta / rz, beta p = daxpy(p, z, a=beta) class _CholeskySolver(object): """Cholesky factorization. """ def __init__(self, A): if not self._cholesky: raise nx.NetworkXError('Cholesky solver unavailable.') self._chol = self._cholesky(A) def solve(self, B): return self._chol(B) try: from scikits.sparse.cholmod import cholesky _cholesky = cholesky except ImportError: _cholesky = None class _LUSolver(object): """LU factorization. """ def __init__(self, A): if not self._splu: raise nx.NetworkXError('LU solver unavailable.') self._LU = self._splu(A) def solve(self, B): B = asarray(B) X = ndarray(B.shape, order='F') for j in range(B.shape[1]): X[:, j] = self._LU.solve(B[:, j]) return X try: from scipy.sparse.linalg import splu _splu = partial(splu, permc_spec='MMD_AT_PLUS_A', diag_pivot_thresh=0., options={'Equil': True, 'SymmetricMode': True}) except ImportError: _splu = None def _preprocess_graph(G, weight): """Compute edge weights and eliminate zero-weight edges. """ if G.is_directed(): H = nx.MultiGraph() H.add_nodes_from(G) H.add_weighted_edges_from(((u, v, e.get(weight, 1.)) for u, v, e in G.edges_iter(data=True) if u != v), weight=weight) G = H if not G.is_multigraph(): edges = ((u, v, abs(e.get(weight, 1.))) for u, v, e in G.edges_iter(data=True) if u != v) else: edges = ((u, v, sum(abs(e.get(weight, 1.)) for e in G[u][v].values())) for u, v in G.edges_iter() if u != v) H = nx.Graph() H.add_nodes_from(G) H.add_weighted_edges_from((u, v, e) for u, v, e in edges if e != 0) return H def _rcm_estimate(G, nodelist): """Estimate the Fiedler vector using the reverse Cuthill-McKee ordering. """ G = G.subgraph(nodelist) order = reverse_cuthill_mckee_ordering(G) n = len(nodelist) index = dict(zip(nodelist, range(n))) x = ndarray(n, dtype=float) for i, u in enumerate(order): x[index[u]] = i x -= (n - 1) / 2. return x def _tracemin_fiedler(L, X, normalized, tol, method): """Compute the Fiedler vector of L using the TraceMIN-Fiedler algorithm. """ n = X.shape[0] if normalized: # Form the normalized Laplacian matrix and determine the eigenvector of # its nullspace. e = sqrt(L.diagonal()) D = spdiags(1. / e, [0], n, n, format='csr') L = D * L * D e *= 1. / norm(e, 2) if not normalized: def project(X): """Make X orthogonal to the nullspace of L. """ X = asarray(X) for j in range(X.shape[1]): X[:, j] -= X[:, j].sum() / n else: def project(X): """Make X orthogonal to the nullspace of L. """ X = asarray(X) for j in range(X.shape[1]): X[:, j] -= dot(X[:, j], e) * e if method is None: method = 'pcg' if method == 'pcg': # See comments below for the semantics of P and D. def P(x): x -= asarray(x * X * X.T)[0, :] if not normalized: x -= x.sum() / n else: x = daxpy(e, x, a=-ddot(x, e)) return x solver = _PCGSolver(lambda x: P(L * P(x)), lambda x: D * x) elif method == 'chol' or method == 'lu': # Convert A to CSC to suppress SparseEfficiencyWarning. A = csc_matrix(L, dtype=float, copy=True) # Force A to be nonsingular. Since A is the Laplacian matrix of a # connected graph, its rank deficiency is one, and thus one diagonal # element needs to modified. Changing to infinity forces a zero in the # corresponding element in the solution. i = (A.indptr[1:] - A.indptr[:-1]).argmax() A[i, i] = float('inf') solver = (_CholeskySolver if method == 'chol' else _LUSolver)(A) else: raise nx.NetworkXError('unknown linear system solver.') # Initialize. Lnorm = abs(L).sum(axis=1).flatten().max() project(X) W = asmatrix(ndarray(X.shape, order='F')) while True: # Orthonormalize X. X = qr(X)[0] # Compute interation matrix H. W[:, :] = L * X H = X.T * W sigma, Y = eigh(H, overwrite_a=True) # Compute the Ritz vectors. X *= Y # Test for convergence exploiting the fact that L * X == W * Y. res = dasum(W * asmatrix(Y)[:, 0] - sigma[0] * X[:, 0]) / Lnorm if res < tol: break # Depending on the linear solver to be used, two mathematically # equivalent formulations are used. if method == 'pcg': # Compute X = X - (P * L * P) \ (P * L * X) where # P = I - [e X] * [e X]' is a projection onto the orthogonal # complement of [e X]. W *= Y # L * X == W * Y W -= (W.T * X * X.T).T project(W) # Compute the diagonal of P * L * P as a Jacobi preconditioner. D = L.diagonal().astype(float) D += 2. * (asarray(X) * asarray(W)).sum(axis=1) D += (asarray(X) * asarray(X * (W.T * X))).sum(axis=1) D[D < tol * Lnorm] = 1. D = 1. / D # Since TraceMIN is globally convergent, the relative residual can # be loose. X -= solver.solve(W, 0.1) else: # Compute X = L \ X / (X' * (L \ X)). L \ X can have an arbitrary # projection on the nullspace of L, which will be eliminated. W[:, :] = solver.solve(X) project(W) X = (inv(W.T * X) * W.T).T # Preserves Fortran storage order. return sigma, asarray(X) def _get_fiedler_func(method): """Return a function that solves the Fiedler eigenvalue problem. """ match = _tracemin_method.match(method) if match: method = match.group(1) def find_fiedler(L, x, normalized, tol): q = 2 if method == 'pcg' else min(4, L.shape[0] - 1) X = asmatrix(normal(size=(q, L.shape[0]))).T sigma, X = _tracemin_fiedler(L, X, normalized, tol, method) return sigma[0], X[:, 0] elif method == 'lanczos' or method == 'lobpcg': def find_fiedler(L, x, normalized, tol): L = csc_matrix(L, dtype=float) n = L.shape[0] if normalized: D = spdiags(1. / sqrt(L.diagonal()), [0], n, n, format='csc') L = D * L * D if method == 'lanczos' or n < 10: # Avoid LOBPCG when n < 10 due to # https://github.com/scipy/scipy/issues/3592 # https://github.com/scipy/scipy/pull/3594 sigma, X = eigsh(L, 2, which='SM', tol=tol, return_eigenvectors=True) return sigma[1], X[:, 1] else: X = asarray(asmatrix(x).T) M = spdiags(1. / L.diagonal(), [0], n, n) Y = ones(n) if normalized: Y /= D.diagonal() sigma, X = lobpcg(L, X, M=M, Y=asmatrix(Y).T, tol=tol, maxiter=n, largest=False) return sigma[0], X[:, 0] else: raise nx.NetworkXError("unknown method '%s'." % method) return find_fiedler @not_implemented_for('directed') def algebraic_connectivity(G, weight='weight', normalized=False, tol=1e-8, method='tracemin'): """Return the algebraic connectivity of an undirected graph. The algebraic connectivity of a connected undirected graph is the second smallest eigenvalue of its Laplacian matrix. Parameters ---------- G : NetworkX graph An undirected graph. weight : object, optional The data key used to determine the weight of each edge. If None, then each edge has unit weight. Default value: None. normalized : bool, optional Whether the normalized Laplacian matrix is used. Default value: False. tol : float, optional Tolerance of relative residual in eigenvalue computation. Default value: 1e-8. method : string, optional Method of eigenvalue computation. It should be one of 'tracemin' (TraceMIN), 'lanczos' (Lanczos iteration) and 'lobpcg' (LOBPCG). Default value: 'tracemin'. The TraceMIN algorithm uses a linear system solver. The following values allow specifying the solver to be used. =============== ======================================== Value Solver =============== ======================================== 'tracemin_pcg' Preconditioned conjugate gradient method 'tracemin_chol' Cholesky factorization 'tracemin_lu' LU factorization =============== ======================================== Returns ------- algebraic_connectivity : float Algebraic connectivity. Raises ------ NetworkXNotImplemented If G is directed. NetworkXError If G has less than two nodes. Notes ----- Edge weights are interpreted by their absolute values. For MultiGraph's, weights of parallel edges are summed. Zero-weighted edges are ignored. To use Cholesky factorization in the TraceMIN algorithm, the :samp:`scikits.sparse` package must be installed. See Also -------- laplacian_matrix """ if len(G) < 2: raise nx.NetworkXError('graph has less than two nodes.') G = _preprocess_graph(G, weight) if not nx.is_connected(G): return 0. L = nx.laplacian_matrix(G) if L.shape[0] == 2: return 2. * L[0, 0] if not normalized else 2. find_fiedler = _get_fiedler_func(method) x = None if method != 'lobpcg' else _rcm_estimate(G, G) return find_fiedler(L, x, normalized, tol)[0] @not_implemented_for('directed') def fiedler_vector(G, weight='weight', normalized=False, tol=1e-8, method='tracemin'): """Return the Fiedler vector of a connected undirected graph. The Fiedler vector of a connected undirected graph is the eigenvector corresponding to the second smallest eigenvalue of the Laplacian matrix of of the graph. Parameters ---------- G : NetworkX graph An undirected graph. weight : object, optional The data key used to determine the weight of each edge. If None, then each edge has unit weight. Default value: None. normalized : bool, optional Whether the normalized Laplacian matrix is used. Default value: False. tol : float, optional Tolerance of relative residual in eigenvalue computation. Default value: 1e-8. method : string, optional Method of eigenvalue computation. It should be one of 'tracemin' (TraceMIN), 'lanczos' (Lanczos iteration) and 'lobpcg' (LOBPCG). Default value: 'tracemin'. The TraceMIN algorithm uses a linear system solver. The following values allow specifying the solver to be used. =============== ======================================== Value Solver =============== ======================================== 'tracemin_pcg' Preconditioned conjugate gradient method 'tracemin_chol' Cholesky factorization 'tracemin_lu' LU factorization =============== ======================================== Returns ------- fiedler_vector : NumPy array of floats. Fiedler vector. Raises ------ NetworkXNotImplemented If G is directed. NetworkXError If G has less than two nodes or is not connected. Notes ----- Edge weights are interpreted by their absolute values. For MultiGraph's, weights of parallel edges are summed. Zero-weighted edges are ignored. To use Cholesky factorization in the TraceMIN algorithm, the :samp:`scikits.sparse` package must be installed. See Also -------- laplacian_matrix """ if len(G) < 2: raise nx.NetworkXError('graph has less than two nodes.') G = _preprocess_graph(G, weight) if not nx.is_connected(G): raise nx.NetworkXError('graph is not connected.') if len(G) == 2: return array([1., -1.]) find_fiedler = _get_fiedler_func(method) L = nx.laplacian_matrix(G) x = None if method != 'lobpcg' else _rcm_estimate(G, G) return find_fiedler(L, x, normalized, tol)[1] def spectral_ordering(G, weight='weight', normalized=False, tol=1e-8, method='tracemin'): """Compute the spectral_ordering of a graph. The spectral ordering of a graph is an ordering of its nodes where nodes in the same weakly connected components appear contiguous and ordered by their corresponding elements in the Fiedler vector of the component. Parameters ---------- G : NetworkX graph A graph. weight : object, optional The data key used to determine the weight of each edge. If None, then each edge has unit weight. Default value: None. normalized : bool, optional Whether the normalized Laplacian matrix is used. Default value: False. tol : float, optional Tolerance of relative residual in eigenvalue computation. Default value: 1e-8. method : string, optional Method of eigenvalue computation. It should be one of 'tracemin' (TraceMIN), 'lanczos' (Lanczos iteration) and 'lobpcg' (LOBPCG). Default value: 'tracemin'. The TraceMIN algorithm uses a linear system solver. The following values allow specifying the solver to be used. =============== ======================================== Value Solver =============== ======================================== 'tracemin_pcg' Preconditioned conjugate gradient method 'tracemin_chol' Cholesky factorization 'tracemin_lu' LU factorization =============== ======================================== Returns ------- spectral_ordering : NumPy array of floats. Spectral ordering of nodes. Raises ------ NetworkXError If G is empty. Notes ----- Edge weights are interpreted by their absolute values. For MultiGraph's, weights of parallel edges are summed. Zero-weighted edges are ignored. To use Cholesky factorization in the TraceMIN algorithm, the :samp:`scikits.sparse` package must be installed. See Also -------- laplacian_matrix """ if len(G) == 0: raise nx.NetworkXError('graph is empty.') G = _preprocess_graph(G, weight) find_fiedler = _get_fiedler_func(method) order = [] for component in nx.connected_components(G): size = len(component) if size > 2: L = nx.laplacian_matrix(G, component) x = None if method != 'lobpcg' else _rcm_estimate(G, component) fiedler = find_fiedler(L, x, normalized, tol)[1] order.extend( u for x, c, u in sorted(zip(fiedler, range(size), component))) else: order.extend(component) return order # fixture for nose tests def setup_module(module): from nose import SkipTest try: import numpy import scipy.sparse except ImportError: raise SkipTest('SciPy not available.') networkx-1.11/networkx/algorithms/0000755000175000017500000000000012653231454017232 5ustar aricaric00000000000000networkx-1.11/networkx/algorithms/bipartite/0000755000175000017500000000000012653231454021215 5ustar aricaric00000000000000networkx-1.11/networkx/algorithms/bipartite/centrality.py0000644000175000017500000001771312637544500023757 0ustar aricaric00000000000000#-*- coding: utf-8 -*- # Copyright (C) 2011 by # Jordi Torrents # Aric Hagberg # All rights reserved. # BSD license. import networkx as nx __author__ = """\n""".join(['Jordi Torrents ', 'Aric Hagberg (hagberg@lanl.gov)']) __all__=['degree_centrality', 'betweenness_centrality', 'closeness_centrality'] def degree_centrality(G, nodes): r"""Compute the degree centrality for nodes in a bipartite network. The degree centrality for a node `v` is the fraction of nodes connected to it. Parameters ---------- G : graph A bipartite network nodes : list or container Container with all nodes in one bipartite node set. Returns ------- centrality : dictionary Dictionary keyed by node with bipartite degree centrality as the value. See Also -------- betweenness_centrality, closeness_centrality, sets, is_bipartite Notes ----- The nodes input parameter must conatin all nodes in one bipartite node set, but the dictionary returned contains all nodes from both bipartite node sets. For unipartite networks, the degree centrality values are normalized by dividing by the maximum possible degree (which is `n-1` where `n` is the number of nodes in G). In the bipartite case, the maximum possible degree of a node in a bipartite node set is the number of nodes in the opposite node set [1]_. The degree centrality for a node `v` in the bipartite sets `U` with `n` nodes and `V` with `m` nodes is .. math:: d_{v} = \frac{deg(v)}{m}, \mbox{for} v \in U , d_{v} = \frac{deg(v)}{n}, \mbox{for} v \in V , where `deg(v)` is the degree of node `v`. References ---------- .. [1] Borgatti, S.P. and Halgin, D. In press. "Analyzing Affiliation Networks". In Carrington, P. and Scott, J. (eds) The Sage Handbook of Social Network Analysis. Sage Publications. http://www.steveborgatti.com/papers/bhaffiliations.pdf """ top = set(nodes) bottom = set(G) - top s = 1.0/len(bottom) centrality = dict((n,d*s) for n,d in G.degree_iter(top)) s = 1.0/len(top) centrality.update(dict((n,d*s) for n,d in G.degree_iter(bottom))) return centrality def betweenness_centrality(G, nodes): r"""Compute betweenness centrality for nodes in a bipartite network. Betweenness centrality of a node `v` is the sum of the fraction of all-pairs shortest paths that pass through `v`. Values of betweenness are normalized by the maximum possible value which for bipartite graphs is limited by the relative size of the two node sets [1]_. Let `n` be the number of nodes in the node set `U` and `m` be the number of nodes in the node set `V`, then nodes in `U` are normalized by dividing by .. math:: \frac{1}{2} [m^2 (s + 1)^2 + m (s + 1)(2t - s - 1) - t (2s - t + 3)] , where .. math:: s = (n - 1) \div m , t = (n - 1) \mod m , and nodes in `V` are normalized by dividing by .. math:: \frac{1}{2} [n^2 (p + 1)^2 + n (p + 1)(2r - p - 1) - r (2p - r + 3)] , where, .. math:: p = (m - 1) \div n , r = (m - 1) \mod n . Parameters ---------- G : graph A bipartite graph nodes : list or container Container with all nodes in one bipartite node set. Returns ------- betweenness : dictionary Dictionary keyed by node with bipartite betweenness centrality as the value. See Also -------- degree_centrality, closeness_centrality, sets, is_bipartite Notes ----- The nodes input parameter must contain all nodes in one bipartite node set, but the dictionary returned contains all nodes from both node sets. References ---------- .. [1] Borgatti, S.P. and Halgin, D. In press. "Analyzing Affiliation Networks". In Carrington, P. and Scott, J. (eds) The Sage Handbook of Social Network Analysis. Sage Publications. http://www.steveborgatti.com/papers/bhaffiliations.pdf """ top = set(nodes) bottom = set(G) - top n = float(len(top)) m = float(len(bottom)) s = (n-1) // m t = (n-1) % m bet_max_top = (((m**2)*((s+1)**2))+ (m*(s+1)*(2*t-s-1))- (t*((2*s)-t+3)))/2.0 p = (m-1) // n r = (m-1) % n bet_max_bot = (((n**2)*((p+1)**2))+ (n*(p+1)*(2*r-p-1))- (r*((2*p)-r+3)))/2.0 betweenness = nx.betweenness_centrality(G, normalized=False, weight=None) for node in top: betweenness[node]/=bet_max_top for node in bottom: betweenness[node]/=bet_max_bot return betweenness def closeness_centrality(G, nodes, normalized=True): r"""Compute the closeness centrality for nodes in a bipartite network. The closeness of a node is the distance to all other nodes in the graph or in the case that the graph is not connected to all other nodes in the connected component containing that node. Parameters ---------- G : graph A bipartite network nodes : list or container Container with all nodes in one bipartite node set. normalized : bool, optional If True (default) normalize by connected component size. Returns ------- closeness : dictionary Dictionary keyed by node with bipartite closeness centrality as the value. See Also -------- betweenness_centrality, degree_centrality sets, is_bipartite Notes ----- The nodes input parameter must conatin all nodes in one bipartite node set, but the dictionary returned contains all nodes from both node sets. Closeness centrality is normalized by the minimum distance possible. In the bipartite case the minimum distance for a node in one bipartite node set is 1 from all nodes in the other node set and 2 from all other nodes in its own set [1]_. Thus the closeness centrality for node `v` in the two bipartite sets `U` with `n` nodes and `V` with `m` nodes is .. math:: c_{v} = \frac{m + 2(n - 1)}{d}, \mbox{for} v \in U, c_{v} = \frac{n + 2(m - 1)}{d}, \mbox{for} v \in V, where `d` is the sum of the distances from `v` to all other nodes. Higher values of closeness indicate higher centrality. As in the unipartite case, setting normalized=True causes the values to normalized further to n-1 / size(G)-1 where n is the number of nodes in the connected part of graph containing the node. If the graph is not completely connected, this algorithm computes the closeness centrality for each connected part separately. References ---------- .. [1] Borgatti, S.P. and Halgin, D. In press. "Analyzing Affiliation Networks". In Carrington, P. and Scott, J. (eds) The Sage Handbook of Social Network Analysis. Sage Publications. http://www.steveborgatti.com/papers/bhaffiliations.pdf """ closeness={} path_length=nx.single_source_shortest_path_length top = set(nodes) bottom = set(G) - top n = float(len(top)) m = float(len(bottom)) for node in top: sp=path_length(G,node) totsp=sum(sp.values()) if totsp > 0.0 and len(G) > 1: closeness[node]= (m + 2*(n-1)) / totsp if normalized: s=(len(sp)-1.0) / ( len(G) - 1 ) closeness[node] *= s else: closeness[n]=0.0 for node in bottom: sp=path_length(G,node) totsp=sum(sp.values()) if totsp > 0.0 and len(G) > 1: closeness[node]= (n + 2*(m-1)) / totsp if normalized: s=(len(sp)-1.0) / ( len(G) - 1 ) closeness[node] *= s else: closeness[n]=0.0 return closeness networkx-1.11/networkx/algorithms/bipartite/basic.py0000644000175000017500000001366612637544500022665 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ ========================== Bipartite Graph Algorithms ========================== """ # Copyright (C) 2013-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import networkx as nx __author__ = """\n""".join(['Jordi Torrents ', 'Aric Hagberg ']) __all__ = [ 'is_bipartite', 'is_bipartite_node_set', 'color', 'sets', 'density', 'degrees'] def color(G): """Returns a two-coloring of the graph. Raises an exception if the graph is not bipartite. Parameters ---------- G : NetworkX graph Returns ------- color : dictionary A dictionary keyed by node with a 1 or 0 as data for each node color. Raises ------ NetworkXError if the graph is not two-colorable. Examples -------- >>> from networkx.algorithms import bipartite >>> G = nx.path_graph(4) >>> c = bipartite.color(G) >>> print(c) {0: 1, 1: 0, 2: 1, 3: 0} You can use this to set a node attribute indicating the biparite set: >>> nx.set_node_attributes(G, 'bipartite', c) >>> print(G.node[0]['bipartite']) 1 >>> print(G.node[1]['bipartite']) 0 """ if G.is_directed(): import itertools def neighbors(v): return itertools.chain.from_iterable([G.predecessors_iter(v), G.successors_iter(v)]) else: neighbors=G.neighbors_iter color = {} for n in G: # handle disconnected graphs if n in color or len(G[n])==0: # skip isolates continue queue = [n] color[n] = 1 # nodes seen with color (1 or 0) while queue: v = queue.pop() c = 1 - color[v] # opposite color of node v for w in neighbors(v): if w in color: if color[w] == color[v]: raise nx.NetworkXError("Graph is not bipartite.") else: color[w] = c queue.append(w) # color isolates with 0 color.update(dict.fromkeys(nx.isolates(G),0)) return color def is_bipartite(G): """ Returns True if graph G is bipartite, False if not. Parameters ---------- G : NetworkX graph Examples -------- >>> from networkx.algorithms import bipartite >>> G = nx.path_graph(4) >>> print(bipartite.is_bipartite(G)) True See Also -------- color, is_bipartite_node_set """ try: color(G) return True except nx.NetworkXError: return False def is_bipartite_node_set(G,nodes): """Returns True if nodes and G/nodes are a bipartition of G. Parameters ---------- G : NetworkX graph nodes: list or container Check if nodes are a one of a bipartite set. Examples -------- >>> from networkx.algorithms import bipartite >>> G = nx.path_graph(4) >>> X = set([1,3]) >>> bipartite.is_bipartite_node_set(G,X) True Notes ----- For connected graphs the bipartite sets are unique. This function handles disconnected graphs. """ S=set(nodes) for CC in nx.connected_component_subgraphs(G): X,Y=sets(CC) if not ( (X.issubset(S) and Y.isdisjoint(S)) or (Y.issubset(S) and X.isdisjoint(S)) ): return False return True def sets(G): """Returns bipartite node sets of graph G. Raises an exception if the graph is not bipartite. Parameters ---------- G : NetworkX graph Returns ------- (X,Y) : two-tuple of sets One set of nodes for each part of the bipartite graph. Examples -------- >>> from networkx.algorithms import bipartite >>> G = nx.path_graph(4) >>> X, Y = bipartite.sets(G) >>> list(X) [0, 2] >>> list(Y) [1, 3] See Also -------- color """ c = color(G) X = set(n for n in c if c[n]) # c[n] == 1 Y = set(n for n in c if not c[n]) # c[n] == 0 return (X, Y) def density(B, nodes): """Return density of bipartite graph B. Parameters ---------- G : NetworkX graph nodes: list or container Nodes in one set of the bipartite graph. Returns ------- d : float The bipartite density Examples -------- >>> from networkx.algorithms import bipartite >>> G = nx.complete_bipartite_graph(3,2) >>> X=set([0,1,2]) >>> bipartite.density(G,X) 1.0 >>> Y=set([3,4]) >>> bipartite.density(G,Y) 1.0 See Also -------- color """ n=len(B) m=nx.number_of_edges(B) nb=len(nodes) nt=n-nb if m==0: # includes cases n==0 and n==1 d=0.0 else: if B.is_directed(): d=m/(2.0*float(nb*nt)) else: d= m/float(nb*nt) return d def degrees(B, nodes, weight=None): """Return the degrees of the two node sets in the bipartite graph B. Parameters ---------- G : NetworkX graph nodes: list or container Nodes in one set of the bipartite graph. weight : string or None, optional (default=None) The edge attribute that holds the numerical value used as a weight. If None, then each edge has weight 1. The degree is the sum of the edge weights adjacent to the node. Returns ------- (degX,degY) : tuple of dictionaries The degrees of the two bipartite sets as dictionaries keyed by node. Examples -------- >>> from networkx.algorithms import bipartite >>> G = nx.complete_bipartite_graph(3,2) >>> Y=set([3,4]) >>> degX,degY=bipartite.degrees(G,Y) >>> degX {0: 2, 1: 2, 2: 2} See Also -------- color, density """ bottom=set(nodes) top=set(B)-bottom return (B.degree(top,weight),B.degree(bottom,weight)) networkx-1.11/networkx/algorithms/bipartite/matrix.py0000644000175000017500000001473512637544500023106 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ ==================== Biadjacency matrices ==================== """ # Copyright (C) 2013-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import itertools from networkx.convert import _prep_create_using from networkx.convert_matrix import _generate_weighted_edges import networkx as nx __author__ = """\n""".join(['Jordi Torrents ', 'Aric Hagberg ']) __all__ = ['biadjacency_matrix','from_biadjacency_matrix'] def biadjacency_matrix(G, row_order, column_order=None, dtype=None, weight='weight', format='csr'): r"""Return the biadjacency matrix of the bipartite graph G. Let `G = (U, V, E)` be a bipartite graph with node sets `U = u_{1},...,u_{r}` and `V = v_{1},...,v_{s}`. The biadjacency matrix [1]_ is the `r` x `s` matrix `B` in which `b_{i,j} = 1` if, and only if, `(u_i, v_j) \in E`. If the parameter `weight` is not `None` and matches the name of an edge attribute, its value is used instead of 1. Parameters ---------- G : graph A NetworkX graph row_order : list of nodes The rows of the matrix are ordered according to the list of nodes. column_order : list, optional The columns of the matrix are ordered according to the list of nodes. If column_order is None, then the ordering of columns is arbitrary. dtype : NumPy data-type, optional A valid NumPy dtype used to initialize the array. If None, then the NumPy default is used. weight : string or None, optional (default='weight') The edge data key used to provide each value in the matrix. If None, then each edge has weight 1. format : str in {'bsr', 'csr', 'csc', 'coo', 'lil', 'dia', 'dok'} The type of the matrix to be returned (default 'csr'). For some algorithms different implementations of sparse matrices can perform better. See [2]_ for details. Returns ------- M : SciPy sparse matrix Biadjacency matrix representation of the bipartite graph G. Notes ----- No attempt is made to check that the input graph is bipartite. For directed bipartite graphs only successors are considered as neighbors. To obtain an adjacency matrix with ones (or weight values) for both predecessors and successors you have to generate two biadjacency matrices where the rows of one of them are the columns of the other, and then add one to the transpose of the other. See Also -------- adjacency_matrix from_biadjacency_matrix References ---------- .. [1] http://en.wikipedia.org/wiki/Adjacency_matrix#Adjacency_matrix_of_a_bipartite_graph .. [2] Scipy Dev. References, "Sparse Matrices", http://docs.scipy.org/doc/scipy/reference/sparse.html """ from scipy import sparse nlen = len(row_order) if nlen == 0: raise nx.NetworkXError("row_order is empty list") if len(row_order) != len(set(row_order)): msg = "Ambiguous ordering: `row_order` contained duplicates." raise nx.NetworkXError(msg) if column_order is None: column_order = list(set(G) - set(row_order)) mlen = len(column_order) if len(column_order) != len(set(column_order)): msg = "Ambiguous ordering: `column_order` contained duplicates." raise nx.NetworkXError(msg) row_index = dict(zip(row_order, itertools.count())) col_index = dict(zip(column_order, itertools.count())) if G.number_of_edges() == 0: row,col,data=[],[],[] else: row,col,data = zip(*((row_index[u],col_index[v],d.get(weight,1)) for u,v,d in G.edges_iter(row_order,data=True) if u in row_index and v in col_index)) M = sparse.coo_matrix((data,(row,col)), shape=(nlen,mlen), dtype=dtype) try: return M.asformat(format) except AttributeError: raise nx.NetworkXError("Unknown sparse matrix format: %s"%format) def from_biadjacency_matrix(A, create_using=None, edge_attribute='weight'): r"""Creates a new bipartite graph from a biadjacency matrix given as a SciPy sparse matrix. Parameters ---------- A: scipy sparse matrix A biadjacency matrix representation of a graph create_using: NetworkX graph Use specified graph for result. The default is Graph() edge_attribute: string Name of edge attribute to store matrix numeric value. The data will have the same type as the matrix entry (int, float, (real,imag)). Notes ----- The nodes are labeled with the attribute `bipartite` set to an integer 0 or 1 representing membership in part 0 or part 1 of the bipartite graph. If `create_using` is an instance of :class:`networkx.MultiGraph` or :class:`networkx.MultiDiGraph` and the entries of `A` are of type ``int``, then this function returns a multigraph (of the same type as `create_using`) with parallel edges. In this case, `edge_attribute` will be ignored. See Also -------- biadjacency_matrix from_numpy_matrix References ---------- [1] http://en.wikipedia.org/wiki/Adjacency_matrix#Adjacency_matrix_of_a_bipartite_graph """ G = _prep_create_using(create_using) n, m = A.shape # Make sure we get even the isolated nodes of the graph. G.add_nodes_from(range(n), bipartite=0) G.add_nodes_from(range(n,n+m), bipartite=1) # Create an iterable over (u, v, w) triples and for each triple, add an # edge from u to v with weight w. triples = ((u, n+v, d) for (u, v, d) in _generate_weighted_edges(A)) # If the entries in the adjacency matrix are integers and the graph is a # multigraph, then create parallel edges, each with weight 1, for each # entry in the adjacency matrix. Otherwise, create one edge for each # positive entry in the adjacency matrix and set the weight of that edge to # be the entry in the matrix. if A.dtype.kind in ('i', 'u') and G.is_multigraph(): chain = itertools.chain.from_iterable triples = chain(((u, v, 1) for d in range(w)) for (u, v, w) in triples) G.add_weighted_edges_from(triples, weight=edge_attribute) return G # fixture for nose tests def setup_module(module): from nose import SkipTest try: import scipy except: raise SkipTest("SciPy not available") networkx-1.11/networkx/algorithms/bipartite/spectral.py0000644000175000017500000000475212637544500023415 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ Spectral bipartivity measure. """ import networkx as nx __author__ = """Aric Hagberg (hagberg@lanl.gov)""" # Copyright (C) 2011 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. __all__ = ['spectral_bipartivity'] def spectral_bipartivity(G, nodes=None, weight='weight'): """Returns the spectral bipartivity. Parameters ---------- G : NetworkX graph nodes : list or container optional(default is all nodes) Nodes to return value of spectral bipartivity contribution. weight : string or None optional (default = 'weight') Edge data key to use for edge weights. If None, weights set to 1. Returns ------- sb : float or dict A single number if the keyword nodes is not specified, or a dictionary keyed by node with the spectral bipartivity contribution of that node as the value. Examples -------- >>> from networkx.algorithms import bipartite >>> G = nx.path_graph(4) >>> bipartite.spectral_bipartivity(G) 1.0 Notes ----- This implementation uses Numpy (dense) matrices which are not efficient for storing large sparse graphs. See Also -------- color References ---------- .. [1] E. Estrada and J. A. Rodríguez-Velázquez, "Spectral measures of bipartivity in complex networks", PhysRev E 72, 046105 (2005) """ try: import scipy.linalg except ImportError: raise ImportError('spectral_bipartivity() requires SciPy: ', 'http://scipy.org/') nodelist = G.nodes() # ordering of nodes in matrix A = nx.to_numpy_matrix(G, nodelist, weight=weight) expA = scipy.linalg.expm(A) expmA = scipy.linalg.expm(-A) coshA = 0.5 * (expA + expmA) if nodes is None: # return single number for entire graph return coshA.diagonal().sum() / expA.diagonal().sum() else: # contribution for individual nodes index = dict(zip(nodelist, range(len(nodelist)))) sb = {} for n in nodes: i = index[n] sb[n] = coshA[i, i] / expA[i, i] return sb def setup_module(module): """Fixture for nose tests.""" from nose import SkipTest try: import numpy except: raise SkipTest("NumPy not available") try: import scipy except: raise SkipTest("SciPy not available") networkx-1.11/networkx/algorithms/bipartite/projection.py0000644000175000017500000003757012637544500023760 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """One-mode (unipartite) projections of bipartite graphs. """ import networkx as nx # Copyright (C) 2011 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. __author__ = """\n""".join(['Aric Hagberg ', 'Jordi Torrents ']) __all__ = ['project', 'projected_graph', 'weighted_projected_graph', 'collaboration_weighted_projected_graph', 'overlap_weighted_projected_graph', 'generic_weighted_projected_graph'] def projected_graph(B, nodes, multigraph=False): r"""Returns the projection of B onto one of its node sets. Returns the graph G that is the projection of the bipartite graph B onto the specified nodes. They retain their attributes and are connected in G if they have a common neighbor in B. Parameters ---------- B : NetworkX graph The input graph should be bipartite. nodes : list or iterable Nodes to project onto (the "bottom" nodes). multigraph: bool (default=False) If True return a multigraph where the multiple edges represent multiple shared neighbors. They edge key in the multigraph is assigned to the label of the neighbor. Returns ------- Graph : NetworkX graph or multigraph A graph that is the projection onto the given nodes. Examples -------- >>> from networkx.algorithms import bipartite >>> B = nx.path_graph(4) >>> G = bipartite.projected_graph(B, [1,3]) >>> print(G.nodes()) [1, 3] >>> print(G.edges()) [(1, 3)] If nodes `a`, and `b` are connected through both nodes 1 and 2 then building a multigraph results in two edges in the projection onto [`a`,`b`]: >>> B = nx.Graph() >>> B.add_edges_from([('a', 1), ('b', 1), ('a', 2), ('b', 2)]) >>> G = bipartite.projected_graph(B, ['a', 'b'], multigraph=True) >>> print([sorted((u,v)) for u,v in G.edges()]) [['a', 'b'], ['a', 'b']] Notes ----- No attempt is made to verify that the input graph B is bipartite. Returns a simple graph that is the projection of the bipartite graph B onto the set of nodes given in list nodes. If multigraph=True then a multigraph is returned with an edge for every shared neighbor. Directed graphs are allowed as input. The output will also then be a directed graph with edges if there is a directed path between the nodes. The graph and node properties are (shallow) copied to the projected graph. See Also -------- is_bipartite, is_bipartite_node_set, sets, weighted_projected_graph, collaboration_weighted_projected_graph, overlap_weighted_projected_graph, generic_weighted_projected_graph """ if B.is_multigraph(): raise nx.NetworkXError("not defined for multigraphs") if B.is_directed(): directed=True if multigraph: G=nx.MultiDiGraph() else: G=nx.DiGraph() else: directed=False if multigraph: G=nx.MultiGraph() else: G=nx.Graph() G.graph.update(B.graph) G.add_nodes_from((n,B.node[n]) for n in nodes) for u in nodes: nbrs2=set((v for nbr in B[u] for v in B[nbr])) -set([u]) if multigraph: for n in nbrs2: if directed: links=set(B[u]) & set(B.pred[n]) else: links=set(B[u]) & set(B[n]) for l in links: if not G.has_edge(u,n,l): G.add_edge(u,n,key=l) else: G.add_edges_from((u,n) for n in nbrs2) return G def weighted_projected_graph(B, nodes, ratio=False): r"""Returns a weighted projection of B onto one of its node sets. The weighted projected graph is the projection of the bipartite network B onto the specified nodes with weights representing the number of shared neighbors or the ratio between actual shared neighbors and possible shared neighbors if ratio=True [1]_. The nodes retain their attributes and are connected in the resulting graph if they have an edge to a common node in the original graph. Parameters ---------- B : NetworkX graph The input graph should be bipartite. nodes : list or iterable Nodes to project onto (the "bottom" nodes). ratio: Bool (default=False) If True, edge weight is the ratio between actual shared neighbors and possible shared neighbors. If False, edges weight is the number of shared neighbors. Returns ------- Graph : NetworkX graph A graph that is the projection onto the given nodes. Examples -------- >>> from networkx.algorithms import bipartite >>> B = nx.path_graph(4) >>> G = bipartite.weighted_projected_graph(B, [1,3]) >>> print(G.nodes()) [1, 3] >>> print(G.edges(data=True)) [(1, 3, {'weight': 1})] >>> G = bipartite.weighted_projected_graph(B, [1,3], ratio=True) >>> print(G.edges(data=True)) [(1, 3, {'weight': 0.5})] Notes ----- No attempt is made to verify that the input graph B is bipartite. The graph and node properties are (shallow) copied to the projected graph. See Also -------- is_bipartite, is_bipartite_node_set, sets, collaboration_weighted_projected_graph, overlap_weighted_projected_graph, generic_weighted_projected_graph projected_graph References ---------- .. [1] Borgatti, S.P. and Halgin, D. In press. "Analyzing Affiliation Networks". In Carrington, P. and Scott, J. (eds) The Sage Handbook of Social Network Analysis. Sage Publications. """ if B.is_multigraph(): raise nx.NetworkXError("not defined for multigraphs") if B.is_directed(): pred=B.pred G=nx.DiGraph() else: pred=B.adj G=nx.Graph() G.graph.update(B.graph) G.add_nodes_from((n,B.node[n]) for n in nodes) n_top = float(len(B) - len(nodes)) for u in nodes: unbrs = set(B[u]) nbrs2 = set((n for nbr in unbrs for n in B[nbr])) - set([u]) for v in nbrs2: vnbrs = set(pred[v]) common = unbrs & vnbrs if not ratio: weight = len(common) else: weight = len(common) / n_top G.add_edge(u,v,weight=weight) return G def collaboration_weighted_projected_graph(B, nodes): r"""Newman's weighted projection of B onto one of its node sets. The collaboration weighted projection is the projection of the bipartite network B onto the specified nodes with weights assigned using Newman's collaboration model [1]_: .. math:: w_{v,u} = \sum_k \frac{\delta_{v}^{w} \delta_{w}^{k}}{k_w - 1} where `v` and `u` are nodes from the same bipartite node set, and `w` is a node of the opposite node set. The value `k_w` is the degree of node `w` in the bipartite network and `\delta_{v}^{w}` is 1 if node `v` is linked to node `w` in the original bipartite graph or 0 otherwise. The nodes retain their attributes and are connected in the resulting graph if have an edge to a common node in the original bipartite graph. Parameters ---------- B : NetworkX graph The input graph should be bipartite. nodes : list or iterable Nodes to project onto (the "bottom" nodes). Returns ------- Graph : NetworkX graph A graph that is the projection onto the given nodes. Examples -------- >>> from networkx.algorithms import bipartite >>> B = nx.path_graph(5) >>> B.add_edge(1,5) >>> G = bipartite.collaboration_weighted_projected_graph(B, [0, 2, 4, 5]) >>> print(G.nodes()) [0, 2, 4, 5] >>> for edge in G.edges(data=True): print(edge) ... (0, 2, {'weight': 0.5}) (0, 5, {'weight': 0.5}) (2, 4, {'weight': 1.0}) (2, 5, {'weight': 0.5}) Notes ----- No attempt is made to verify that the input graph B is bipartite. The graph and node properties are (shallow) copied to the projected graph. See Also -------- is_bipartite, is_bipartite_node_set, sets, weighted_projected_graph, overlap_weighted_projected_graph, generic_weighted_projected_graph, projected_graph References ---------- .. [1] Scientific collaboration networks: II. Shortest paths, weighted networks, and centrality, M. E. J. Newman, Phys. Rev. E 64, 016132 (2001). """ if B.is_multigraph(): raise nx.NetworkXError("not defined for multigraphs") if B.is_directed(): pred=B.pred G=nx.DiGraph() else: pred=B.adj G=nx.Graph() G.graph.update(B.graph) G.add_nodes_from((n,B.node[n]) for n in nodes) for u in nodes: unbrs = set(B[u]) nbrs2 = set((n for nbr in unbrs for n in B[nbr])) - set([u]) for v in nbrs2: vnbrs = set(pred[v]) common = unbrs & vnbrs weight = sum([1.0/(len(B[n]) - 1) for n in common if len(B[n])>1]) G.add_edge(u,v,weight=weight) return G def overlap_weighted_projected_graph(B, nodes, jaccard=True): r"""Overlap weighted projection of B onto one of its node sets. The overlap weighted projection is the projection of the bipartite network B onto the specified nodes with weights representing the Jaccard index between the neighborhoods of the two nodes in the original bipartite network [1]_: .. math:: w_{v,u} = \frac{|N(u) \cap N(v)|}{|N(u) \cup N(v)|} or if the parameter 'jaccard' is False, the fraction of common neighbors by minimum of both nodes degree in the original bipartite graph [1]_: .. math:: w_{v,u} = \frac{|N(u) \cap N(v)|}{min(|N(u)|,|N(v)|)} The nodes retain their attributes and are connected in the resulting graph if have an edge to a common node in the original bipartite graph. Parameters ---------- B : NetworkX graph The input graph should be bipartite. nodes : list or iterable Nodes to project onto (the "bottom" nodes). jaccard: Bool (default=True) Returns ------- Graph : NetworkX graph A graph that is the projection onto the given nodes. Examples -------- >>> from networkx.algorithms import bipartite >>> B = nx.path_graph(5) >>> G = bipartite.overlap_weighted_projected_graph(B, [0, 2, 4]) >>> print(G.nodes()) [0, 2, 4] >>> print(G.edges(data=True)) [(0, 2, {'weight': 0.5}), (2, 4, {'weight': 0.5})] >>> G = bipartite.overlap_weighted_projected_graph(B, [0, 2, 4], jaccard=False) >>> print(G.edges(data=True)) [(0, 2, {'weight': 1.0}), (2, 4, {'weight': 1.0})] Notes ----- No attempt is made to verify that the input graph B is bipartite. The graph and node properties are (shallow) copied to the projected graph. See Also -------- is_bipartite, is_bipartite_node_set, sets, weighted_projected_graph, collaboration_weighted_projected_graph, generic_weighted_projected_graph, projected_graph References ---------- .. [1] Borgatti, S.P. and Halgin, D. In press. Analyzing Affiliation Networks. In Carrington, P. and Scott, J. (eds) The Sage Handbook of Social Network Analysis. Sage Publications. """ if B.is_multigraph(): raise nx.NetworkXError("not defined for multigraphs") if B.is_directed(): pred=B.pred G=nx.DiGraph() else: pred=B.adj G=nx.Graph() G.graph.update(B.graph) G.add_nodes_from((n,B.node[n]) for n in nodes) for u in nodes: unbrs = set(B[u]) nbrs2 = set((n for nbr in unbrs for n in B[nbr])) - set([u]) for v in nbrs2: vnbrs = set(pred[v]) if jaccard: weight = float(len(unbrs & vnbrs)) / len(unbrs | vnbrs) else: weight = float(len(unbrs & vnbrs)) / min(len(unbrs),len(vnbrs)) G.add_edge(u,v,weight=weight) return G def generic_weighted_projected_graph(B, nodes, weight_function=None): r"""Weighted projection of B with a user-specified weight function. The bipartite network B is projected on to the specified nodes with weights computed by a user-specified function. This function must accept as a parameter the neighborhood sets of two nodes and return an integer or a float. The nodes retain their attributes and are connected in the resulting graph if they have an edge to a common node in the original graph. Parameters ---------- B : NetworkX graph The input graph should be bipartite. nodes : list or iterable Nodes to project onto (the "bottom" nodes). weight_function: function This function must accept as parameters the same input graph that this function, and two nodes; and return an integer or a float. The default function computes the number of shared neighbors. Returns ------- Graph : NetworkX graph A graph that is the projection onto the given nodes. Examples -------- >>> from networkx.algorithms import bipartite >>> # Define some custom weight functions >>> def jaccard(G, u, v): ... unbrs = set(G[u]) ... vnbrs = set(G[v]) ... return float(len(unbrs & vnbrs)) / len(unbrs | vnbrs) ... >>> def my_weight(G, u, v, weight='weight'): ... w = 0 ... for nbr in set(G[u]) & set(G[v]): ... w += G.edge[u][nbr].get(weight, 1) + G.edge[v][nbr].get(weight, 1) ... return w ... >>> # A complete bipartite graph with 4 nodes and 4 edges >>> B = nx.complete_bipartite_graph(2,2) >>> # Add some arbitrary weight to the edges >>> for i,(u,v) in enumerate(B.edges()): ... B.edge[u][v]['weight'] = i + 1 ... >>> for edge in B.edges(data=True): ... print(edge) ... (0, 2, {'weight': 1}) (0, 3, {'weight': 2}) (1, 2, {'weight': 3}) (1, 3, {'weight': 4}) >>> # Without specifying a function, the weight is equal to # shared partners >>> G = bipartite.generic_weighted_projected_graph(B, [0, 1]) >>> print(G.edges(data=True)) [(0, 1, {'weight': 2})] >>> # To specify a custom weight function use the weight_function parameter >>> G = bipartite.generic_weighted_projected_graph(B, [0, 1], weight_function=jaccard) >>> print(G.edges(data=True)) [(0, 1, {'weight': 1.0})] >>> G = bipartite.generic_weighted_projected_graph(B, [0, 1], weight_function=my_weight) >>> print(G.edges(data=True)) [(0, 1, {'weight': 10})] Notes ----- No attempt is made to verify that the input graph B is bipartite. The graph and node properties are (shallow) copied to the projected graph. See Also -------- is_bipartite, is_bipartite_node_set, sets, weighted_projected_graph, collaboration_weighted_projected_graph, overlap_weighted_projected_graph, projected_graph """ if B.is_multigraph(): raise nx.NetworkXError("not defined for multigraphs") if B.is_directed(): pred=B.pred G=nx.DiGraph() else: pred=B.adj G=nx.Graph() if weight_function is None: def weight_function(G, u, v): # Notice that we use set(pred[v]) for handling the directed case. return len(set(G[u]) & set(pred[v])) G.graph.update(B.graph) G.add_nodes_from((n,B.node[n]) for n in nodes) for u in nodes: nbrs2 = set((n for nbr in set(B[u]) for n in B[nbr])) - set([u]) for v in nbrs2: weight = weight_function(B, u, v) G.add_edge(u,v,weight=weight) return G def project(B, nodes, create_using=None): return projected_graph(B, nodes) networkx-1.11/networkx/algorithms/bipartite/matching.py0000644000175000017500000003645612637544450023404 0ustar aricaric00000000000000# matching.py - bipartite graph maximum matching algorithms # # Copyright 2015 Jeffrey Finkelstein . # # This file is part of NetworkX. # # NetworkX is distributed under a BSD license; see LICENSE.txt for more # information. # # This module uses material from the Wikipedia article Hopcroft--Karp algorithm # , accessed on # January 3, 2015, which is released under the Creative Commons # Attribution-Share-Alike License 3.0 # . That article includes # pseudocode, which has been translated into the corresponding Python code. # # Portions of this module use code from David Eppstein's Python Algorithms and # Data Structures (PADS) library, which is dedicated to the public domain (for # proof, see ). """Provides functions for computing a maximum cardinality matching in a bipartite graph. If you don't care about the particular implementation of the maximum matching algorithm, simply use the :func:`maximum_matching`. If you do care, you can import one of the named maximum matching algorithms directly. For example, to find a maximum matching in the complete bipartite graph with two vertices on the left and three vertices on the right: >>> import networkx as nx >>> G = nx.complete_bipartite_graph(2, 3) >>> left, right = nx.bipartite.sets(G) >>> list(left) [0, 1] >>> list(right) [2, 3, 4] >>> nx.bipartite.maximum_matching(G) {0: 2, 1: 3, 2: 0, 3: 1} The dictionary returned by :func:`maximum_matching` includes a mapping for vertices in both the left and right vertex sets. """ import collections import itertools from networkx.algorithms.bipartite import sets as bipartite_sets __all__ = ['maximum_matching', 'hopcroft_karp_matching', 'eppstein_matching', 'to_vertex_cover'] INFINITY = float('inf') def hopcroft_karp_matching(G): """Returns the maximum cardinality matching of the bipartite graph `G`. Parameters ---------- G : NetworkX graph Undirected bipartite graph Returns ------- matches : dictionary The matching is returned as a dictionary, `matches`, such that ``matches[v] == w`` if node ``v`` is matched to node ``w``. Unmatched nodes do not occur as a key in mate. Notes ----- This function is implemented with the `Hopcroft--Karp matching algorithm `_ for bipartite graphs. See Also -------- eppstein_matching References ---------- .. [1] John E. Hopcroft and Richard M. Karp. "An n^{5 / 2} Algorithm for Maximum Matchings in Bipartite Graphs" In: **SIAM Journal of Computing** 2.4 (1973), pp. 225--231. . """ # First we define some auxiliary search functions. # # If you are a human reading these auxiliary search functions, the "global" # variables `leftmatches`, `rightmatches`, `distances`, etc. are defined # below the functions, so that they are initialized close to the initial # invocation of the search functions. def breadth_first_search(): for v in left: if leftmatches[v] is None: distances[v] = 0 queue.append(v) else: distances[v] = INFINITY distances[None] = INFINITY while queue: v = queue.popleft() if distances[v] < distances[None]: for u in G[v]: if distances[rightmatches[u]] is INFINITY: distances[rightmatches[u]] = distances[v] + 1 queue.append(rightmatches[u]) return distances[None] is not INFINITY def depth_first_search(v): if v is not None: for u in G[v]: if distances[rightmatches[u]] == distances[v] + 1: if depth_first_search(rightmatches[u]): rightmatches[u] = v leftmatches[v] = u return True distances[v] = INFINITY return False return True # Initialize the "global" variables that maintain state during the search. left, right = bipartite_sets(G) leftmatches = {v: None for v in left} rightmatches = {v: None for v in right} distances = {} queue = collections.deque() # Implementation note: this counter is incremented as pairs are matched but # it is currently not used elsewhere in the computation. num_matched_pairs = 0 while breadth_first_search(): for v in left: if leftmatches[v] is None: if depth_first_search(v): num_matched_pairs += 1 # Strip the entries matched to `None`. leftmatches = {k: v for k, v in leftmatches.items() if v is not None} rightmatches = {k: v for k, v in rightmatches.items() if v is not None} # At this point, the left matches and the right matches are inverses of one # another. In other words, # # leftmatches == {v, k for k, v in rightmatches.items()} # # Finally, we combine both the left matches and right matches. return dict(itertools.chain(leftmatches.items(), rightmatches.items())) def eppstein_matching(G): """Returns the maximum cardinality matching of the bipartite graph `G`. Parameters ---------- G : NetworkX graph Undirected bipartite graph Returns ------- matches : dictionary The matching is returned as a dictionary, `matches`, such that ``matches[v] == w`` if node ``v`` is matched to node ``w``. Unmatched nodes do not occur as a key in mate. Notes ----- This function is implemented with David Eppstein's version of the algorithm Hopcroft--Karp algorithm (see :func:`hopcroft_karp_matching`), which originally appeared in the `Python Algorithms and Data Structures library (PADS) `_. See Also -------- hopcroft_karp_matching """ # initialize greedy matching (redundant, but faster than full search) matching = {} for u in G: for v in G[u]: if v not in matching: matching[v] = u break while True: # structure residual graph into layers # pred[u] gives the neighbor in the previous layer for u in U # preds[v] gives a list of neighbors in the previous layer for v in V # unmatched gives a list of unmatched vertices in final layer of V, # and is also used as a flag value for pred[u] when u is in the first # layer preds = {} unmatched = [] pred = {u: unmatched for u in G} for v in matching: del pred[matching[v]] layer = list(pred) # repeatedly extend layering structure by another pair of layers while layer and not unmatched: newLayer = {} for u in layer: for v in G[u]: if v not in preds: newLayer.setdefault(v, []).append(u) layer = [] for v in newLayer: preds[v] = newLayer[v] if v in matching: layer.append(matching[v]) pred[matching[v]] = v else: unmatched.append(v) # did we finish layering without finding any alternating paths? if not unmatched: unlayered = {} for u in G: # TODO Why is extra inner loop necessary? for v in G[u]: if v not in preds: unlayered[v] = None # TODO Originally, this function returned a three-tuple: # # return (matching, list(pred), list(unlayered)) # # For some reason, the documentation for this function # indicated that the second and third elements of the returned # three-tuple would be the vertices in the left and right vertex # sets, respectively, that are also in the maximum independent set. # However, what I think the author meant was that the second # element is the list of vertices that were unmatched and the third # element was the list of vertices that were matched. Since that # seems to be the case, they don't really need to be returned, # since that information can be inferred from the matching # dictionary. return matching # recursively search backward through layers to find alternating paths # recursion returns true if found path, false otherwise def recurse(v): if v in preds: L = preds.pop(v) for u in L: if u in pred: pu = pred.pop(u) if pu is unmatched or recurse(pu): matching[v] = u return True return False for v in unmatched: recurse(v) def _is_connected_by_alternating_path(G, v, matching, targets): """Returns ``True`` if and only if the vertex `v` is connected to one of the target vertices by an alternating path in `G`. An *alternating path* is a path in which every other edge is in the specified maximum matching (and the remaining edges in the path are not in the matching). An alternating path may have matched edges in the even positions or in the odd positions, as long as the edges alternate between 'matched' and 'unmatched'. `G` is an undirected bipartite NetworkX graph. `v` is a vertex in `G`. `matching` is a dictionary representing a maximum matching in `G`, as returned by, for example, :func:`maximum_matching`. `targets` is a set of vertices. """ # Get the set of matched edges and the set of unmatched edges. Only include # one version of each undirected edge (for example, include edge (1, 2) but # not edge (2, 1)). matched_edges = {(u, v) for u, v in matching.items() if u <= v} unmatched_edges = set(G.edges()) - matched_edges def _alternating_dfs(u, depth, along_matched=True): """Returns ``True`` if and only if `u` is connected to one of the targets by an alternating path. `u` is a vertex in the graph `G`. `depth` specifies the maximum recursion depth of the depth-first search. If `along_matched` is ``True``, this step of the depth-first search will continue only through edges in the given matching. Otherwise, it will continue only through edges *not* in the given matching. """ # Base case 1: u is one of the target vertices. `u` is connected to one # of the target vertices by an alternating path of length zero. if u in targets: return True # Base case 2: we have exceeded are allowed depth. In this case, we # have looked at a path of length `n`, so looking any further won't # help. if depth < 0: return False # Determine which set of edges to look across. valid_edges = matched_edges if along_matched else unmatched_edges for v in G[u]: # Consider only those neighbors connected via a valid edge. if (u, v) in valid_edges or (v, u) in valid_edges: # Recursively perform a depth-first search starting from the # neighbor. Decrement the depth limit and switch which set of # vertices will be valid for next time. return _alternating_dfs(v, depth - 1, not along_matched) # If there are no more vertices to look through and we haven't yet # found a target vertex, simply say that no path exists. return False # Check for alternating paths starting with edges in the matching, then # check for alternating paths starting with edges not in the # matching. Initiate the depth-first search with the current depth equal to # the number of nodes in the graph. return (_alternating_dfs(v, len(G), along_matched=True) or _alternating_dfs(v, len(G), along_matched=False)) def _connected_by_alternating_paths(G, matching, targets): """Returns the set of vertices that are connected to one of the target vertices by an alternating path in `G`. An *alternating path* is a path in which every other edge is in the specified maximum matching (and the remaining edges in the path are not in the matching). An alternating path may have matched edges in the even positions or in the odd positions, as long as the edges alternate between 'matched' and 'unmatched'. `G` is an undirected bipartite NetworkX graph. `matching` is a dictionary representing a maximum matching in `G`, as returned by, for example, :func:`maximum_matching`. `targets` is a set of vertices. """ # TODO This can be parallelized. return {v for v in G if _is_connected_by_alternating_path(G, v, matching, targets)} def to_vertex_cover(G, matching): """Returns the minimum vertex cover corresponding to the given maximum matching of the bipartite graph `G`. Parameters ---------- G : NetworkX graph Undirected bipartite graph matching : dictionary A dictionary whose keys are vertices in `G` and whose values are the distinct neighbors comprising the maximum matching for `G`, as returned by, for example, :func:`maximum_matching`. The dictionary *must* represent the maximum matching. Returns ------- vertex_cover : :class:`set` The minimum vertex cover in `G`. Notes ----- This function is implemented using the procedure guaranteed by `Konig's theorem `_, which proves an equivalence between a maximum matching and a minimum vertex cover in bipartite graphs. Since a minimum vertex cover is the complement of a maximum independent set for any graph, one can compute the maximum independent set of a bipartite graph this way: >>> import networkx as nx >>> G = nx.complete_bipartite_graph(2, 3) >>> matching = nx.bipartite.maximum_matching(G) >>> vertex_cover = nx.bipartite.to_vertex_cover(G, matching) >>> independent_set = set(G) - vertex_cover >>> print(list(independent_set)) [2, 3, 4] """ # This is a Python implementation of the algorithm described at # . L, R = bipartite_sets(G) # Let U be the set of unmatched vertices in the left vertex set. unmatched_vertices = set(G) - set(matching) U = unmatched_vertices & L # Let Z be the set of vertices that are either in U or are connected to U # by alternating paths. Z = _connected_by_alternating_paths(G, matching, U) # At this point, every edge either has a right endpoint in Z or a left # endpoint not in Z. This gives us the vertex cover. return (L - Z) | (R & Z) #: Returns the maximum cardinality matching in the given bipartite graph. #: #: This function is simply an alias for :func:`hopcroft_karp_matching`. maximum_matching = hopcroft_karp_matching networkx-1.11/networkx/algorithms/bipartite/cluster.py0000644000175000017500000001560212637544450023261 0ustar aricaric00000000000000#-*- coding: utf-8 -*- # Copyright (C) 2011 by # Jordi Torrents # Aric Hagberg # All rights reserved. # BSD license. import itertools import networkx as nx __author__ = """\n""".join(['Jordi Torrents ', 'Aric Hagberg (hagberg@lanl.gov)']) __all__ = [ 'clustering', 'average_clustering', 'latapy_clustering', 'robins_alexander_clustering'] # functions for computing clustering of pairs def cc_dot(nu,nv): return float(len(nu & nv))/len(nu | nv) def cc_max(nu,nv): return float(len(nu & nv))/max(len(nu),len(nv)) def cc_min(nu,nv): return float(len(nu & nv))/min(len(nu),len(nv)) modes={'dot':cc_dot, 'min':cc_min, 'max':cc_max} def latapy_clustering(G, nodes=None, mode='dot'): r"""Compute a bipartite clustering coefficient for nodes. The bipartie clustering coefficient is a measure of local density of connections defined as [1]_: .. math:: c_u = \frac{\sum_{v \in N(N(v))} c_{uv} }{|N(N(u))|} where `N(N(u))` are the second order neighbors of `u` in `G` excluding `u`, and `c_{uv}` is the pairwise clustering coefficient between nodes `u` and `v`. The mode selects the function for `c_{uv}` which can be: `dot`: .. math:: c_{uv}=\frac{|N(u)\cap N(v)|}{|N(u) \cup N(v)|} `min`: .. math:: c_{uv}=\frac{|N(u)\cap N(v)|}{min(|N(u)|,|N(v)|)} `max`: .. math:: c_{uv}=\frac{|N(u)\cap N(v)|}{max(|N(u)|,|N(v)|)} Parameters ---------- G : graph A bipartite graph nodes : list or iterable (optional) Compute bipartite clustering for these nodes. The default is all nodes in G. mode : string The pariwise bipartite clustering method to be used in the computation. It must be "dot", "max", or "min". Returns ------- clustering : dictionary A dictionary keyed by node with the clustering coefficient value. Examples -------- >>> from networkx.algorithms import bipartite >>> G = nx.path_graph(4) # path graphs are bipartite >>> c = bipartite.clustering(G) >>> c[0] 0.5 >>> c = bipartite.clustering(G,mode='min') >>> c[0] 1.0 See Also -------- robins_alexander_clustering square_clustering average_clustering References ---------- .. [1] Latapy, Matthieu, Clémence Magnien, and Nathalie Del Vecchio (2008). Basic notions for the analysis of large two-mode networks. Social Networks 30(1), 31--48. """ if not nx.algorithms.bipartite.is_bipartite(G): raise nx.NetworkXError("Graph is not bipartite") try: cc_func = modes[mode] except KeyError: raise nx.NetworkXError(\ "Mode for bipartite clustering must be: dot, min or max") if nodes is None: nodes = G ccs = {} for v in nodes: cc = 0.0 nbrs2=set([u for nbr in G[v] for u in G[nbr]])-set([v]) for u in nbrs2: cc += cc_func(set(G[u]),set(G[v])) if cc > 0.0: # len(nbrs2)>0 cc /= len(nbrs2) ccs[v] = cc return ccs clustering = latapy_clustering def average_clustering(G, nodes=None, mode='dot'): r"""Compute the average bipartite clustering coefficient. A clustering coefficient for the whole graph is the average, .. math:: C = \frac{1}{n}\sum_{v \in G} c_v, where `n` is the number of nodes in `G`. Similar measures for the two bipartite sets can be defined [1]_ .. math:: C_X = \frac{1}{|X|}\sum_{v \in X} c_v, where `X` is a bipartite set of `G`. Parameters ---------- G : graph a bipartite graph nodes : list or iterable, optional A container of nodes to use in computing the average. The nodes should be either the entire graph (the default) or one of the bipartite sets. mode : string The pariwise bipartite clustering method. It must be "dot", "max", or "min" Returns ------- clustering : float The average bipartite clustering for the given set of nodes or the entire graph if no nodes are specified. Examples -------- >>> from networkx.algorithms import bipartite >>> G=nx.star_graph(3) # star graphs are bipartite >>> bipartite.average_clustering(G) 0.75 >>> X,Y=bipartite.sets(G) >>> bipartite.average_clustering(G,X) 0.0 >>> bipartite.average_clustering(G,Y) 1.0 See Also -------- clustering Notes ----- The container of nodes passed to this function must contain all of the nodes in one of the bipartite sets ("top" or "bottom") in order to compute the correct average bipartite clustering coefficients. References ---------- .. [1] Latapy, Matthieu, Clémence Magnien, and Nathalie Del Vecchio (2008). Basic notions for the analysis of large two-mode networks. Social Networks 30(1), 31--48. """ if nodes is None: nodes=G ccs=latapy_clustering(G, nodes=nodes, mode=mode) return float(sum(ccs[v] for v in nodes))/len(nodes) def robins_alexander_clustering(G): r"""Compute the bipartite clustering of G. Robins and Alexander [1]_ defined bipartite clustering coefficient as four times the number of four cycles `C_4` divided by the number of three paths `L_3` in a bipartite graph: .. math:: CC_4 = \frac{4 * C_4}{L_3} Parameters ---------- G : graph a bipartite graph Returns ------- clustering : float The Robins and Alexander bipartite clustering for the input graph. Examples -------- >>> from networkx.algorithms import bipartite >>> G = nx.davis_southern_women_graph() >>> print(round(bipartite.robins_alexander_clustering(G), 3)) 0.468 See Also -------- latapy_clustering square_clustering References ---------- .. [1] Robins, G. and M. Alexander (2004). Small worlds among interlocking directors: Network structure and distance in bipartite graphs. Computational & Mathematical Organization Theory 10(1), 69–94. """ if G.order() < 4 or G.size() < 3: return 0 L_3 = _threepaths(G) if L_3 == 0: return 0 C_4 = _four_cycles(G) return (4. * C_4) / L_3 def _four_cycles(G): cycles = 0 for v in G: for u, w in itertools.combinations(G[v], 2): cycles += len((set(G[u]) & set(G[w])) - set([v])) return cycles / 4 def _threepaths(G): paths = 0 for v in G: for u in G[v]: for w in set(G[u]) - set([v]): paths += len(set(G[w]) - set([v, u])) # Divide by two because we count each three path twice # one for each possible starting point return paths / 2 networkx-1.11/networkx/algorithms/bipartite/redundancy.py0000644000175000017500000000755212637544450023741 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """Node redundancy for bipartite graphs.""" # Copyright (C) 2011 by # Jordi Torrents # Aric Hagberg # All rights reserved. # BSD license. from __future__ import division from itertools import combinations from networkx import NetworkXError __author__ = """\n""".join(['Jordi Torrents ', 'Aric Hagberg (hagberg@lanl.gov)']) __all__ = ['node_redundancy'] def node_redundancy(G, nodes=None): r"""Computes the node redundancy coefficients for the nodes in the bipartite graph ``G``. The redundancy coefficient of a node `v` is the fraction of pairs of neighbors of `v` that are both linked to other nodes. In a one-mode projection these nodes would be linked together even if `v` were not there. More formally, for any vertex `v`, the *redundancy coefficient of `v`* is defined by .. math:: rc(v) = \frac{|\{\{u, w\} \subseteq N(v), \: \exists v' \neq v,\: (v',u) \in E\: \mathrm{and}\: (v',w) \in E\}|}{ \frac{|N(v)|(|N(v)|-1)}{2}}, where `N(v)` is the set of neighbors of `v` in ``G``. Parameters ---------- G : graph A bipartite graph nodes : list or iterable (optional) Compute redundancy for these nodes. The default is all nodes in G. Returns ------- redundancy : dictionary A dictionary keyed by node with the node redundancy value. Examples -------- Compute the redundancy coefficient of each node in a graph:: >>> import networkx as nx >>> from networkx.algorithms import bipartite >>> G = nx.cycle_graph(4) >>> rc = bipartite.node_redundancy(G) >>> rc[0] 1.0 Compute the average redundancy for the graph:: >>> import networkx as nx >>> from networkx.algorithms import bipartite >>> G = nx.cycle_graph(4) >>> rc = bipartite.node_redundancy(G) >>> sum(rc.values()) / len(G) 1.0 Compute the average redundancy for a set of nodes:: >>> import networkx as nx >>> from networkx.algorithms import bipartite >>> G = nx.cycle_graph(4) >>> rc = bipartite.node_redundancy(G) >>> nodes = [0, 2] >>> sum(rc[n] for n in nodes) / len(nodes) 1.0 Raises ------ NetworkXError If any of the nodes in the graph (or in ``nodes``, if specified) has (out-)degree less than two (which would result in division by zero, according to the definition of the redundancy coefficient). References ---------- .. [1] Latapy, Matthieu, Clémence Magnien, and Nathalie Del Vecchio (2008). Basic notions for the analysis of large two-mode networks. Social Networks 30(1), 31--48. """ if nodes is None: nodes = G if any(len(G[v]) < 2 for v in nodes): raise NetworkXError('Cannot compute redundancy coefficient for a node' ' that has fewer than two neighbors.') # TODO This can be trivially parallelized. return {v: _node_redundancy(G, v) for v in nodes} def _node_redundancy(G, v): """Returns the redundancy of the node ``v`` in the bipartite graph ``G``. If ``G`` is a graph with ``n`` nodes, the redundancy of a node is the ratio of the "overlap" of ``v`` to the maximum possible overlap of ``v`` according to its degree. The overlap of ``v`` is the number of pairs of neighbors that have mutual neighbors themselves, other than ``v``. ``v`` must have at least two neighbors in ``G``. """ n = len(G[v]) # TODO On Python 3, we could just use `G[u].keys() & G[w].keys()` instead # of instantiating the entire sets. overlap = sum(1 for (u, w) in combinations(G[v], 2) if (set(G[u]) & set(G[w])) - {v}) return (2 * overlap) / (n * (n - 1)) networkx-1.11/networkx/algorithms/bipartite/__init__.py0000644000175000017500000000646012637544500023335 0ustar aricaric00000000000000r""" This module provides functions and operations for bipartite graphs. Bipartite graphs `B = (U, V, E)` have two node sets `U,V` and edges in `E` that only connect nodes from opposite sets. It is common in the literature to use an spatial analogy referring to the two node sets as top and bottom nodes. The bipartite algorithms are not imported into the networkx namespace at the top level so the easiest way to use them is with: >>> import networkx as nx >>> from networkx.algorithms import bipartite NetworkX does not have a custom bipartite graph class but the Graph() or DiGraph() classes can be used to represent bipartite graphs. However, you have to keep track of which set each node belongs to, and make sure that there is no edge between nodes of the same set. The convention used in NetworkX is to use a node attribute named "bipartite" with values 0 or 1 to identify the sets each node belongs to. For example: >>> B = nx.Graph() >>> B.add_nodes_from([1,2,3,4], bipartite=0) # Add the node attribute "bipartite" >>> B.add_nodes_from(['a','b','c'], bipartite=1) >>> B.add_edges_from([(1,'a'), (1,'b'), (2,'b'), (2,'c'), (3,'c'), (4,'a')]) Many algorithms of the bipartite module of NetworkX require, as an argument, a container with all the nodes that belong to one set, in addition to the bipartite graph `B`. If `B` is connected, you can find the node sets using a two-coloring algorithm: >>> nx.is_connected(B) True >>> bottom_nodes, top_nodes = bipartite.sets(B) list(top_nodes) [1, 2, 3, 4] list(bottom_nodes) ['a', 'c', 'b'] However, if the input graph is not connected, there are more than one possible colorations. Thus, the following result is correct: >>> B.remove_edge(2,'c') >>> nx.is_connected(B) False >>> bottom_nodes, top_nodes = bipartite.sets(B) list(top_nodes) [1, 2, 4, 'c'] list(bottom_nodes) ['a', 3, 'b'] Using the "bipartite" node attribute, you can easily get the two node sets: >>> top_nodes = set(n for n,d in B.nodes(data=True) if d['bipartite']==0) >>> bottom_nodes = set(B) - top_nodes list(top_nodes) [1, 2, 3, 4] list(bottom_nodes) ['a', 'c', 'b'] So you can easily use the bipartite algorithms that require, as an argument, a container with all nodes that belong to one node set: >>> print(round(bipartite.density(B, bottom_nodes),2)) 0.42 >>> G = bipartite.projected_graph(B, top_nodes) >>> G.edges() [(1, 2), (1, 4)] All bipartite graph generators in NetworkX build bipartite graphs with the "bipartite" node attribute. Thus, you can use the same approach: >>> RB = bipartite.random_graph(5, 7, 0.2) >>> RB_top = set(n for n,d in RB.nodes(data=True) if d['bipartite']==0) >>> RB_bottom = set(RB) - RB_top >>> list(RB_top) [0, 1, 2, 3, 4] >>> list(RB_bottom) [5, 6, 7, 8, 9, 10, 11] For other bipartite graph generators see the bipartite section of :doc:`generators`. """ from networkx.algorithms.bipartite.basic import * from networkx.algorithms.bipartite.centrality import * from networkx.algorithms.bipartite.cluster import * from networkx.algorithms.bipartite.edgelist import * from networkx.algorithms.bipartite.matching import * from networkx.algorithms.bipartite.matrix import * from networkx.algorithms.bipartite.projection import * from networkx.algorithms.bipartite.redundancy import * from networkx.algorithms.bipartite.spectral import * from networkx.algorithms.bipartite.generators import * networkx-1.11/networkx/algorithms/bipartite/edgelist.py0000644000175000017500000002635712637544500023405 0ustar aricaric00000000000000""" ********** Bipartite Edge Lists ********** Read and write NetworkX graphs as bipartite edge lists. Format ------ You can read or write three formats of edge lists with these functions. Node pairs with no data:: 1 2 Python dictionary as data:: 1 2 {'weight':7, 'color':'green'} Arbitrary data:: 1 2 7 green For each edge (u, v) the node u is assigned to part 0 and the node v to part 1. """ # Copyright (C) 2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. __all__ = ['generate_edgelist', 'write_edgelist', 'parse_edgelist', 'read_edgelist'] import networkx as nx from networkx.utils import open_file, make_str, not_implemented_for from networkx.convert import _prep_create_using @open_file(1, mode='wb') def write_edgelist(G, path, comments="#", delimiter=' ', data=True, encoding = 'utf-8'): """Write a bipartite graph as a list of edges. Parameters ---------- G : Graph A NetworkX bipartite graph path : file or string File or filename to write. If a file is provided, it must be opened in 'wb' mode. Filenames ending in .gz or .bz2 will be compressed. comments : string, optional The character used to indicate the start of a comment delimiter : string, optional The string used to separate values. The default is whitespace. data : bool or list, optional If False write no edge data. If True write a string representation of the edge data dictionary.. If a list (or other iterable) is provided, write the keys specified in the list. encoding: string, optional Specify which encoding to use when writing file. Examples -------- >>> G=nx.path_graph(4) >>> G.add_nodes_from([0,2], bipartite=0) >>> G.add_nodes_from([1,3], bipartite=1) >>> nx.write_edgelist(G, "test.edgelist") >>> fh=open("test.edgelist",'wb') >>> nx.write_edgelist(G, fh) >>> nx.write_edgelist(G, "test.edgelist.gz") >>> nx.write_edgelist(G, "test.edgelist.gz", data=False) >>> G=nx.Graph() >>> G.add_edge(1,2,weight=7,color='red') >>> nx.write_edgelist(G,'test.edgelist',data=False) >>> nx.write_edgelist(G,'test.edgelist',data=['color']) >>> nx.write_edgelist(G,'test.edgelist',data=['color','weight']) See Also -------- write_edgelist() generate_edgelist() """ for line in generate_edgelist(G, delimiter, data): line += '\n' path.write(line.encode(encoding)) @not_implemented_for('directed') def generate_edgelist(G, delimiter=' ', data=True): """Generate a single line of the bipartite graph G in edge list format. Parameters ---------- G : NetworkX graph The graph is assumed to have node attribute `part` set to 0,1 representing the two graph parts delimiter : string, optional Separator for node labels data : bool or list of keys If False generate no edge data. If True use a dictionary representation of edge data. If a list of keys use a list of data values corresponding to the keys. Returns ------- lines : string Lines of data in adjlist format. Examples -------- >>> from networkx.algorithms import bipartite >>> G = nx.path_graph(4) >>> G.add_nodes_from([0,2], bipartite=0) >>> G.add_nodes_from([1,3], bipartite=1) >>> G[1][2]['weight'] = 3 >>> G[2][3]['capacity'] = 12 >>> for line in bipartite.generate_edgelist(G, data=False): ... print(line) 0 1 2 1 2 3 >>> for line in bipartite.generate_edgelist(G): ... print(line) 0 1 {} 2 1 {'weight': 3} 2 3 {'capacity': 12} >>> for line in bipartite.generate_edgelist(G,data=['weight']): ... print(line) 0 1 2 1 3 2 3 """ try: part0 = [n for n,d in G.node.items() if d['bipartite'] == 0] except: raise AttributeError("Missing node attribute `bipartite`") if data is True or data is False: for n in part0: for e in G.edges(n, data=data): yield delimiter.join(map(make_str,e)) else: for n in part0: for u,v,d in G.edges(n, data=True): e = [u,v] try: e.extend(d[k] for k in data) except KeyError: pass # missing data for this edge, should warn? yield delimiter.join(map(make_str,e)) def parse_edgelist(lines, comments='#', delimiter=None, create_using=None, nodetype=None, data=True): """Parse lines of an edge list representation of a bipartite graph. Parameters ---------- lines : list or iterator of strings Input data in edgelist format comments : string, optional Marker for comment lines delimiter : string, optional Separator for node labels create_using: NetworkX graph container, optional Use given NetworkX graph for holding nodes or edges. nodetype : Python type, optional Convert nodes to this type. data : bool or list of (label,type) tuples If False generate no edge data or if True use a dictionary representation of edge data or a list tuples specifying dictionary key names and types for edge data. Returns ------- G: NetworkX Graph The bipartite graph corresponding to lines Examples -------- Edgelist with no data: >>> from networkx.algorithms import bipartite >>> lines = ["1 2", ... "2 3", ... "3 4"] >>> G = bipartite.parse_edgelist(lines, nodetype = int) >>> sorted(G.nodes()) [1, 2, 3, 4] >>> sorted(G.nodes(data=True)) [(1, {'bipartite': 0}), (2, {'bipartite': 0}), (3, {'bipartite': 0}), (4, {'bipartite': 1})] >>> sorted(G.edges()) [(1, 2), (2, 3), (3, 4)] Edgelist with data in Python dictionary representation: >>> lines = ["1 2 {'weight':3}", ... "2 3 {'weight':27}", ... "3 4 {'weight':3.0}"] >>> G = bipartite.parse_edgelist(lines, nodetype = int) >>> sorted(G.nodes()) [1, 2, 3, 4] >>> sorted(G.edges(data = True)) [(1, 2, {'weight': 3}), (2, 3, {'weight': 27}), (3, 4, {'weight': 3.0})] Edgelist with data in a list: >>> lines = ["1 2 3", ... "2 3 27", ... "3 4 3.0"] >>> G = bipartite.parse_edgelist(lines, nodetype = int, data=(('weight',float),)) >>> sorted(G.nodes()) [1, 2, 3, 4] >>> sorted(G.edges(data = True)) [(1, 2, {'weight': 3.0}), (2, 3, {'weight': 27.0}), (3, 4, {'weight': 3.0})] See Also -------- """ from ast import literal_eval G = _prep_create_using(create_using) for line in lines: p=line.find(comments) if p>=0: line = line[:p] if not len(line): continue # split line, should have 2 or more s=line.strip().split(delimiter) if len(s)<2: continue u=s.pop(0) v=s.pop(0) d=s if nodetype is not None: try: u=nodetype(u) v=nodetype(v) except: raise TypeError("Failed to convert nodes %s,%s to type %s." %(u,v,nodetype)) if len(d)==0 or data is False: # no data or data type specified edgedata={} elif data is True: # no edge types specified try: # try to evaluate as dictionary edgedata=dict(literal_eval(' '.join(d))) except: raise TypeError( "Failed to convert edge data (%s) to dictionary."%(d)) else: # convert edge data to dictionary with specified keys and type if len(d)!=len(data): raise IndexError( "Edge data %s and data_keys %s are not the same length"% (d, data)) edgedata={} for (edge_key,edge_type),edge_value in zip(data,d): try: edge_value=edge_type(edge_value) except: raise TypeError( "Failed to convert %s data %s to type %s." %(edge_key, edge_value, edge_type)) edgedata.update({edge_key:edge_value}) G.add_node(u, bipartite=0) G.add_node(v, bipartite=1) G.add_edge(u, v, attr_dict=edgedata) return G @open_file(0,mode='rb') def read_edgelist(path, comments="#", delimiter=None, create_using=None, nodetype=None, data=True, edgetype=None, encoding='utf-8'): """Read a bipartite graph from a list of edges. Parameters ---------- path : file or string File or filename to read. If a file is provided, it must be opened in 'rb' mode. Filenames ending in .gz or .bz2 will be uncompressed. comments : string, optional The character used to indicate the start of a comment. delimiter : string, optional The string used to separate values. The default is whitespace. create_using : Graph container, optional, Use specified container to build graph. The default is networkx.Graph, an undirected graph. nodetype : int, float, str, Python type, optional Convert node data from strings to specified type data : bool or list of (label,type) tuples Tuples specifying dictionary key names and types for edge data edgetype : int, float, str, Python type, optional OBSOLETE Convert edge data from strings to specified type and use as 'weight' encoding: string, optional Specify which encoding to use when reading file. Returns ------- G : graph A networkx Graph or other type specified with create_using Examples -------- >>> from networkx.algorithms import bipartite >>> G = nx.path_graph(4) >>> G.add_nodes_from([0,2], bipartite=0) >>> G.add_nodes_from([1,3], bipartite=1) >>> bipartite.write_edgelist(G, "test.edgelist") >>> G = bipartite.read_edgelist("test.edgelist") >>> fh = open("test.edgelist", 'rb') >>> G = bipartite.read_edgelist(fh) >>> fh.close() >>> G=bipartite.read_edgelist("test.edgelist", nodetype=int) Edgelist with data in a list: >>> textline = '1 2 3' >>> fh = open('test.edgelist','w') >>> d = fh.write(textline) >>> fh.close() >>> G = bipartite.read_edgelist('test.edgelist', nodetype=int, data=(('weight',float),)) >>> G.nodes() [1, 2] >>> G.edges(data = True) [(1, 2, {'weight': 3.0})] See parse_edgelist() for more examples of formatting. See Also -------- parse_edgelist Notes ----- Since nodes must be hashable, the function nodetype must return hashable types (e.g. int, float, str, frozenset - or tuples of those, etc.) """ lines = (line.decode(encoding) for line in path) return parse_edgelist(lines,comments=comments, delimiter=delimiter, create_using=create_using, nodetype=nodetype, data=data) networkx-1.11/networkx/algorithms/bipartite/tests/0000755000175000017500000000000012653231454022357 5ustar aricaric00000000000000networkx-1.11/networkx/algorithms/bipartite/tests/test_project.py0000644000175000017500000003301412637544500025440 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import assert_equal import networkx as nx from networkx.algorithms import bipartite from networkx.testing import assert_edges_equal, assert_nodes_equal class TestBipartiteProject: def test_path_projected_graph(self): G=nx.path_graph(4) P=bipartite.projected_graph(G, [1, 3]) assert_nodes_equal(P.nodes(), [1, 3]) assert_edges_equal(P.edges(), [(1, 3)]) P=bipartite.projected_graph(G, [0, 2]) assert_nodes_equal(P.nodes(), [0, 2]) assert_edges_equal(P.edges(), [(0, 2)]) def test_path_projected_properties_graph(self): G=nx.path_graph(4) G.add_node(1,name='one') G.add_node(2,name='two') P=bipartite.projected_graph(G,[1,3]) assert_nodes_equal(P.nodes(),[1,3]) assert_edges_equal(P.edges(),[(1,3)]) assert_equal(P.node[1]['name'],G.node[1]['name']) P=bipartite.projected_graph(G,[0,2]) assert_nodes_equal(P.nodes(),[0,2]) assert_edges_equal(P.edges(),[(0,2)]) assert_equal(P.node[2]['name'],G.node[2]['name']) def test_path_collaboration_projected_graph(self): G=nx.path_graph(4) P=bipartite.collaboration_weighted_projected_graph(G,[1,3]) assert_nodes_equal(P.nodes(),[1,3]) assert_edges_equal(P.edges(),[(1,3)]) P[1][3]['weight']=1 P=bipartite.collaboration_weighted_projected_graph(G,[0,2]) assert_nodes_equal(P.nodes(),[0,2]) assert_edges_equal(P.edges(),[(0,2)]) P[0][2]['weight']=1 def test_directed_path_collaboration_projected_graph(self): G=nx.DiGraph() G.add_path(list(range(4))) P=bipartite.collaboration_weighted_projected_graph(G,[1,3]) assert_nodes_equal(P.nodes(),[1,3]) assert_edges_equal(P.edges(),[(1,3)]) P[1][3]['weight']=1 P=bipartite.collaboration_weighted_projected_graph(G,[0,2]) assert_nodes_equal(P.nodes(),[0,2]) assert_edges_equal(P.edges(),[(0,2)]) P[0][2]['weight']=1 def test_path_weighted_projected_graph(self): G=nx.path_graph(4) P=bipartite.weighted_projected_graph(G,[1,3]) assert_nodes_equal(P.nodes(),[1,3]) assert_edges_equal(P.edges(),[(1,3)]) P[1][3]['weight']=1 P=bipartite.weighted_projected_graph(G,[0,2]) assert_nodes_equal(P.nodes(),[0,2]) assert_edges_equal(P.edges(),[(0,2)]) P[0][2]['weight']=1 def test_path_weighted_projected_directed_graph(self): G=nx.DiGraph() G.add_path(list(range(4))) P=bipartite.weighted_projected_graph(G,[1,3]) assert_nodes_equal(P.nodes(),[1,3]) assert_edges_equal(P.edges(),[(1,3)]) P[1][3]['weight']=1 P=bipartite.weighted_projected_graph(G,[0,2]) assert_nodes_equal(P.nodes(),[0,2]) assert_edges_equal(P.edges(),[(0,2)]) P[0][2]['weight']=1 def test_star_projected_graph(self): G=nx.star_graph(3) P=bipartite.projected_graph(G,[1,2,3]) assert_nodes_equal(P.nodes(),[1,2,3]) assert_edges_equal(P.edges(),[(1,2),(1,3),(2,3)]) P=bipartite.weighted_projected_graph(G,[1,2,3]) assert_nodes_equal(P.nodes(),[1,2,3]) assert_edges_equal(P.edges(),[(1,2),(1,3),(2,3)]) P=bipartite.projected_graph(G,[0]) assert_nodes_equal(P.nodes(),[0]) assert_edges_equal(P.edges(),[]) def test_project_multigraph(self): G=nx.Graph() G.add_edge('a',1) G.add_edge('b',1) G.add_edge('a',2) G.add_edge('b',2) P=bipartite.projected_graph(G,'ab') assert_edges_equal(P.edges(),[('a','b')]) P=bipartite.weighted_projected_graph(G,'ab') assert_edges_equal(P.edges(),[('a','b')]) P=bipartite.projected_graph(G,'ab',multigraph=True) assert_edges_equal(P.edges(),[('a','b'),('a','b')]) def test_project_collaboration(self): G=nx.Graph() G.add_edge('a',1) G.add_edge('b',1) G.add_edge('b',2) G.add_edge('c',2) G.add_edge('c',3) G.add_edge('c',4) G.add_edge('b',4) P=bipartite.collaboration_weighted_projected_graph(G,'abc') assert_equal(P['a']['b']['weight'],1) assert_equal(P['b']['c']['weight'],2) def test_directed_projection(self): G=nx.DiGraph() G.add_edge('A',1) G.add_edge(1,'B') G.add_edge('A',2) G.add_edge('B',2) P=bipartite.projected_graph(G,'AB') assert_edges_equal(P.edges(),[('A','B')]) P=bipartite.weighted_projected_graph(G,'AB') assert_edges_equal(P.edges(),[('A','B')]) assert_equal(P['A']['B']['weight'],1) P=bipartite.projected_graph(G,'AB',multigraph=True) assert_edges_equal(P.edges(),[('A','B')]) G=nx.DiGraph() G.add_edge('A',1) G.add_edge(1,'B') G.add_edge('A',2) G.add_edge(2,'B') P=bipartite.projected_graph(G,'AB') assert_edges_equal(P.edges(),[('A','B')]) P=bipartite.weighted_projected_graph(G,'AB') assert_edges_equal(P.edges(),[('A','B')]) assert_equal(P['A']['B']['weight'],2) P=bipartite.projected_graph(G,'AB',multigraph=True) assert_edges_equal(P.edges(),[('A','B'),('A','B')]) class TestBipartiteWeightedProjection: def setUp(self): # Tore Opsahl's example # http://toreopsahl.com/2009/05/01/projecting-two-mode-networks-onto-weighted-one-mode-networks/ self.G=nx.Graph() self.G.add_edge('A',1) self.G.add_edge('A',2) self.G.add_edge('B',1) self.G.add_edge('B',2) self.G.add_edge('B',3) self.G.add_edge('B',4) self.G.add_edge('B',5) self.G.add_edge('C',1) self.G.add_edge('D',3) self.G.add_edge('E',4) self.G.add_edge('E',5) self.G.add_edge('E',6) self.G.add_edge('F',6) # Graph based on figure 6 from Newman (2001) self.N=nx.Graph() self.N.add_edge('A',1) self.N.add_edge('A',2) self.N.add_edge('A',3) self.N.add_edge('B',1) self.N.add_edge('B',2) self.N.add_edge('B',3) self.N.add_edge('C',1) self.N.add_edge('D',1) self.N.add_edge('E',3) def test_project_weighted_shared(self): edges=[('A','B',2), ('A','C',1), ('B','C',1), ('B','D',1), ('B','E',2), ('E','F',1)] Panswer=nx.Graph() Panswer.add_weighted_edges_from(edges) P=bipartite.weighted_projected_graph(self.G,'ABCDEF') assert_edges_equal(P.edges(),Panswer.edges()) for u,v in P.edges(): assert_equal(P[u][v]['weight'],Panswer[u][v]['weight']) edges=[('A','B',3), ('A','E',1), ('A','C',1), ('A','D',1), ('B','E',1), ('B','C',1), ('B','D',1), ('C','D',1)] Panswer=nx.Graph() Panswer.add_weighted_edges_from(edges) P=bipartite.weighted_projected_graph(self.N,'ABCDE') assert_edges_equal(P.edges(),Panswer.edges()) for u,v in P.edges(): assert_equal(P[u][v]['weight'],Panswer[u][v]['weight']) def test_project_weighted_newman(self): edges=[('A','B',1.5), ('A','C',0.5), ('B','C',0.5), ('B','D',1), ('B','E',2), ('E','F',1)] Panswer=nx.Graph() Panswer.add_weighted_edges_from(edges) P=bipartite.collaboration_weighted_projected_graph(self.G,'ABCDEF') assert_edges_equal(P.edges(),Panswer.edges()) for u,v in P.edges(): assert_equal(P[u][v]['weight'],Panswer[u][v]['weight']) edges=[('A','B',11/6.0), ('A','E',1/2.0), ('A','C',1/3.0), ('A','D',1/3.0), ('B','E',1/2.0), ('B','C',1/3.0), ('B','D',1/3.0), ('C','D',1/3.0)] Panswer=nx.Graph() Panswer.add_weighted_edges_from(edges) P=bipartite.collaboration_weighted_projected_graph(self.N,'ABCDE') assert_edges_equal(P.edges(),Panswer.edges()) for u,v in P.edges(): assert_equal(P[u][v]['weight'],Panswer[u][v]['weight']) def test_project_weighted_ratio(self): edges=[('A','B',2/6.0), ('A','C',1/6.0), ('B','C',1/6.0), ('B','D',1/6.0), ('B','E',2/6.0), ('E','F',1/6.0)] Panswer=nx.Graph() Panswer.add_weighted_edges_from(edges) P=bipartite.weighted_projected_graph(self.G, 'ABCDEF', ratio=True) assert_edges_equal(P.edges(),Panswer.edges()) for u,v in P.edges(): assert_equal(P[u][v]['weight'],Panswer[u][v]['weight']) edges=[('A','B',3/3.0), ('A','E',1/3.0), ('A','C',1/3.0), ('A','D',1/3.0), ('B','E',1/3.0), ('B','C',1/3.0), ('B','D',1/3.0), ('C','D',1/3.0)] Panswer=nx.Graph() Panswer.add_weighted_edges_from(edges) P=bipartite.weighted_projected_graph(self.N, 'ABCDE', ratio=True) assert_edges_equal(P.edges(),Panswer.edges()) for u,v in P.edges(): assert_equal(P[u][v]['weight'],Panswer[u][v]['weight']) def test_project_weighted_overlap(self): edges=[('A','B',2/2.0), ('A','C',1/1.0), ('B','C',1/1.0), ('B','D',1/1.0), ('B','E',2/3.0), ('E','F',1/1.0)] Panswer=nx.Graph() Panswer.add_weighted_edges_from(edges) P=bipartite.overlap_weighted_projected_graph(self.G,'ABCDEF', jaccard=False) assert_edges_equal(P.edges(),Panswer.edges()) for u,v in P.edges(): assert_equal(P[u][v]['weight'],Panswer[u][v]['weight']) edges=[('A','B',3/3.0), ('A','E',1/1.0), ('A','C',1/1.0), ('A','D',1/1.0), ('B','E',1/1.0), ('B','C',1/1.0), ('B','D',1/1.0), ('C','D',1/1.0)] Panswer=nx.Graph() Panswer.add_weighted_edges_from(edges) P=bipartite.overlap_weighted_projected_graph(self.N,'ABCDE', jaccard=False) assert_edges_equal(P.edges(),Panswer.edges()) for u,v in P.edges(): assert_equal(P[u][v]['weight'],Panswer[u][v]['weight']) def test_project_weighted_jaccard(self): edges=[('A','B',2/5.0), ('A','C',1/2.0), ('B','C',1/5.0), ('B','D',1/5.0), ('B','E',2/6.0), ('E','F',1/3.0)] Panswer=nx.Graph() Panswer.add_weighted_edges_from(edges) P=bipartite.overlap_weighted_projected_graph(self.G,'ABCDEF') assert_edges_equal(P.edges(),Panswer.edges()) for u,v in P.edges(): assert_equal(P[u][v]['weight'],Panswer[u][v]['weight']) edges=[('A','B',3/3.0), ('A','E',1/3.0), ('A','C',1/3.0), ('A','D',1/3.0), ('B','E',1/3.0), ('B','C',1/3.0), ('B','D',1/3.0), ('C','D',1/1.0)] Panswer=nx.Graph() Panswer.add_weighted_edges_from(edges) P=bipartite.overlap_weighted_projected_graph(self.N,'ABCDE') assert_edges_equal(P.edges(),Panswer.edges()) for u,v in P.edges(): assert_equal(P[u][v]['weight'],Panswer[u][v]['weight']) def test_generic_weighted_projected_graph_simple(self): def shared(G, u, v): return len(set(G[u]) & set(G[v])) B = nx.path_graph(5) G = bipartite.generic_weighted_projected_graph(B, [0, 2, 4], weight_function=shared) assert_nodes_equal(G.nodes(), [0, 2, 4]) assert_edges_equal(G.edges(data=True), [(0, 2, {'weight': 1}), (2, 4, {'weight': 1})] ) G = bipartite.generic_weighted_projected_graph(B, [0, 2, 4]) assert_nodes_equal(G.nodes(), [0, 2, 4]) assert_edges_equal(G.edges(data=True), [(0, 2, {'weight': 1}), (2, 4, {'weight': 1})] ) B = nx.DiGraph() B.add_path(list(range(5))) G = bipartite.generic_weighted_projected_graph(B, [0, 2, 4]) assert_nodes_equal(G.nodes(), [0, 2, 4]) assert_edges_equal(G.edges(data=True), [(0, 2, {'weight': 1}), (2, 4, {'weight': 1})] ) def test_generic_weighted_projected_graph_custom(self): def jaccard(G, u, v): unbrs = set(G[u]) vnbrs = set(G[v]) return float(len(unbrs & vnbrs)) / len(unbrs | vnbrs) def my_weight(G, u, v, weight='weight'): w = 0 for nbr in set(G[u]) & set(G[v]): w += G.edge[u][nbr].get(weight, 1) + G.edge[v][nbr].get(weight, 1) return w B = nx.bipartite.complete_bipartite_graph(2, 2) for i,(u,v) in enumerate(B.edges()): B.edge[u][v]['weight'] = i + 1 G = bipartite.generic_weighted_projected_graph(B, [0, 1], weight_function=jaccard) assert_edges_equal(G.edges(data=True), [(0, 1, {'weight': 1.0})]) G = bipartite.generic_weighted_projected_graph(B, [0, 1], weight_function=my_weight) assert_edges_equal(G.edges(data=True), [(0, 1, {'weight': 10})]) G = bipartite.generic_weighted_projected_graph(B, [0, 1]) assert_edges_equal(G.edges(data=True), [(0, 1, {'weight': 2})]) networkx-1.11/networkx/algorithms/bipartite/tests/test_matrix.py0000644000175000017500000000617712637544450025314 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * from nose import SkipTest import networkx as nx from networkx.algorithms import bipartite from networkx.testing.utils import assert_edges_equal class TestBiadjacencyMatrix: @classmethod def setupClass(cls): global np, sp, sparse, np_assert_equal try: import numpy as np import scipy as sp import scipy.sparse as sparse np_assert_equal=np.testing.assert_equal except ImportError: raise SkipTest('SciPy sparse library not available.') def test_biadjacency_matrix_weight(self): G=nx.path_graph(5) G.add_edge(0,1,weight=2,other=4) X=[1,3] Y=[0,2,4] M = bipartite.biadjacency_matrix(G,X,weight='weight') assert_equal(M[0,0], 2) M = bipartite.biadjacency_matrix(G, X, weight='other') assert_equal(M[0,0], 4) def test_biadjacency_matrix(self): tops = [2,5,10] bots = [5,10,15] for i in range(len(tops)): G = bipartite.random_graph(tops[i], bots[i], 0.2) top = [n for n,d in G.nodes(data=True) if d['bipartite']==0] M = bipartite.biadjacency_matrix(G, top) assert_equal(M.shape[0],tops[i]) assert_equal(M.shape[1],bots[i]) def test_biadjacency_matrix_order(self): G=nx.path_graph(5) G.add_edge(0,1,weight=2) X=[3,1] Y=[4,2,0] M = bipartite.biadjacency_matrix(G,X,Y,weight='weight') assert_equal(M[1,2], 2) @raises(nx.NetworkXError) def test_null_fail(self): bipartite.biadjacency_matrix(nx.Graph(),[]) @raises(nx.NetworkXError) def test_empty_fail(self): bipartite.biadjacency_matrix(nx.Graph([(1,0)]),[]) @raises(nx.NetworkXError) def test_duplicate_row_fail(self): bipartite.biadjacency_matrix(nx.Graph([(1,0)]),[1,1]) @raises(nx.NetworkXError) def test_duplicate_col_fail(self): bipartite.biadjacency_matrix(nx.Graph([(1,0)]),[0],[1,1]) @raises(nx.NetworkXError) def test_duplicate_col_fail(self): bipartite.biadjacency_matrix(nx.Graph([(1,0)]),[0],[1,1]) @raises(nx.NetworkXError) def test_format_keyword_fail(self): bipartite.biadjacency_matrix(nx.Graph([(1,0)]),[0],format='foo') def test_from_biadjacency_roundtrip(self): B1 = nx.path_graph(5) M = bipartite.biadjacency_matrix(B1, [0,2,4]) B2 = bipartite.from_biadjacency_matrix(M) assert_true(nx.is_isomorphic(B1,B2)) def test_from_biadjacency_weight(self): M = sparse.csc_matrix([[1,2],[0,3]]) B = bipartite.from_biadjacency_matrix(M) assert_edges_equal(B.edges(),[(0,2),(0,3),(1,3)]) B = bipartite.from_biadjacency_matrix(M, edge_attribute='weight') e = [(0,2,{'weight':1}),(0,3,{'weight':2}),(1,3,{'weight':3})] assert_edges_equal(B.edges(data=True),e) def test_from_biadjacency_multigraph(self): M = sparse.csc_matrix([[1,2],[0,3]]) B = bipartite.from_biadjacency_matrix(M, create_using=nx.MultiGraph()) assert_edges_equal(B.edges(),[(0,2),(0,3),(0,3),(1,3),(1,3),(1,3)]) networkx-1.11/networkx/algorithms/bipartite/tests/test_cluster.py0000644000175000017500000000522512637544450025462 0ustar aricaric00000000000000import networkx as nx from nose.tools import * from networkx.algorithms.bipartite.cluster import cc_dot,cc_min,cc_max import networkx.algorithms.bipartite as bipartite def test_pairwise_bipartite_cc_functions(): # Test functions for different kinds of bipartite clustering coefficients # between pairs of nodes using 3 example graphs from figure 5 p. 40 # Latapy et al (2008) G1 = nx.Graph([(0,2),(0,3),(0,4),(0,5),(0,6),(1,5),(1,6),(1,7)]) G2 = nx.Graph([(0,2),(0,3),(0,4),(1,3),(1,4),(1,5)]) G3 = nx.Graph([(0,2),(0,3),(0,4),(0,5),(0,6),(1,5),(1,6),(1,7),(1,8),(1,9)]) result = {0:[1/3.0, 2/3.0, 2/5.0], 1:[1/2.0, 2/3.0, 2/3.0], 2:[2/8.0, 2/5.0, 2/5.0]} for i, G in enumerate([G1, G2, G3]): assert(bipartite.is_bipartite(G)) assert(cc_dot(set(G[0]), set(G[1])) == result[i][0]) assert(cc_min(set(G[0]), set(G[1])) == result[i][1]) assert(cc_max(set(G[0]), set(G[1])) == result[i][2]) def test_star_graph(): G=nx.star_graph(3) # all modes are the same answer={0:0,1:1,2:1,3:1} assert_equal(bipartite.clustering(G,mode='dot'),answer) assert_equal(bipartite.clustering(G,mode='min'),answer) assert_equal(bipartite.clustering(G,mode='max'),answer) @raises(nx.NetworkXError) def test_not_bipartite(): bipartite.clustering(nx.complete_graph(4)) @raises(nx.NetworkXError) def test_bad_mode(): bipartite.clustering(nx.path_graph(4),mode='foo') def test_path_graph(): G=nx.path_graph(4) answer={0:0.5,1:0.5,2:0.5,3:0.5} assert_equal(bipartite.clustering(G,mode='dot'),answer) assert_equal(bipartite.clustering(G,mode='max'),answer) answer={0:1,1:1,2:1,3:1} assert_equal(bipartite.clustering(G,mode='min'),answer) def test_average_path_graph(): G=nx.path_graph(4) assert_equal(bipartite.average_clustering(G,mode='dot'),0.5) assert_equal(bipartite.average_clustering(G,mode='max'),0.5) assert_equal(bipartite.average_clustering(G,mode='min'),1) def test_ra_clustering_davis(): G = nx.davis_southern_women_graph() cc4 = round(bipartite.robins_alexander_clustering(G), 3) assert_equal(cc4, 0.468) def test_ra_clustering_square(): G = nx.path_graph(4) G.add_edge(0, 3) assert_equal(bipartite.robins_alexander_clustering(G), 1.0) def test_ra_clustering_zero(): G = nx.Graph() assert_equal(bipartite.robins_alexander_clustering(G), 0) G.add_nodes_from(range(4)) assert_equal(bipartite.robins_alexander_clustering(G), 0) G.add_edges_from([(0,1),(2,3),(3,4)]) assert_equal(bipartite.robins_alexander_clustering(G), 0) G.add_edge(1,2) assert_equal(bipartite.robins_alexander_clustering(G), 0) networkx-1.11/networkx/algorithms/bipartite/tests/test_generators.py0000644000175000017500000001577112637544500026155 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * from networkx import * from networkx.algorithms.bipartite.generators import * """Generators - Bipartite ---------------------- """ class TestGeneratorsBipartite(): def test_complete_bipartite_graph(self): G=complete_bipartite_graph(0,0) assert_true(is_isomorphic( G, null_graph() )) for i in [1, 5]: G=complete_bipartite_graph(i,0) assert_true(is_isomorphic( G, empty_graph(i) )) G=complete_bipartite_graph(0,i) assert_true(is_isomorphic( G, empty_graph(i) )) G=complete_bipartite_graph(2,2) assert_true(is_isomorphic( G, cycle_graph(4) )) G=complete_bipartite_graph(1,5) assert_true(is_isomorphic( G, star_graph(5) )) G=complete_bipartite_graph(5,1) assert_true(is_isomorphic( G, star_graph(5) )) # complete_bipartite_graph(m1,m2) is a connected graph with # m1+m2 nodes and m1*m2 edges for m1, m2 in [(5, 11), (7, 3)]: G=complete_bipartite_graph(m1,m2) assert_equal(number_of_nodes(G), m1 + m2) assert_equal(number_of_edges(G), m1 * m2) assert_raises(networkx.exception.NetworkXError, complete_bipartite_graph, 7, 3, create_using=DiGraph()) mG=complete_bipartite_graph(7, 3, create_using=MultiGraph()) assert_equal(mG.edges(), G.edges()) def test_configuration_model(self): aseq=[3,3,3,3] bseq=[2,2,2,2,2] assert_raises(networkx.exception.NetworkXError, configuration_model, aseq, bseq) aseq=[3,3,3,3] bseq=[2,2,2,2,2,2] G=configuration_model(aseq,bseq) assert_equal(sorted(G.degree().values()), [2, 2, 2, 2, 2, 2, 3, 3, 3, 3]) aseq=[2,2,2,2,2,2] bseq=[3,3,3,3] G=configuration_model(aseq,bseq) assert_equal(sorted(G.degree().values()), [2, 2, 2, 2, 2, 2, 3, 3, 3, 3]) aseq=[2,2,2,1,1,1] bseq=[3,3,3] G=configuration_model(aseq,bseq) assert_equal(sorted(G.degree().values()), [1, 1, 1, 2, 2, 2, 3, 3, 3]) GU=project(Graph(G),range(len(aseq))) assert_equal(GU.number_of_nodes(), 6) GD=project(Graph(G),range(len(aseq),len(aseq)+len(bseq))) assert_equal(GD.number_of_nodes(), 3) assert_raises(networkx.exception.NetworkXError, configuration_model, aseq, bseq, create_using=DiGraph()) def test_havel_hakimi_graph(self): aseq=[3,3,3,3] bseq=[2,2,2,2,2] assert_raises(networkx.exception.NetworkXError, havel_hakimi_graph, aseq, bseq) bseq=[2,2,2,2,2,2] G=havel_hakimi_graph(aseq,bseq) assert_equal(sorted(G.degree().values()), [2, 2, 2, 2, 2, 2, 3, 3, 3, 3]) aseq=[2,2,2,2,2,2] bseq=[3,3,3,3] G=havel_hakimi_graph(aseq,bseq) assert_equal(sorted(G.degree().values()), [2, 2, 2, 2, 2, 2, 3, 3, 3, 3]) GU=project(Graph(G),range(len(aseq))) assert_equal(GU.number_of_nodes(), 6) GD=project(Graph(G),range(len(aseq),len(aseq)+len(bseq))) assert_equal(GD.number_of_nodes(), 4) assert_raises(networkx.exception.NetworkXError, havel_hakimi_graph, aseq, bseq, create_using=DiGraph()) def test_reverse_havel_hakimi_graph(self): aseq=[3,3,3,3] bseq=[2,2,2,2,2] assert_raises(networkx.exception.NetworkXError, reverse_havel_hakimi_graph, aseq, bseq) bseq=[2,2,2,2,2,2] G=reverse_havel_hakimi_graph(aseq,bseq) assert_equal(sorted(G.degree().values()), [2, 2, 2, 2, 2, 2, 3, 3, 3, 3]) aseq=[2,2,2,2,2,2] bseq=[3,3,3,3] G=reverse_havel_hakimi_graph(aseq,bseq) assert_equal(sorted(G.degree().values()), [2, 2, 2, 2, 2, 2, 3, 3, 3, 3]) aseq=[2,2,2,1,1,1] bseq=[3,3,3] G=reverse_havel_hakimi_graph(aseq,bseq) assert_equal(sorted(G.degree().values()), [1, 1, 1, 2, 2, 2, 3, 3, 3]) GU=project(Graph(G),range(len(aseq))) assert_equal(GU.number_of_nodes(), 6) GD=project(Graph(G),range(len(aseq),len(aseq)+len(bseq))) assert_equal(GD.number_of_nodes(), 3) assert_raises(networkx.exception.NetworkXError, reverse_havel_hakimi_graph, aseq, bseq, create_using=DiGraph()) def test_alternating_havel_hakimi_graph(self): aseq=[3,3,3,3] bseq=[2,2,2,2,2] assert_raises(networkx.exception.NetworkXError, alternating_havel_hakimi_graph, aseq, bseq) bseq=[2,2,2,2,2,2] G=alternating_havel_hakimi_graph(aseq,bseq) assert_equal(sorted(G.degree().values()), [2, 2, 2, 2, 2, 2, 3, 3, 3, 3]) aseq=[2,2,2,2,2,2] bseq=[3,3,3,3] G=alternating_havel_hakimi_graph(aseq,bseq) assert_equal(sorted(G.degree().values()), [2, 2, 2, 2, 2, 2, 3, 3, 3, 3]) aseq=[2,2,2,1,1,1] bseq=[3,3,3] G=alternating_havel_hakimi_graph(aseq,bseq) assert_equal(sorted(G.degree().values()), [1, 1, 1, 2, 2, 2, 3, 3, 3]) GU=project(Graph(G),range(len(aseq))) assert_equal(GU.number_of_nodes(), 6) GD=project(Graph(G),range(len(aseq),len(aseq)+len(bseq))) assert_equal(GD.number_of_nodes(), 3) assert_raises(networkx.exception.NetworkXError, alternating_havel_hakimi_graph, aseq, bseq, create_using=DiGraph()) def test_preferential_attachment(self): aseq=[3,2,1,1] G=preferential_attachment_graph(aseq,0.5) assert_raises(networkx.exception.NetworkXError, preferential_attachment_graph, aseq, 0.5, create_using=DiGraph()) def test_random_graph(self): n=10 m=20 G=random_graph(n,m,0.9) assert_equal(len(G),30) assert_true(is_bipartite(G)) X,Y=nx.algorithms.bipartite.sets(G) assert_equal(set(range(n)),X) assert_equal(set(range(n,n+m)),Y) def test_random_graph(self): n=10 m=20 G=random_graph(n,m,0.9,directed=True) assert_equal(len(G),30) assert_true(is_bipartite(G)) X,Y=nx.algorithms.bipartite.sets(G) assert_equal(set(range(n)),X) assert_equal(set(range(n,n+m)),Y) def test_gnmk_random_graph(self): n = 10 m = 20 edges = 100 G = gnmk_random_graph(n, m, edges) assert_equal(len(G),30) assert_true(is_bipartite(G)) X,Y=nx.algorithms.bipartite.sets(G) print(X) assert_equal(set(range(n)),X) assert_equal(set(range(n,n+m)),Y) assert_equal(edges, len(G.edges())) networkx-1.11/networkx/algorithms/bipartite/tests/test_centrality.py0000644000175000017500000001375712637544450026170 0ustar aricaric00000000000000from nose.tools import * import networkx as nx from networkx.algorithms import bipartite class TestBipartiteCentrality(object): def setUp(self): self.P4 = nx.path_graph(4) self.K3 = nx.complete_bipartite_graph(3,3) self.C4 = nx.cycle_graph(4) self.davis = nx.davis_southern_women_graph() self.top_nodes = [n for n,d in self.davis.nodes(data=True) if d['bipartite']==0] def test_degree_centrality(self): d = bipartite.degree_centrality(self.P4, [1,3]) answer = {0: 0.5, 1: 1.0, 2: 1.0, 3: 0.5} assert_equal(d, answer) d = bipartite.degree_centrality(self.K3, [0,1,2]) answer = {0: 1.0, 1: 1.0, 2: 1.0, 3: 1.0, 4: 1.0, 5: 1.0} assert_equal(d, answer) d = bipartite.degree_centrality(self.C4, [0,2]) answer = {0: 1.0, 1: 1.0, 2: 1.0, 3: 1.0} assert_equal(d,answer) def test_betweenness_centrality(self): c = bipartite.betweenness_centrality(self.P4, [1,3]) answer = {0: 0.0, 1: 1.0, 2: 1.0, 3: 0.0} assert_equal(c, answer) c = bipartite.betweenness_centrality(self.K3, [0,1,2]) answer = {0: 0.125, 1: 0.125, 2: 0.125, 3: 0.125, 4: 0.125, 5: 0.125} assert_equal(c, answer) c = bipartite.betweenness_centrality(self.C4, [0,2]) answer = {0: 0.25, 1: 0.25, 2: 0.25, 3: 0.25} assert_equal(c, answer) def test_closeness_centrality(self): c = bipartite.closeness_centrality(self.P4, [1,3]) answer = {0: 2.0/3, 1: 1.0, 2: 1.0, 3:2.0/3} assert_equal(c, answer) c = bipartite.closeness_centrality(self.K3, [0,1,2]) answer = {0: 1.0, 1: 1.0, 2: 1.0, 3: 1.0, 4: 1.0, 5: 1.0} assert_equal(c, answer) c = bipartite.closeness_centrality(self.C4, [0,2]) answer = {0: 1.0, 1: 1.0, 2: 1.0, 3: 1.0} assert_equal(c, answer) G = nx.Graph() G.add_node(0) G.add_node(1) c = bipartite.closeness_centrality(G, [0]) assert_equal(c, {1: 0.0}) c = bipartite.closeness_centrality(G, [1]) assert_equal(c, {1: 0.0}) def test_davis_degree_centrality(self): G = self.davis deg = bipartite.degree_centrality(G, self.top_nodes) answer = {'E8':0.78, 'E9':0.67, 'E7':0.56, 'Nora Fayette':0.57, 'Evelyn Jefferson':0.57, 'Theresa Anderson':0.57, 'E6':0.44, 'Sylvia Avondale':0.50, 'Laura Mandeville':0.50, 'Brenda Rogers':0.50, 'Katherina Rogers':0.43, 'E5':0.44, 'Helen Lloyd':0.36, 'E3':0.33, 'Ruth DeSand':0.29, 'Verne Sanderson':0.29, 'E12':0.33, 'Myra Liddel':0.29, 'E11':0.22, 'Eleanor Nye':0.29, 'Frances Anderson':0.29, 'Pearl Oglethorpe':0.21, 'E4':0.22, 'Charlotte McDowd':0.29, 'E10':0.28, 'Olivia Carleton':0.14, 'Flora Price':0.14, 'E2':0.17, 'E1':0.17, 'Dorothy Murchison':0.14, 'E13':0.17, 'E14':0.17} for node, value in answer.items(): assert_almost_equal(value, deg[node], places=2) def test_davis_betweenness_centrality(self): G = self.davis bet = bipartite.betweenness_centrality(G, self.top_nodes) answer = {'E8':0.24, 'E9':0.23, 'E7':0.13, 'Nora Fayette':0.11, 'Evelyn Jefferson':0.10, 'Theresa Anderson':0.09, 'E6':0.07, 'Sylvia Avondale':0.07, 'Laura Mandeville':0.05, 'Brenda Rogers':0.05, 'Katherina Rogers':0.05, 'E5':0.04, 'Helen Lloyd':0.04, 'E3':0.02, 'Ruth DeSand':0.02, 'Verne Sanderson':0.02, 'E12':0.02, 'Myra Liddel':0.02, 'E11':0.02, 'Eleanor Nye':0.01, 'Frances Anderson':0.01, 'Pearl Oglethorpe':0.01, 'E4':0.01, 'Charlotte McDowd':0.01, 'E10':0.01, 'Olivia Carleton':0.01, 'Flora Price':0.01, 'E2':0.00, 'E1':0.00, 'Dorothy Murchison':0.00, 'E13':0.00, 'E14':0.00} for node, value in answer.items(): assert_almost_equal(value, bet[node], places=2) def test_davis_closeness_centrality(self): G = self.davis clos = bipartite.closeness_centrality(G, self.top_nodes) answer = {'E8':0.85, 'E9':0.79, 'E7':0.73, 'Nora Fayette':0.80, 'Evelyn Jefferson':0.80, 'Theresa Anderson':0.80, 'E6':0.69, 'Sylvia Avondale':0.77, 'Laura Mandeville':0.73, 'Brenda Rogers':0.73, 'Katherina Rogers':0.73, 'E5':0.59, 'Helen Lloyd':0.73, 'E3':0.56, 'Ruth DeSand':0.71, 'Verne Sanderson':0.71, 'E12':0.56, 'Myra Liddel':0.69, 'E11':0.54, 'Eleanor Nye':0.67, 'Frances Anderson':0.67, 'Pearl Oglethorpe':0.67, 'E4':0.54, 'Charlotte McDowd':0.60, 'E10':0.55, 'Olivia Carleton':0.59, 'Flora Price':0.59, 'E2':0.52, 'E1':0.52, 'Dorothy Murchison':0.65, 'E13':0.52, 'E14':0.52} for node, value in answer.items(): assert_almost_equal(value, clos[node], places=2) networkx-1.11/networkx/algorithms/bipartite/tests/test_matching.py0000644000175000017500000000641212637544450025572 0ustar aricaric00000000000000# test_matching.py - unit tests for bipartite matching algorithms # # Copyright 2015 Jeffrey Finkelstein . # # This file is part of NetworkX. # # NetworkX is distributed under a BSD license; see LICENSE.txt for more # information. """Unit tests for the :mod:`networkx.algorithms.bipartite.matching` module.""" import itertools import networkx as nx from networkx.algorithms.bipartite.matching import eppstein_matching from networkx.algorithms.bipartite.matching import hopcroft_karp_matching from networkx.algorithms.bipartite.matching import maximum_matching from networkx.algorithms.bipartite.matching import to_vertex_cover class TestMatching(): """Tests for bipartite matching algorithms.""" def setup(self): """Creates a bipartite graph for use in testing matching algorithms. The bipartite graph has a maximum cardinality matching that leaves vertex 1 and vertex 10 unmatched. The first six numbers are the left vertices and the next six numbers are the right vertices. """ edges = [(0, 7), (0, 8), (2, 6), (2, 9), (3, 8), (4, 8), (4, 9), (5, 11)] self.graph = nx.Graph() self.graph.add_nodes_from(range(12)) self.graph.add_edges_from(edges) def check_match(self, matching): """Asserts that the matching is what we expect from the bipartite graph constructed in the :meth:`setup` fixture. """ # For the sake of brevity, rename `matching` to `M`. M = matching matched_vertices = frozenset(itertools.chain(*M.items())) # Assert that the maximum number of vertices (10) is matched. assert matched_vertices == frozenset(range(12)) - {1, 10} # Assert that no vertex appears in two edges, or in other words, that # the matching (u, v) and (v, u) both appear in the matching # dictionary. assert all(u == M[M[u]] for u in range(12) if u in M) def check_vertex_cover(self, vertices): """Asserts that the given set of vertices is the vertex cover we expected from the bipartite graph constructed in the :meth:`setup` fixture. """ # By Konig's theorem, the number of edges in a maximum matching equals # the number of vertices in a minimum vertex cover. assert len(vertices) == 5 # Assert that the set is truly a vertex cover. for (u, v) in self.graph.edges(): assert u in vertices or v in vertices # TODO Assert that the vertices are the correct ones. def test_eppstein_matching(self): """Tests that David Eppstein's implementation of the Hopcroft--Karp algorithm produces a maximum cardinality matching. """ self.check_match(eppstein_matching(self.graph)) def test_hopcroft_karp_matching(self): """Tests that the Hopcroft--Karp algorithm produces a maximum cardinality matching in a bipartite graph. """ self.check_match(hopcroft_karp_matching(self.graph)) def test_to_vertex_cover(self): """Test for converting a maximum matching to a minimum vertex cover.""" matching = maximum_matching(self.graph) vertex_cover = to_vertex_cover(self.graph, matching) self.check_vertex_cover(vertex_cover) networkx-1.11/networkx/algorithms/bipartite/tests/test_edgelist.py0000644000175000017500000001563512637544500025603 0ustar aricaric00000000000000""" Unit tests for bipartite edgelists. """ from nose.tools import assert_equal, assert_raises, assert_not_equal, raises import io import tempfile import os import networkx as nx from networkx.testing import (assert_edges_equal, assert_nodes_equal, assert_graphs_equal) from networkx.algorithms import bipartite class TestEdgelist: def setUp(self): self.G=nx.Graph(name="test") e=[('a','b'),('b','c'),('c','d'),('d','e'),('e','f'),('a','f')] self.G.add_edges_from(e) self.G.add_nodes_from(['a','c','e'],bipartite=0) self.G.add_nodes_from(['b','d','f'],bipartite=1) self.G.add_node('g',bipartite=0) self.DG=nx.DiGraph(self.G) self.MG=nx.MultiGraph() self.MG.add_edges_from([(1,2),(1,2),(1,2)]) self.MG.add_node(1,bipartite=0) self.MG.add_node(2,bipartite=1) def test_read_edgelist_1(self): s = b"""\ # comment line 1 2 # comment line 2 3 """ bytesIO = io.BytesIO(s) G = bipartite.read_edgelist(bytesIO,nodetype=int) assert_edges_equal(G.edges(),[(1,2),(2,3)]) def test_read_edgelist_3(self): s = b"""\ # comment line 1 2 {'weight':2.0} # comment line 2 3 {'weight':3.0} """ bytesIO = io.BytesIO(s) G = bipartite.read_edgelist(bytesIO,nodetype=int,data=False) assert_edges_equal(G.edges(),[(1,2),(2,3)]) bytesIO = io.BytesIO(s) G = bipartite.read_edgelist(bytesIO,nodetype=int,data=True) assert_edges_equal(G.edges(data=True), [(1,2,{'weight':2.0}),(2,3,{'weight':3.0})]) def test_write_edgelist_1(self): fh=io.BytesIO() G=nx.Graph() G.add_edges_from([(1,2),(2,3)]) G.add_node(1,bipartite=0) G.add_node(2,bipartite=1) G.add_node(3,bipartite=0) bipartite.write_edgelist(G,fh,data=False) fh.seek(0) assert_equal(fh.read(),b"1 2\n3 2\n") def test_write_edgelist_2(self): fh=io.BytesIO() G=nx.Graph() G.add_edges_from([(1,2),(2,3)]) G.add_node(1,bipartite=0) G.add_node(2,bipartite=1) G.add_node(3,bipartite=0) bipartite.write_edgelist(G,fh,data=True) fh.seek(0) assert_equal(fh.read(),b"1 2 {}\n3 2 {}\n") def test_write_edgelist_3(self): fh=io.BytesIO() G=nx.Graph() G.add_edge(1,2,weight=2.0) G.add_edge(2,3,weight=3.0) G.add_node(1,bipartite=0) G.add_node(2,bipartite=1) G.add_node(3,bipartite=0) bipartite.write_edgelist(G,fh,data=True) fh.seek(0) assert_equal(fh.read(),b"1 2 {'weight': 2.0}\n3 2 {'weight': 3.0}\n") def test_write_edgelist_4(self): fh=io.BytesIO() G=nx.Graph() G.add_edge(1,2,weight=2.0) G.add_edge(2,3,weight=3.0) G.add_node(1,bipartite=0) G.add_node(2,bipartite=1) G.add_node(3,bipartite=0) bipartite.write_edgelist(G,fh,data=[('weight')]) fh.seek(0) assert_equal(fh.read(),b"1 2 2.0\n3 2 3.0\n") def test_unicode(self): G = nx.Graph() try: # Python 3.x name1 = chr(2344) + chr(123) + chr(6543) name2 = chr(5543) + chr(1543) + chr(324) except ValueError: # Python 2.6+ name1 = unichr(2344) + unichr(123) + unichr(6543) name2 = unichr(5543) + unichr(1543) + unichr(324) G.add_edge(name1, 'Radiohead', attr_dict={name2: 3}) G.add_node(name1,bipartite=0) G.add_node('Radiohead',bipartite=1) fd, fname = tempfile.mkstemp() bipartite.write_edgelist(G, fname) H = bipartite.read_edgelist(fname) assert_graphs_equal(G, H) os.close(fd) os.unlink(fname) def test_latin1_error(self): G = nx.Graph() try: # Python 3.x name1 = chr(2344) + chr(123) + chr(6543) name2 = chr(5543) + chr(1543) + chr(324) except ValueError: # Python 2.6+ name1 = unichr(2344) + unichr(123) + unichr(6543) name2 = unichr(5543) + unichr(1543) + unichr(324) G.add_edge(name1, 'Radiohead', attr_dict={name2: 3}) G.add_node(name1,bipartite=0) G.add_node('Radiohead',bipartite=1) fd, fname = tempfile.mkstemp() assert_raises(UnicodeEncodeError, bipartite.write_edgelist, G, fname, encoding = 'latin-1') os.close(fd) os.unlink(fname) def test_latin1(self): G = nx.Graph() try: # Python 3.x blurb = chr(1245) # just to trigger the exception name1 = 'Bj' + chr(246) + 'rk' name2 = chr(220) + 'ber' except ValueError: # Python 2.6+ name1 = 'Bj' + unichr(246) + 'rk' name2 = unichr(220) + 'ber' G.add_edge(name1, 'Radiohead', attr_dict={name2: 3}) G.add_node(name1,bipartite=0) G.add_node('Radiohead',bipartite=1) fd, fname = tempfile.mkstemp() bipartite.write_edgelist(G, fname, encoding = 'latin-1') H = bipartite.read_edgelist(fname, encoding = 'latin-1') assert_graphs_equal(G, H) os.close(fd) os.unlink(fname) def test_edgelist_graph(self): G=self.G (fd,fname)=tempfile.mkstemp() bipartite.write_edgelist(G,fname) H=bipartite.read_edgelist(fname) H2=bipartite.read_edgelist(fname) assert_not_equal(H,H2) # they should be different graphs G.remove_node('g') # isolated nodes are not written in edgelist assert_nodes_equal(H.nodes(),G.nodes()) assert_edges_equal(H.edges(),G.edges()) os.close(fd) os.unlink(fname) def test_edgelist_integers(self): G=nx.convert_node_labels_to_integers(self.G) (fd,fname)=tempfile.mkstemp() bipartite.write_edgelist(G,fname) H=bipartite.read_edgelist(fname,nodetype=int) # isolated nodes are not written in edgelist G.remove_nodes_from(nx.isolates(G)) assert_nodes_equal(H.nodes(),G.nodes()) assert_edges_equal(H.edges(),G.edges()) os.close(fd) os.unlink(fname) def test_edgelist_multigraph(self): G=self.MG (fd,fname)=tempfile.mkstemp() bipartite.write_edgelist(G,fname) H=bipartite.read_edgelist(fname,nodetype=int,create_using=nx.MultiGraph()) H2=bipartite.read_edgelist(fname,nodetype=int,create_using=nx.MultiGraph()) assert_not_equal(H,H2) # they should be different graphs assert_nodes_equal(H.nodes(),G.nodes()) assert_edges_equal(H.edges(),G.edges()) os.close(fd) os.unlink(fname) @raises(nx.NetworkXNotImplemented) def test_digraph_fail(self): bytesIO = io.BytesIO() bipartite.write_edgelist(nx.DiGraph(),bytesIO) @raises(AttributeError) def test_attribute_fail(self): G = nx.path_graph(4) bytesIO = io.BytesIO() bipartite.write_edgelist(G,bytesIO) networkx-1.11/networkx/algorithms/bipartite/tests/test_basic.py0000644000175000017500000000752212637544500025060 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * from nose import SkipTest from nose.plugins.attrib import attr import networkx as nx from networkx.algorithms import bipartite class TestBipartiteBasic: def test_is_bipartite(self): assert_true(bipartite.is_bipartite(nx.path_graph(4))) assert_true(bipartite.is_bipartite(nx.DiGraph([(1,0)]))) assert_false(bipartite.is_bipartite(nx.complete_graph(3))) def test_bipartite_color(self): G=nx.path_graph(4) c=bipartite.color(G) assert_equal(c,{0: 1, 1: 0, 2: 1, 3: 0}) @raises(nx.NetworkXError) def test_not_bipartite_color(self): c=bipartite.color(nx.complete_graph(4)) def test_bipartite_directed(self): G = bipartite.random_graph(10, 10, 0.1, directed=True) assert_true(bipartite.is_bipartite(G)) def test_bipartite_sets(self): G=nx.path_graph(4) X,Y=bipartite.sets(G) assert_equal(X,set([0,2])) assert_equal(Y,set([1,3])) def test_is_bipartite_node_set(self): G=nx.path_graph(4) assert_true(bipartite.is_bipartite_node_set(G,[0,2])) assert_true(bipartite.is_bipartite_node_set(G,[1,3])) assert_false(bipartite.is_bipartite_node_set(G,[1,2])) G.add_path([10,20]) assert_true(bipartite.is_bipartite_node_set(G,[0,2,10])) assert_true(bipartite.is_bipartite_node_set(G,[0,2,20])) assert_true(bipartite.is_bipartite_node_set(G,[1,3,10])) assert_true(bipartite.is_bipartite_node_set(G,[1,3,20])) def test_bipartite_density(self): G=nx.path_graph(5) X,Y=bipartite.sets(G) density=float(len(G.edges()))/(len(X)*len(Y)) assert_equal(bipartite.density(G,X),density) D = nx.DiGraph(G.edges()) assert_equal(bipartite.density(D,X),density/2.0) assert_equal(bipartite.density(nx.Graph(),{}),0.0) def test_bipartite_degrees(self): G=nx.path_graph(5) X=set([1,3]) Y=set([0,2,4]) u,d=bipartite.degrees(G,Y) assert_equal(u,{1:2,3:2}) assert_equal(d,{0:1,2:2,4:1}) def test_bipartite_weighted_degrees(self): G=nx.path_graph(5) G.add_edge(0,1,weight=0.1,other=0.2) X=set([1,3]) Y=set([0,2,4]) u,d=bipartite.degrees(G,Y,weight='weight') assert_equal(u,{1:1.1,3:2}) assert_equal(d,{0:0.1,2:2,4:1}) u,d=bipartite.degrees(G,Y,weight='other') assert_equal(u,{1:1.2,3:2}) assert_equal(d,{0:0.2,2:2,4:1}) @attr('numpy') def test_biadjacency_matrix_weight(self): try: import scipy except ImportError: raise SkipTest('SciPy not available.') G=nx.path_graph(5) G.add_edge(0,1,weight=2,other=4) X=[1,3] Y=[0,2,4] M = bipartite.biadjacency_matrix(G,X,weight='weight') assert_equal(M[0,0], 2) M = bipartite.biadjacency_matrix(G, X, weight='other') assert_equal(M[0,0], 4) @attr('numpy') def test_biadjacency_matrix(self): try: import scipy except ImportError: raise SkipTest('SciPy not available.') tops = [2,5,10] bots = [5,10,15] for i in range(len(tops)): G = bipartite.random_graph(tops[i], bots[i], 0.2) top = [n for n,d in G.nodes(data=True) if d['bipartite']==0] M = bipartite.biadjacency_matrix(G, top) assert_equal(M.shape[0],tops[i]) assert_equal(M.shape[1],bots[i]) @attr('numpy') def test_biadjacency_matrix_order(self): try: import scipy except ImportError: raise SkipTest('SciPy not available.') G=nx.path_graph(5) G.add_edge(0,1,weight=2) X=[3,1] Y=[4,2,0] M = bipartite.biadjacency_matrix(G,X,Y,weight='weight') assert_equal(M[1,2], 2) networkx-1.11/networkx/algorithms/bipartite/tests/test_spectral_bipartivity.py0000644000175000017500000000472412637544450030247 0ustar aricaric00000000000000# -*- coding: utf-8 -*- from nose import SkipTest from nose.tools import * import networkx as nx from networkx.algorithms.bipartite import spectral_bipartivity as sb # Examples from Figure 1 # E. Estrada and J. A. Rodríguez-Velázquez, "Spectral measures of # bipartivity in complex networks", PhysRev E 72, 046105 (2005) class TestSpectralBipartivity(object): @classmethod def setupClass(cls): global scipy global assert_equal global assert_almost_equal try: import scipy.linalg except ImportError: raise SkipTest('SciPy not available.') def test_star_like(self): # star-like G=nx.star_graph(2) G.add_edge(1,2) assert_almost_equal(sb(G),0.843,places=3) G=nx.star_graph(3) G.add_edge(1,2) assert_almost_equal(sb(G),0.871,places=3) G=nx.star_graph(4) G.add_edge(1,2) assert_almost_equal(sb(G),0.890,places=3) def k23_like(self): # K2,3-like G=nx.complete_bipartite_graph(2,3) G.add_edge(0,1) assert_almost_equal(sb(G),0.769,places=3) G=nx.complete_bipartite_graph(2,3) G.add_edge(2,4) assert_almost_equal(sb(G),0.829,places=3) G=nx.complete_bipartite_graph(2,3) G.add_edge(2,4) G.add_edge(3,4) assert_almost_equal(sb(G),0.731,places=3) G=nx.complete_bipartite_graph(2,3) G.add_edge(0,1) G.add_edge(2,4) assert_almost_equal(sb(G),0.692,places=3) G=nx.complete_bipartite_graph(2,3) G.add_edge(2,4) G.add_edge(3,4) G.add_edge(0,1) assert_almost_equal(sb(G),0.645,places=3) G=nx.complete_bipartite_graph(2,3) G.add_edge(2,4) G.add_edge(3,4) G.add_edge(2,3) assert_almost_equal(sb(G),0.645,places=3) G=nx.complete_bipartite_graph(2,3) G.add_edge(2,4) G.add_edge(3,4) G.add_edge(2,3) G.add_edge(0,1) assert_almost_equal(sb(G),0.597,places=3) def test_single_nodes(self): # single nodes G=nx.complete_bipartite_graph(2, 3) G.add_edge(2,4) sbn=sb(G,nodes=[1,2]) assert_almost_equal(sbn[1],0.85,places=2) assert_almost_equal(sbn[2],0.77,places=2) G=nx.complete_bipartite_graph(2,3) G.add_edge(0,1) sbn=sb(G,nodes=[1,2]) assert_almost_equal(sbn[1],0.73,places=2) assert_almost_equal(sbn[2],0.82,places=2) networkx-1.11/networkx/algorithms/bipartite/tests/test_redundancy.py0000644000175000017500000000230712637544450026133 0ustar aricaric00000000000000# test_redundancy.py - unit tests for the bipartite.redundancy module # # Copyright 2015 Jeffrey Finkelstein . # # This file is part of NetworkX. # # NetworkX is distributed under a BSD license; see LICENSE.txt for more # information. """Unit tests for the :mod:`networkx.algorithms.bipartite.redundancy` module. """ from __future__ import division from nose.tools import assert_equal from nose.tools import assert_true from nose.tools import raises from networkx import cycle_graph from networkx import NetworkXError from networkx.algorithms.bipartite import complete_bipartite_graph from networkx.algorithms.bipartite import node_redundancy def test_no_redundant_nodes(): G = complete_bipartite_graph(2, 2) rc = node_redundancy(G) assert_true(all(redundancy == 1 for redundancy in rc.values())) def test_redundant_nodes(): G = cycle_graph(6) edge = {0, 3} G.add_edge(*edge) redundancy = node_redundancy(G) for v in edge: assert_equal(redundancy[v], 2 / 3) for v in set(G) - edge: assert_equal(redundancy[v], 1) @raises(NetworkXError) def test_not_enough_neighbors(): G = complete_bipartite_graph(1, 2) node_redundancy(G) networkx-1.11/networkx/algorithms/bipartite/generators.py0000644000175000017500000004447312637544450023761 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ Generators and functions for bipartite graphs. """ # Copyright (C) 2006-2011 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import math import random import networkx from functools import reduce import networkx as nx __author__ = """\n""".join(['Aric Hagberg (hagberg@lanl.gov)', 'Pieter Swart (swart@lanl.gov)', 'Dan Schult(dschult@colgate.edu)']) __all__=['configuration_model', 'havel_hakimi_graph', 'reverse_havel_hakimi_graph', 'alternating_havel_hakimi_graph', 'preferential_attachment_graph', 'random_graph', 'gnmk_random_graph', 'complete_bipartite_graph', ] def complete_bipartite_graph(n1, n2, create_using=None): """Return the complete bipartite graph `K_{n_1,n_2}`. Composed of two partitions with `n_1` nodes in the first and `n_2` nodes in the second. Each node in the first is connected to each node in the second. Parameters ---------- n1 : integer Number of nodes for node set A. n2 : integer Number of nodes for node set B. create_using : NetworkX graph instance, optional Return graph of this type. Notes ----- Node labels are the integers 0 to `n_1 + n_2 - 1`. The nodes are assigned the attribute 'bipartite' with the value 0 or 1 to indicate which bipartite set the node belongs to. """ if create_using is None: G = nx.Graph() else: if create_using.is_directed(): raise nx.NetworkXError("Directed Graph not supported") G = create_using G.clear() top = set(range(n1)) bottom = set(range(n1, n1+n2)) G.add_nodes_from(top, bipartite=0) G.add_nodes_from(bottom, bipartite=1) G.add_edges_from((u, v) for u in top for v in bottom) G.graph['name'] = "complete_bipartite_graph(%d,%d)" % (n1, n2) return G def configuration_model(aseq, bseq, create_using=None, seed=None): """Return a random bipartite graph from two given degree sequences. Parameters ---------- aseq : list Degree sequence for node set A. bseq : list Degree sequence for node set B. create_using : NetworkX graph instance, optional Return graph of this type. seed : integer, optional Seed for random number generator. Nodes from the set A are connected to nodes in the set B by choosing randomly from the possible free stubs, one in A and one in B. Notes ----- The sum of the two sequences must be equal: sum(aseq)=sum(bseq) If no graph type is specified use MultiGraph with parallel edges. If you want a graph with no parallel edges use create_using=Graph() but then the resulting degree sequences might not be exact. The nodes are assigned the attribute 'bipartite' with the value 0 or 1 to indicate which bipartite set the node belongs to. This function is not imported in the main namespace. To use it you have to explicitly import the bipartite package. """ if create_using is None: create_using=networkx.MultiGraph() elif create_using.is_directed(): raise networkx.NetworkXError(\ "Directed Graph not supported") G=networkx.empty_graph(0,create_using) if not seed is None: random.seed(seed) # length and sum of each sequence lena=len(aseq) lenb=len(bseq) suma=sum(aseq) sumb=sum(bseq) if not suma==sumb: raise networkx.NetworkXError(\ 'invalid degree sequences, sum(aseq)!=sum(bseq),%s,%s'\ %(suma,sumb)) G=_add_nodes_with_bipartite_label(G,lena,lenb) if max(aseq)==0: return G # done if no edges # build lists of degree-repeated vertex numbers stubs=[] stubs.extend([[v]*aseq[v] for v in range(0,lena)]) astubs=[] astubs=[x for subseq in stubs for x in subseq] stubs=[] stubs.extend([[v]*bseq[v-lena] for v in range(lena,lena+lenb)]) bstubs=[] bstubs=[x for subseq in stubs for x in subseq] # shuffle lists random.shuffle(astubs) random.shuffle(bstubs) G.add_edges_from([[astubs[i],bstubs[i]] for i in range(suma)]) G.name="bipartite_configuration_model" return G def havel_hakimi_graph(aseq, bseq, create_using=None): """Return a bipartite graph from two given degree sequences using a Havel-Hakimi style construction. Nodes from the set A are connected to nodes in the set B by connecting the highest degree nodes in set A to the highest degree nodes in set B until all stubs are connected. Parameters ---------- aseq : list Degree sequence for node set A. bseq : list Degree sequence for node set B. create_using : NetworkX graph instance, optional Return graph of this type. Notes ----- This function is not imported in the main namespace. To use it you have to explicitly import the bipartite package. The sum of the two sequences must be equal: sum(aseq)=sum(bseq) If no graph type is specified use MultiGraph with parallel edges. If you want a graph with no parallel edges use create_using=Graph() but then the resulting degree sequences might not be exact. The nodes are assigned the attribute 'bipartite' with the value 0 or 1 to indicate which bipartite set the node belongs to. """ if create_using is None: create_using=networkx.MultiGraph() elif create_using.is_directed(): raise networkx.NetworkXError(\ "Directed Graph not supported") G=networkx.empty_graph(0,create_using) # length of the each sequence naseq=len(aseq) nbseq=len(bseq) suma=sum(aseq) sumb=sum(bseq) if not suma==sumb: raise networkx.NetworkXError(\ 'invalid degree sequences, sum(aseq)!=sum(bseq),%s,%s'\ %(suma,sumb)) G=_add_nodes_with_bipartite_label(G,naseq,nbseq) if max(aseq)==0: return G # done if no edges # build list of degree-repeated vertex numbers astubs=[[aseq[v],v] for v in range(0,naseq)] bstubs=[[bseq[v-naseq],v] for v in range(naseq,naseq+nbseq)] astubs.sort() while astubs: (degree,u)=astubs.pop() # take of largest degree node in the a set if degree==0: break # done, all are zero # connect the source to largest degree nodes in the b set bstubs.sort() for target in bstubs[-degree:]: v=target[1] G.add_edge(u,v) target[0] -= 1 # note this updates bstubs too. if target[0]==0: bstubs.remove(target) G.name="bipartite_havel_hakimi_graph" return G def reverse_havel_hakimi_graph(aseq, bseq, create_using=None): """Return a bipartite graph from two given degree sequences using a Havel-Hakimi style construction. Nodes from set A are connected to nodes in the set B by connecting the highest degree nodes in set A to the lowest degree nodes in set B until all stubs are connected. Parameters ---------- aseq : list Degree sequence for node set A. bseq : list Degree sequence for node set B. create_using : NetworkX graph instance, optional Return graph of this type. Notes ----- This function is not imported in the main namespace. To use it you have to explicitly import the bipartite package. The sum of the two sequences must be equal: sum(aseq)=sum(bseq) If no graph type is specified use MultiGraph with parallel edges. If you want a graph with no parallel edges use create_using=Graph() but then the resulting degree sequences might not be exact. The nodes are assigned the attribute 'bipartite' with the value 0 or 1 to indicate which bipartite set the node belongs to. """ if create_using is None: create_using=networkx.MultiGraph() elif create_using.is_directed(): raise networkx.NetworkXError(\ "Directed Graph not supported") G=networkx.empty_graph(0,create_using) # length of the each sequence lena=len(aseq) lenb=len(bseq) suma=sum(aseq) sumb=sum(bseq) if not suma==sumb: raise networkx.NetworkXError(\ 'invalid degree sequences, sum(aseq)!=sum(bseq),%s,%s'\ %(suma,sumb)) G=_add_nodes_with_bipartite_label(G,lena,lenb) if max(aseq)==0: return G # done if no edges # build list of degree-repeated vertex numbers astubs=[[aseq[v],v] for v in range(0,lena)] bstubs=[[bseq[v-lena],v] for v in range(lena,lena+lenb)] astubs.sort() bstubs.sort() while astubs: (degree,u)=astubs.pop() # take of largest degree node in the a set if degree==0: break # done, all are zero # connect the source to the smallest degree nodes in the b set for target in bstubs[0:degree]: v=target[1] G.add_edge(u,v) target[0] -= 1 # note this updates bstubs too. if target[0]==0: bstubs.remove(target) G.name="bipartite_reverse_havel_hakimi_graph" return G def alternating_havel_hakimi_graph(aseq, bseq,create_using=None): """Return a bipartite graph from two given degree sequences using an alternating Havel-Hakimi style construction. Nodes from the set A are connected to nodes in the set B by connecting the highest degree nodes in set A to alternatively the highest and the lowest degree nodes in set B until all stubs are connected. Parameters ---------- aseq : list Degree sequence for node set A. bseq : list Degree sequence for node set B. create_using : NetworkX graph instance, optional Return graph of this type. Notes ----- This function is not imported in the main namespace. To use it you have to explicitly import the bipartite package. The sum of the two sequences must be equal: sum(aseq)=sum(bseq) If no graph type is specified use MultiGraph with parallel edges. If you want a graph with no parallel edges use create_using=Graph() but then the resulting degree sequences might not be exact. The nodes are assigned the attribute 'bipartite' with the value 0 or 1 to indicate which bipartite set the node belongs to. """ if create_using is None: create_using=networkx.MultiGraph() elif create_using.is_directed(): raise networkx.NetworkXError(\ "Directed Graph not supported") G=networkx.empty_graph(0,create_using) # length of the each sequence naseq=len(aseq) nbseq=len(bseq) suma=sum(aseq) sumb=sum(bseq) if not suma==sumb: raise networkx.NetworkXError(\ 'invalid degree sequences, sum(aseq)!=sum(bseq),%s,%s'\ %(suma,sumb)) G=_add_nodes_with_bipartite_label(G,naseq,nbseq) if max(aseq)==0: return G # done if no edges # build list of degree-repeated vertex numbers astubs=[[aseq[v],v] for v in range(0,naseq)] bstubs=[[bseq[v-naseq],v] for v in range(naseq,naseq+nbseq)] while astubs: astubs.sort() (degree,u)=astubs.pop() # take of largest degree node in the a set if degree==0: break # done, all are zero bstubs.sort() small=bstubs[0:degree // 2] # add these low degree targets large=bstubs[(-degree+degree // 2):] # and these high degree targets stubs=[x for z in zip(large,small) for x in z] # combine, sorry if len(stubs) 1: raise networkx.NetworkXError("probability %s > 1"%(p)) G=networkx.empty_graph(0,create_using) if not seed is None: random.seed(seed) naseq=len(aseq) G=_add_nodes_with_bipartite_label(G,naseq,0) vv=[ [v]*aseq[v] for v in range(0,naseq)] while vv: while vv[0]: source=vv[0][0] vv[0].remove(source) if random.random() < p or G.number_of_nodes() == naseq: target=G.number_of_nodes() G.add_node(target,bipartite=1) G.add_edge(source,target) else: bb=[ [b]*G.degree(b) for b in range(naseq,G.number_of_nodes())] # flatten the list of lists into a list. bbstubs=reduce(lambda x,y: x+y, bb) # choose preferentially a bottom node. target=random.choice(bbstubs) G.add_node(target,bipartite=1) G.add_edge(source,target) vv.remove(vv[0]) G.name="bipartite_preferential_attachment_model" return G def random_graph(n, m, p, seed=None, directed=False): """Return a bipartite random graph. This is a bipartite version of the binomial (Erdős-Rényi) graph. Parameters ---------- n : int The number of nodes in the first bipartite set. m : int The number of nodes in the second bipartite set. p : float Probability for edge creation. seed : int, optional Seed for random number generator (default=None). directed : bool, optional (default=False) If True return a directed graph Notes ----- This function is not imported in the main namespace. To use it you have to explicitly import the bipartite package. The bipartite random graph algorithm chooses each of the n*m (undirected) or 2*nm (directed) possible edges with probability p. This algorithm is O(n+m) where m is the expected number of edges. The nodes are assigned the attribute 'bipartite' with the value 0 or 1 to indicate which bipartite set the node belongs to. See Also -------- gnp_random_graph, configuration_model References ---------- .. [1] Vladimir Batagelj and Ulrik Brandes, "Efficient generation of large random networks", Phys. Rev. E, 71, 036113, 2005. """ G=nx.Graph() G=_add_nodes_with_bipartite_label(G,n,m) if directed: G=nx.DiGraph(G) G.name="fast_gnp_random_graph(%s,%s,%s)"%(n,m,p) if not seed is None: random.seed(seed) if p <= 0: return G if p >= 1: return nx.complete_bipartite_graph(n,m) lp = math.log(1.0 - p) v = 0 w = -1 while v < n: lr = math.log(1.0 - random.random()) w = w + 1 + int(lr/lp) while w >= m and v < n: w = w - m v = v + 1 if v < n: G.add_edge(v, n+w) if directed: # use the same algorithm to # add edges from the "m" to "n" set v = 0 w = -1 while v < n: lr = math.log(1.0 - random.random()) w = w + 1 + int(lr/lp) while w>= m and v < n: w = w - m v = v + 1 if v < n: G.add_edge(n+w, v) return G def gnmk_random_graph(n, m, k, seed=None, directed=False): """Return a random bipartite graph G_{n,m,k}. Produces a bipartite graph chosen randomly out of the set of all graphs with n top nodes, m bottom nodes, and k edges. Parameters ---------- n : int The number of nodes in the first bipartite set. m : int The number of nodes in the second bipartite set. k : int The number of edges seed : int, optional Seed for random number generator (default=None). directed : bool, optional (default=False) If True return a directed graph Examples -------- from networkx.algorithms import bipartite G = bipartite.gnmk_random_graph(10,20,50) See Also -------- gnm_random_graph Notes ----- This function is not imported in the main namespace. To use it you have to explicitly import the bipartite package. If k > m * n then a complete bipartite graph is returned. This graph is a bipartite version of the `G_{nm}` random graph model. """ G = networkx.Graph() G=_add_nodes_with_bipartite_label(G,n,m) if directed: G=nx.DiGraph(G) G.name="bipartite_gnm_random_graph(%s,%s,%s)"%(n,m,k) if seed is not None: random.seed(seed) if n == 1 or m == 1: return G max_edges = n*m # max_edges for bipartite networks if k >= max_edges: # Maybe we should raise an exception here return networkx.complete_bipartite_graph(n, m, create_using=G) top = [n for n,d in G.nodes(data=True) if d['bipartite']==0] bottom = list(set(G) - set(top)) edge_count = 0 while edge_count < k: # generate random edge,u,v u = random.choice(top) v = random.choice(bottom) if v in G[u]: continue else: G.add_edge(u,v) edge_count += 1 return G def _add_nodes_with_bipartite_label(G, lena, lenb): G.add_nodes_from(range(0,lena+lenb)) b=dict(zip(range(0,lena),[0]*lena)) b.update(dict(zip(range(lena,lena+lenb),[1]*lenb))) nx.set_node_attributes(G,'bipartite',b) return G networkx-1.11/networkx/algorithms/dominance.py0000644000175000017500000000711112637544450021546 0ustar aricaric00000000000000""" Dominance algorithms. """ __author__ = 'ysitu ' # Copyright (C) 2014 ysitu # All rights reserved. # BSD license. from functools import reduce import networkx as nx from networkx.utils import not_implemented_for __all__ = ['immediate_dominators', 'dominance_frontiers'] @not_implemented_for('undirected') def immediate_dominators(G, start): """Returns the immediate dominators of all nodes of a directed graph. Parameters ---------- G : a DiGraph or MultiDiGraph The graph where dominance is to be computed. start : node The start node of dominance computation. Returns ------- idom : dict keyed by nodes A dict containing the immediate dominators of each node reachable from ``start``. Raises ------ NetworkXNotImplemented If ``G`` is undirected. NetworkXError If ``start`` is not in ``G``. Notes ----- Except for ``start``, the immediate dominators are the parents of their corresponding nodes in the dominator tree. Examples -------- >>> G = nx.DiGraph([(1, 2), (1, 3), (2, 5), (3, 4), (4, 5)]) >>> sorted(nx.immediate_dominators(G, 1).items()) [(1, 1), (2, 1), (3, 1), (4, 3), (5, 1)] References ---------- .. [1] K. D. Cooper, T. J. Harvey, and K. Kennedy. A simple, fast dominance algorithm. Software Practice & Experience, 4:110, 2001. """ if start not in G: raise nx.NetworkXError('start is not in G') idom = {start: start} order = list(nx.dfs_postorder_nodes(G, start)) dfn = {u: i for i, u in enumerate(order)} order.pop() order.reverse() def intersect(u, v): while u != v: while dfn[u] < dfn[v]: u = idom[u] while dfn[u] > dfn[v]: v = idom[v] return u changed = True while changed: changed = False for u in order: new_idom = reduce(intersect, (v for v in G.pred[u] if v in idom)) if u not in idom or idom[u] != new_idom: idom[u] = new_idom changed = True return idom def dominance_frontiers(G, start): """Returns the dominance frontiers of all nodes of a directed graph. Parameters ---------- G : a DiGraph or MultiDiGraph The graph where dominance is to be computed. start : node The start node of dominance computation. Returns ------- df : dict keyed by nodes A dict containing the dominance frontiers of each node reachable from ``start`` as lists. Raises ------ NetworkXNotImplemented If ``G`` is undirected. NetworkXError If ``start`` is not in ``G``. Examples -------- >>> G = nx.DiGraph([(1, 2), (1, 3), (2, 5), (3, 4), (4, 5)]) >>> sorted((u, sorted(df)) for u, df in nx.dominance_frontiers(G, 1).items()) [(1, []), (2, [5]), (3, [5]), (4, [5]), (5, [])] References ---------- .. [1] K. D. Cooper, T. J. Harvey, and K. Kennedy. A simple, fast dominance algorithm. Software Practice & Experience, 4:110, 2001. """ idom = nx.immediate_dominators(G, start) df = {u: [] for u in idom} for u in idom: if len(G.pred[u]) - int(u in G.pred[u]) >= 2: p = set() for v in G.pred[u]: while v != idom[u] and v not in p: p.add(v) v = idom[v] p.discard(u) for v in p: df[v].append(u) return df networkx-1.11/networkx/algorithms/community/0000755000175000017500000000000012653231454021256 5ustar aricaric00000000000000networkx-1.11/networkx/algorithms/community/__init__.py0000644000175000017500000000006412637544500023370 0ustar aricaric00000000000000from networkx.algorithms.community.kclique import * networkx-1.11/networkx/algorithms/community/tests/0000755000175000017500000000000012653231454022420 5ustar aricaric00000000000000networkx-1.11/networkx/algorithms/community/tests/test_kclique.py0000644000175000017500000000415012637544450025473 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * import networkx as nx from itertools import combinations from networkx import k_clique_communities def test_overlaping_K5(): G = nx.Graph() G.add_edges_from(combinations(range(5), 2)) # Add a five clique G.add_edges_from(combinations(range(2,7), 2)) # Add another five clique c = list(nx.k_clique_communities(G, 4)) assert_equal(c,[frozenset([0, 1, 2, 3, 4, 5, 6])]) c= list(nx.k_clique_communities(G, 5)) assert_equal(set(c),set([frozenset([0,1,2,3,4]),frozenset([2,3,4,5,6])])) def test_isolated_K5(): G = nx.Graph() G.add_edges_from(combinations(range(0,5), 2)) # Add a five clique G.add_edges_from(combinations(range(5,10), 2)) # Add another five clique c= list(nx.k_clique_communities(G, 5)) assert_equal(set(c),set([frozenset([0,1,2,3,4]),frozenset([5,6,7,8,9])])) def test_zachary(): z = nx.karate_club_graph() # clique percolation with k=2 is just connected components zachary_k2_ground_truth = set([frozenset(z.nodes())]) zachary_k3_ground_truth = set([frozenset([0, 1, 2, 3, 7, 8, 12, 13, 14, 15, 17, 18, 19, 20, 21, 22, 23, 26, 27, 28, 29, 30, 31, 32, 33]), frozenset([0, 4, 5, 6, 10, 16]), frozenset([24, 25, 31])]) zachary_k4_ground_truth = set([frozenset([0, 1, 2, 3, 7, 13]), frozenset([8, 32, 30, 33]), frozenset([32, 33, 29, 23])]) zachary_k5_ground_truth = set([frozenset([0, 1, 2, 3, 7, 13])]) zachary_k6_ground_truth = set([]) assert set(k_clique_communities(z, 2)) == zachary_k2_ground_truth assert set(k_clique_communities(z, 3)) == zachary_k3_ground_truth assert set(k_clique_communities(z, 4)) == zachary_k4_ground_truth assert set(k_clique_communities(z, 5)) == zachary_k5_ground_truth assert set(k_clique_communities(z, 6)) == zachary_k6_ground_truth @raises(nx.NetworkXError) def test_bad_k(): c = list(k_clique_communities(nx.Graph(),1)) networkx-1.11/networkx/algorithms/community/kclique.py0000644000175000017500000000524012637544450023273 0ustar aricaric00000000000000#-*- coding: utf-8 -*- # Copyright (C) 2011 by # Conrad Lee # Aric Hagberg # All rights reserved. # BSD license. from collections import defaultdict import networkx as nx __author__ = """\n""".join(['Conrad Lee ', 'Aric Hagberg ']) __all__ = ['k_clique_communities'] def k_clique_communities(G, k, cliques=None): """Find k-clique communities in graph using the percolation method. A k-clique community is the union of all cliques of size k that can be reached through adjacent (sharing k-1 nodes) k-cliques. Parameters ---------- G : NetworkX graph k : int Size of smallest clique cliques: list or generator Precomputed cliques (use networkx.find_cliques(G)) Returns ------- Yields sets of nodes, one for each k-clique community. Examples -------- >>> G = nx.complete_graph(5) >>> K5 = nx.convert_node_labels_to_integers(G,first_label=2) >>> G.add_edges_from(K5.edges()) >>> c = list(nx.k_clique_communities(G, 4)) >>> list(c[0]) [0, 1, 2, 3, 4, 5, 6] >>> list(nx.k_clique_communities(G, 6)) [] References ---------- .. [1] Gergely Palla, Imre Derényi, Illés Farkas1, and Tamás Vicsek, Uncovering the overlapping community structure of complex networks in nature and society Nature 435, 814-818, 2005, doi:10.1038/nature03607 """ if k < 2: raise nx.NetworkXError("k=%d, k must be greater than 1."%k) if cliques is None: cliques = nx.find_cliques(G) cliques = [frozenset(c) for c in cliques if len(c) >= k] # First index which nodes are in which cliques membership_dict = defaultdict(list) for clique in cliques: for node in clique: membership_dict[node].append(clique) # For each clique, see which adjacent cliques percolate perc_graph = nx.Graph() perc_graph.add_nodes_from(cliques) for clique in cliques: for adj_clique in _get_adjacent_cliques(clique, membership_dict): if len(clique.intersection(adj_clique)) >= (k - 1): perc_graph.add_edge(clique, adj_clique) # Connected components of clique graph with perc edges # are the percolated cliques for component in nx.connected_components(perc_graph): yield(frozenset.union(*component)) def _get_adjacent_cliques(clique, membership_dict): adjacent_cliques = set() for n in clique: for adj_clique in membership_dict[n]: if clique != adj_clique: adjacent_cliques.add(adj_clique) return adjacent_cliques networkx-1.11/networkx/algorithms/block.py0000644000175000017500000000771412637544450020714 0ustar aricaric00000000000000# encoding: utf-8 """ Functions for creating network blockmodels from node partitions. Created by Drew Conway Copyright (c) 2010. All rights reserved. """ __author__ = """\n""".join(['Drew Conway ', 'Aric Hagberg ']) __all__=['blockmodel'] import networkx as nx def blockmodel(G,partitions,multigraph=False): """Returns a reduced graph constructed using the generalized block modeling technique. The blockmodel technique collapses nodes into blocks based on a given partitioning of the node set. Each partition of nodes (block) is represented as a single node in the reduced graph. Edges between nodes in the block graph are added according to the edges in the original graph. If the parameter multigraph is False (the default) a single edge is added with a weight equal to the sum of the edge weights between nodes in the original graph The default is a weight of 1 if weights are not specified. If the parameter multigraph is True then multiple edges are added each with the edge data from the original graph. Parameters ---------- G : graph A networkx Graph or DiGraph partitions : list of lists, or list of sets The partition of the nodes. Must be non-overlapping. multigraph : bool, optional If True return a MultiGraph with the edge data of the original graph applied to each corresponding edge in the new graph. If False return a Graph with the sum of the edge weights, or a count of the edges if the original graph is unweighted. Returns ------- blockmodel : a Networkx graph object Examples -------- >>> G=nx.path_graph(6) >>> partition=[[0,1],[2,3],[4,5]] >>> M=nx.blockmodel(G,partition) References ---------- .. [1] Patrick Doreian, Vladimir Batagelj, and Anuska Ferligoj "Generalized Blockmodeling",Cambridge University Press, 2004. """ # Create sets of node partitions part=list(map(set,partitions)) # Check for overlapping node partitions u=set() for p1,p2 in zip(part[:-1],part[1:]): u.update(p1) #if not u.isdisjoint(p2): # Python 2.6 required if len (u.intersection(p2))>0: raise nx.NetworkXException("Overlapping node partitions.") # Initialize blockmodel graph if multigraph: if G.is_directed(): M=nx.MultiDiGraph() else: M=nx.MultiGraph() else: if G.is_directed(): M=nx.DiGraph() else: M=nx.Graph() # Add nodes and properties to blockmodel # The blockmodel nodes are node-induced subgraphs of G # Label them with integers starting at 0 for i,p in enumerate(part): M.add_node(i) # The node-induced subgraph is stored as the node 'graph' attribute SG=G.subgraph(p) M.node[i]['graph']=SG M.node[i]['nnodes']=SG.number_of_nodes() M.node[i]['nedges']=SG.number_of_edges() M.node[i]['density']=nx.density(SG) # Create mapping between original node labels and new blockmodel node labels block_mapping={} for n in M: nodes_in_block=M.node[n]['graph'].nodes() block_mapping.update(dict.fromkeys(nodes_in_block,n)) # Add edges to block graph for u,v,d in G.edges(data=True): bmu=block_mapping[u] bmv=block_mapping[v] if bmu==bmv: # no self loops continue if multigraph: # For multigraphs add an edge for each edge in original graph M.add_edge(bmu,bmv,attr_dict=d) else: # For graphs and digraphs add single weighted edge weight=d.get('weight',1.0) # default to 1 if no weight specified if M.has_edge(bmu,bmv): M[bmu][bmv]['weight']+=weight else: M.add_edge(bmu,bmv,weight=weight) return M networkx-1.11/networkx/algorithms/hybrid.py0000644000175000017500000001370412637544500021073 0ustar aricaric00000000000000# coding: utf-8 """ Provides functions for finding and testing for locally `(k, l)`-connected graphs. """ __author__ = """Aric Hagberg (hagberg@lanl.gov)\nDan Schult (dschult@colgate.edu)""" # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. _all__ = ['kl_connected_subgraph', 'is_kl_connected'] import copy import networkx as nx def kl_connected_subgraph(G, k, l, low_memory=False, same_as_graph=False): """Returns the maximum locally `(k, l)`-connected subgraph of ``G``. A graph is locally `(k, l)`-connected if for each edge `(u, v)` in the graph there are at least `l` edge-disjoint paths of length at most `k` joining `u` to `v`. Parameters ---------- G : NetworkX graph The graph in which to find a maximum locally `(k, l)`-connected subgraph. k : integer The maximum length of paths to consider. A higher number means a looser connectivity requirement. l : integer The number of edge-disjoint paths. A higher number means a stricter connectivity requirement. low_memory : bool If this is ``True``, this function uses an algorithm that uses slightly more time but less memory. same_as_graph : bool If this is ``True`` then return a tuple of the form ``(H, is_same)``, where ``H`` is the maximum locally `(k, l)`-connected subgraph and ``is_same`` is a Boolean representing whether ``G`` is locally `(k, l)`-connected (and hence, whether ``H`` is simply a copy of the input graph ``G``). Returns ------- NetworkX graph or two-tuple If ``same_as_graph`` is ``True``, then this function returns a two-tuple as described above. Otherwise, it returns only the maximum locally `(k, l)`-connected subgraph. See also -------- is_kl_connected References ---------- .. [1]: Chung, Fan and Linyuan Lu. "The Small World Phenomenon in Hybrid Power Law Graphs." *Complex Networks*. Springer Berlin Heidelberg, 2004. 89--104. """ H=copy.deepcopy(G) # subgraph we construct by removing from G graphOK=True deleted_some=True # hack to start off the while loop while deleted_some: deleted_some=False for edge in H.edges(): (u,v)=edge ### Get copy of graph needed for this search if low_memory: verts=set([u,v]) for i in range(k): [verts.update(G.neighbors(w)) for w in verts.copy()] G2=G.subgraph(list(verts)) else: G2=copy.deepcopy(G) ### path=[u,v] cnt=0 accept=0 while path: cnt += 1 # Found a path if cnt>=l: accept=1 break # record edges along this graph prev=u for w in path: if prev!=w: G2.remove_edge(prev,w) prev=w # path=shortest_path(G2,u,v,k) # ??? should "Cutoff" be k+1? try: path=nx.shortest_path(G2,u,v) # ??? should "Cutoff" be k+1? except nx.NetworkXNoPath: path = False # No Other Paths if accept==0: H.remove_edge(u,v) deleted_some=True if graphOK: graphOK=False # We looked through all edges and removed none of them. # So, H is the maximal (k,l)-connected subgraph of G if same_as_graph: return (H,graphOK) return H def is_kl_connected(G, k, l, low_memory=False): """Returns ``True`` if and only if ``G`` is locally `(k, l)`-connected. A graph is locally `(k, l)`-connected if for each edge `(u, v)` in the graph there are at least `l` edge-disjoint paths of length at most `k` joining `u` to `v`. Parameters ---------- G : NetworkX graph The graph to test for local `(k, l)`-connectedness. k : integer The maximum length of paths to consider. A higher number means a looser connectivity requirement. l : integer The number of edge-disjoint paths. A higher number means a stricter connectivity requirement. low_memory : bool If this is ``True``, this function uses an algorithm that uses slightly more time but less memory. Returns ------- bool Whether the graph is locally `(k, l)`-connected subgraph. See also -------- kl_connected_subgraph References ---------- .. [1]: Chung, Fan and Linyuan Lu. "The Small World Phenomenon in Hybrid Power Law Graphs." *Complex Networks*. Springer Berlin Heidelberg, 2004. 89--104. """ graphOK=True for edge in G.edges(): (u,v)=edge ### Get copy of graph needed for this search if low_memory: verts=set([u,v]) for i in range(k): [verts.update(G.neighbors(w)) for w in verts.copy()] G2=G.subgraph(verts) else: G2=copy.deepcopy(G) ### path=[u,v] cnt=0 accept=0 while path: cnt += 1 # Found a path if cnt>=l: accept=1 break # record edges along this graph prev=u for w in path: if w!=prev: G2.remove_edge(prev,w) prev=w # path=shortest_path(G2,u,v,k) # ??? should "Cutoff" be k+1? try: path=nx.shortest_path(G2,u,v) # ??? should "Cutoff" be k+1? except nx.NetworkXNoPath: path = False # No Other Paths if accept==0: graphOK=False break # return status return graphOK networkx-1.11/networkx/algorithms/vitality.py0000644000175000017500000000467012637544500021461 0ustar aricaric00000000000000""" Vitality measures. """ # Copyright (C) 2012 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import networkx as nx __author__ = "\n".join(['Aric Hagberg (hagberg@lanl.gov)', 'Renato Fabbri']) __all__ = ['closeness_vitality'] def weiner_index(G, weight=None): # compute sum of distances between all node pairs # (with optional weights) weiner=0.0 if weight is None: for n in G: path_length=nx.single_source_shortest_path_length(G,n) weiner+=sum(path_length.values()) else: for n in G: path_length=nx.single_source_dijkstra_path_length(G, n,weight=weight) weiner+=sum(path_length.values()) return weiner def closeness_vitality(G, weight=None): """Compute closeness vitality for nodes. Closeness vitality of a node is the change in the sum of distances between all node pairs when excluding that node. Parameters ---------- G : graph weight : None or string (optional) The name of the edge attribute used as weight. If None the edge weights are ignored. Returns ------- nodes : dictionary Dictionary with nodes as keys and closeness vitality as the value. Examples -------- >>> G=nx.cycle_graph(3) >>> nx.closeness_vitality(G) {0: 4.0, 1: 4.0, 2: 4.0} See Also -------- closeness_centrality() References ---------- .. [1] Ulrik Brandes, Sec. 3.6.2 in Network Analysis: Methodological Foundations, Springer, 2005. http://books.google.com/books?id=TTNhSm7HYrIC """ multigraph = G.is_multigraph() wig = weiner_index(G,weight) closeness_vitality = {} for n in G: # remove edges connected to node n and keep list of edges with data # could remove node n but it doesn't count anyway if multigraph: edges = G.edges(n,data=True,keys=True) if G.is_directed(): edges += G.in_edges(n,data=True,keys=True) else: edges = G.edges(n,data=True) if G.is_directed(): edges += G.in_edges(n,data=True) G.remove_edges_from(edges) closeness_vitality[n] = wig - weiner_index(G,weight) # add edges and data back to graph G.add_edges_from(edges) return closeness_vitality networkx-1.11/networkx/algorithms/traversal/0000755000175000017500000000000012653231454021235 5ustar aricaric00000000000000networkx-1.11/networkx/algorithms/traversal/depth_first_search.py0000644000175000017500000002034312637544500025452 0ustar aricaric00000000000000""" ================== Depth-first search ================== Basic algorithms for depth-first searching the nodes of a graph. Based on http://www.ics.uci.edu/~eppstein/PADS/DFS.py by D. Eppstein, July 2004. """ import networkx as nx from collections import defaultdict __author__ = """\n""".join(['Aric Hagberg ']) __all__ = ['dfs_edges', 'dfs_tree', 'dfs_predecessors', 'dfs_successors', 'dfs_preorder_nodes','dfs_postorder_nodes', 'dfs_labeled_edges'] def dfs_edges(G, source=None): """Produce edges in a depth-first-search (DFS). Parameters ---------- G : NetworkX graph source : node, optional Specify starting node for depth-first search and return edges in the component reachable from source. Returns ------- edges: generator A generator of edges in the depth-first-search. Examples -------- >>> G = nx.Graph() >>> G.add_path([0,1,2]) >>> print(list(nx.dfs_edges(G,0))) [(0, 1), (1, 2)] Notes ----- Based on http://www.ics.uci.edu/~eppstein/PADS/DFS.py by D. Eppstein, July 2004. If a source is not specified then a source is chosen arbitrarily and repeatedly until all components in the graph are searched. """ if source is None: # produce edges for all components nodes = G else: # produce edges for components with source nodes = [source] visited=set() for start in nodes: if start in visited: continue visited.add(start) stack = [(start,iter(G[start]))] while stack: parent,children = stack[-1] try: child = next(children) if child not in visited: yield parent,child visited.add(child) stack.append((child,iter(G[child]))) except StopIteration: stack.pop() def dfs_tree(G, source): """Return oriented tree constructed from a depth-first-search from source. Parameters ---------- G : NetworkX graph source : node, optional Specify starting node for depth-first search. Returns ------- T : NetworkX DiGraph An oriented tree Examples -------- >>> G = nx.Graph() >>> G.add_path([0,1,2]) >>> T = nx.dfs_tree(G,0) >>> print(T.edges()) [(0, 1), (1, 2)] """ T = nx.DiGraph() if source is None: T.add_nodes_from(G) else: T.add_node(source) T.add_edges_from(dfs_edges(G,source)) return T def dfs_predecessors(G, source=None): """Return dictionary of predecessors in depth-first-search from source. Parameters ---------- G : NetworkX graph source : node, optional Specify starting node for depth-first search and return edges in the component reachable from source. Returns ------- pred: dict A dictionary with nodes as keys and predecessor nodes as values. Examples -------- >>> G = nx.Graph() >>> G.add_path([0,1,2]) >>> print(nx.dfs_predecessors(G,0)) {1: 0, 2: 1} Notes ----- Based on http://www.ics.uci.edu/~eppstein/PADS/DFS.py by D. Eppstein, July 2004. If a source is not specified then a source is chosen arbitrarily and repeatedly until all components in the graph are searched. """ return dict((t,s) for s,t in dfs_edges(G,source=source)) def dfs_successors(G, source=None): """Return dictionary of successors in depth-first-search from source. Parameters ---------- G : NetworkX graph source : node, optional Specify starting node for depth-first search and return edges in the component reachable from source. Returns ------- succ: dict A dictionary with nodes as keys and list of successor nodes as values. Examples -------- >>> G = nx.Graph() >>> G.add_path([0,1,2]) >>> print(nx.dfs_successors(G,0)) {0: [1], 1: [2]} Notes ----- Based on http://www.ics.uci.edu/~eppstein/PADS/DFS.py by D. Eppstein, July 2004. If a source is not specified then a source is chosen arbitrarily and repeatedly until all components in the graph are searched. """ d = defaultdict(list) for s,t in dfs_edges(G,source=source): d[s].append(t) return dict(d) def dfs_postorder_nodes(G,source=None): """Produce nodes in a depth-first-search post-ordering starting from source. Parameters ---------- G : NetworkX graph source : node, optional Specify starting node for depth-first search and return edges in the component reachable from source. Returns ------- nodes: generator A generator of nodes in a depth-first-search post-ordering. Examples -------- >>> G = nx.Graph() >>> G.add_path([0,1,2]) >>> print(list(nx.dfs_postorder_nodes(G,0))) [2, 1, 0] Notes ----- Based on http://www.ics.uci.edu/~eppstein/PADS/DFS.py by D. Eppstein, July 2004. If a source is not specified then a source is chosen arbitrarily and repeatedly until all components in the graph are searched. """ post=(v for u,v,d in nx.dfs_labeled_edges(G,source=source) if d['dir']=='reverse') # potential modification: chain source to end of post-ordering # return chain(post,[source]) return post def dfs_preorder_nodes(G, source=None): """Produce nodes in a depth-first-search pre-ordering starting from source. Parameters ---------- G : NetworkX graph source : node, optional Specify starting node for depth-first search and return edges in the component reachable from source. Returns ------- nodes: generator A generator of nodes in a depth-first-search pre-ordering. Examples -------- >>> G = nx.Graph() >>> G.add_path([0,1,2]) >>> print(list(nx.dfs_preorder_nodes(G,0))) [0, 1, 2] Notes ----- Based on http://www.ics.uci.edu/~eppstein/PADS/DFS.py by D. Eppstein, July 2004. If a source is not specified then a source is chosen arbitrarily and repeatedly until all components in the graph are searched. """ pre=(v for u,v,d in nx.dfs_labeled_edges(G,source=source) if d['dir']=='forward') # potential modification: chain source to beginning of pre-ordering # return chain([source],pre) return pre def dfs_labeled_edges(G, source=None): """Produce edges in a depth-first-search (DFS) labeled by type. Parameters ---------- G : NetworkX graph source : node, optional Specify starting node for depth-first search and return edges in the component reachable from source. Returns ------- edges: generator A generator of edges in the depth-first-search labeled with 'forward', 'nontree', and 'reverse'. Examples -------- >>> G = nx.Graph() >>> G.add_path([0,1,2]) >>> edges = (list(nx.dfs_labeled_edges(G,0))) Notes ----- Based on http://www.ics.uci.edu/~eppstein/PADS/DFS.py by D. Eppstein, July 2004. If a source is not specified then a source is chosen arbitrarily and repeatedly until all components in the graph are searched. """ # Based on http://www.ics.uci.edu/~eppstein/PADS/DFS.py # by D. Eppstein, July 2004. if source is None: # produce edges for all components nodes = G else: # produce edges for components with source nodes = [source] visited = set() for start in nodes: if start in visited: continue yield start,start,{'dir':'forward'} visited.add(start) stack = [(start,iter(G[start]))] while stack: parent,children = stack[-1] try: child = next(children) if child in visited: yield parent,child,{'dir':'nontree'} else: yield parent,child,{'dir':'forward'} visited.add(child) stack.append((child,iter(G[child]))) except StopIteration: stack.pop() if stack: yield stack[-1][0],parent,{'dir':'reverse'} yield start,start,{'dir':'reverse'} networkx-1.11/networkx/algorithms/traversal/__init__.py0000644000175000017500000000013512637544450023352 0ustar aricaric00000000000000from .depth_first_search import * from .breadth_first_search import * from .edgedfs import * networkx-1.11/networkx/algorithms/traversal/breadth_first_search.py0000644000175000017500000000764412637544500025770 0ustar aricaric00000000000000""" ==================== Breadth-first search ==================== Basic algorithms for breadth-first searching the nodes of a graph. """ import networkx as nx from collections import defaultdict, deque __author__ = """\n""".join(['Aric Hagberg ']) __all__ = ['bfs_edges', 'bfs_tree', 'bfs_predecessors', 'bfs_successors'] def bfs_edges(G, source, reverse=False): """Produce edges in a breadth-first-search starting at source. Parameters ---------- G : NetworkX graph source : node Specify starting node for breadth-first search and return edges in the component reachable from source. reverse : bool, optional If True traverse a directed graph in the reverse direction Returns ------- edges: generator A generator of edges in the breadth-first-search. Examples -------- >>> G = nx.Graph() >>> G.add_path([0,1,2]) >>> print(list(nx.bfs_edges(G,0))) [(0, 1), (1, 2)] Notes ----- Based on http://www.ics.uci.edu/~eppstein/PADS/BFS.py by D. Eppstein, July 2004. """ if reverse and isinstance(G, nx.DiGraph): neighbors = G.predecessors_iter else: neighbors = G.neighbors_iter visited = set([source]) queue = deque([(source, neighbors(source))]) while queue: parent, children = queue[0] try: child = next(children) if child not in visited: yield parent, child visited.add(child) queue.append((child, neighbors(child))) except StopIteration: queue.popleft() def bfs_tree(G, source, reverse=False): """Return an oriented tree constructed from of a breadth-first-search starting at source. Parameters ---------- G : NetworkX graph source : node Specify starting node for breadth-first search and return edges in the component reachable from source. reverse : bool, optional If True traverse a directed graph in the reverse direction Returns ------- T: NetworkX DiGraph An oriented tree Examples -------- >>> G = nx.Graph() >>> G.add_path([0,1,2]) >>> print(list(nx.bfs_edges(G,0))) [(0, 1), (1, 2)] Notes ----- Based on http://www.ics.uci.edu/~eppstein/PADS/BFS.py by D. Eppstein, July 2004. """ T = nx.DiGraph() T.add_node(source) T.add_edges_from(bfs_edges(G,source,reverse=reverse)) return T def bfs_predecessors(G, source): """Return dictionary of predecessors in breadth-first-search from source. Parameters ---------- G : NetworkX graph source : node Specify starting node for breadth-first search and return edges in the component reachable from source. Returns ------- pred: dict A dictionary with nodes as keys and predecessor nodes as values. Examples -------- >>> G = nx.Graph() >>> G.add_path([0,1,2]) >>> print(nx.bfs_predecessors(G,0)) {1: 0, 2: 1} Notes ----- Based on http://www.ics.uci.edu/~eppstein/PADS/BFS.py by D. Eppstein, July 2004. """ return dict((t,s) for s,t in bfs_edges(G,source)) def bfs_successors(G, source): """Return dictionary of successors in breadth-first-search from source. Parameters ---------- G : NetworkX graph source : node Specify starting node for breadth-first search and return edges in the component reachable from source. Returns ------- succ: dict A dictionary with nodes as keys and list of succssors nodes as values. Examples -------- >>> G = nx.Graph() >>> G.add_path([0,1,2]) >>> print(nx.bfs_successors(G,0)) {0: [1], 1: [2]} Notes ----- Based on http://www.ics.uci.edu/~eppstein/PADS/BFS.py by D. Eppstein, July 2004. """ d = defaultdict(list) for s,t in bfs_edges(G,source): d[s].append(t) return dict(d) networkx-1.11/networkx/algorithms/traversal/tests/0000755000175000017500000000000012653231454022377 5ustar aricaric00000000000000networkx-1.11/networkx/algorithms/traversal/tests/test_edgedfs.py0000644000175000017500000001012612637544450025416 0ustar aricaric00000000000000from nose.tools import * import networkx as nx edge_dfs = nx.algorithms.edge_dfs FORWARD = nx.algorithms.edgedfs.FORWARD REVERSE = nx.algorithms.edgedfs.REVERSE # These tests can fail with hash randomization. The easiest and clearest way # to write these unit tests is for the edges to be output in an expected total # order, but we cannot guarantee the order amongst outgoing edges from a node, # unless each class uses an ordered data structure for neighbors. This is # painful to do with the current API. The alternative is that the tests are # written (IMO confusingly) so that there is not a total order over the edges, # but only a partial order. Due to the small size of the graphs, hopefully # failures due to hash randomization will not occur. For an example of how # this can fail, see TestEdgeDFS.test_multigraph. class TestEdgeDFS(object): def setUp(self): self.nodes = [0, 1, 2, 3] self.edges = [(0, 1), (1, 0), (1, 0), (2, 1), (3, 1)] def test_empty(self): G = nx.Graph() edges = list(edge_dfs(G)) assert_equal(edges, []) def test_graph(self): G = nx.Graph(self.edges) x = list(edge_dfs(G, self.nodes)) x_ = [(0, 1), (1, 2), (1, 3)] assert_equal(x, x_) def test_digraph(self): G = nx.DiGraph(self.edges) x = list(edge_dfs(G, self.nodes)) x_= [(0, 1), (1, 0), (2, 1), (3, 1)] assert_equal(x, x_) def test_digraph2(self): G = nx.DiGraph() nodes = range(4) G.add_path(nodes) x = list(edge_dfs(G, [0])) x_ = [(0, 1), (1, 2), (2, 3)] assert_equal(x, x_) def test_digraph_rev(self): G = nx.DiGraph(self.edges) x = list(edge_dfs(G, self.nodes, orientation='reverse')) x_= [(1, 0, REVERSE), (0, 1, REVERSE), (2, 1, REVERSE), (3, 1, REVERSE)] assert_equal(x, x_) def test_digraph_rev2(self): G = nx.DiGraph() nodes = range(4) G.add_path(nodes) x = list(edge_dfs(G, [3], orientation='reverse')) x_ = [(2, 3, REVERSE), (1, 2, REVERSE), (0, 1, REVERSE)] assert_equal(x, x_) def test_multigraph(self): G = nx.MultiGraph(self.edges) x = list(edge_dfs(G, self.nodes)) x_ = [(0, 1, 0), (1, 0, 1), (0, 1, 2), (1, 2, 0), (1, 3, 0)] # This is an example of where hash randomization can break. # There are 3! * 2 alternative outputs, such as: # [(0, 1, 1), (1, 0, 0), (0, 1, 2), (1, 3, 0), (1, 2, 0)] # But note, the edges (1,2,0) and (1,3,0) always follow the (0,1,k) # edges. So the algorithm only guarantees a partial order. A total # order is guaranteed only if the graph data structures are ordered. assert_equal(x, x_) def test_multidigraph(self): G = nx.MultiDiGraph(self.edges) x = list(edge_dfs(G, self.nodes)) x_ = [(0, 1, 0), (1, 0, 0), (1, 0, 1), (2, 1, 0), (3, 1, 0)] assert_equal(x, x_) def test_multidigraph_rev(self): G = nx.MultiDiGraph(self.edges) x = list(edge_dfs(G, self.nodes, orientation='reverse')) x_ = [(1, 0, 0, REVERSE), (0, 1, 0, REVERSE), (1, 0, 1, REVERSE), (2, 1, 0, REVERSE), (3, 1, 0, REVERSE)] assert_equal(x, x_) def test_digraph_ignore(self): G = nx.DiGraph(self.edges) x = list(edge_dfs(G, self.nodes, orientation='ignore')) x_ = [(0, 1, FORWARD), (1, 0, FORWARD), (2, 1, REVERSE), (3, 1, REVERSE)] assert_equal(x, x_) def test_digraph_ignore2(self): G = nx.DiGraph() nodes = range(4) G.add_path(nodes) x = list(edge_dfs(G, [0], orientation='ignore')) x_ = [(0, 1, FORWARD), (1, 2, FORWARD), (2, 3, FORWARD)] assert_equal(x, x_) def test_multidigraph_ignore(self): G = nx.MultiDiGraph(self.edges) x = list(edge_dfs(G, self.nodes, orientation='ignore')) x_ = [(0, 1, 0, FORWARD), (1, 0, 0, FORWARD), (1, 0, 1, REVERSE), (2, 1, 0, REVERSE), (3, 1, 0, REVERSE)] assert_equal(x, x_) networkx-1.11/networkx/algorithms/traversal/tests/test_bfs.py0000644000175000017500000000252212637544500024564 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * import networkx as nx class TestBFS: def setUp(self): # simple graph G = nx.Graph() G.add_edges_from([(0, 1), (1, 2), (1, 3), (2, 4), (3, 4)]) self.G = G def test_successor(self): assert_equal(nx.bfs_successors(self.G, source=0), {0: [1], 1: [2, 3], 2: [4]}) def test_predecessor(self): assert_equal(nx.bfs_predecessors(self.G, source=0), {1: 0, 2: 1, 3: 1, 4: 2}) def test_bfs_tree(self): T = nx.bfs_tree(self.G, source=0) assert_equal(sorted(T.nodes()), sorted(self.G.nodes())) assert_equal(sorted(T.edges()), [(0, 1), (1, 2), (1, 3), (2, 4)]) def test_bfs_edges(self): edges = nx.bfs_edges(self.G, source=0) assert_equal(list(edges), [(0, 1), (1, 2), (1, 3), (2, 4)]) def test_bfs_edges_reverse(self): D = nx.DiGraph() D.add_edges_from([(0, 1), (1, 2), (1, 3), (2, 4), (3, 4)]) edges = nx.bfs_edges(D, source=4, reverse=True) assert_equal(list(edges), [(4, 2), (4, 3), (2, 1), (1, 0)]) def test_bfs_tree_isolates(self): G = nx.Graph() G.add_node(1) G.add_node(2) T = nx.bfs_tree(G, source=1) assert_equal(sorted(T.nodes()), [1]) assert_equal(sorted(T.edges()), []) networkx-1.11/networkx/algorithms/traversal/tests/test_dfs.py0000644000175000017500000000460312637544450024574 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * import networkx as nx class TestDFS: def setUp(self): # simple graph G=nx.Graph() G.add_edges_from([(0,1),(1,2),(1,3),(2,4),(3,4)]) self.G=G # simple graph, disconnected D=nx.Graph() D.add_edges_from([(0,1),(2,3)]) self.D=D def test_preorder_nodes(self): assert_equal(list(nx.dfs_preorder_nodes(self.G,source=0)), [0, 1, 2, 4, 3]) assert_equal(list(nx.dfs_preorder_nodes(self.D)),[0, 1, 2, 3]) def test_postorder_nodes(self): assert_equal(list(nx.dfs_postorder_nodes(self.G,source=0)), [3, 4, 2, 1, 0]) assert_equal(list(nx.dfs_postorder_nodes(self.D)),[1, 0, 3, 2]) def test_successor(self): assert_equal(nx.dfs_successors(self.G,source=0), {0: [1], 1: [2], 2: [4], 4: [3]}) assert_equal(nx.dfs_successors(self.D), {0: [1], 2: [3]}) def test_predecessor(self): assert_equal(nx.dfs_predecessors(self.G,source=0), {1: 0, 2: 1, 3: 4, 4: 2}) assert_equal(nx.dfs_predecessors(self.D), {1: 0, 3: 2}) def test_dfs_tree(self): T=nx.dfs_tree(self.G,source=0) assert_equal(sorted(T.nodes()),sorted(self.G.nodes())) assert_equal(sorted(T.edges()),[(0, 1), (1, 2), (2, 4), (4, 3)]) def test_dfs_edges(self): edges=nx.dfs_edges(self.G,source=0) assert_equal(list(edges),[(0, 1), (1, 2), (2, 4), (4, 3)]) edges=nx.dfs_edges(self.D) assert_equal(list(edges),[(0, 1), (2, 3)]) def test_dfs_labeled_edges(self): edges=list(nx.dfs_labeled_edges(self.G,source=0)) forward=[(u,v) for (u,v,d) in edges if d['dir']=='forward'] assert_equal(forward,[(0,0), (0, 1), (1, 2), (2, 4), (4, 3)]) def test_dfs_labeled_disconnected_edges(self): edges=list(nx.dfs_labeled_edges(self.D)) forward=[(u,v) for (u,v,d) in edges if d['dir']=='forward'] assert_equal(forward,[(0, 0), (0, 1), (2, 2), (2, 3)]) def test_dfs_tree_isolates(self): G = nx.Graph() G.add_node(1) G.add_node(2) T=nx.dfs_tree(G,source=1) assert_equal(sorted(T.nodes()),[1]) assert_equal(sorted(T.edges()),[]) T=nx.dfs_tree(G,source=None) assert_equal(sorted(T.nodes()),[1, 2]) assert_equal(sorted(T.edges()),[]) networkx-1.11/networkx/algorithms/traversal/edgedfs.py0000644000175000017500000001531412637544500023215 0ustar aricaric00000000000000""" =========================== Depth First Search on Edges =========================== Algorithms for a depth-first traversal of edges in a graph. """ FORWARD = 'forward' REVERSE = 'reverse' __all__ = ['edge_dfs'] def helper_funcs(G, orientation): """ These are various G-specific functions that help us implement the algorithm for all graph types: graph, multigraph, directed or not. """ ignore_orientation = G.is_directed() and orientation == 'ignore' reverse_orientation = G.is_directed() and orientation == 'reverse' if ignore_orientation: # When we ignore the orientation, we still need to know how the edge # was traversed, so we add an object representing the direction. def out_edges(u, **kwds): for edge in G.out_edges(u, **kwds): yield edge + (FORWARD,) for edge in G.in_edges(u, **kwds): yield edge + (REVERSE,) elif reverse_orientation: def out_edges(u, **kwds): for edge in G.in_edges(u, **kwds): yield edge + (REVERSE,) else: # If "yield from" were an option, we could pass kwds automatically. out_edges = G.edges_iter # If every edge had a unique key, then it would be easier to track which # edges had been visited. Since that is not available, we will form a # unique identifier from the edge and key (if present). If the graph # is undirected, then the head and tail need to be stored as a frozenset. if ignore_orientation or reverse_orientation: # edge is a 4-tuple: (u, v, key, direction) # u and v always represent the true tail and head of the edge. def key(edge): # We want everything but the direction. return edge[:-1] else: if G.is_directed(): def key(edge): return edge else: # edge is a 3-tuple: (u, v, key) def key(edge): new_edge = (frozenset(edge[:2]),) + edge[2:] return new_edge def traversed_tailhead(edge): """ Returns the tail and head of an edge, as it was traversed. So in general, this is different from the true tail and head. (Also, undirected edges have no true tail or head.) """ if (ignore_orientation or reverse_orientation) and edge[-1] == REVERSE: tail, head = edge[1], edge[0] else: tail, head = edge[0], edge[1] return tail, head return out_edges, key, traversed_tailhead def edge_dfs(G, source=None, orientation='original'): """ A directed, depth-first traversal of edges in ``G``, beginning at ``source``. Parameters ---------- G : graph A directed/undirected graph/multigraph. source : node, list of nodes The node from which the traversal begins. If ``None``, then a source is chosen arbitrarily and repeatedly until all edges from each node in the graph are searched. orientation : 'original' | 'reverse' | 'ignore' For directed graphs and directed multigraphs, edge traversals need not respect the original orientation of the edges. When set to 'reverse', then every edge will be traversed in the reverse direction. When set to 'ignore', then each directed edge is treated as a single undirected edge that can be traversed in either direction. For undirected graphs and undirected multigraphs, this parameter is meaningless and is not consulted by the algorithm. Yields ------ edge : directed edge A directed edge indicating the path taken by the depth-first traversal. For graphs, ``edge`` is of the form ``(u, v)`` where ``u`` and ``v`` are the tail and head of the edge as determined by the traversal. For multigraphs, ``edge`` is of the form ``(u, v, key)``, where `key` is the key of the edge. When the graph is directed, then ``u`` and ``v`` are always in the order of the actual directed edge. If orientation is 'reverse' or 'ignore', then ``edge`` takes the form ``(u, v, key, direction)`` where direction is a string, 'forward' or 'reverse', that indicates if the edge was traversed in the forward (tail to head) or reverse (head to tail) direction, respectively. Examples -------- >>> import networkx as nx >>> nodes = [0, 1, 2, 3] >>> edges = [(0, 1), (1, 0), (1, 0), (2, 1), (3, 1)] >>> list(nx.edge_dfs(nx.Graph(edges), nodes)) [(0, 1), (1, 2), (1, 3)] >>> list(nx.edge_dfs(nx.DiGraph(edges), nodes)) [(0, 1), (1, 0), (2, 1), (3, 1)] >>> list(nx.edge_dfs(nx.MultiGraph(edges), nodes)) [(0, 1, 0), (1, 0, 1), (0, 1, 2), (1, 2, 0), (1, 3, 0)] >>> list(nx.edge_dfs(nx.MultiDiGraph(edges), nodes)) [(0, 1, 0), (1, 0, 0), (1, 0, 1), (2, 1, 0), (3, 1, 0)] >>> list(nx.edge_dfs(nx.DiGraph(edges), nodes, orientation='ignore')) [(0, 1, 'forward'), (1, 0, 'forward'), (2, 1, 'reverse'), (3, 1, 'reverse')] >>> list(nx.edge_dfs(nx.MultiDiGraph(edges), nodes, orientation='ignore')) [(0, 1, 0, 'forward'), (1, 0, 0, 'forward'), (1, 0, 1, 'reverse'), (2, 1, 0, 'reverse'), (3, 1, 0, 'reverse')] Notes ----- The goal of this function is to visit edges. It differs from the more familiar depth-first traversal of nodes, as provided by :func:`networkx.algorithms.traversal.depth_first_search.dfs_edges`, in that it does not stop once every node has been visited. In a directed graph with edges [(0, 1), (1, 2), (2, 1)], the edge (2, 1) would not be visited if not for the functionality provided by this function. See Also -------- dfs_edges """ nodes = list(G.nbunch_iter(source)) if not nodes: raise StopIteration kwds = {'data': False} if G.is_multigraph(): kwds['keys'] = True out_edges, key, tailhead = helper_funcs(G, orientation) visited_edges = set() visited_nodes = set() edges = {} for start_node in nodes: stack = [start_node] while stack: current_node = stack[-1] if current_node not in visited_nodes: edges[current_node] = out_edges(current_node, **kwds) visited_nodes.add(current_node) try: edge = next(edges[current_node]) except StopIteration: # No more edges from the current node. stack.pop() else: edge_key = key(edge) if edge_key not in visited_edges: visited_edges.add(edge_key) # Mark the traversed "to" node as to-be-explored. stack.append(tailhead(edge)[1]) yield edge networkx-1.11/networkx/algorithms/distance_measures.py0000644000175000017500000000744412637544500023314 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ Graph diameter, radius, eccentricity and other properties. """ __author__ = "\n".join(['Aric Hagberg (hagberg@lanl.gov)', 'Dan Schult(dschult@colgate.edu)']) # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. __all__ = ['eccentricity', 'diameter', 'radius', 'periphery', 'center'] import networkx def eccentricity(G, v=None, sp=None): """Return the eccentricity of nodes in G. The eccentricity of a node v is the maximum distance from v to all other nodes in G. Parameters ---------- G : NetworkX graph A graph v : node, optional Return value of specified node sp : dict of dicts, optional All pairs shortest path lengths as a dictionary of dictionaries Returns ------- ecc : dictionary A dictionary of eccentricity values keyed by node. """ # nodes= # nodes=[] # if v is None: # none, use entire graph # nodes=G.nodes() # elif v in G: # is v a single node # nodes=[v] # else: # assume v is a container of nodes # nodes=v order=G.order() e={} for n in G.nbunch_iter(v): if sp is None: length=networkx.single_source_shortest_path_length(G,n) L = len(length) else: try: length=sp[n] L = len(length) except TypeError: raise networkx.NetworkXError('Format of "sp" is invalid.') if L != order: msg = "Graph not connected: infinite path length" raise networkx.NetworkXError(msg) e[n]=max(length.values()) if v in G: return e[v] # return single value else: return e def diameter(G, e=None): """Return the diameter of the graph G. The diameter is the maximum eccentricity. Parameters ---------- G : NetworkX graph A graph e : eccentricity dictionary, optional A precomputed dictionary of eccentricities. Returns ------- d : integer Diameter of graph See Also -------- eccentricity """ if e is None: e=eccentricity(G) return max(e.values()) def periphery(G, e=None): """Return the periphery of the graph G. The periphery is the set of nodes with eccentricity equal to the diameter. Parameters ---------- G : NetworkX graph A graph e : eccentricity dictionary, optional A precomputed dictionary of eccentricities. Returns ------- p : list List of nodes in periphery """ if e is None: e=eccentricity(G) diameter=max(e.values()) p=[v for v in e if e[v]==diameter] return p def radius(G, e=None): """Return the radius of the graph G. The radius is the minimum eccentricity. Parameters ---------- G : NetworkX graph A graph e : eccentricity dictionary, optional A precomputed dictionary of eccentricities. Returns ------- r : integer Radius of graph """ if e is None: e=eccentricity(G) return min(e.values()) def center(G, e=None): """Return the center of the graph G. The center is the set of nodes with eccentricity equal to radius. Parameters ---------- G : NetworkX graph A graph e : eccentricity dictionary, optional A precomputed dictionary of eccentricities. Returns ------- c : list List of nodes in center """ if e is None: e=eccentricity(G) # order the nodes by path length radius=min(e.values()) p=[v for v in e if e[v]==radius] return p networkx-1.11/networkx/algorithms/mst.py0000644000175000017500000001644612637544500020423 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ Computes minimum spanning tree of a weighted graph. """ # Copyright (C) 2009-2010 by # Aric Hagberg # Dan Schult # Pieter Swart # Loïc Séguin-C. # All rights reserved. # BSD license. __all__ = ['kruskal_mst', 'minimum_spanning_edges', 'minimum_spanning_tree', 'prim_mst_edges', 'prim_mst'] import networkx as nx from heapq import heappop, heappush from itertools import count def minimum_spanning_edges(G, weight='weight', data=True): """Generate edges in a minimum spanning forest of an undirected weighted graph. A minimum spanning tree is a subgraph of the graph (a tree) with the minimum sum of edge weights. A spanning forest is a union of the spanning trees for each connected component of the graph. Parameters ---------- G : NetworkX Graph weight : string Edge data key to use for weight (default 'weight'). data : bool, optional If True yield the edge data along with the edge. Returns ------- edges : iterator A generator that produces edges in the minimum spanning tree. The edges are three-tuples (u,v,w) where w is the weight. Examples -------- >>> G=nx.cycle_graph(4) >>> G.add_edge(0,3,weight=2) # assign weight 2 to edge 0-3 >>> mst=nx.minimum_spanning_edges(G,data=False) # a generator of MST edges >>> edgelist=list(mst) # make a list of the edges >>> print(sorted(edgelist)) [(0, 1), (1, 2), (2, 3)] Notes ----- Uses Kruskal's algorithm. If the graph edges do not have a weight attribute a default weight of 1 will be used. Modified code from David Eppstein, April 2006 http://www.ics.uci.edu/~eppstein/PADS/ """ # Modified code from David Eppstein, April 2006 # http://www.ics.uci.edu/~eppstein/PADS/ # Kruskal's algorithm: sort edges by weight, and add them one at a time. # We use Kruskal's algorithm, first because it is very simple to # implement once UnionFind exists, and second, because the only slow # part (the sort) is sped up by being built in to Python. from networkx.utils import UnionFind if G.is_directed(): raise nx.NetworkXError( "Mimimum spanning tree not defined for directed graphs.") subtrees = UnionFind() edges = sorted(G.edges(data=True), key=lambda t: t[2].get(weight, 1)) for u, v, d in edges: if subtrees[u] != subtrees[v]: if data: yield (u, v, d) else: yield (u, v) subtrees.union(u, v) def minimum_spanning_tree(G, weight='weight'): """Return a minimum spanning tree or forest of an undirected weighted graph. A minimum spanning tree is a subgraph of the graph (a tree) with the minimum sum of edge weights. If the graph is not connected a spanning forest is constructed. A spanning forest is a union of the spanning trees for each connected component of the graph. Parameters ---------- G : NetworkX Graph weight : string Edge data key to use for weight (default 'weight'). Returns ------- G : NetworkX Graph A minimum spanning tree or forest. Examples -------- >>> G=nx.cycle_graph(4) >>> G.add_edge(0,3,weight=2) # assign weight 2 to edge 0-3 >>> T=nx.minimum_spanning_tree(G) >>> print(sorted(T.edges(data=True))) [(0, 1, {}), (1, 2, {}), (2, 3, {})] Notes ----- Uses Kruskal's algorithm. If the graph edges do not have a weight attribute a default weight of 1 will be used. """ T = nx.Graph(nx.minimum_spanning_edges(G, weight=weight, data=True)) # Add isolated nodes if len(T) != len(G): T.add_nodes_from([n for n, d in G.degree().items() if d == 0]) # Add node and graph attributes as shallow copy for n in T: T.node[n] = G.node[n].copy() T.graph = G.graph.copy() return T kruskal_mst = minimum_spanning_tree def prim_mst_edges(G, weight='weight', data=True): """Generate edges in a minimum spanning forest of an undirected weighted graph. A minimum spanning tree is a subgraph of the graph (a tree) with the minimum sum of edge weights. A spanning forest is a union of the spanning trees for each connected component of the graph. Parameters ---------- G : NetworkX Graph weight : string Edge data key to use for weight (default 'weight'). data : bool, optional If True yield the edge data along with the edge. Returns ------- edges : iterator A generator that produces edges in the minimum spanning tree. The edges are three-tuples (u,v,w) where w is the weight. Examples -------- >>> G=nx.cycle_graph(4) >>> G.add_edge(0,3,weight=2) # assign weight 2 to edge 0-3 >>> mst=nx.prim_mst_edges(G,data=False) # a generator of MST edges >>> edgelist=list(mst) # make a list of the edges >>> print(sorted(edgelist)) [(0, 1), (1, 2), (2, 3)] Notes ----- Uses Prim's algorithm. If the graph edges do not have a weight attribute a default weight of 1 will be used. """ if G.is_directed(): raise nx.NetworkXError( "Mimimum spanning tree not defined for directed graphs.") push = heappush pop = heappop nodes = G.nodes() c = count() while nodes: u = nodes.pop(0) frontier = [] visited = [u] for u, v in G.edges(u): push(frontier, (G[u][v].get(weight, 1), next(c), u, v)) while frontier: W, _, u, v = pop(frontier) if v in visited: continue visited.append(v) nodes.remove(v) for v, w in G.edges(v): if not w in visited: push(frontier, (G[v][w].get(weight, 1), next(c), v, w)) if data: yield u, v, G[u][v] else: yield u, v def prim_mst(G, weight='weight'): """Return a minimum spanning tree or forest of an undirected weighted graph. A minimum spanning tree is a subgraph of the graph (a tree) with the minimum sum of edge weights. If the graph is not connected a spanning forest is constructed. A spanning forest is a union of the spanning trees for each connected component of the graph. Parameters ---------- G : NetworkX Graph weight : string Edge data key to use for weight (default 'weight'). Returns ------- G : NetworkX Graph A minimum spanning tree or forest. Examples -------- >>> G=nx.cycle_graph(4) >>> G.add_edge(0,3,weight=2) # assign weight 2 to edge 0-3 >>> T=nx.prim_mst(G) >>> print(sorted(T.edges(data=True))) [(0, 1, {}), (1, 2, {}), (2, 3, {})] Notes ----- Uses Prim's algorithm. If the graph edges do not have a weight attribute a default weight of 1 will be used. """ T = nx.Graph(nx.prim_mst_edges(G, weight=weight, data=True)) # Add isolated nodes if len(T) != len(G): T.add_nodes_from([n for n, d in G.degree().items() if d == 0]) # Add node and graph attributes as shallow copy for n in T: T.node[n] = G.node[n].copy() T.graph = G.graph.copy() return T networkx-1.11/networkx/algorithms/chordal/0000755000175000017500000000000012653231454020646 5ustar aricaric00000000000000networkx-1.11/networkx/algorithms/chordal/__init__.py0000644000175000017500000000007012637544450022761 0ustar aricaric00000000000000from networkx.algorithms.chordal.chordal_alg import * networkx-1.11/networkx/algorithms/chordal/chordal_alg.py0000644000175000017500000002502012637544450023463 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ Algorithms for chordal graphs. A graph is chordal if every cycle of length at least 4 has a chord (an edge joining two nodes not adjacent in the cycle). http://en.wikipedia.org/wiki/Chordal_graph """ import networkx as nx import random import sys __authors__ = "\n".join(['Jesus Cerquides ']) # Copyright (C) 2010 by # Jesus Cerquides # All rights reserved. # BSD license. __all__ = ['is_chordal', 'find_induced_nodes', 'chordal_graph_cliques', 'chordal_graph_treewidth', 'NetworkXTreewidthBoundExceeded'] class NetworkXTreewidthBoundExceeded(nx.NetworkXException): """Exception raised when a treewidth bound has been provided and it has been exceeded""" def is_chordal(G): """Checks whether G is a chordal graph. A graph is chordal if every cycle of length at least 4 has a chord (an edge joining two nodes not adjacent in the cycle). Parameters ---------- G : graph A NetworkX graph. Returns ------- chordal : bool True if G is a chordal graph and False otherwise. Raises ------ NetworkXError The algorithm does not support DiGraph, MultiGraph and MultiDiGraph. If the input graph is an instance of one of these classes, a NetworkXError is raised. Examples -------- >>> import networkx as nx >>> e=[(1,2),(1,3),(2,3),(2,4),(3,4),(3,5),(3,6),(4,5),(4,6),(5,6)] >>> G=nx.Graph(e) >>> nx.is_chordal(G) True Notes ----- The routine tries to go through every node following maximum cardinality search. It returns False when it finds that the separator for any node is not a clique. Based on the algorithms in [1]_. References ---------- .. [1] R. E. Tarjan and M. Yannakakis, Simple linear-time algorithms to test chordality of graphs, test acyclicity of hypergraphs, and selectively reduce acyclic hypergraphs, SIAM J. Comput., 13 (1984), pp. 566–579. """ if G.is_directed(): raise nx.NetworkXError('Directed graphs not supported') if G.is_multigraph(): raise nx.NetworkXError('Multiply connected graphs not supported.') if len(_find_chordality_breaker(G))==0: return True else: return False def find_induced_nodes(G,s,t,treewidth_bound=sys.maxsize): """Returns the set of induced nodes in the path from s to t. Parameters ---------- G : graph A chordal NetworkX graph s : node Source node to look for induced nodes t : node Destination node to look for induced nodes treewith_bound: float Maximum treewidth acceptable for the graph H. The search for induced nodes will end as soon as the treewidth_bound is exceeded. Returns ------- I : Set of nodes The set of induced nodes in the path from s to t in G Raises ------ NetworkXError The algorithm does not support DiGraph, MultiGraph and MultiDiGraph. If the input graph is an instance of one of these classes, a NetworkXError is raised. The algorithm can only be applied to chordal graphs. If the input graph is found to be non-chordal, a NetworkXError is raised. Examples -------- >>> import networkx as nx >>> G=nx.Graph() >>> G = nx.generators.classic.path_graph(10) >>> I = nx.find_induced_nodes(G,1,9,2) >>> list(I) [1, 2, 3, 4, 5, 6, 7, 8, 9] Notes ----- G must be a chordal graph and (s,t) an edge that is not in G. If a treewidth_bound is provided, the search for induced nodes will end as soon as the treewidth_bound is exceeded. The algorithm is inspired by Algorithm 4 in [1]_. A formal definition of induced node can also be found on that reference. References ---------- .. [1] Learning Bounded Treewidth Bayesian Networks. Gal Elidan, Stephen Gould; JMLR, 9(Dec):2699--2731, 2008. http://jmlr.csail.mit.edu/papers/volume9/elidan08a/elidan08a.pdf """ if not is_chordal(G): raise nx.NetworkXError("Input graph is not chordal.") H = nx.Graph(G) H.add_edge(s,t) I = set() triplet = _find_chordality_breaker(H,s,treewidth_bound) while triplet: (u,v,w) = triplet I.update(triplet) for n in triplet: if n!=s: H.add_edge(s,n) triplet = _find_chordality_breaker(H,s,treewidth_bound) if I: # Add t and the second node in the induced path from s to t. I.add(t) for u in G[s]: if len(I & set(G[u]))==2: I.add(u) break return I def chordal_graph_cliques(G): """Returns the set of maximal cliques of a chordal graph. The algorithm breaks the graph in connected components and performs a maximum cardinality search in each component to get the cliques. Parameters ---------- G : graph A NetworkX graph Returns ------- cliques : A set containing the maximal cliques in G. Raises ------ NetworkXError The algorithm does not support DiGraph, MultiGraph and MultiDiGraph. If the input graph is an instance of one of these classes, a NetworkXError is raised. The algorithm can only be applied to chordal graphs. If the input graph is found to be non-chordal, a NetworkXError is raised. Examples -------- >>> import networkx as nx >>> e= [(1,2),(1,3),(2,3),(2,4),(3,4),(3,5),(3,6),(4,5),(4,6),(5,6),(7,8)] >>> G = nx.Graph(e) >>> G.add_node(9) >>> setlist = nx.chordal_graph_cliques(G) """ if not is_chordal(G): raise nx.NetworkXError("Input graph is not chordal.") cliques = set() for C in nx.connected.connected_component_subgraphs(G): cliques |= _connected_chordal_graph_cliques(C) return cliques def chordal_graph_treewidth(G): """Returns the treewidth of the chordal graph G. Parameters ---------- G : graph A NetworkX graph Returns ------- treewidth : int The size of the largest clique in the graph minus one. Raises ------ NetworkXError The algorithm does not support DiGraph, MultiGraph and MultiDiGraph. If the input graph is an instance of one of these classes, a NetworkXError is raised. The algorithm can only be applied to chordal graphs. If the input graph is found to be non-chordal, a NetworkXError is raised. Examples -------- >>> import networkx as nx >>> e = [(1,2),(1,3),(2,3),(2,4),(3,4),(3,5),(3,6),(4,5),(4,6),(5,6),(7,8)] >>> G = nx.Graph(e) >>> G.add_node(9) >>> nx.chordal_graph_treewidth(G) 3 References ---------- .. [1] http://en.wikipedia.org/wiki/Tree_decomposition#Treewidth """ if not is_chordal(G): raise nx.NetworkXError("Input graph is not chordal.") max_clique = -1 for clique in nx.chordal_graph_cliques(G): max_clique = max(max_clique,len(clique)) return max_clique - 1 def _is_complete_graph(G): """Returns True if G is a complete graph.""" if G.number_of_selfloops()>0: raise nx.NetworkXError("Self loop found in _is_complete_graph()") n = G.number_of_nodes() if n < 2: return True e = G.number_of_edges() max_edges = ((n * (n-1))/2) return e == max_edges def _find_missing_edge(G): """ Given a non-complete graph G, returns a missing edge.""" nodes=set(G) for u in G: missing=nodes-set(list(G[u].keys())+[u]) if missing: return (u,missing.pop()) def _max_cardinality_node(G,choices,wanna_connect): """Returns a the node in choices that has more connections in G to nodes in wanna_connect. """ # max_number = None max_number = -1 for x in choices: number=len([y for y in G[x] if y in wanna_connect]) if number > max_number: max_number = number max_cardinality_node = x return max_cardinality_node def _find_chordality_breaker(G,s=None,treewidth_bound=sys.maxsize): """ Given a graph G, starts a max cardinality search (starting from s if s is given and from a random node otherwise) trying to find a non-chordal cycle. If it does find one, it returns (u,v,w) where u,v,w are the three nodes that together with s are involved in the cycle. """ unnumbered = set(G) if s is None: s = random.choice(list(unnumbered)) unnumbered.remove(s) numbered = set([s]) # current_treewidth = None current_treewidth = -1 while unnumbered:# and current_treewidth <= treewidth_bound: v = _max_cardinality_node(G,unnumbered,numbered) unnumbered.remove(v) numbered.add(v) clique_wanna_be = set(G[v]) & numbered sg = G.subgraph(clique_wanna_be) if _is_complete_graph(sg): # The graph seems to be chordal by now. We update the treewidth current_treewidth = max(current_treewidth,len(clique_wanna_be)) if current_treewidth > treewidth_bound: raise nx.NetworkXTreewidthBoundExceeded(\ "treewidth_bound exceeded: %s"%current_treewidth) else: # sg is not a clique, # look for an edge that is not included in sg (u,w) = _find_missing_edge(sg) return (u,v,w) return () def _connected_chordal_graph_cliques(G): """Return the set of maximal cliques of a connected chordal graph.""" if G.number_of_nodes() == 1: x = frozenset(G.nodes()) return set([x]) else: cliques = set() unnumbered = set(G.nodes()) v = random.choice(list(unnumbered)) unnumbered.remove(v) numbered = set([v]) clique_wanna_be = set([v]) while unnumbered: v = _max_cardinality_node(G,unnumbered,numbered) unnumbered.remove(v) numbered.add(v) new_clique_wanna_be = set(G.neighbors(v)) & numbered sg = G.subgraph(clique_wanna_be) if _is_complete_graph(sg): new_clique_wanna_be.add(v) if not new_clique_wanna_be >= clique_wanna_be: cliques.add(frozenset(clique_wanna_be)) clique_wanna_be = new_clique_wanna_be else: raise nx.NetworkXError("Input graph is not chordal.") cliques.add(frozenset(clique_wanna_be)) return cliques networkx-1.11/networkx/algorithms/chordal/tests/0000755000175000017500000000000012653231454022010 5ustar aricaric00000000000000networkx-1.11/networkx/algorithms/chordal/tests/test_chordal.py0000644000175000017500000000451112637544500025037 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * import networkx as nx class TestMCS: def setUp(self): # simple graph connected_chordal_G=nx.Graph() connected_chordal_G.add_edges_from([(1,2),(1,3),(2,3),(2,4),(3,4), (3,5),(3,6),(4,5),(4,6),(5,6)]) self.connected_chordal_G=connected_chordal_G chordal_G = nx.Graph() chordal_G.add_edges_from([(1,2),(1,3),(2,3),(2,4),(3,4), (3,5),(3,6),(4,5),(4,6),(5,6),(7,8)]) chordal_G.add_node(9) self.chordal_G=chordal_G non_chordal_G = nx.Graph() non_chordal_G.add_edges_from([(1,2),(1,3),(2,4),(2,5),(3,4),(3,5)]) self.non_chordal_G = non_chordal_G def test_is_chordal(self): assert_false(nx.is_chordal(self.non_chordal_G)) assert_true(nx.is_chordal(self.chordal_G)) assert_true(nx.is_chordal(self.connected_chordal_G)) assert_true(nx.is_chordal(nx.complete_graph(3))) assert_true(nx.is_chordal(nx.cycle_graph(3))) assert_false(nx.is_chordal(nx.cycle_graph(5))) def test_induced_nodes(self): G = nx.generators.classic.path_graph(10) I = nx.find_induced_nodes(G,1,9,2) assert_equal(I,set([1,2,3,4,5,6,7,8,9])) assert_raises(nx.NetworkXTreewidthBoundExceeded, nx.find_induced_nodes,G,1,9,1) I = nx.find_induced_nodes(self.chordal_G,1,6) assert_equal(I,set([1,2,4,6])) assert_raises(nx.NetworkXError, nx.find_induced_nodes,self.non_chordal_G,1,5) def test_chordal_find_cliques(self): cliques = set([frozenset([9]),frozenset([7,8]),frozenset([1,2,3]), frozenset([2,3,4]),frozenset([3,4,5,6])]) assert_equal(nx.chordal_graph_cliques(self.chordal_G),cliques) def test_chordal_find_cliques_path(self): G = nx.path_graph(10) cliqueset = nx.chordal_graph_cliques(G) for (u,v) in G.edges_iter(): assert_true(frozenset([u,v]) in cliqueset or frozenset([v,u]) in cliqueset) def test_chordal_find_cliquesCC(self): cliques = set([frozenset([1,2,3]),frozenset([2,3,4]), frozenset([3,4,5,6])]) assert_equal(nx.chordal_graph_cliques(self.connected_chordal_G),cliques) networkx-1.11/networkx/algorithms/link_prediction.py0000644000175000017500000004021712637544500022766 0ustar aricaric00000000000000""" Link prediction algorithms. """ from __future__ import division import math import networkx as nx from networkx.utils.decorators import * __all__ = ['resource_allocation_index', 'jaccard_coefficient', 'adamic_adar_index', 'preferential_attachment', 'cn_soundarajan_hopcroft', 'ra_index_soundarajan_hopcroft', 'within_inter_cluster'] @not_implemented_for('directed') @not_implemented_for('multigraph') def resource_allocation_index(G, ebunch=None): r"""Compute the resource allocation index of all node pairs in ebunch. Resource allocation index of `u` and `v` is defined as .. math:: \sum_{w \in \Gamma(u) \cap \Gamma(v)} \frac{1}{|\Gamma(w)|} where :math:`\Gamma(u)` denotes the set of neighbors of `u`. Parameters ---------- G : graph A NetworkX undirected graph. ebunch : iterable of node pairs, optional (default = None) Resource allocation index will be computed for each pair of nodes given in the iterable. The pairs must be given as 2-tuples (u, v) where u and v are nodes in the graph. If ebunch is None then all non-existent edges in the graph will be used. Default value: None. Returns ------- piter : iterator An iterator of 3-tuples in the form (u, v, p) where (u, v) is a pair of nodes and p is their resource allocation index. Examples -------- >>> import networkx as nx >>> G = nx.complete_graph(5) >>> preds = nx.resource_allocation_index(G, [(0, 1), (2, 3)]) >>> for u, v, p in preds: ... '(%d, %d) -> %.8f' % (u, v, p) ... '(0, 1) -> 0.75000000' '(2, 3) -> 0.75000000' References ---------- .. [1] T. Zhou, L. Lu, Y.-C. Zhang. Predicting missing links via local information. Eur. Phys. J. B 71 (2009) 623. http://arxiv.org/pdf/0901.0553.pdf """ if ebunch is None: ebunch = nx.non_edges(G) def predict(u, v): return sum(1 / G.degree(w) for w in nx.common_neighbors(G, u, v)) return ((u, v, predict(u, v)) for u, v in ebunch) @not_implemented_for('directed') @not_implemented_for('multigraph') def jaccard_coefficient(G, ebunch=None): r"""Compute the Jaccard coefficient of all node pairs in ebunch. Jaccard coefficient of nodes `u` and `v` is defined as .. math:: \frac{|\Gamma(u) \cap \Gamma(v)|}{|\Gamma(u) \cup \Gamma(v)|} where :math:`\Gamma(u)` denotes the set of neighbors of `u`. Parameters ---------- G : graph A NetworkX undirected graph. ebunch : iterable of node pairs, optional (default = None) Jaccard coefficient will be computed for each pair of nodes given in the iterable. The pairs must be given as 2-tuples (u, v) where u and v are nodes in the graph. If ebunch is None then all non-existent edges in the graph will be used. Default value: None. Returns ------- piter : iterator An iterator of 3-tuples in the form (u, v, p) where (u, v) is a pair of nodes and p is their Jaccard coefficient. Examples -------- >>> import networkx as nx >>> G = nx.complete_graph(5) >>> preds = nx.jaccard_coefficient(G, [(0, 1), (2, 3)]) >>> for u, v, p in preds: ... '(%d, %d) -> %.8f' % (u, v, p) ... '(0, 1) -> 0.60000000' '(2, 3) -> 0.60000000' References ---------- .. [1] D. Liben-Nowell, J. Kleinberg. The Link Prediction Problem for Social Networks (2004). http://www.cs.cornell.edu/home/kleinber/link-pred.pdf """ if ebunch is None: ebunch = nx.non_edges(G) def predict(u, v): cnbors = list(nx.common_neighbors(G, u, v)) union_size = len(set(G[u]) | set(G[v])) if union_size == 0: return 0 else: return len(cnbors) / union_size return ((u, v, predict(u, v)) for u, v in ebunch) @not_implemented_for('directed') @not_implemented_for('multigraph') def adamic_adar_index(G, ebunch=None): r"""Compute the Adamic-Adar index of all node pairs in ebunch. Adamic-Adar index of `u` and `v` is defined as .. math:: \sum_{w \in \Gamma(u) \cap \Gamma(v)} \frac{1}{\log |\Gamma(w)|} where :math:`\Gamma(u)` denotes the set of neighbors of `u`. Parameters ---------- G : graph NetworkX undirected graph. ebunch : iterable of node pairs, optional (default = None) Adamic-Adar index will be computed for each pair of nodes given in the iterable. The pairs must be given as 2-tuples (u, v) where u and v are nodes in the graph. If ebunch is None then all non-existent edges in the graph will be used. Default value: None. Returns ------- piter : iterator An iterator of 3-tuples in the form (u, v, p) where (u, v) is a pair of nodes and p is their Adamic-Adar index. Examples -------- >>> import networkx as nx >>> G = nx.complete_graph(5) >>> preds = nx.adamic_adar_index(G, [(0, 1), (2, 3)]) >>> for u, v, p in preds: ... '(%d, %d) -> %.8f' % (u, v, p) ... '(0, 1) -> 2.16404256' '(2, 3) -> 2.16404256' References ---------- .. [1] D. Liben-Nowell, J. Kleinberg. The Link Prediction Problem for Social Networks (2004). http://www.cs.cornell.edu/home/kleinber/link-pred.pdf """ if ebunch is None: ebunch = nx.non_edges(G) def predict(u, v): return sum(1 / math.log(G.degree(w)) for w in nx.common_neighbors(G, u, v)) return ((u, v, predict(u, v)) for u, v in ebunch) @not_implemented_for('directed') @not_implemented_for('multigraph') def preferential_attachment(G, ebunch=None): r"""Compute the preferential attachment score of all node pairs in ebunch. Preferential attachment score of `u` and `v` is defined as .. math:: |\Gamma(u)| |\Gamma(v)| where :math:`\Gamma(u)` denotes the set of neighbors of `u`. Parameters ---------- G : graph NetworkX undirected graph. ebunch : iterable of node pairs, optional (default = None) Preferential attachment score will be computed for each pair of nodes given in the iterable. The pairs must be given as 2-tuples (u, v) where u and v are nodes in the graph. If ebunch is None then all non-existent edges in the graph will be used. Default value: None. Returns ------- piter : iterator An iterator of 3-tuples in the form (u, v, p) where (u, v) is a pair of nodes and p is their preferential attachment score. Examples -------- >>> import networkx as nx >>> G = nx.complete_graph(5) >>> preds = nx.preferential_attachment(G, [(0, 1), (2, 3)]) >>> for u, v, p in preds: ... '(%d, %d) -> %d' % (u, v, p) ... '(0, 1) -> 16' '(2, 3) -> 16' References ---------- .. [1] D. Liben-Nowell, J. Kleinberg. The Link Prediction Problem for Social Networks (2004). http://www.cs.cornell.edu/home/kleinber/link-pred.pdf """ if ebunch is None: ebunch = nx.non_edges(G) return ((u, v, G.degree(u) * G.degree(v)) for u, v in ebunch) @not_implemented_for('directed') @not_implemented_for('multigraph') def cn_soundarajan_hopcroft(G, ebunch=None, community='community'): r"""Count the number of common neighbors of all node pairs in ebunch using community information. For two nodes `u` and `v`, this function computes the number of common neighbors and bonus one for each common neighbor belonging to the same community as `u` and `v`. Mathematically, .. math:: |\Gamma(u) \cap \Gamma(v)| + \sum_{w \in \Gamma(u) \cap \Gamma(v)} f(w) where `f(w)` equals 1 if `w` belongs to the same community as `u` and `v` or 0 otherwise and :math:`\Gamma(u)` denotes the set of neighbors of `u`. Parameters ---------- G : graph A NetworkX undirected graph. ebunch : iterable of node pairs, optional (default = None) The score will be computed for each pair of nodes given in the iterable. The pairs must be given as 2-tuples (u, v) where u and v are nodes in the graph. If ebunch is None then all non-existent edges in the graph will be used. Default value: None. community : string, optional (default = 'community') Nodes attribute name containing the community information. G[u][community] identifies which community u belongs to. Each node belongs to at most one community. Default value: 'community'. Returns ------- piter : iterator An iterator of 3-tuples in the form (u, v, p) where (u, v) is a pair of nodes and p is their score. Examples -------- >>> import networkx as nx >>> G = nx.path_graph(3) >>> G.node[0]['community'] = 0 >>> G.node[1]['community'] = 0 >>> G.node[2]['community'] = 0 >>> preds = nx.cn_soundarajan_hopcroft(G, [(0, 2)]) >>> for u, v, p in preds: ... '(%d, %d) -> %d' % (u, v, p) ... '(0, 2) -> 2' References ---------- .. [1] Sucheta Soundarajan and John Hopcroft. Using community information to improve the precision of link prediction methods. In Proceedings of the 21st international conference companion on World Wide Web (WWW '12 Companion). ACM, New York, NY, USA, 607-608. http://doi.acm.org/10.1145/2187980.2188150 """ if ebunch is None: ebunch = nx.non_edges(G) def predict(u, v): Cu = _community(G, u, community) Cv = _community(G, v, community) cnbors = list(nx.common_neighbors(G, u, v)) if Cu == Cv: return len(cnbors) + sum(_community(G, w, community) == Cu for w in cnbors) else: return len(cnbors) return ((u, v, predict(u, v)) for u, v in ebunch) @not_implemented_for('directed') @not_implemented_for('multigraph') def ra_index_soundarajan_hopcroft(G, ebunch=None, community='community'): r"""Compute the resource allocation index of all node pairs in ebunch using community information. For two nodes `u` and `v`, this function computes the resource allocation index considering only common neighbors belonging to the same community as `u` and `v`. Mathematically, .. math:: \sum_{w \in \Gamma(u) \cap \Gamma(v)} \frac{f(w)}{|\Gamma(w)|} where `f(w)` equals 1 if `w` belongs to the same community as `u` and `v` or 0 otherwise and :math:`\Gamma(u)` denotes the set of neighbors of `u`. Parameters ---------- G : graph A NetworkX undirected graph. ebunch : iterable of node pairs, optional (default = None) The score will be computed for each pair of nodes given in the iterable. The pairs must be given as 2-tuples (u, v) where u and v are nodes in the graph. If ebunch is None then all non-existent edges in the graph will be used. Default value: None. community : string, optional (default = 'community') Nodes attribute name containing the community information. G[u][community] identifies which community u belongs to. Each node belongs to at most one community. Default value: 'community'. Returns ------- piter : iterator An iterator of 3-tuples in the form (u, v, p) where (u, v) is a pair of nodes and p is their score. Examples -------- >>> import networkx as nx >>> G = nx.Graph() >>> G.add_edges_from([(0, 1), (0, 2), (1, 3), (2, 3)]) >>> G.node[0]['community'] = 0 >>> G.node[1]['community'] = 0 >>> G.node[2]['community'] = 1 >>> G.node[3]['community'] = 0 >>> preds = nx.ra_index_soundarajan_hopcroft(G, [(0, 3)]) >>> for u, v, p in preds: ... '(%d, %d) -> %.8f' % (u, v, p) ... '(0, 3) -> 0.50000000' References ---------- .. [1] Sucheta Soundarajan and John Hopcroft. Using community information to improve the precision of link prediction methods. In Proceedings of the 21st international conference companion on World Wide Web (WWW '12 Companion). ACM, New York, NY, USA, 607-608. http://doi.acm.org/10.1145/2187980.2188150 """ if ebunch is None: ebunch = nx.non_edges(G) def predict(u, v): Cu = _community(G, u, community) Cv = _community(G, v, community) if Cu == Cv: cnbors = nx.common_neighbors(G, u, v) return sum(1 / G.degree(w) for w in cnbors if _community(G, w, community) == Cu) else: return 0 return ((u, v, predict(u, v)) for u, v in ebunch) @not_implemented_for('directed') @not_implemented_for('multigraph') def within_inter_cluster(G, ebunch=None, delta=0.001, community='community'): """Compute the ratio of within- and inter-cluster common neighbors of all node pairs in ebunch. For two nodes `u` and `v`, if a common neighbor `w` belongs to the same community as them, `w` is considered as within-cluster common neighbor of `u` and `v`. Otherwise, it is considered as inter-cluster common neighbor of `u` and `v`. The ratio between the size of the set of within- and inter-cluster common neighbors is defined as the WIC measure. [1]_ Parameters ---------- G : graph A NetworkX undirected graph. ebunch : iterable of node pairs, optional (default = None) The WIC measure will be computed for each pair of nodes given in the iterable. The pairs must be given as 2-tuples (u, v) where u and v are nodes in the graph. If ebunch is None then all non-existent edges in the graph will be used. Default value: None. delta : float, optional (default = 0.001) Value to prevent division by zero in case there is no inter-cluster common neighbor between two nodes. See [1]_ for details. Default value: 0.001. community : string, optional (default = 'community') Nodes attribute name containing the community information. G[u][community] identifies which community u belongs to. Each node belongs to at most one community. Default value: 'community'. Returns ------- piter : iterator An iterator of 3-tuples in the form (u, v, p) where (u, v) is a pair of nodes and p is their WIC measure. Examples -------- >>> import networkx as nx >>> G = nx.Graph() >>> G.add_edges_from([(0, 1), (0, 2), (0, 3), (1, 4), (2, 4), (3, 4)]) >>> G.node[0]['community'] = 0 >>> G.node[1]['community'] = 1 >>> G.node[2]['community'] = 0 >>> G.node[3]['community'] = 0 >>> G.node[4]['community'] = 0 >>> preds = nx.within_inter_cluster(G, [(0, 4)]) >>> for u, v, p in preds: ... '(%d, %d) -> %.8f' % (u, v, p) ... '(0, 4) -> 1.99800200' >>> preds = nx.within_inter_cluster(G, [(0, 4)], delta=0.5) >>> for u, v, p in preds: ... '(%d, %d) -> %.8f' % (u, v, p) ... '(0, 4) -> 1.33333333' References ---------- .. [1] Jorge Carlos Valverde-Rebaza and Alneu de Andrade Lopes. Link prediction in complex networks based on cluster information. In Proceedings of the 21st Brazilian conference on Advances in Artificial Intelligence (SBIA'12) http://dx.doi.org/10.1007/978-3-642-34459-6_10 """ if delta <= 0: raise nx.NetworkXAlgorithmError('Delta must be greater than zero') if ebunch is None: ebunch = nx.non_edges(G) def predict(u, v): Cu = _community(G, u, community) Cv = _community(G, v, community) if Cu == Cv: cnbors = set(nx.common_neighbors(G, u, v)) within = set(w for w in cnbors if _community(G, w, community) == Cu) inter = cnbors - within return len(within) / (len(inter) + delta) else: return 0 return ((u, v, predict(u, v)) for u, v in ebunch) def _community(G, u, community): """Get the community of the given node.""" node_u = G.node[u] try: return node_u[community] except KeyError: raise nx.NetworkXAlgorithmError('No community information') networkx-1.11/networkx/algorithms/flow/0000755000175000017500000000000012653231454020201 5ustar aricaric00000000000000networkx-1.11/networkx/algorithms/flow/utils.py0000644000175000017500000001337512637544500021725 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ Utility classes and functions for network flow algorithms. """ __author__ = """ysitu """ # Copyright (C) 2014 ysitu # All rights reserved. # BSD license. from collections import deque import networkx as nx __all__ = ['CurrentEdge', 'Level', 'GlobalRelabelThreshold', 'build_residual_network', 'detect_unboundedness', 'build_flow_dict'] class CurrentEdge(object): """Mechanism for iterating over out-edges incident to a node in a circular manner. StopIteration exception is raised when wraparound occurs. """ __slots__ = ('_edges', '_it', '_curr') def __init__(self, edges): self._edges = edges if self._edges: self._rewind() def get(self): return self._curr def move_to_next(self): try: self._curr = next(self._it) except StopIteration: self._rewind() raise def _rewind(self): self._it = iter(self._edges.items()) self._curr = next(self._it) class Level(object): """Active and inactive nodes in a level. """ __slots__ = ('active', 'inactive') def __init__(self): self.active = set() self.inactive = set() class GlobalRelabelThreshold(object): """Measurement of work before the global relabeling heuristic should be applied. """ def __init__(self, n, m, freq): self._threshold = (n + m) / freq if freq else float('inf') self._work = 0 def add_work(self, work): self._work += work def is_reached(self): return self._work >= self._threshold def clear_work(self): self._work = 0 def build_residual_network(G, capacity): """Build a residual network and initialize a zero flow. The residual network :samp:`R` from an input graph :samp:`G` has the same nodes as :samp:`G`. :samp:`R` is a DiGraph that contains a pair of edges :samp:`(u, v)` and :samp:`(v, u)` iff :samp:`(u, v)` is not a self-loop, and at least one of :samp:`(u, v)` and :samp:`(v, u)` exists in :samp:`G`. For each edge :samp:`(u, v)` in :samp:`R`, :samp:`R[u][v]['capacity']` is equal to the capacity of :samp:`(u, v)` in :samp:`G` if it exists in :samp:`G` or zero otherwise. If the capacity is infinite, :samp:`R[u][v]['capacity']` will have a high arbitrary finite value that does not affect the solution of the problem. This value is stored in :samp:`R.graph['inf']`. For each edge :samp:`(u, v)` in :samp:`R`, :samp:`R[u][v]['flow']` represents the flow function of :samp:`(u, v)` and satisfies :samp:`R[u][v]['flow'] == -R[v][u]['flow']`. The flow value, defined as the total flow into :samp:`t`, the sink, is stored in :samp:`R.graph['flow_value']`. If :samp:`cutoff` is not specified, reachability to :samp:`t` using only edges :samp:`(u, v)` such that :samp:`R[u][v]['flow'] < R[u][v]['capacity']` induces a minimum :samp:`s`-:samp:`t` cut. """ if G.is_multigraph(): raise nx.NetworkXError( 'MultiGraph and MultiDiGraph not supported (yet).') R = nx.DiGraph() R.add_nodes_from(G) inf = float('inf') # Extract edges with positive capacities. Self loops excluded. edge_list = [(u, v, attr) for u, v, attr in G.edges_iter(data=True) if u != v and attr.get(capacity, inf) > 0] # Simulate infinity with three times the sum of the finite edge capacities # or any positive value if the sum is zero. This allows the # infinite-capacity edges to be distinguished for unboundedness detection # and directly participate in residual capacity calculation. If the maximum # flow is finite, these edges cannot appear in the minimum cut and thus # guarantee correctness. Since the residual capacity of an # infinite-capacity edge is always at least 2/3 of inf, while that of an # finite-capacity edge is at most 1/3 of inf, if an operation moves more # than 1/3 of inf units of flow to t, there must be an infinite-capacity # s-t path in G. inf = 3 * sum(attr[capacity] for u, v, attr in edge_list if capacity in attr and attr[capacity] != inf) or 1 if G.is_directed(): for u, v, attr in edge_list: r = min(attr.get(capacity, inf), inf) if not R.has_edge(u, v): # Both (u, v) and (v, u) must be present in the residual # network. R.add_edge(u, v, capacity=r) R.add_edge(v, u, capacity=0) else: # The edge (u, v) was added when (v, u) was visited. R[u][v]['capacity'] = r else: for u, v, attr in edge_list: # Add a pair of edges with equal residual capacities. r = min(attr.get(capacity, inf), inf) R.add_edge(u, v, capacity=r) R.add_edge(v, u, capacity=r) # Record the value simulating infinity. R.graph['inf'] = inf return R def detect_unboundedness(R, s, t): """Detect an infinite-capacity s-t path in R. """ q = deque([s]) seen = set([s]) inf = R.graph['inf'] while q: u = q.popleft() for v, attr in R[u].items(): if attr['capacity'] == inf and v not in seen: if v == t: raise nx.NetworkXUnbounded( 'Infinite capacity path, flow unbounded above.') seen.add(v) q.append(v) def build_flow_dict(G, R): """Build a flow dictionary from a residual network. """ flow_dict = {} for u in G: flow_dict[u] = dict((v, 0) for v in G[u]) flow_dict[u].update((v, attr['flow']) for v, attr in R[u].items() if attr['flow'] > 0) return flow_dict networkx-1.11/networkx/algorithms/flow/maxflow.py0000644000175000017500000005425412637544500022243 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ Maximum flow (and minimum cut) algorithms on capacitated graphs. """ import networkx as nx # Define the default flow function for computing maximum flow. from .edmondskarp import edmonds_karp from .preflowpush import preflow_push from .shortestaugmentingpath import shortest_augmenting_path from .utils import build_flow_dict default_flow_func = preflow_push __all__ = ['maximum_flow', 'maximum_flow_value', 'minimum_cut', 'minimum_cut_value'] def maximum_flow(G, s, t, capacity='capacity', flow_func=None, **kwargs): """Find a maximum single-commodity flow. Parameters ---------- G : NetworkX graph Edges of the graph are expected to have an attribute called 'capacity'. If this attribute is not present, the edge is considered to have infinite capacity. s : node Source node for the flow. t : node Sink node for the flow. capacity : string Edges of the graph G are expected to have an attribute capacity that indicates how much flow the edge can support. If this attribute is not present, the edge is considered to have infinite capacity. Default value: 'capacity'. flow_func : function A function for computing the maximum flow among a pair of nodes in a capacitated graph. The function has to accept at least three parameters: a Graph or Digraph, a source node, and a target node. And return a residual network that follows NetworkX conventions (see Notes). If flow_func is None, the default maximum flow function (:meth:`preflow_push`) is used. See below for alternative algorithms. The choice of the default function may change from version to version and should not be relied on. Default value: None. kwargs : Any other keyword parameter is passed to the function that computes the maximum flow. Returns ------- flow_value : integer, float Value of the maximum flow, i.e., net outflow from the source. flow_dict : dict A dictionary containing the value of the flow that went through each edge. Raises ------ NetworkXError The algorithm does not support MultiGraph and MultiDiGraph. If the input graph is an instance of one of these two classes, a NetworkXError is raised. NetworkXUnbounded If the graph has a path of infinite capacity, the value of a feasible flow on the graph is unbounded above and the function raises a NetworkXUnbounded. See also -------- :meth:`maximum_flow_value` :meth:`minimum_cut` :meth:`minimum_cut_value` :meth:`edmonds_karp` :meth:`preflow_push` :meth:`shortest_augmenting_path` Notes ----- The function used in the flow_func paramter has to return a residual network that follows NetworkX conventions: The residual network :samp:`R` from an input graph :samp:`G` has the same nodes as :samp:`G`. :samp:`R` is a DiGraph that contains a pair of edges :samp:`(u, v)` and :samp:`(v, u)` iff :samp:`(u, v)` is not a self-loop, and at least one of :samp:`(u, v)` and :samp:`(v, u)` exists in :samp:`G`. For each edge :samp:`(u, v)` in :samp:`R`, :samp:`R[u][v]['capacity']` is equal to the capacity of :samp:`(u, v)` in :samp:`G` if it exists in :samp:`G` or zero otherwise. If the capacity is infinite, :samp:`R[u][v]['capacity']` will have a high arbitrary finite value that does not affect the solution of the problem. This value is stored in :samp:`R.graph['inf']`. For each edge :samp:`(u, v)` in :samp:`R`, :samp:`R[u][v]['flow']` represents the flow function of :samp:`(u, v)` and satisfies :samp:`R[u][v]['flow'] == -R[v][u]['flow']`. The flow value, defined as the total flow into :samp:`t`, the sink, is stored in :samp:`R.graph['flow_value']`. Reachability to :samp:`t` using only edges :samp:`(u, v)` such that :samp:`R[u][v]['flow'] < R[u][v]['capacity']` induces a minimum :samp:`s`-:samp:`t` cut. Specific algorithms may store extra data in :samp:`R`. The function should supports an optional boolean parameter value_only. When True, it can optionally terminate the algorithm as soon as the maximum flow value and the minimum cut can be determined. Examples -------- >>> import networkx as nx >>> G = nx.DiGraph() >>> G.add_edge('x','a', capacity=3.0) >>> G.add_edge('x','b', capacity=1.0) >>> G.add_edge('a','c', capacity=3.0) >>> G.add_edge('b','c', capacity=5.0) >>> G.add_edge('b','d', capacity=4.0) >>> G.add_edge('d','e', capacity=2.0) >>> G.add_edge('c','y', capacity=2.0) >>> G.add_edge('e','y', capacity=3.0) maximum_flow returns both the value of the maximum flow and a dictionary with all flows. >>> flow_value, flow_dict = nx.maximum_flow(G, 'x', 'y') >>> flow_value 3.0 >>> print(flow_dict['x']['b']) 1.0 You can also use alternative algorithms for computing the maximum flow by using the flow_func parameter. >>> from networkx.algorithms.flow import shortest_augmenting_path >>> flow_value == nx.maximum_flow(G, 'x', 'y', ... flow_func=shortest_augmenting_path)[0] True """ if flow_func is None: if kwargs: raise nx.NetworkXError("You have to explicitly set a flow_func if" " you need to pass parameters via kwargs.") flow_func = default_flow_func if not callable(flow_func): raise nx.NetworkXError("flow_func has to be callable.") R = flow_func(G, s, t, capacity=capacity, value_only=False, **kwargs) flow_dict = build_flow_dict(G, R) return (R.graph['flow_value'], flow_dict) def maximum_flow_value(G, s, t, capacity='capacity', flow_func=None, **kwargs): """Find the value of maximum single-commodity flow. Parameters ---------- G : NetworkX graph Edges of the graph are expected to have an attribute called 'capacity'. If this attribute is not present, the edge is considered to have infinite capacity. s : node Source node for the flow. t : node Sink node for the flow. capacity : string Edges of the graph G are expected to have an attribute capacity that indicates how much flow the edge can support. If this attribute is not present, the edge is considered to have infinite capacity. Default value: 'capacity'. flow_func : function A function for computing the maximum flow among a pair of nodes in a capacitated graph. The function has to accept at least three parameters: a Graph or Digraph, a source node, and a target node. And return a residual network that follows NetworkX conventions (see Notes). If flow_func is None, the default maximum flow function (:meth:`preflow_push`) is used. See below for alternative algorithms. The choice of the default function may change from version to version and should not be relied on. Default value: None. kwargs : Any other keyword parameter is passed to the function that computes the maximum flow. Returns ------- flow_value : integer, float Value of the maximum flow, i.e., net outflow from the source. Raises ------ NetworkXError The algorithm does not support MultiGraph and MultiDiGraph. If the input graph is an instance of one of these two classes, a NetworkXError is raised. NetworkXUnbounded If the graph has a path of infinite capacity, the value of a feasible flow on the graph is unbounded above and the function raises a NetworkXUnbounded. See also -------- :meth:`maximum_flow` :meth:`minimum_cut` :meth:`minimum_cut_value` :meth:`edmonds_karp` :meth:`preflow_push` :meth:`shortest_augmenting_path` Notes ----- The function used in the flow_func paramter has to return a residual network that follows NetworkX conventions: The residual network :samp:`R` from an input graph :samp:`G` has the same nodes as :samp:`G`. :samp:`R` is a DiGraph that contains a pair of edges :samp:`(u, v)` and :samp:`(v, u)` iff :samp:`(u, v)` is not a self-loop, and at least one of :samp:`(u, v)` and :samp:`(v, u)` exists in :samp:`G`. For each edge :samp:`(u, v)` in :samp:`R`, :samp:`R[u][v]['capacity']` is equal to the capacity of :samp:`(u, v)` in :samp:`G` if it exists in :samp:`G` or zero otherwise. If the capacity is infinite, :samp:`R[u][v]['capacity']` will have a high arbitrary finite value that does not affect the solution of the problem. This value is stored in :samp:`R.graph['inf']`. For each edge :samp:`(u, v)` in :samp:`R`, :samp:`R[u][v]['flow']` represents the flow function of :samp:`(u, v)` and satisfies :samp:`R[u][v]['flow'] == -R[v][u]['flow']`. The flow value, defined as the total flow into :samp:`t`, the sink, is stored in :samp:`R.graph['flow_value']`. Reachability to :samp:`t` using only edges :samp:`(u, v)` such that :samp:`R[u][v]['flow'] < R[u][v]['capacity']` induces a minimum :samp:`s`-:samp:`t` cut. Specific algorithms may store extra data in :samp:`R`. The function should supports an optional boolean parameter value_only. When True, it can optionally terminate the algorithm as soon as the maximum flow value and the minimum cut can be determined. Examples -------- >>> import networkx as nx >>> G = nx.DiGraph() >>> G.add_edge('x','a', capacity=3.0) >>> G.add_edge('x','b', capacity=1.0) >>> G.add_edge('a','c', capacity=3.0) >>> G.add_edge('b','c', capacity=5.0) >>> G.add_edge('b','d', capacity=4.0) >>> G.add_edge('d','e', capacity=2.0) >>> G.add_edge('c','y', capacity=2.0) >>> G.add_edge('e','y', capacity=3.0) maximum_flow_value computes only the value of the maximum flow: >>> flow_value = nx.maximum_flow_value(G, 'x', 'y') >>> flow_value 3.0 You can also use alternative algorithms for computing the maximum flow by using the flow_func parameter. >>> from networkx.algorithms.flow import shortest_augmenting_path >>> flow_value == nx.maximum_flow_value(G, 'x', 'y', ... flow_func=shortest_augmenting_path) True """ if flow_func is None: if kwargs: raise nx.NetworkXError("You have to explicitly set a flow_func if" " you need to pass parameters via kwargs.") flow_func = default_flow_func if not callable(flow_func): raise nx.NetworkXError("flow_func has to be callable.") R = flow_func(G, s, t, capacity=capacity, value_only=True, **kwargs) return R.graph['flow_value'] def minimum_cut(G, s, t, capacity='capacity', flow_func=None, **kwargs): """Compute the value and the node partition of a minimum (s, t)-cut. Use the max-flow min-cut theorem, i.e., the capacity of a minimum capacity cut is equal to the flow value of a maximum flow. Parameters ---------- G : NetworkX graph Edges of the graph are expected to have an attribute called 'capacity'. If this attribute is not present, the edge is considered to have infinite capacity. s : node Source node for the flow. t : node Sink node for the flow. capacity : string Edges of the graph G are expected to have an attribute capacity that indicates how much flow the edge can support. If this attribute is not present, the edge is considered to have infinite capacity. Default value: 'capacity'. flow_func : function A function for computing the maximum flow among a pair of nodes in a capacitated graph. The function has to accept at least three parameters: a Graph or Digraph, a source node, and a target node. And return a residual network that follows NetworkX conventions (see Notes). If flow_func is None, the default maximum flow function (:meth:`preflow_push`) is used. See below for alternative algorithms. The choice of the default function may change from version to version and should not be relied on. Default value: None. kwargs : Any other keyword parameter is passed to the function that computes the maximum flow. Returns ------- cut_value : integer, float Value of the minimum cut. partition : pair of node sets A partitioning of the nodes that defines a minimum cut. Raises ------ NetworkXUnbounded If the graph has a path of infinite capacity, all cuts have infinite capacity and the function raises a NetworkXError. See also -------- :meth:`maximum_flow` :meth:`maximum_flow_value` :meth:`minimum_cut_value` :meth:`edmonds_karp` :meth:`preflow_push` :meth:`shortest_augmenting_path` Notes ----- The function used in the flow_func paramter has to return a residual network that follows NetworkX conventions: The residual network :samp:`R` from an input graph :samp:`G` has the same nodes as :samp:`G`. :samp:`R` is a DiGraph that contains a pair of edges :samp:`(u, v)` and :samp:`(v, u)` iff :samp:`(u, v)` is not a self-loop, and at least one of :samp:`(u, v)` and :samp:`(v, u)` exists in :samp:`G`. For each edge :samp:`(u, v)` in :samp:`R`, :samp:`R[u][v]['capacity']` is equal to the capacity of :samp:`(u, v)` in :samp:`G` if it exists in :samp:`G` or zero otherwise. If the capacity is infinite, :samp:`R[u][v]['capacity']` will have a high arbitrary finite value that does not affect the solution of the problem. This value is stored in :samp:`R.graph['inf']`. For each edge :samp:`(u, v)` in :samp:`R`, :samp:`R[u][v]['flow']` represents the flow function of :samp:`(u, v)` and satisfies :samp:`R[u][v]['flow'] == -R[v][u]['flow']`. The flow value, defined as the total flow into :samp:`t`, the sink, is stored in :samp:`R.graph['flow_value']`. Reachability to :samp:`t` using only edges :samp:`(u, v)` such that :samp:`R[u][v]['flow'] < R[u][v]['capacity']` induces a minimum :samp:`s`-:samp:`t` cut. Specific algorithms may store extra data in :samp:`R`. The function should supports an optional boolean parameter value_only. When True, it can optionally terminate the algorithm as soon as the maximum flow value and the minimum cut can be determined. Examples -------- >>> import networkx as nx >>> G = nx.DiGraph() >>> G.add_edge('x','a', capacity = 3.0) >>> G.add_edge('x','b', capacity = 1.0) >>> G.add_edge('a','c', capacity = 3.0) >>> G.add_edge('b','c', capacity = 5.0) >>> G.add_edge('b','d', capacity = 4.0) >>> G.add_edge('d','e', capacity = 2.0) >>> G.add_edge('c','y', capacity = 2.0) >>> G.add_edge('e','y', capacity = 3.0) minimum_cut computes both the value of the minimum cut and the node partition: >>> cut_value, partition = nx.minimum_cut(G, 'x', 'y') >>> reachable, non_reachable = partition 'partition' here is a tuple with the two sets of nodes that define the minimum cut. You can compute the cut set of edges that induce the minimum cut as follows: >>> cutset = set() >>> for u, nbrs in ((n, G[n]) for n in reachable): ... cutset.update((u, v) for v in nbrs if v in non_reachable) >>> print(sorted(cutset)) [('c', 'y'), ('x', 'b')] >>> cut_value == sum(G.edge[u][v]['capacity'] for (u, v) in cutset) True You can also use alternative algorithms for computing the minimum cut by using the flow_func parameter. >>> from networkx.algorithms.flow import shortest_augmenting_path >>> cut_value == nx.minimum_cut(G, 'x', 'y', ... flow_func=shortest_augmenting_path)[0] True """ if flow_func is None: if kwargs: raise nx.NetworkXError("You have to explicitly set a flow_func if" " you need to pass parameters via kwargs.") flow_func = default_flow_func if not callable(flow_func): raise nx.NetworkXError("flow_func has to be callable.") if (kwargs.get('cutoff') is not None and flow_func in (edmonds_karp, preflow_push, shortest_augmenting_path)): raise nx.NetworkXError("cutoff should not be specified.") R = flow_func(G, s, t, capacity=capacity, value_only=True, **kwargs) # Remove saturated edges from the residual network cutset = [(u, v, d) for u, v, d in R.edges(data=True) if d['flow'] == d['capacity']] R.remove_edges_from(cutset) # Then, reachable and non reachable nodes from source in the # residual network form the node partition that defines # the minimum cut. non_reachable = set(nx.shortest_path_length(R, target=t)) partition = (set(G) - non_reachable, non_reachable) # Finaly add again cutset edges to the residual network to make # sure that it is reusable. if cutset is not None: R.add_edges_from(cutset) return (R.graph['flow_value'], partition) def minimum_cut_value(G, s, t, capacity='capacity', flow_func=None, **kwargs): """Compute the value of a minimum (s, t)-cut. Use the max-flow min-cut theorem, i.e., the capacity of a minimum capacity cut is equal to the flow value of a maximum flow. Parameters ---------- G : NetworkX graph Edges of the graph are expected to have an attribute called 'capacity'. If this attribute is not present, the edge is considered to have infinite capacity. s : node Source node for the flow. t : node Sink node for the flow. capacity : string Edges of the graph G are expected to have an attribute capacity that indicates how much flow the edge can support. If this attribute is not present, the edge is considered to have infinite capacity. Default value: 'capacity'. flow_func : function A function for computing the maximum flow among a pair of nodes in a capacitated graph. The function has to accept at least three parameters: a Graph or Digraph, a source node, and a target node. And return a residual network that follows NetworkX conventions (see Notes). If flow_func is None, the default maximum flow function (:meth:`preflow_push`) is used. See below for alternative algorithms. The choice of the default function may change from version to version and should not be relied on. Default value: None. kwargs : Any other keyword parameter is passed to the function that computes the maximum flow. Returns ------- cut_value : integer, float Value of the minimum cut. Raises ------ NetworkXUnbounded If the graph has a path of infinite capacity, all cuts have infinite capacity and the function raises a NetworkXError. See also -------- :meth:`maximum_flow` :meth:`maximum_flow_value` :meth:`minimum_cut` :meth:`edmonds_karp` :meth:`preflow_push` :meth:`shortest_augmenting_path` Notes ----- The function used in the flow_func paramter has to return a residual network that follows NetworkX conventions: The residual network :samp:`R` from an input graph :samp:`G` has the same nodes as :samp:`G`. :samp:`R` is a DiGraph that contains a pair of edges :samp:`(u, v)` and :samp:`(v, u)` iff :samp:`(u, v)` is not a self-loop, and at least one of :samp:`(u, v)` and :samp:`(v, u)` exists in :samp:`G`. For each edge :samp:`(u, v)` in :samp:`R`, :samp:`R[u][v]['capacity']` is equal to the capacity of :samp:`(u, v)` in :samp:`G` if it exists in :samp:`G` or zero otherwise. If the capacity is infinite, :samp:`R[u][v]['capacity']` will have a high arbitrary finite value that does not affect the solution of the problem. This value is stored in :samp:`R.graph['inf']`. For each edge :samp:`(u, v)` in :samp:`R`, :samp:`R[u][v]['flow']` represents the flow function of :samp:`(u, v)` and satisfies :samp:`R[u][v]['flow'] == -R[v][u]['flow']`. The flow value, defined as the total flow into :samp:`t`, the sink, is stored in :samp:`R.graph['flow_value']`. Reachability to :samp:`t` using only edges :samp:`(u, v)` such that :samp:`R[u][v]['flow'] < R[u][v]['capacity']` induces a minimum :samp:`s`-:samp:`t` cut. Specific algorithms may store extra data in :samp:`R`. The function should supports an optional boolean parameter value_only. When True, it can optionally terminate the algorithm as soon as the maximum flow value and the minimum cut can be determined. Examples -------- >>> import networkx as nx >>> G = nx.DiGraph() >>> G.add_edge('x','a', capacity = 3.0) >>> G.add_edge('x','b', capacity = 1.0) >>> G.add_edge('a','c', capacity = 3.0) >>> G.add_edge('b','c', capacity = 5.0) >>> G.add_edge('b','d', capacity = 4.0) >>> G.add_edge('d','e', capacity = 2.0) >>> G.add_edge('c','y', capacity = 2.0) >>> G.add_edge('e','y', capacity = 3.0) minimum_cut_value computes only the value of the minimum cut: >>> cut_value = nx.minimum_cut_value(G, 'x', 'y') >>> cut_value 3.0 You can also use alternative algorithms for computing the minimum cut by using the flow_func parameter. >>> from networkx.algorithms.flow import shortest_augmenting_path >>> cut_value == nx.minimum_cut_value(G, 'x', 'y', ... flow_func=shortest_augmenting_path) True """ if flow_func is None: if kwargs: raise nx.NetworkXError("You have to explicitly set a flow_func if" " you need to pass parameters via kwargs.") flow_func = default_flow_func if not callable(flow_func): raise nx.NetworkXError("flow_func has to be callable.") if (kwargs.get('cutoff') is not None and flow_func in (edmonds_karp, preflow_push, shortest_augmenting_path)): raise nx.NetworkXError("cutoff should not be specified.") R = flow_func(G, s, t, capacity=capacity, value_only=True, **kwargs) return R.graph['flow_value'] networkx-1.11/networkx/algorithms/flow/shortestaugmentingpath.py0000644000175000017500000002447312637544450025401 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ Shortest augmenting path algorithm for maximum flow problems. """ __author__ = """ysitu """ # Copyright (C) 2014 ysitu # All rights reserved. # BSD license. from collections import deque import networkx as nx from .utils import * from .edmondskarp import edmonds_karp_core __all__ = ['shortest_augmenting_path'] def shortest_augmenting_path_impl(G, s, t, capacity, residual, two_phase, cutoff): """Implementation of the shortest augmenting path algorithm. """ if s not in G: raise nx.NetworkXError('node %s not in graph' % str(s)) if t not in G: raise nx.NetworkXError('node %s not in graph' % str(t)) if s == t: raise nx.NetworkXError('source and sink are the same node') if residual is None: R = build_residual_network(G, capacity) else: R = residual R_node = R.node R_pred = R.pred R_succ = R.succ # Initialize/reset the residual network. for u in R: for e in R_succ[u].values(): e['flow'] = 0 # Initialize heights of the nodes. heights = {t: 0} q = deque([(t, 0)]) while q: u, height = q.popleft() height += 1 for v, attr in R_pred[u].items(): if v not in heights and attr['flow'] < attr['capacity']: heights[v] = height q.append((v, height)) if s not in heights: # t is not reachable from s in the residual network. The maximum flow # must be zero. R.graph['flow_value'] = 0 return R n = len(G) m = R.size() / 2 # Initialize heights and 'current edge' data structures of the nodes. for u in R: R_node[u]['height'] = heights[u] if u in heights else n R_node[u]['curr_edge'] = CurrentEdge(R_succ[u]) # Initialize counts of nodes in each level. counts = [0] * (2 * n - 1) for u in R: counts[R_node[u]['height']] += 1 inf = R.graph['inf'] def augment(path): """Augment flow along a path from s to t. """ # Determine the path residual capacity. flow = inf it = iter(path) u = next(it) for v in it: attr = R_succ[u][v] flow = min(flow, attr['capacity'] - attr['flow']) u = v if flow * 2 > inf: raise nx.NetworkXUnbounded( 'Infinite capacity path, flow unbounded above.') # Augment flow along the path. it = iter(path) u = next(it) for v in it: R_succ[u][v]['flow'] += flow R_succ[v][u]['flow'] -= flow u = v return flow def relabel(u): """Relabel a node to create an admissible edge. """ height = n - 1 for v, attr in R_succ[u].items(): if attr['flow'] < attr['capacity']: height = min(height, R_node[v]['height']) return height + 1 if cutoff is None: cutoff = float('inf') # Phase 1: Look for shortest augmenting paths using depth-first search. flow_value = 0 path = [s] u = s d = n if not two_phase else int(min(m ** 0.5, 2 * n ** (2. / 3))) done = R_node[s]['height'] >= d while not done: height = R_node[u]['height'] curr_edge = R_node[u]['curr_edge'] # Depth-first search for the next node on the path to t. while True: v, attr = curr_edge.get() if (height == R_node[v]['height'] + 1 and attr['flow'] < attr['capacity']): # Advance to the next node following an admissible edge. path.append(v) u = v break try: curr_edge.move_to_next() except StopIteration: counts[height] -= 1 if counts[height] == 0: # Gap heuristic: If relabeling causes a level to become # empty, a minimum cut has been identified. The algorithm # can now be terminated. R.graph['flow_value'] = flow_value return R height = relabel(u) if u == s and height >= d: if not two_phase: # t is disconnected from s in the residual network. No # more augmenting paths exist. R.graph['flow_value'] = flow_value return R else: # t is at least d steps away from s. End of phase 1. done = True break counts[height] += 1 R_node[u]['height'] = height if u != s: # After relabeling, the last edge on the path is no longer # admissible. Retreat one step to look for an alternative. path.pop() u = path[-1] break if u == t: # t is reached. Augment flow along the path and reset it for a new # depth-first search. flow_value += augment(path) if flow_value >= cutoff: R.graph['flow_value'] = flow_value return R path = [s] u = s # Phase 2: Look for shortest augmenting paths using breadth-first search. flow_value += edmonds_karp_core(R, s, t, cutoff - flow_value) R.graph['flow_value'] = flow_value return R def shortest_augmenting_path(G, s, t, capacity='capacity', residual=None, value_only=False, two_phase=False, cutoff=None): """Find a maximum single-commodity flow using the shortest augmenting path algorithm. This function returns the residual network resulting after computing the maximum flow. See below for details about the conventions NetworkX uses for defining residual networks. This algorithm has a running time of `O(n^2 m)` for `n` nodes and `m` edges. Parameters ---------- G : NetworkX graph Edges of the graph are expected to have an attribute called 'capacity'. If this attribute is not present, the edge is considered to have infinite capacity. s : node Source node for the flow. t : node Sink node for the flow. capacity : string Edges of the graph G are expected to have an attribute capacity that indicates how much flow the edge can support. If this attribute is not present, the edge is considered to have infinite capacity. Default value: 'capacity'. residual : NetworkX graph Residual network on which the algorithm is to be executed. If None, a new residual network is created. Default value: None. value_only : bool If True compute only the value of the maximum flow. This parameter will be ignored by this algorithm because it is not applicable. two_phase : bool If True, a two-phase variant is used. The two-phase variant improves the running time on unit-capacity networks from `O(nm)` to `O(\min(n^{2/3}, m^{1/2}) m)`. Default value: False. cutoff : integer, float If specified, the algorithm will terminate when the flow value reaches or exceeds the cutoff. In this case, it may be unable to immediately determine a minimum cut. Default value: None. Returns ------- R : NetworkX DiGraph Residual network after computing the maximum flow. Raises ------ NetworkXError The algorithm does not support MultiGraph and MultiDiGraph. If the input graph is an instance of one of these two classes, a NetworkXError is raised. NetworkXUnbounded If the graph has a path of infinite capacity, the value of a feasible flow on the graph is unbounded above and the function raises a NetworkXUnbounded. See also -------- :meth:`maximum_flow` :meth:`minimum_cut` :meth:`edmonds_karp` :meth:`preflow_push` Notes ----- The residual network :samp:`R` from an input graph :samp:`G` has the same nodes as :samp:`G`. :samp:`R` is a DiGraph that contains a pair of edges :samp:`(u, v)` and :samp:`(v, u)` iff :samp:`(u, v)` is not a self-loop, and at least one of :samp:`(u, v)` and :samp:`(v, u)` exists in :samp:`G`. For each edge :samp:`(u, v)` in :samp:`R`, :samp:`R[u][v]['capacity']` is equal to the capacity of :samp:`(u, v)` in :samp:`G` if it exists in :samp:`G` or zero otherwise. If the capacity is infinite, :samp:`R[u][v]['capacity']` will have a high arbitrary finite value that does not affect the solution of the problem. This value is stored in :samp:`R.graph['inf']`. For each edge :samp:`(u, v)` in :samp:`R`, :samp:`R[u][v]['flow']` represents the flow function of :samp:`(u, v)` and satisfies :samp:`R[u][v]['flow'] == -R[v][u]['flow']`. The flow value, defined as the total flow into :samp:`t`, the sink, is stored in :samp:`R.graph['flow_value']`. If :samp:`cutoff` is not specified, reachability to :samp:`t` using only edges :samp:`(u, v)` such that :samp:`R[u][v]['flow'] < R[u][v]['capacity']` induces a minimum :samp:`s`-:samp:`t` cut. Examples -------- >>> import networkx as nx >>> from networkx.algorithms.flow import shortest_augmenting_path The functions that implement flow algorithms and output a residual network, such as this one, are not imported to the base NetworkX namespace, so you have to explicitly import them from the flow package. >>> G = nx.DiGraph() >>> G.add_edge('x','a', capacity=3.0) >>> G.add_edge('x','b', capacity=1.0) >>> G.add_edge('a','c', capacity=3.0) >>> G.add_edge('b','c', capacity=5.0) >>> G.add_edge('b','d', capacity=4.0) >>> G.add_edge('d','e', capacity=2.0) >>> G.add_edge('c','y', capacity=2.0) >>> G.add_edge('e','y', capacity=3.0) >>> R = shortest_augmenting_path(G, 'x', 'y') >>> flow_value = nx.maximum_flow_value(G, 'x', 'y') >>> flow_value 3.0 >>> flow_value == R.graph['flow_value'] True """ R = shortest_augmenting_path_impl(G, s, t, capacity, residual, two_phase, cutoff) R.graph['algorithm'] = 'shortest_augmenting_path' return R networkx-1.11/networkx/algorithms/flow/__init__.py0000644000175000017500000000040212637544450022313 0ustar aricaric00000000000000from .maxflow import * from .mincost import * from .edmondskarp import * from .preflowpush import * from .shortestaugmentingpath import * from .capacityscaling import * from .networksimplex import * from .utils import build_flow_dict, build_residual_network networkx-1.11/networkx/algorithms/flow/preflowpush.py0000644000175000017500000003656412637544500023150 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ Highest-label preflow-push algorithm for maximum flow problems. """ __author__ = """ysitu """ # Copyright (C) 2014 ysitu # All rights reserved. # BSD license. from collections import deque from itertools import islice import networkx as nx from networkx.algorithms.flow.utils import * __all__ = ['preflow_push'] def preflow_push_impl(G, s, t, capacity, residual, global_relabel_freq, value_only): """Implementation of the highest-label preflow-push algorithm. """ if s not in G: raise nx.NetworkXError('node %s not in graph' % str(s)) if t not in G: raise nx.NetworkXError('node %s not in graph' % str(t)) if s == t: raise nx.NetworkXError('source and sink are the same node') if global_relabel_freq is None: global_relabel_freq = 0 if global_relabel_freq < 0: raise nx.NetworkXError('global_relabel_freq must be nonnegative.') if residual is None: R = build_residual_network(G, capacity) else: R = residual detect_unboundedness(R, s, t) R_node = R.node R_pred = R.pred R_succ = R.succ # Initialize/reset the residual network. for u in R: R_node[u]['excess'] = 0 for e in R_succ[u].values(): e['flow'] = 0 def reverse_bfs(src): """Perform a reverse breadth-first search from src in the residual network. """ heights = {src: 0} q = deque([(src, 0)]) while q: u, height = q.popleft() height += 1 for v, attr in R_pred[u].items(): if v not in heights and attr['flow'] < attr['capacity']: heights[v] = height q.append((v, height)) return heights # Initialize heights of the nodes. heights = reverse_bfs(t) if s not in heights: # t is not reachable from s in the residual network. The maximum flow # must be zero. R.graph['flow_value'] = 0 return R n = len(R) # max_height represents the height of the highest level below level n with # at least one active node. max_height = max(heights[u] for u in heights if u != s) heights[s] = n grt = GlobalRelabelThreshold(n, R.size(), global_relabel_freq) # Initialize heights and 'current edge' data structures of the nodes. for u in R: R_node[u]['height'] = heights[u] if u in heights else n + 1 R_node[u]['curr_edge'] = CurrentEdge(R_succ[u]) def push(u, v, flow): """Push flow units of flow from u to v. """ R_succ[u][v]['flow'] += flow R_succ[v][u]['flow'] -= flow R_node[u]['excess'] -= flow R_node[v]['excess'] += flow # The maximum flow must be nonzero now. Initialize the preflow by # saturating all edges emanating from s. for u, attr in R_succ[s].items(): flow = attr['capacity'] if flow > 0: push(s, u, flow) # Partition nodes into levels. levels = [Level() for i in range(2 * n)] for u in R: if u != s and u != t: level = levels[R_node[u]['height']] if R_node[u]['excess'] > 0: level.active.add(u) else: level.inactive.add(u) def activate(v): """Move a node from the inactive set to the active set of its level. """ if v != s and v != t: level = levels[R_node[v]['height']] if v in level.inactive: level.inactive.remove(v) level.active.add(v) def relabel(u): """Relabel a node to create an admissible edge. """ grt.add_work(len(R_succ[u])) return min(R_node[v]['height'] for v, attr in R_succ[u].items() if attr['flow'] < attr['capacity']) + 1 def discharge(u, is_phase1): """Discharge a node until it becomes inactive or, during phase 1 (see below), its height reaches at least n. The node is known to have the largest height among active nodes. """ height = R_node[u]['height'] curr_edge = R_node[u]['curr_edge'] # next_height represents the next height to examine after discharging # the current node. During phase 1, it is capped to below n. next_height = height levels[height].active.remove(u) while True: v, attr = curr_edge.get() if (height == R_node[v]['height'] + 1 and attr['flow'] < attr['capacity']): flow = min(R_node[u]['excess'], attr['capacity'] - attr['flow']) push(u, v, flow) activate(v) if R_node[u]['excess'] == 0: # The node has become inactive. levels[height].inactive.add(u) break try: curr_edge.move_to_next() except StopIteration: # We have run off the end of the adjacency list, and there can # be no more admissible edges. Relabel the node to create one. height = relabel(u) if is_phase1 and height >= n - 1: # Although the node is still active, with a height at least # n - 1, it is now known to be on the s side of the minimum # s-t cut. Stop processing it until phase 2. levels[height].active.add(u) break # The first relabel operation after global relabeling may not # increase the height of the node since the 'current edge' data # structure is not rewound. Use height instead of (height - 1) # in case other active nodes at the same level are missed. next_height = height R_node[u]['height'] = height return next_height def gap_heuristic(height): """Apply the gap heuristic. """ # Move all nodes at levels (height + 1) to max_height to level n + 1. for level in islice(levels, height + 1, max_height + 1): for u in level.active: R_node[u]['height'] = n + 1 for u in level.inactive: R_node[u]['height'] = n + 1 levels[n + 1].active.update(level.active) level.active.clear() levels[n + 1].inactive.update(level.inactive) level.inactive.clear() def global_relabel(from_sink): """Apply the global relabeling heuristic. """ src = t if from_sink else s heights = reverse_bfs(src) if not from_sink: # s must be reachable from t. Remove t explicitly. del heights[t] max_height = max(heights.values()) if from_sink: # Also mark nodes from which t is unreachable for relabeling. This # serves the same purpose as the gap heuristic. for u in R: if u not in heights and R_node[u]['height'] < n: heights[u] = n + 1 else: # Shift the computed heights because the height of s is n. for u in heights: heights[u] += n max_height += n del heights[src] for u, new_height in heights.items(): old_height = R_node[u]['height'] if new_height != old_height: if u in levels[old_height].active: levels[old_height].active.remove(u) levels[new_height].active.add(u) else: levels[old_height].inactive.remove(u) levels[new_height].inactive.add(u) R_node[u]['height'] = new_height return max_height # Phase 1: Find the maximum preflow by pushing as much flow as possible to # t. height = max_height while height > 0: # Discharge active nodes in the current level. while True: level = levels[height] if not level.active: # All active nodes in the current level have been discharged. # Move to the next lower level. height -= 1 break # Record the old height and level for the gap heuristic. old_height = height old_level = level u = next(iter(level.active)) height = discharge(u, True) if grt.is_reached(): # Global relabeling heuristic: Recompute the exact heights of # all nodes. height = global_relabel(True) max_height = height grt.clear_work() elif not old_level.active and not old_level.inactive: # Gap heuristic: If the level at old_height is empty (a 'gap'), # a minimum cut has been identified. All nodes with heights # above old_height can have their heights set to n + 1 and not # be further processed before a maximum preflow is found. gap_heuristic(old_height) height = old_height - 1 max_height = height else: # Update the height of the highest level with at least one # active node. max_height = max(max_height, height) # A maximum preflow has been found. The excess at t is the maximum flow # value. if value_only: R.graph['flow_value'] = R_node[t]['excess'] return R # Phase 2: Convert the maximum preflow into a maximum flow by returning the # excess to s. # Relabel all nodes so that they have accurate heights. height = global_relabel(False) grt.clear_work() # Continue to discharge the active nodes. while height > n: # Discharge active nodes in the current level. while True: level = levels[height] if not level.active: # All active nodes in the current level have been discharged. # Move to the next lower level. height -= 1 break u = next(iter(level.active)) height = discharge(u, False) if grt.is_reached(): # Global relabeling heuristic. height = global_relabel(False) grt.clear_work() R.graph['flow_value'] = R_node[t]['excess'] return R def preflow_push(G, s, t, capacity='capacity', residual=None, global_relabel_freq=1, value_only=False): """Find a maximum single-commodity flow using the highest-label preflow-push algorithm. This function returns the residual network resulting after computing the maximum flow. See below for details about the conventions NetworkX uses for defining residual networks. This algorithm has a running time of `O(n^2 \sqrt{m})` for `n` nodes and `m` edges. Parameters ---------- G : NetworkX graph Edges of the graph are expected to have an attribute called 'capacity'. If this attribute is not present, the edge is considered to have infinite capacity. s : node Source node for the flow. t : node Sink node for the flow. capacity : string Edges of the graph G are expected to have an attribute capacity that indicates how much flow the edge can support. If this attribute is not present, the edge is considered to have infinite capacity. Default value: 'capacity'. residual : NetworkX graph Residual network on which the algorithm is to be executed. If None, a new residual network is created. Default value: None. global_relabel_freq : integer, float Relative frequency of applying the global relabeling heuristic to speed up the algorithm. If it is None, the heuristic is disabled. Default value: 1. value_only : bool If False, compute a maximum flow; otherwise, compute a maximum preflow which is enough for computing the maximum flow value. Default value: False. Returns ------- R : NetworkX DiGraph Residual network after computing the maximum flow. Raises ------ NetworkXError The algorithm does not support MultiGraph and MultiDiGraph. If the input graph is an instance of one of these two classes, a NetworkXError is raised. NetworkXUnbounded If the graph has a path of infinite capacity, the value of a feasible flow on the graph is unbounded above and the function raises a NetworkXUnbounded. See also -------- :meth:`maximum_flow` :meth:`minimum_cut` :meth:`edmonds_karp` :meth:`shortest_augmenting_path` Notes ----- The residual network :samp:`R` from an input graph :samp:`G` has the same nodes as :samp:`G`. :samp:`R` is a DiGraph that contains a pair of edges :samp:`(u, v)` and :samp:`(v, u)` iff :samp:`(u, v)` is not a self-loop, and at least one of :samp:`(u, v)` and :samp:`(v, u)` exists in :samp:`G`. For each node :samp:`u` in :samp:`R`, :samp:`R.node[u]['excess']` represents the difference between flow into :samp:`u` and flow out of :samp:`u`. For each edge :samp:`(u, v)` in :samp:`R`, :samp:`R[u][v]['capacity']` is equal to the capacity of :samp:`(u, v)` in :samp:`G` if it exists in :samp:`G` or zero otherwise. If the capacity is infinite, :samp:`R[u][v]['capacity']` will have a high arbitrary finite value that does not affect the solution of the problem. This value is stored in :samp:`R.graph['inf']`. For each edge :samp:`(u, v)` in :samp:`R`, :samp:`R[u][v]['flow']` represents the flow function of :samp:`(u, v)` and satisfies :samp:`R[u][v]['flow'] == -R[v][u]['flow']`. The flow value, defined as the total flow into :samp:`t`, the sink, is stored in :samp:`R.graph['flow_value']`. Reachability to :samp:`t` using only edges :samp:`(u, v)` such that :samp:`R[u][v]['flow'] < R[u][v]['capacity']` induces a minimum :samp:`s`-:samp:`t` cut. Examples -------- >>> import networkx as nx >>> from networkx.algorithms.flow import preflow_push The functions that implement flow algorithms and output a residual network, such as this one, are not imported to the base NetworkX namespace, so you have to explicitly import them from the flow package. >>> G = nx.DiGraph() >>> G.add_edge('x','a', capacity=3.0) >>> G.add_edge('x','b', capacity=1.0) >>> G.add_edge('a','c', capacity=3.0) >>> G.add_edge('b','c', capacity=5.0) >>> G.add_edge('b','d', capacity=4.0) >>> G.add_edge('d','e', capacity=2.0) >>> G.add_edge('c','y', capacity=2.0) >>> G.add_edge('e','y', capacity=3.0) >>> R = preflow_push(G, 'x', 'y') >>> flow_value = nx.maximum_flow_value(G, 'x', 'y') >>> flow_value == R.graph['flow_value'] True >>> # preflow_push also stores the maximum flow value >>> # in the excess attribute of the sink node t >>> flow_value == R.node['y']['excess'] True >>> # For some problems, you might only want to compute a >>> # maximum preflow. >>> R = preflow_push(G, 'x', 'y', value_only=True) >>> flow_value == R.graph['flow_value'] True >>> flow_value == R.node['y']['excess'] True """ R = preflow_push_impl(G, s, t, capacity, residual, global_relabel_freq, value_only) R.graph['algorithm'] = 'preflow_push' return R networkx-1.11/networkx/algorithms/flow/mincost.py0000644000175000017500000002575212637544500022243 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ Minimum cost flow algorithms on directed connected graphs. """ __author__ = """Loïc Séguin-C. """ # Copyright (C) 2010 Loïc Séguin-C. # All rights reserved. # BSD license. __all__ = ['min_cost_flow_cost', 'min_cost_flow', 'cost_of_flow', 'max_flow_min_cost'] import networkx as nx def min_cost_flow_cost(G, demand = 'demand', capacity = 'capacity', weight = 'weight'): r"""Find the cost of a minimum cost flow satisfying all demands in digraph G. G is a digraph with edge costs and capacities and in which nodes have demand, i.e., they want to send or receive some amount of flow. A negative demand means that the node wants to send flow, a positive demand means that the node want to receive flow. A flow on the digraph G satisfies all demand if the net flow into each node is equal to the demand of that node. Parameters ---------- G : NetworkX graph DiGraph on which a minimum cost flow satisfying all demands is to be found. demand : string Nodes of the graph G are expected to have an attribute demand that indicates how much flow a node wants to send (negative demand) or receive (positive demand). Note that the sum of the demands should be 0 otherwise the problem in not feasible. If this attribute is not present, a node is considered to have 0 demand. Default value: 'demand'. capacity : string Edges of the graph G are expected to have an attribute capacity that indicates how much flow the edge can support. If this attribute is not present, the edge is considered to have infinite capacity. Default value: 'capacity'. weight : string Edges of the graph G are expected to have an attribute weight that indicates the cost incurred by sending one unit of flow on that edge. If not present, the weight is considered to be 0. Default value: 'weight'. Returns ------- flowCost : integer, float Cost of a minimum cost flow satisfying all demands. Raises ------ NetworkXError This exception is raised if the input graph is not directed or not connected. NetworkXUnfeasible This exception is raised in the following situations: * The sum of the demands is not zero. Then, there is no flow satisfying all demands. * There is no flow satisfying all demand. NetworkXUnbounded This exception is raised if the digraph G has a cycle of negative cost and infinite capacity. Then, the cost of a flow satisfying all demands is unbounded below. See also -------- cost_of_flow, max_flow_min_cost, min_cost_flow, network_simplex Examples -------- A simple example of a min cost flow problem. >>> import networkx as nx >>> G = nx.DiGraph() >>> G.add_node('a', demand = -5) >>> G.add_node('d', demand = 5) >>> G.add_edge('a', 'b', weight = 3, capacity = 4) >>> G.add_edge('a', 'c', weight = 6, capacity = 10) >>> G.add_edge('b', 'd', weight = 1, capacity = 9) >>> G.add_edge('c', 'd', weight = 2, capacity = 5) >>> flowCost = nx.min_cost_flow_cost(G) >>> flowCost 24 """ return nx.network_simplex(G, demand = demand, capacity = capacity, weight = weight)[0] def min_cost_flow(G, demand = 'demand', capacity = 'capacity', weight = 'weight'): r"""Return a minimum cost flow satisfying all demands in digraph G. G is a digraph with edge costs and capacities and in which nodes have demand, i.e., they want to send or receive some amount of flow. A negative demand means that the node wants to send flow, a positive demand means that the node want to receive flow. A flow on the digraph G satisfies all demand if the net flow into each node is equal to the demand of that node. Parameters ---------- G : NetworkX graph DiGraph on which a minimum cost flow satisfying all demands is to be found. demand : string Nodes of the graph G are expected to have an attribute demand that indicates how much flow a node wants to send (negative demand) or receive (positive demand). Note that the sum of the demands should be 0 otherwise the problem in not feasible. If this attribute is not present, a node is considered to have 0 demand. Default value: 'demand'. capacity : string Edges of the graph G are expected to have an attribute capacity that indicates how much flow the edge can support. If this attribute is not present, the edge is considered to have infinite capacity. Default value: 'capacity'. weight : string Edges of the graph G are expected to have an attribute weight that indicates the cost incurred by sending one unit of flow on that edge. If not present, the weight is considered to be 0. Default value: 'weight'. Returns ------- flowDict : dictionary Dictionary of dictionaries keyed by nodes such that flowDict[u][v] is the flow edge (u, v). Raises ------ NetworkXError This exception is raised if the input graph is not directed or not connected. NetworkXUnfeasible This exception is raised in the following situations: * The sum of the demands is not zero. Then, there is no flow satisfying all demands. * There is no flow satisfying all demand. NetworkXUnbounded This exception is raised if the digraph G has a cycle of negative cost and infinite capacity. Then, the cost of a flow satisfying all demands is unbounded below. See also -------- cost_of_flow, max_flow_min_cost, min_cost_flow_cost, network_simplex Examples -------- A simple example of a min cost flow problem. >>> import networkx as nx >>> G = nx.DiGraph() >>> G.add_node('a', demand = -5) >>> G.add_node('d', demand = 5) >>> G.add_edge('a', 'b', weight = 3, capacity = 4) >>> G.add_edge('a', 'c', weight = 6, capacity = 10) >>> G.add_edge('b', 'd', weight = 1, capacity = 9) >>> G.add_edge('c', 'd', weight = 2, capacity = 5) >>> flowDict = nx.min_cost_flow(G) """ return nx.network_simplex(G, demand = demand, capacity = capacity, weight = weight)[1] def cost_of_flow(G, flowDict, weight = 'weight'): """Compute the cost of the flow given by flowDict on graph G. Note that this function does not check for the validity of the flow flowDict. This function will fail if the graph G and the flow don't have the same edge set. Parameters ---------- G : NetworkX graph DiGraph on which a minimum cost flow satisfying all demands is to be found. weight : string Edges of the graph G are expected to have an attribute weight that indicates the cost incurred by sending one unit of flow on that edge. If not present, the weight is considered to be 0. Default value: 'weight'. flowDict : dictionary Dictionary of dictionaries keyed by nodes such that flowDict[u][v] is the flow edge (u, v). Returns ------- cost : Integer, float The total cost of the flow. This is given by the sum over all edges of the product of the edge's flow and the edge's weight. See also -------- max_flow_min_cost, min_cost_flow, min_cost_flow_cost, network_simplex """ return sum((flowDict[u][v] * d.get(weight, 0) for u, v, d in G.edges_iter(data = True))) def max_flow_min_cost(G, s, t, capacity = 'capacity', weight = 'weight'): """Return a maximum (s, t)-flow of minimum cost. G is a digraph with edge costs and capacities. There is a source node s and a sink node t. This function finds a maximum flow from s to t whose total cost is minimized. Parameters ---------- G : NetworkX graph DiGraph on which a minimum cost flow satisfying all demands is to be found. s: node label Source of the flow. t: node label Destination of the flow. capacity: string Edges of the graph G are expected to have an attribute capacity that indicates how much flow the edge can support. If this attribute is not present, the edge is considered to have infinite capacity. Default value: 'capacity'. weight: string Edges of the graph G are expected to have an attribute weight that indicates the cost incurred by sending one unit of flow on that edge. If not present, the weight is considered to be 0. Default value: 'weight'. Returns ------- flowDict: dictionary Dictionary of dictionaries keyed by nodes such that flowDict[u][v] is the flow edge (u, v). Raises ------ NetworkXError This exception is raised if the input graph is not directed or not connected. NetworkXUnbounded This exception is raised if there is an infinite capacity path from s to t in G. In this case there is no maximum flow. This exception is also raised if the digraph G has a cycle of negative cost and infinite capacity. Then, the cost of a flow is unbounded below. See also -------- cost_of_flow, min_cost_flow, min_cost_flow_cost, network_simplex Examples -------- >>> G = nx.DiGraph() >>> G.add_edges_from([(1, 2, {'capacity': 12, 'weight': 4}), ... (1, 3, {'capacity': 20, 'weight': 6}), ... (2, 3, {'capacity': 6, 'weight': -3}), ... (2, 6, {'capacity': 14, 'weight': 1}), ... (3, 4, {'weight': 9}), ... (3, 5, {'capacity': 10, 'weight': 5}), ... (4, 2, {'capacity': 19, 'weight': 13}), ... (4, 5, {'capacity': 4, 'weight': 0}), ... (5, 7, {'capacity': 28, 'weight': 2}), ... (6, 5, {'capacity': 11, 'weight': 1}), ... (6, 7, {'weight': 8}), ... (7, 4, {'capacity': 6, 'weight': 6})]) >>> mincostFlow = nx.max_flow_min_cost(G, 1, 7) >>> mincost = nx.cost_of_flow(G, mincostFlow) >>> mincost 373 >>> from networkx.algorithms.flow import maximum_flow >>> maxFlow = maximum_flow(G, 1, 7)[1] >>> nx.cost_of_flow(G, maxFlow) >= mincost True >>> mincostFlowValue = (sum((mincostFlow[u][7] for u in G.predecessors(7))) ... - sum((mincostFlow[7][v] for v in G.successors(7)))) >>> mincostFlowValue == nx.maximum_flow_value(G, 1, 7) True """ maxFlow = nx.maximum_flow_value(G, s, t, capacity = capacity) H = nx.DiGraph(G) H.add_node(s, demand = -maxFlow) H.add_node(t, demand = maxFlow) return min_cost_flow(H, capacity = capacity, weight = weight) networkx-1.11/networkx/algorithms/flow/capacityscaling.py0000644000175000017500000003413712637544500023722 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ Capacity scaling minimum cost flow algorithm. """ __author__ = """ysitu """ # Copyright (C) 2014 ysitu # All rights reserved. # BSD license. __all__ = ['capacity_scaling'] from itertools import chain from math import log import networkx as nx from networkx.utils import * def _detect_unboundedness(R): """Detect infinite-capacity negative cycles. """ s = generate_unique_node() G = nx.DiGraph() G.add_nodes_from(R) # Value simulating infinity. inf = R.graph['inf'] # True infinity. f_inf = float('inf') for u in R: for v, e in R[u].items(): # Compute the minimum weight of infinite-capacity (u, v) edges. w = f_inf for k, e in e.items(): if e['capacity'] == inf: w = min(w, e['weight']) if w != f_inf: G.add_edge(u, v, weight=w) if nx.negative_edge_cycle(G): raise nx.NetworkXUnbounded( 'Negative cost cycle of infinite capacity found. ' 'Min cost flow may be unbounded below.') @not_implemented_for('undirected') def _build_residual_network(G, demand, capacity, weight): """Build a residual network and initialize a zero flow. """ if sum(G.node[u].get(demand, 0) for u in G) != 0: raise nx.NetworkXUnfeasible("Sum of the demands should be 0.") R = nx.MultiDiGraph() R.add_nodes_from((u, {'excess': -G.node[u].get(demand, 0), 'potential': 0}) for u in G) inf = float('inf') # Detect selfloops with infinite capacities and negative weights. for u, v, e in G.selfloop_edges(data=True): if e.get(weight, 0) < 0 and e.get(capacity, inf) == inf: raise nx.NetworkXUnbounded( 'Negative cost cycle of infinite capacity found. ' 'Min cost flow may be unbounded below.') # Extract edges with positive capacities. Self loops excluded. if G.is_multigraph(): edge_list = [(u, v, k, e) for u, v, k, e in G.edges_iter(data=True, keys=True) if u != v and e.get(capacity, inf) > 0] else: edge_list = [(u, v, 0, e) for u, v, e in G.edges_iter(data=True) if u != v and e.get(capacity, inf) > 0] # Simulate infinity with the larger of the sum of absolute node imbalances # the sum of finite edge capacities or any positive value if both sums are # zero. This allows the infinite-capacity edges to be distinguished for # unboundedness detection and directly participate in residual capacity # calculation. inf = max(sum(abs(R.node[u]['excess']) for u in R), 2 * sum(e[capacity] for u, v, k, e in edge_list if capacity in e and e[capacity] != inf)) or 1 for u, v, k, e in edge_list: r = min(e.get(capacity, inf), inf) w = e.get(weight, 0) # Add both (u, v) and (v, u) into the residual network marked with the # original key. (key[1] == True) indicates the (u, v) is in the # original network. R.add_edge(u, v, key=(k, True), capacity=r, weight=w, flow=0) R.add_edge(v, u, key=(k, False), capacity=0, weight=-w, flow=0) # Record the value simulating infinity. R.graph['inf'] = inf _detect_unboundedness(R) return R def _build_flow_dict(G, R, capacity, weight): """Build a flow dictionary from a residual network. """ inf = float('inf') flow_dict = {} if G.is_multigraph(): for u in G: flow_dict[u] = {} for v, es in G[u].items(): flow_dict[u][v] = dict( # Always saturate negative selfloops. (k, (0 if (u != v or e.get(capacity, inf) <= 0 or e.get(weight, 0) >= 0) else e[capacity])) for k, e in es.items()) for v, es in R[u].items(): if v in flow_dict[u]: flow_dict[u][v].update((k[0], e['flow']) for k, e in es.items() if e['flow'] > 0) else: for u in G: flow_dict[u] = dict( # Always saturate negative selfloops. (v, (0 if (u != v or e.get(capacity, inf) <= 0 or e.get(weight, 0) >= 0) else e[capacity])) for v, e in G[u].items()) flow_dict[u].update((v, e['flow']) for v, es in R[u].items() for e in es.values() if e['flow'] > 0) return flow_dict def capacity_scaling(G, demand='demand', capacity='capacity', weight='weight', heap=BinaryHeap): r"""Find a minimum cost flow satisfying all demands in digraph G. This is a capacity scaling successive shortest augmenting path algorithm. G is a digraph with edge costs and capacities and in which nodes have demand, i.e., they want to send or receive some amount of flow. A negative demand means that the node wants to send flow, a positive demand means that the node want to receive flow. A flow on the digraph G satisfies all demand if the net flow into each node is equal to the demand of that node. Parameters ---------- G : NetworkX graph DiGraph or MultiDiGraph on which a minimum cost flow satisfying all demands is to be found. demand : string Nodes of the graph G are expected to have an attribute demand that indicates how much flow a node wants to send (negative demand) or receive (positive demand). Note that the sum of the demands should be 0 otherwise the problem in not feasible. If this attribute is not present, a node is considered to have 0 demand. Default value: 'demand'. capacity : string Edges of the graph G are expected to have an attribute capacity that indicates how much flow the edge can support. If this attribute is not present, the edge is considered to have infinite capacity. Default value: 'capacity'. weight : string Edges of the graph G are expected to have an attribute weight that indicates the cost incurred by sending one unit of flow on that edge. If not present, the weight is considered to be 0. Default value: 'weight'. heap : class Type of heap to be used in the algorithm. It should be a subclass of :class:`MinHeap` or implement a compatible interface. If a stock heap implementation is to be used, :class:`BinaryHeap` is recommeded over :class:`PairingHeap` for Python implementations without optimized attribute accesses (e.g., CPython) despite a slower asymptotic running time. For Python implementations with optimized attribute accesses (e.g., PyPy), :class:`PairingHeap` provides better performance. Default value: :class:`BinaryHeap`. Returns ------- flowCost : integer Cost of a minimum cost flow satisfying all demands. flowDict : dictionary If G is a DiGraph, a dict-of-dicts keyed by nodes such that flowDict[u][v] is the flow edge (u, v). If G is a MultiDiGraph, a dict-of-dictsof-dicts keyed by nodes so that flowDict[u][v][key] is the flow edge (u, v, key). Raises ------ NetworkXError This exception is raised if the input graph is not directed, not connected. NetworkXUnfeasible This exception is raised in the following situations: * The sum of the demands is not zero. Then, there is no flow satisfying all demands. * There is no flow satisfying all demand. NetworkXUnbounded This exception is raised if the digraph G has a cycle of negative cost and infinite capacity. Then, the cost of a flow satisfying all demands is unbounded below. Notes ----- This algorithm does not work if edge weights are floating-point numbers. See also -------- :meth:`network_simplex` Examples -------- A simple example of a min cost flow problem. >>> import networkx as nx >>> G = nx.DiGraph() >>> G.add_node('a', demand = -5) >>> G.add_node('d', demand = 5) >>> G.add_edge('a', 'b', weight = 3, capacity = 4) >>> G.add_edge('a', 'c', weight = 6, capacity = 10) >>> G.add_edge('b', 'd', weight = 1, capacity = 9) >>> G.add_edge('c', 'd', weight = 2, capacity = 5) >>> flowCost, flowDict = nx.capacity_scaling(G) >>> flowCost 24 >>> flowDict # doctest: +SKIP {'a': {'c': 1, 'b': 4}, 'c': {'d': 1}, 'b': {'d': 4}, 'd': {}} It is possible to change the name of the attributes used for the algorithm. >>> G = nx.DiGraph() >>> G.add_node('p', spam = -4) >>> G.add_node('q', spam = 2) >>> G.add_node('a', spam = -2) >>> G.add_node('d', spam = -1) >>> G.add_node('t', spam = 2) >>> G.add_node('w', spam = 3) >>> G.add_edge('p', 'q', cost = 7, vacancies = 5) >>> G.add_edge('p', 'a', cost = 1, vacancies = 4) >>> G.add_edge('q', 'd', cost = 2, vacancies = 3) >>> G.add_edge('t', 'q', cost = 1, vacancies = 2) >>> G.add_edge('a', 't', cost = 2, vacancies = 4) >>> G.add_edge('d', 'w', cost = 3, vacancies = 4) >>> G.add_edge('t', 'w', cost = 4, vacancies = 1) >>> flowCost, flowDict = nx.capacity_scaling(G, demand = 'spam', ... capacity = 'vacancies', ... weight = 'cost') >>> flowCost 37 >>> flowDict # doctest: +SKIP {'a': {'t': 4}, 'd': {'w': 2}, 'q': {'d': 1}, 'p': {'q': 2, 'a': 2}, 't': {'q': 1, 'w': 1}, 'w': {}} """ R = _build_residual_network(G, demand, capacity, weight) inf = float('inf') # Account cost of negative selfloops. flow_cost = sum( 0 if e.get(capacity, inf) <= 0 or e.get(weight, 0) >= 0 else e[capacity] * e[weight] for u, v, e in G.selfloop_edges(data=True)) # Determine the maxmimum edge capacity. wmax = max(chain([-inf], (e['capacity'] for u, v, e in R.edges_iter(data=True)))) if wmax == -inf: # Residual network has no edges. return flow_cost, _build_flow_dict(G, R, capacity, weight) R_node = R.node R_succ = R.succ delta = 2 ** int(log(wmax, 2)) while delta >= 1: # Saturate Δ-residual edges with negative reduced costs to achieve # Δ-optimality. for u in R: p_u = R_node[u]['potential'] for v, es in R_succ[u].items(): for k, e in es.items(): flow = e['capacity'] - e['flow'] if e['weight'] - p_u + R_node[v]['potential'] < 0: flow = e['capacity'] - e['flow'] if flow >= delta: e['flow'] += flow R_succ[v][u][(k[0], not k[1])]['flow'] -= flow R_node[u]['excess'] -= flow R_node[v]['excess'] += flow # Determine the Δ-active nodes. S = set() T = set() S_add = S.add S_remove = S.remove T_add = T.add T_remove = T.remove for u in R: excess = R_node[u]['excess'] if excess >= delta: S_add(u) elif excess <= -delta: T_add(u) # Repeatedly augment flow from S to T along shortest paths until # Δ-feasibility is achieved. while S and T: s = next(iter(S)) t = None # Search for a shortest path in terms of reduce costs from s to # any t in T in the Δ-residual network. d = {} pred = {s: None} h = heap() h_insert = h.insert h_get = h.get h_insert(s, 0) while h: u, d_u = h.pop() d[u] = d_u if u in T: # Path found. t = u break p_u = R_node[u]['potential'] for v, es in R_succ[u].items(): if v in d: continue wmin = inf # Find the minimum-weighted (u, v) Δ-residual edge. for k, e in es.items(): if e['capacity'] - e['flow'] >= delta: w = e['weight'] if w < wmin: wmin = w kmin = k emin = e if wmin == inf: continue # Update the distance label of v. d_v = d_u + wmin - p_u + R_node[v]['potential'] if h_insert(v, d_v): pred[v] = (u, kmin, emin) if t is not None: # Augment Δ units of flow from s to t. while u != s: v = u u, k, e = pred[v] e['flow'] += delta R_succ[v][u][(k[0], not k[1])]['flow'] -= delta # Account node excess and deficit. R_node[s]['excess'] -= delta R_node[t]['excess'] += delta if R_node[s]['excess'] < delta: S_remove(s) if R_node[t]['excess'] > -delta: T_remove(t) # Update node potentials. d_t = d[t] for u, d_u in d.items(): R_node[u]['potential'] -= d_u - d_t else: # Path not found. S_remove(s) delta //= 2 if any(R.node[u]['excess'] != 0 for u in R): raise nx.NetworkXUnfeasible('No flow satisfying all demands.') # Calculate the flow cost. for u in R: for v, es in R_succ[u].items(): for e in es.values(): flow = e['flow'] if flow > 0: flow_cost += flow * e['weight'] return flow_cost, _build_flow_dict(G, R, capacity, weight) networkx-1.11/networkx/algorithms/flow/edmondskarp.py0000644000175000017500000002000312637544450023062 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ Edmonds-Karp algorithm for maximum flow problems. """ __author__ = """ysitu """ # Copyright (C) 2014 ysitu # All rights reserved. # BSD license. import networkx as nx from networkx.algorithms.flow.utils import * __all__ = ['edmonds_karp'] def edmonds_karp_core(R, s, t, cutoff): """Implementation of the Edmonds-Karp algorithm. """ R_node = R.node R_pred = R.pred R_succ = R.succ inf = R.graph['inf'] def augment(path): """Augment flow along a path from s to t. """ # Determine the path residual capacity. flow = inf it = iter(path) u = next(it) for v in it: attr = R_succ[u][v] flow = min(flow, attr['capacity'] - attr['flow']) u = v if flow * 2 > inf: raise nx.NetworkXUnbounded( 'Infinite capacity path, flow unbounded above.') # Augment flow along the path. it = iter(path) u = next(it) for v in it: R_succ[u][v]['flow'] += flow R_succ[v][u]['flow'] -= flow u = v return flow def bidirectional_bfs(): """Bidirectional breadth-first search for an augmenting path. """ pred = {s: None} q_s = [s] succ = {t: None} q_t = [t] while True: q = [] if len(q_s) <= len(q_t): for u in q_s: for v, attr in R_succ[u].items(): if v not in pred and attr['flow'] < attr['capacity']: pred[v] = u if v in succ: return v, pred, succ q.append(v) if not q: return None, None, None q_s = q else: for u in q_t: for v, attr in R_pred[u].items(): if v not in succ and attr['flow'] < attr['capacity']: succ[v] = u if v in pred: return v, pred, succ q.append(v) if not q: return None, None, None q_t = q # Look for shortest augmenting paths using breadth-first search. flow_value = 0 while flow_value < cutoff: v, pred, succ = bidirectional_bfs() if pred is None: break path = [v] # Trace a path from s to v. u = v while u != s: u = pred[u] path.append(u) path.reverse() # Trace a path from v to t. u = v while u != t: u = succ[u] path.append(u) flow_value += augment(path) return flow_value def edmonds_karp_impl(G, s, t, capacity, residual, cutoff): """Implementation of the Edmonds-Karp algorithm. """ if s not in G: raise nx.NetworkXError('node %s not in graph' % str(s)) if t not in G: raise nx.NetworkXError('node %s not in graph' % str(t)) if s == t: raise nx.NetworkXError('source and sink are the same node') if residual is None: R = build_residual_network(G, capacity) else: R = residual # Initialize/reset the residual network. for u in R: for e in R[u].values(): e['flow'] = 0 if cutoff is None: cutoff = float('inf') R.graph['flow_value'] = edmonds_karp_core(R, s, t, cutoff) return R def edmonds_karp(G, s, t, capacity='capacity', residual=None, value_only=False, cutoff=None): """Find a maximum single-commodity flow using the Edmonds-Karp algorithm. This function returns the residual network resulting after computing the maximum flow. See below for details about the conventions NetworkX uses for defining residual networks. This algorithm has a running time of `O(n m^2)` for `n` nodes and `m` edges. Parameters ---------- G : NetworkX graph Edges of the graph are expected to have an attribute called 'capacity'. If this attribute is not present, the edge is considered to have infinite capacity. s : node Source node for the flow. t : node Sink node for the flow. capacity : string Edges of the graph G are expected to have an attribute capacity that indicates how much flow the edge can support. If this attribute is not present, the edge is considered to have infinite capacity. Default value: 'capacity'. residual : NetworkX graph Residual network on which the algorithm is to be executed. If None, a new residual network is created. Default value: None. value_only : bool If True compute only the value of the maximum flow. This parameter will be ignored by this algorithm because it is not applicable. cutoff : integer, float If specified, the algorithm will terminate when the flow value reaches or exceeds the cutoff. In this case, it may be unable to immediately determine a minimum cut. Default value: None. Returns ------- R : NetworkX DiGraph Residual network after computing the maximum flow. Raises ------ NetworkXError The algorithm does not support MultiGraph and MultiDiGraph. If the input graph is an instance of one of these two classes, a NetworkXError is raised. NetworkXUnbounded If the graph has a path of infinite capacity, the value of a feasible flow on the graph is unbounded above and the function raises a NetworkXUnbounded. See also -------- :meth:`maximum_flow` :meth:`minimum_cut` :meth:`preflow_push` :meth:`shortest_augmenting_path` Notes ----- The residual network :samp:`R` from an input graph :samp:`G` has the same nodes as :samp:`G`. :samp:`R` is a DiGraph that contains a pair of edges :samp:`(u, v)` and :samp:`(v, u)` iff :samp:`(u, v)` is not a self-loop, and at least one of :samp:`(u, v)` and :samp:`(v, u)` exists in :samp:`G`. For each edge :samp:`(u, v)` in :samp:`R`, :samp:`R[u][v]['capacity']` is equal to the capacity of :samp:`(u, v)` in :samp:`G` if it exists in :samp:`G` or zero otherwise. If the capacity is infinite, :samp:`R[u][v]['capacity']` will have a high arbitrary finite value that does not affect the solution of the problem. This value is stored in :samp:`R.graph['inf']`. For each edge :samp:`(u, v)` in :samp:`R`, :samp:`R[u][v]['flow']` represents the flow function of :samp:`(u, v)` and satisfies :samp:`R[u][v]['flow'] == -R[v][u]['flow']`. The flow value, defined as the total flow into :samp:`t`, the sink, is stored in :samp:`R.graph['flow_value']`. If :samp:`cutoff` is not specified, reachability to :samp:`t` using only edges :samp:`(u, v)` such that :samp:`R[u][v]['flow'] < R[u][v]['capacity']` induces a minimum :samp:`s`-:samp:`t` cut. Examples -------- >>> import networkx as nx >>> from networkx.algorithms.flow import edmonds_karp The functions that implement flow algorithms and output a residual network, such as this one, are not imported to the base NetworkX namespace, so you have to explicitly import them from the flow package. >>> G = nx.DiGraph() >>> G.add_edge('x','a', capacity=3.0) >>> G.add_edge('x','b', capacity=1.0) >>> G.add_edge('a','c', capacity=3.0) >>> G.add_edge('b','c', capacity=5.0) >>> G.add_edge('b','d', capacity=4.0) >>> G.add_edge('d','e', capacity=2.0) >>> G.add_edge('c','y', capacity=2.0) >>> G.add_edge('e','y', capacity=3.0) >>> R = edmonds_karp(G, 'x', 'y') >>> flow_value = nx.maximum_flow_value(G, 'x', 'y') >>> flow_value 3.0 >>> flow_value == R.graph['flow_value'] True """ R = edmonds_karp_impl(G, s, t, capacity, residual, cutoff) R.graph['algorithm'] = 'edmonds_karp' return R networkx-1.11/networkx/algorithms/flow/tests/0000755000175000017500000000000012653231454021343 5ustar aricaric00000000000000networkx-1.11/networkx/algorithms/flow/tests/wlm3.gpickle.bz20000644000175000017500000025317412637544450024302 0ustar aricaric00000000000000BZh91AY&SYH?|J^ @$w|"g"Dyw!$yDHu3 HHw3 <3Ĉn$F&H[Nw83ė$gs.vEn$gs.q;IwYKHλ,KInss;sS֖RJKm)mJ[mmmLP Xj M  * նmmRJmmҖU0֚(J… - mjVmtiJR[mڶjJmm-*KmJmTmնm-TmҕmҔ[4mKmRm*miJRԥmjVmH[mJR@hLld454M4ړhd46M22040!0L#M& UT:؀bɀ44!@TH4d4 4F643S M4m'zA=Oz&z@4ʞi?$@=&TdD@ d4ѐ h41h4`h1 hhhiOԤI&ih3U? ɦS4=&FT OMMzOG $44ɩ0Oш2iIM5A"MT&3SAM0 4CMɑLhd&CL &Lbi&h&i'G2IS$5.KYlL&42R"$DD0`0!cVjm̹K2m喳'\-%̲-Yq8K$-%%-Y꤫˗,$Hm}K&\]9'x:NJ֞IiZZV[ex. elYdrOeF1R$ #Q@X!pF-rVzSFÂ8-[8p\8.6کM;3FQ $OAr֘ r,7t-pv8#\2dl؆T6z!iÍYJa+m,GH_D;\jFۛ csp;vmcmb8 Tp u8V4qT![%ɣFnF{7\p8#n8-pnoYm[mÁÀps[3ÁG M@26H8qnź87l7 }\&V~ to (TP!s(ws4ZQ [ L ѸnѺ8.Çp7ݛ3FLYY2Upv*BcKHKHlp }ull2dceÁp8mg4WU693IY $ΤI!$ɘI 2@ɂ! :!$̒&"Lfa"2C21FTYj#d%dbVa+2a3 !lI2D !" e%dda!ɆC3!3rVI $BYeʒc $T HIS% 3faȌ"NS%d&XVJ[ 3 f 0'Q(*mRTFfC$̌f #0d$HFN-² d fVa03itMtΙ(UJHrE." XP(T2K,+-da@@0Ai*%a[Sa#"jf2-X@#,666f@M@6@@ KE2̣fRQ,,(2<LDdp8)-1Plc+mѳmgd=l330`٩F JQ(PS-Z vevaIJhD @!Cjte2da@U%)( @0)lPSY"Ll[ RY# l2bf@E2 j`d)rx7cͲ o|lQᓀp{pi<pۃs"k}0 RR Ue&6ȨjDbZ)ePFRe% hQYaFZ(eZj5Hٔ4JPI d6If `2,^^Vl.M6eFL:i2,Yhy<> )Ja PM:5NٺL v"(%"ڑQB0 h[0 fPQ,# %LFFeMECT`ɳFE6h@01H9LLLM)3,LZdMẌ́ȉ3dpf@3,3+ae4\eRQt-.aBY"ݬ,f "ab )d*Hڕ6h4p(8֨ CfTr(يP[63VZ)e.Im0F`Ε$3 1BD΂i f ѳmg(ed|e`eZ2=83C [0  pj)(pa&( 2,2 m2%&wa.E$ojlEPYR"0*Hf5TY`eVZ3*MEa@D&T&e<6Xo+ 19<,'=lnZtǜelfٙKjFYv% @"7V= lnBD\8hee#,TeZej[e,E6f@aAQ@DS(  H,r%THAnP %h,[f6JT:pc3&cd+#in n7c[39i;3&̆d٢h)l)paH)e%%L8<2YLR|,AN"@DS")'F[e%#"2:J,F`ٷl JY).%E28_7&;tl|G5<46^=NݵۇM ׳swk,y$Żngץ4_^nķ;7.;Fk-xpdҧ^Rjn1=6}-Nﺗ|M¯ӧKqf1onw&SXۆVNu׭3^/rpy_gķ> jKǬpn|'YxySiݜ..PSehY}Q#toNSLպ.7AxtK͝<:nӺ;I/zs^wn39Y}M۩(%ئǦe\0wmw=؞vy/o8+8i_w.6Z~Ml78M"tw!S}YpMwoM$wo픸Gӷq7 Yt }tlQߗ^;ݖ΢^ $[cN5ڹKa+5,;./ǯKp^{!~k,vt vޝvLVMܹ[\ץ:7MvќU,1eq}ۛ׭w`ޯk|8v  tmnMjL_ӱ}eaw; R7th}~pG]25{rݯ #g[.S|1.ݸT=$'|H3kw]:?+yiΐI9n\.br]\oˊ3/꼬R̢sc\ Cw|.:;E;k]vx[Px,{-'Džsg]{vM:گw,_7Tf3\yLWq߫4۞[u:ZћF.f0ߏIomߛ\ׄsSvidkǷ|s{#{;lhv冻?D˿ rsm5}{GSŷ^tŮ.su~k љkϽ0æ=9O,X֗z w5ũ|4ڶw~֗Xs:p^ f8ͣӋŒն^ݫiwϼpW5Ò7[w]i]:i t̙)r0zB|Z۰x_v[qvb|+wK><iەlpn4uuSYxj;w^Tղ#׋5{/xO .ɧF|1ח|MD_ W;`eʴ߿+L|AvϮt0uvќÇ[޻h[{NqcQӵsJ ˟,i:Ma dk\WLŷijuݮO}0/ɜ\.U^HγjԪztxצ GǍ}{ D*XF!XSualh_m3}+]˕5Lt\߿O0cl8ϵC-O90<کRL wp4㚼vZ'wwJ[[O..hӆwC7Y/66qEahpi/.}8Oz۟gO-[muy8xѹk>m6L:i\I4C)5ړ~ɠLh:ਯ* ɵ1)s`x_{s_v ٯ'ipMFZ:JwnMe ˏ;>yun}Kۙ/'G Y/y]fm\{BXv&kp2[qiyΖ_ЬumYNuˋ)|:Nz_SKV鹷\_S7.Bp#xq=x9:mDܺʰ.]iI=7{O r.=ٙdӿNٸ׺ڜ!=ܩSv-Yݶ7M Z 鿟-qۅitC*ܰ#'ӓ7xv{|aŐ[&g.>|[ ; w7Oݱˎ0e7mӖ. wӝxxrvs2~n\.Zxwi8랜izj"%9k黴 m8^{6iu8.{:%}Ԯ[mRrg/۩:׫'^<6p*.ۭSr릱Ƕ%}6jNZW!.%{3 1*|!Zۿ~~0wu+,˝4\ӯ~^[Y}Gg`[Kl~ӟ+_wp~5T>8l,6_muݙ%;]ٍrVU\YnKPkYB<-:c× tÔ%g[.vA37OtUwq^~O΍;Pݔ3Lߖmztr9ݎ)&_2t7b8=w+ݾ) .|wߟ>P]wMsx݇nNk<7_X[.ѵ_xs)-/!m1MnZ~/[_loE)E3F o&cd˽ [WWwhݎo>;Mw+|ywMqYq~8]Mgi6twr^6˦n:ϐ.Mxˋcn۾mƴQcu龒;3%sS|_>awݹzo˟WNw?::䂣n!Ng O wNe=ݲtئkMݐk8:}vHn ɜMv;-#<͛in4?aMg Q,mtonrk;=6]|q.cc"7m㜷g۾㎮nģzNtknK+Ͽ7yǚ6wj՛_6߻i֞{S;[u:?n&#rk83ﻏ-oNW\]q쑛s]cNkҋ{_LUx ͮi^ywQ;ev| oqm匘b'.uǒG{uu1tuXmsw[jtZGD^ᅦ&K ;~`♦1rv>#Jt۹gwr^7xҟ /kŜp1pp i[L]-> :ߞp·7.Wjn|Lν-{w>^1ϥKN=,o׭ wkݮnŗ(:Lur׷J/7&۞?}vKONe2gSE}nWK~^3^1=zw^t5;LJöo|U#mҿ:iluFgmQtߍ;Lÿˌ9_l-9qwQ%ۦ!= t %s.84{GÓNxqSm8eo<ܸzڌ϶fݻ,Mx)yq4\cWo$맇xK^iL4xOӾݺcmǜ0 kO[FpfXkM{.}gs3owm1ϴ.L9nxVC^-w&SNXr]ϒ.d!za; 5nj|6gnWTo9GֹmgLtFHa9&d dDLճ!19_7ZֵkZնm$I$I$I$IY{*NIWu <( %(I%aRSJĒ t$J$% <('y$xI'I%`IJ *I$J%$JRI$I:ID(I$I:NI$*X$ $NNI$I$I$$I$I$I$I+5'IXBNNI$RII% :I$I$)$I$ġ'[$$I$NI$NI$I$I$I$ܒt'I$欰$tI$I$I$I$I$I$'Vd$IBIBĤtI% $NI$DI$I$}'I$I+D$!'Q )9M.ӑD\\|GDx|\Gl%.w㹗J\: J6dHI&U drLə3&L`Zd\~8_UwۇZOmk5ij$y ~~Τ, mZ I@l |X ^;AaEVYZvegIM))1 XUu\X ҄U3m8Ec& )X+毂 m-`B*N x(c8,`Fjd"EZ|Di8 leI%]Ft%k Q2c:H{Zk7`Xd*Ͳ LLףŭrʌcZ!HJdh(l*E\ⱢJvգ7BN1\Xkd^@:kJ#jdžZ ,Jƥn PdjuX9TaƑt HV1hM6*RqY9K-Гg1!Q-aDc]Y$cJRγ(d{ϤRӁl+Jx*ٱ V:/_(#Pӣob^,VI+5rNU'GUτVT@k8Fi:Pi+2ִk2d'V1d =MA'*i :(g1HVz/ 'V$Ua'I5tйB²bI4eVjN3K6,TTY(Jl2k–{"NuI3fjdezŭΡIQLV.t dfڀ9Qh$IYZ^*ƒ5jƱ Rl lj=e7=z=\v!7FPf,l4)QR"+ňmm =골l(AZe'09UU`QhAm5Rѵsײ5ubɌzNOZьsa2Z#JVW8eQZغdIR֤V ƺp5H6 HH#,kT{Tc&k'ǐ[f̥ UQbQd֓ ׄtdЫ&`U/-{m@r@5kOV`YiQE#䵝i)ʳDr؇PVGղ,Z!Jٶi(khdQа-'%tU I|^[WU)E`l`]:9'ZQŔZcFRk6j('"'(= :JJ-c7Le\ؾXA!.xڕ`\CRR1]Y# RE&;1d*JNOұ`2{h3,pd\ֈIz(_;RJ)cHB,s超K2VفUHu0|^մhYsU+g Ij򐏒<#$U&tfIٱR$YBZԍ`cVN {/T'FJ5(>IS$E ԤȵifpmkA5QR6֠RVkA4MjkF$FJβR+A_(jf A_`E[F tԽE,+N(Mh`i>Rs (=+(Yα""̊E^jcc:!;4S>$bHZɌ] MsնqWZE}[Z2 Fx%$)FFJ&Ƚ7UlHDކ6Gr(4eJFh-g06“/A_(2xZDdX*̚'GY)0pkєz2#[>s*$DUbL146i8c*RW”h$ud RЌ^L!kQ(6BRL 0d)e'h)I#"U{SW+Iqĩ7YfUGΦ1e>LHIiDc94>M_GZu[3F5&>K!d汫V-c\RPJĺH+ :uu"ǤYHrdiI(Y'2t%s'&Zk3+)r"ȴm6Yke(8B*ʹC)Z$lДR)0tGL2 ˥PoVrUZYZ0Ps ZI8$J0Jbuz+>1YYVb,'FFShc־ΥmTF1`UeQ7δ{3Y,d:sW5{MFKY+͕)ѥ'HhcNdlRmҥbf&Z))E ؓscZrD& ِAt]t1h$ 0 1XiF¬g@=6spY5BudT)a'Ő >v/ kdb6y(No|2+JdkHV(I*1a@I -'Pʖb5eV̽Ҕ eJк0^Y18`fЫ:c42$Zn,5b,u\(m5zPZk Ij5 1tQs Boc\+2R(#mVJų}t|5QϢJPFI BeYO{[IB(xQJ$Q}H,Y ! 2HפժMϬʫZVՅ줝V>Z:u]9r8LK8bQVT>{؇rR1])[VN[=koH)@[Vl#II&})WVQekY D83DגSU1W-1:ISUZ^ҦNG6sXY)8- bfkG)phSdĤjNK:(a'=^4kuZ[FWFH௣+&s&1f9c2JլA6YE"$k@e$dTkzƬ7V|Y'RMklNUc VUc(:S,BY,=ZM%8/e6c^ҏJ3ÚUEL2I0YΌ&LC0i{XY)XJ:e2lcX<5Ebf|fxY׽ա`khtr֭UUlıl$tҢu*&_7Zږ|Fa( e+Qk3z±Jk5խ-ի!8J"SVqV!J@ԍg#bXƤ+*EKi9AМdj3|#&{e R2Z)XJFnEbjة|^ ,0maHAbϚ-BFlR JIJN{$ƼUho+eX®bU`i XD<[Z2" -'VBl<1W@D )0=elPZ,%FrzZuFI452Q%fJ,$&ULLRTA:iU*قӣ92rE^,\VeTQ WD@!VV٬`>V1 Vj8E9y xiY"DB}^mX|m$)Ay&ŵ5x{kd9A]^=ZQ*$"m"EfOZQsWB}'֏bl#T#% 2K:L$9,BIVB ldQyWJs 5"1 INd0$VHJYGd֣6ik!0r(,Y]#Z#0{ڌ6YAuF$&>psm)YeRuXJdkVJk-[=_Sf$[%!kH=`YŖ 9 ZPc9LUaZSVT6{h._ P&^&v*XIY5c!$ -Gi|J>D-;AP:֭A[,XMq r#J̣dBUsKA]P(YVVڬ# A̕kIrNS&#g76P7(%ĆkB{,D+8J2l%XYIB ^ZLJM}aDk$׵ Zc*hثYԺ-tAd5k*12G1 T ōvF'=&*1 'F5"bUQtVGNoсtWN,,ph J `XMBC(N lXK:e#&A)bF5Ta(h꾱mk9ՔlJ*RkLJ*gACdCM/{GIh1c),h2#h“T5mU M>֜8FRav,XnDO& W5 f8. xY'GoL75MW_p|95Ui4TrV)}fhG* ˬ,- Ma]ۚ]B89=_vw`i-Sx ba#"" "d b^e$df&ge88r3k˪S&ֹ]WlQ;D iD$4K24Wkjva&Np_EI (!!iijs/Mٹw\wۑkQZ2/ziM ~. d=U dTLv _32]MQ_]S |D߹[AVs| Xpw[{q־P2K`7i2 x[ pդF9RR^@z(H,4K*Pä<&19L0@9|Oe>G;clyfz5=Em9x<&m\&Y} W|9ȑPv__(Xz"OǒTW8%~* vzL%TTgU׼E,+p*I5~/ Z#Εn/iUyIGz^|@Z͗ ˨\چ&#EBNs; phöY}#1B2an<|_DL;Ci<xpO57njlXqSz'! J !{\vG'#7ͭ}YNX5?܆k(y7A40NOy* O.Byƭ0n'Yx1+6ô3G?Zlؿ|Fn5`aup;~t3Jpz$i7i9ےXqQRII$Ζ=~xx'ԩ?~I$I?R~u$I5$IU$Ie ש$OI$e RI$I$$O)>!I$`I?)?8I$}I$I??(I$~I$~HTI')$Oש$OR}aI$ I$u ?LI>I$~I$$I*I$T}$II$5$I $O씒I'T-I$`I>I$}Bө$Oҩ$O*@ݟRI$RI$RI$R}$I I$ʐޟrI>I$~I$~H#I$TI')>ҒI$RI$RI$XI?RI$R{C$II$$I ޟ)$ORI$=$II$I$*@l}ҒI$I?R}I'ة$OTI')=I$ԒI'?I?$ORI$? $II$ҐLI?I$}JI$ I?tI>HC%$I*I$I')? 9I$}RI$ $ORI$TI? HTI'RI$RzI')$O)$OR}$I$IJ@^~*I$e$IE$IJ@DI$E$I @`~I$}RI$:I$ ZI$I$ ?I? I$%I$} RI$I?ک$ORI$I'R~QTI'RI$G)$ORI$:I$ }ڒI$$I@t}ڒI$ԒI')$OR|SI'$OR|I')$ORI$ԒI'$OR|cRI$I?9I$~zI>u$I @|I$I'RI$:}%$I$O􏤤I>I$JI$/I>O7_%̜+JIXQaWUS}$w4SNm zxv40.kh$Ҡ9u3p[VQJ1}.Y@dȍ 8(d$~ U< H=tݣ8$~58ɒ-HrC16,)SM?Fز.qs6FJ Jk^) Y["d"DF iTVd𓡐;lJ]v; \ m"3w64 p) H$=H]UŌCl,pw4KpM,Ɲp$6|}^7]#,-ϙ$q#vs#`ǹD:5DI7;BAH+)2EP(ɚ;9cAFșǔo cT+K kt$ `o(o46S#?WE(p;sz`zeRTuۯ VYq|0BY>tGVQ\΋?6S2rE leJhvtiLfwtN ۚ+xҲjJ] 4i\tR͔8i-`F2^iG^a_E{KJ}fLn+ui0k;K@x$[J ݱ/m6]/մR$z'_\(߄v/Gr*ۣUe"lTXe:@Jq#Ct6-:fk%sQs\~ u Y1ȧN^D) &ww˼8b,!,Ft3ۀDU0M\f KW tl!CgCQqF6Q(Ol 2t_c>6k1ޯ#9w&[:_ ,u*)?Iג4?l~"s]CUwm+mʟyou 1woq*VȿHWa),B[ޠ[)1v|J~2_kE%U>R&vɼǮ+<gM}O%'[>2HVcQgE9r~q-?QG<]a/N537E`]ں̕R)ť@tk/7N p^;ձA#Ч3| \ʷܕfܚα"썰:0q/Ʈt-zЌv _W狙 {]1hp5n}usRԼG (.X@궛5 UsG*>T>ۆJr~gpJ8bw?BěQ oO]❰tPVebn ս2]O2Z!˺UWueSxV{ߝgsxJ郐4OY7FW7I|}?m<;}ų7D0aW9R෈spNޒJ/Q.--Zwwv$I5& ={G/qqqqqql0I$}zI$iI$I?RI$R~I$}I$Zթ$ORI$RI$R~I$|I$@}I$e$II$ }I$I$ >I$ I$~I$}RI?$Oҩ=Q$I I$$IAHRI$LI?R I$*I$ԒI'@>I$\I>HI$I?RI>H I$$I ?RI$RI$PI>HԒI'$I% ԒI'TI'I'VI$~ I$bORI$ԒI'I'*@?I?rI?R}Y’I$YI$I>H)$O)$OR$II'TI')?I?vI?X$IJI$jI$J@Oة$Oک$O)>rI$5$I$Ij@ꟾRI$I>RI$AI$I$}RI$RI$RzjI$jI$I$j@>I$I>HO)$ORI$RI$t')$ORI$R}I')$ORI$R|sԒI' I$딀U$II$I$$I I$@?I?RI$rI>H$ORI$jI$I$I'ک=JI$TI')?DzI$I$I$ ?I$%I$}ޟI>I$I>HI?I?HRI$AI$~BI$|I?I$}~I$u$II$ԀcI'RI$Rz)$ORI$I?)H?%I$|I$@*I$)$OI>|I$RI$IHII$}%$II'_II$}%$~}_TSrqNMLkHgf¾ϾNZz8Qn LkA#t֤iCV;YO?>GܓI'Q.L`c8oݷfg}C֍|sPBѮ%sQH<Ү)I=fXSKX;(r1*g10k$^Zӽ|ucd\BRnQK>:SOifުMOU(-߂|ӚOףe\%@wi] g\Μu>|X5Bx//?N?Nwt*$ ҃L_KA "n3{jZ,D+)4xܾ$uc*xKijSxFȹF\Zd̻Q2v&k0 !/ pRDUr8."hGP%N."vVc_#he^l t옇BqRN.?2@EHC?ڟ~SHʍenvWxbqَ{ ؟찞">e -68JQJKG \&ȗu>c &7{XADqҺ$Kd M%0,LTb]uΔ25(>~y,B Fg(O)Mժs:39E^wg 07z9JkOul0*@i0PFs}EGn2Q j0S#U((9/>P;mCI\O( W50"&4N)_.Z8ʛ]&m8IrZ^-D4$iF2{ys :lbmdOհPNujWWh]$3 @;WAueh 87R6)}X=B L)G 0!} -4J5) ݘuN휦}NQ:웄g]syTz[$:/4c^p|fxSn~P1F%E h֝b]WsPSPsnAJpVh1S}l-ٴtWl^i)-LU۲kn^# G*r ^к\mgy:* zs;*?Dߒ]DɊ!F02ce07hG%ۺZĸ[<6'f2Rk('#\ 0bю"Mlafw8 /p(O:EPrcYޑ$8`m*s̭<5CK4KG'F^.BF*fT@.鉻퉚m,\3Z`|bGt]x'`N4s2f`H.J'S7'Q48}B3HP]XOWdžܴXX9.tkcht K||:3zjIQ?A趙 ODj7?\>WpʱU!%if??H,i9L#跤AJ}tKz1Zg׺ii6 &JխGНKOKADݼm 6a(&$ c{&#b|}uƔQX|eaK.@d@}3WsEU&o.ȊTߢU7Vk,1#!hZ8A_,|3(.\֚i!2I*]w+jj2!m/: n/1ѧŒ́GR2Rmbjz+3͗<Ŗ`3ޱͮ:M-aIM}6rm L6ȼmL"OrUw4ƈ)7o+H"hC,*v Lxêay!ԯA~4&JmP@].uo=uP6%#폯yv޲2"`9;)<761ƌjqGe&3kw٪ xZjNe~чv6 K "TIANODC*C2;>ջYRZ2rE. #Ϩp127z,genq#CGL7+$Y/l.Bl$CFjO)! S|-')&g矠2 xub՘Y'>[k`di Ө]}Tzvv)Q,=?U{Ci MX՛I:t* A&T5v˷T=5k;-;7juHN:䃦UB: ߷o+TvdTz%Nʼ%+w ==GO r6NՌ-`6`yn!K4/D7tᕃ9Ax߅ jr=64[Gt8j0iΦB=T4=Q3*!t^}OVEM0O)5 wM^$i~+Veu;;JJ-ͣΌ&.k !3;C%k1_]4#,tج=F/϶Cf1߄דE<醑af=7#a*Mb%^ۄpދxEMdQJU"rNu ?K'Yū]H]EwM 蠦c|1WcV|Z7'ԓr-ζb2d3 ,O&B{%}%suS%{ /\5gq2Wz5 ; ^ fo4ׇc@=8bJO~|$Bai:UHWQ}|*{n^uPh_ (}[~72(ro&9bcE#mOOeدϠ5Eɓ0& ڃ׻E-rtӦ ;[˟\i3(9nT+->x>҃}z+utDNM%L4?7V# i *|[b޷T7x zlx}" +n0:)x9'|+.RB~"d 8Rgk9(]%.CqI\mxT?O[|bMHLDhC1mFh\ŦxQJݬ8{配KzNFv~q?nGuoo^`?/Ƀ Se"s_|Ljo,H##M;D 8Njy$o_YI$?bI?lI?<)$ORI$RI$vxʤI?HI?Hg锒I'ߩ$?GԒI?*$OJI$Ԑ0?)$O I$I$UI$~HIe ٩$OԒI'*I$%U$G ?RI$R~"I$QI$~I$ HI$jI$ $>?bI?RI$TI'ة$Oө$O,!BBI$I$I>UI$zqI'*@fh@DY$u$IuU$II$*Hv}RI$$ORI$Hؓu$I*@}I$$#ՒI?I>I$?xI?)I$~HI$ԒI'ԒI')?3HRO)$O)=#ԒI')$O) =>I$zOI?iI'*@HԒI'I')=u$=$XB>{$$)$ORz$ &!d@=Q "T@0!|?} @H<!~=Y !=\@D |@!=a ĀڐzpaB@zܐ{r !ܐz~aB=  y!B ${BrHD !H>9 Ā}aB y@B'ʊv[ ݿR4[^)4 r vcR: ֏F+ݡG2j+íKG.)9ݐ)<Qw404V(,Qqn݄zn.Y坓s+S: ͇S;*_^8-qf=ݢ)⻷/*bb&=~6~tZ ` V~ mK(rndlQ6ϭjǴ_ %,[[MPZ[I- =Y>p9>(ُ_wNdU,{8iԡqab/:l(iZRz8`J&cB]+'AL8 v5s߂FivNnNxׂmcr\|}epiPoD P]M,%&rXȨGex \H HyUW^Y)Uh<U4H0WIIcCM <]KaBÙ ^|1})ON8J0zס_>4s: | u@͏)9<Sb>3!EK_|p!ʲfTs/jno~ TO~9b>W<|~6/Xep'Ά!V9QvM:(_{:4 J8Lv7~=_o- ^\@vY֔p~*MM=r |mNkVx]^ſ'{ cZi%H V$"Kez(RŮ5%֮mxm(h@G40AFveZgI~a=( ̸2AF82ʩg-I)[o0THOoȈc~} D^t2?.KF`y]ɧ4px7oSC%8Ľ&'S饲>HE24|ȰE~ I̳\4]e V' iMF*l> OUhs-U YP-3TkUӓV|jG/V nJQLK^O} i5t|k/MI]/zcWXC #K>=zx&f-lS1*%Ǫ~ڔzTY=] ܴ $L(WtzMKzzv-!O|g~TC~O ւ~4r@KN!:m6׷ ~.ySѲIHkH'E=C[Bܯ.fz {8q6'(0 #R lLэ*?[r ̥.B2Rŭ.܏Mi/{5M1ل3ӈwq^gf~8wў\D6B0$d_ZղA%ޤ N]0pdSg&Z+(\Ez4בȷ"e&OUM}vJa<4 ODW@OOuv 9y=Z2"20;{QQ@߬r\9'*392b+nFJKzYZ!PI5vȃwFpS""lw:ڃ5%y!;)b[cˆ{#(b K/af>@32XQ5"OVO䰦F>5Fݧ5yW| ȻB;6SAGU}J.%lPα;akW߀+'iN@UppV1|VI Xݜ}/[If>Rw-I7![y!t?M7OGT>,? Z52+KaLo\+jIt*v@=\]ƿD!Wu:/nnFq5Wqpˋa;ɟ^4>jSkA^r5H~Q(O%nC#x  }Cw SDa"YSr323wefd}\73fp*]?GM+({^wsg)ͭsF!q"ۡ׾PfGݣG~'|jt>/GlM_2J1Z· FqF[.Z-t}/sĨ˿mhې:.0:L\/>9RŸPpf^\~μ>}wВ*a[@BgKѩqGTy]E-}-'g`{EaM ߬ܤ JvGxwqW%.2iҕmyOւ -gjXq]}xTpI`)'t$yH;H,9hYm2-ϰ%kqWK3ZdBǨ01>7Mq]ɶAUB=i-)M]S/MrnZ2 ~Ͽӥk];{ʉRYGgyWi{RǑ2+k7mE{#:ġuFGƲ*n󩸂u膙`HiܘAbm5(ҹJ|XONߨ3W?i6Vvښuq-xtxFd ۄwkطuEaDG1XU7v.?A ne}l} |G7WR?o%bM}P.?[e \=j51W<1VÖCQjC_pp8HIQ-丌A*#jJHJlř4 Yf4 --=-<ùvL/<e?q~C8Mk<}M^90b}Ṝ~dJOh5fP2 B!;TR0} =O?IA> Rk_pydB/yC\'']+-M. e|e)g[$r{DcoR7/vTEr|ןgy?ku}=;e,QtKyX$WFIO"F*f:Gn?K5;e$&C?XzK=, }<ɖ>A?'D؋Gk$"!t#m&q%4?l-^3]) baq)< ئ27"C^lUUWCo 7p"HaHH9gǭjk1~47uQxza L2 dvny"~f/|5 )S*ϑxȬ.xРoFX9&H+w5_[G:6͌L\EXeĒJ>SXRgBh-?1tH2|1_3T]N}jNKf>wY眵l3Rp_r]?:^/0sxT0j(R}l@'QnI:=#K w9ǭ8 K3Xet^a;Ǚt본^p[yOLYPyjf :9N;'5%[6^rx{Uը42*'yJQ{--['o.Y}sNw7 4 20U[OzZ+ToJ1C1mq ׸ͭg{`c е2t2A8m*s{s[šZ Ӛ:WW')!g¤<h~:Os6/ (IKr=dĴMS )?O'x*әV⛢z*o6s`/eOmlF=Zwy %Vmhւ!bBlEXȐ"C8a!o w΅"$!(!ە?HzV4SD3ұ19 5&J´6/?Jyp1`lXE.nHT7{hV͟8\L%/^ϣ^D bbmR϶g[e56xV J?sQݔL[^1.j\cձwd> deXQEN7]~ȁxnqIkU~+0CJli{-}n}9&(I"t"E?YЩ .9+SCkә?]Zs3B]@HK 3Űzwxjv$ Rs2 2W1Frt(y X;jnMNe)Qa] bi &<{CFZ_xl^V<{nd|\=NF0_E kOS}bh:='VТ%S`xU5=C{4!d-.: S%Ŀ %YWq3{vS`q}ـB`" ˾nR }O{Ao}O^#/12DH$"5g~ <$w^ rǢmcmW(2N\ tcꁆKvl.|<\f;抝FFWY˖{|:#[~:G2m_[uq^tܝlB ͛E)l%6ѭ62_,:{QN\x&7JxpQ.MN|ڳ-Ov>/F}ZdzN>iC*"؄⳴)NxOI҈dz6 tvq!ө&Bbh=u 1D:iDdMe|<ͥa!isw+qK\ִAz-`͔DI DfB8 @lA|YE R#eÓ?CD~Q%SYc-&pv#b, $K~:F/Ĉ S)3{Z4m4 S?dpMsA1cT;;S k+.7vևX,tO'H/+ۗ)lڰ 4QŢ(`z 䗳e<`w 5b9>̗Tf<:_ʨ9 2djM;u?XߟiO_t]JOs=q{3qJS6>{%@W,G*ÕzMZ]5{yu;`tˈv_(#lY?P dLEHnLJxB}˹.:wi(.yJHЏ%jbolv!n$d, 49[8Rl=`h>-*^>{onF>^_ D/A_E6Ro>~>E'Am'ȈG!~0/֯esCP (xM3 T1ch¥Tlb|eӖDKS^%/oR$ O~|\){J-ےq4w^s:rE!7PaYSrNolc}ZyqY{6udooKݽWj@\'JK6 mjln[s+R=^,xotxalf/6Jv)Ʈ!.% 6paF嘎?r|B87wq٪}d/)R1oE>׏Q L 0!A '_K_xƀD.t2OaN'yW1+ 6w7hfAό3z(K*21t+l8N+eeu s iOE,-~3kRә[gQ?`kk -R10fyضM=0g]á.v@5ㅤ:Q!(!gbEџc\oY 2A6̐jkkV\XYLQ|_{MУ 9MQ%ҡ]w.|^A00ics,'5XMs7?|/c`(3MG8Isj.Ns4x-*ǖft=QEM o.%]az3=Y a ݭ/[A<9'cx|lOγUM7%>G|F!30K9էd^ GDya< ;<-4Gm]W18mDA'IG wgo>H?#V2J'@N3QeoNOo%-1^G[a-mxF' %ۺa+`cܓdmk76kkcQJvL:+V.5 ~[% Dɐ٤ hK&Vͺ~Z1ЁpJeۆj8S HB[vABO̾OmvzL߮T5V}xʔsPq/K\ƃ]D{P1PHfD9f*U2b jD5JUf^u)mː1U P\xPZ]\6cehV:[<؊am M9.fdP?eЍ\6Hb+<plj%8_nj b?CKKJ8d`/&9(Sd$؊B!ectJiR } cyfq`Yk| l'nRsʑ\]. @\$-Tp4mO0'7_]oR"p_0. zʚb|9 }ܹiM͛!Խ F6f3 ik, jQ-+WٗeahpƳb8߃((h+qYiBi %t=D1E=7iK{ԫξ?|X9_>o h x6êw ab2n|Hpb/s[T(mqK?O&8ܒ5) Ty1p 37hġeOq| =%.0oJf--(uj2剧ʽ9vߌiV?$&\D-̾ZPt I&lbduEkq {lQ&,؉CpU"&!/8c~HxSZ~uHPzLEB/e]'?ၔPG ぇlg`;n| 8~U-]=" xm=sǾfx/֠ZV Lo$j.FIٲ}Z̷9gC1<=GʊuH-Af5}=[XE5(-@iUW 3I:'^[ixe\៖p[˘v{33]-[v} NG2QhQ߹ \hzH!X8-c zO궓iՉr}-KՉѱ÷ŵϱϲ0rjm8E?1$p` 57 yF&èa W6-J ܏ӉBP\{.cLjq \Ve}w>SEZ}J>2޹*V_YPA Lقy/ /_Zzy}u ϸzi≼15*9+l/2g@í6j1=h>/Q7ɚ_@2<*b\S:?uSif!n~THz5lWrƴfkr4 WLO}bBSfD~l7m X3p5ۅ y1zUuK$S7;BK$R9'˺Uf?brsY8{Wިzaɀ;5N?CqQ kOCׇ}J]N:/wKKHsےdY3 $$"M ky/.۷D\\aۮi`8 ,nJq*V {>b-j a]yK![e{w6fyVK;FZ]k^5c "O؊]ܘGG]y&~;Xж=|)2T@v TUBQ/=Ō(R'cSzn}w!P63r 5F3iWr5ٓէ͛٭fn>@X bݭ ^ކvh9 eYϝav^ i-y/tdMwf69{9ݛxu[؈.ſiAo뜟e핰6ͅw;ZSx$7@Y7Aj-U$58vY}!|~b4A,I 8F&LJH(dM;/ޗ5V(ls]g|Se4]"Q:MIU12F$oŜgNVqmi V+OHa= Z >0CPݓz)UaAY&-~ yaQ\Hkz ɴ6VKÔwdp*@]GRSVFH0AJ}\S+hB}%nO ;GnIhJTٕ/MvuhiζqYt]4uZ뱉 >H_QԗwOlF =dؾ%QX.̭Q6|y%\*}{D3VQ_h~mzUU<_HI"Zz㾼kHywQ~=QwI[ǝD{t ۟~?y$ݓG/Pʠ [g }N-{䨻v!;FpmHW."l4|^t\N$F?[9*8E[cxaJ]j_X:C;'Gi^I^Cmw-뽯븸"kBmBJIa6}ykղ;7/w+xIvs$lme}W082VGfd겂n|agh %NXЙP) Fq6س:.dy&ۧqp^"L&_o:L[VRL ֑߽UuT&)?遝&R*HU&N=7Z-'@#&a x'^/ npwrm,0Zl|;} ;~>zsfdsF1naj ŬJR~R )"(W7 v\xsR"rl䬈y򩎀#Bt ";G[$qAH--2$ .~cE6?8*w 5cT˿H_a|kO_'×qm3mW^mPd&3 PA/͓nx4޼( y2tϖG!O{{]K!Yy~;4>u8{/.UIkSekz]ݑ|-J"iTgndtHk$Mjm|`?C}ԷΗ޻|c 𗬖&˾ M.t}915::@FCOFSqJ FJ}giu>ϗw;=qly>xqy4Iݑ'BOw:8?"ϧ YmGCݩ%+CEABRbB{NXarhUjfpHi#eژE֢p (3c-h>m_c@[lvjشKl̶G{M"P>OJC ]R.>&ewvee1uG<\ L,ò6RЏ.]B`"  8EHȏ .D S D 0 !EbcTYs&law3|ܿG6^:d33iOD'ȇkFPy=^8߇q^L"oyI2= g%{J/\~_·{YkKđ%~$#bڀ˧zi|lS#%bDj>KY@*:mn#GH$I$_I$I${ 2L5E*,MOKSS3,jLnSַU^dcWF]hiZ\\X^S`\kiahW`ffdbhjRոijgoroor碊f+wsGmRbX.Lw@9,P^k黌Xmq% jgAΐwxvF& "&dύ/{#wsi./QMXlRKo=^eqltFI+;pS! fV m|$fz6izSE8V^8Ugf#ԕx5z3hj)xy.d(vS0 !8:WӜ C mWV4%"墡&jDtF~$C<d"2QLL,EasgN8fGt B 0樌ϟ IF8EMohX}nm˝\mkkCmBm nK#bv_( f8v3ի{)ruwIXr~{f8AX 6|;$;Al2kyX܌1@lj؉T0caˋ֣B42/f^:{fq|S ssmhv$ŕM>GgZMW_]6<&(iJ:JG*}5gOwt!+43OKNun3`7[P{ M#x0ل黀h8Rb^2'ߍUY\ENBӼ+Wwjmo|%z9h13>͍,gfYy LpHQ(|2"Lq_je?BK<$.((ҕ(D&ɔX&MiW;9d٧&ooE]!j.׍?TCMOK»RuRIrWznR`E%o'~A(]$A$@s$__緜3$Iî#}sU+,I/y+5xy{PL Qӵ"m,IK~i8Ⱦ/c?ӒOjq_N^fOӚ~zG6ƮTmƁZ ,{bH [NzwhEuVJ,w\QU>%;jŒL 0$Fᵎ.t>)ݶ-=aߔB:8RX2A%Q x-5zN2.P{ᎪNf-nUTcg7MU?0aغQ/N }Qw2]$}hͿ/`!NfֆR67f+g+Y=iqo["8m`:)1 M&I@Dƈ`Q_Oӭ~^Wu8[-۷-_e{;  ;0 _j(G斵"p;o9ܒA  )n|W]z,KI ub2 )V KZ w1\3S8+-.D,+.8yۗ-\FGN_>_ONֶGW'[|='޴.@/- _v vU.2Za47΅zHmjTPMjzpmxiYvrižfz.c"\L4˥3ۜo5v,ϵ`ͭRZ xcfd*٢2CZvP W޴8r-&,("Lr4EN>il.C=O^>`9$N/-\DeBV]t)u=~r Iv 'N褗q}.ARRC\~'F*GL .2>y M9cMrbViƳ*մZjy={]tJ3ÜbZZg$':OҺwwLc>)bf,<|.7Ook⪸:iLcqyxBuL<97y.QަR-.%g˱÷ W4Ԃ؍9gTJܰ vRkԮ{XFƦA'B(R^0^]q^ݠbc!e?E<荱JՉɸ(?Y@MVp)pdq2Q#:$W`8t+7m c~u|tV?D. a<\m"Ys/Jpmp>o+a"wU +A) $2 B`dK8NTO0ƛ>tL>uj_'ﶍr'#73ǶSևcvJ]ui`Š0Eq|bXt-/ K7{:#{bV"";@2hIY|ĩ#ukǏ:_/|(X`3Ώt9%*93+悖)>P?t\M`>3#{F9, E&|q=:nȀ^]5>\A|g89lTZ$x1vx@x8YZ Ѓԫ.ژP>% tǡk;ic>#jjOEzN(aۜk0QDt=Ѳqg8rE+Oj}"Mkpi*ዽ3 J0pO3898_:9cFv7;Hf=VdU#F>Vx"&hrF$,.7+~ M3 <Hr=,,,-,z:Zw^H? rO6VщXKhE`<& "\94?9]X _3}Pyo\ݝO9$I$A56Bĉ2K5w&7I qgn~KySCٕ{:5Ʌ˒b,p? 3G mzMNHzÒ{^@5>gg '~dގnG?`/~(Z^W5ms'W(!{mJDjS3 :ڬ"wU v; BÕʌ,ɖmG|;oV~m;V^K!t7Kʊh2ڙ%2:K?UqhD~Ԝ_׳w]m~{*ݍG{^uҁXmg̯d`Svߔx}Ѻ[Y.δ" jAy8]CBFJ"Id &` 2a(O<~-S!4MLd۩O} hT/]/''h`"ɴweL7UU*T}/Z=q񖘀$IlF`YﰜVS{xǨ8K0gp1ŧtQW+h{@>x{ބUqt~o(̟?8\{I2sL^lXh|~}_`Da{ `V-2ҩA&Nr ٰ23w] $w#ȅYPOD S05:|(YFI<i^gF S0ݺ/@Ef{ֻߛD I%k:f_V>*ֹ.=b8c[<|J5XV&;(dўo"ܳzvͭӬ7GW/kH.i&5n{bl|Kz5 c_y$/y,KcmlY_mQZIӮ4spv$H$x_" D |o׷7txw7:5vutrs= KDM+Q VIKTc`oaeaھ-rq=NB4`[I|+qY#2V4E=;829:и= .Ƕ0e޼9 f9LdМ=[6'ÊdV8ZVnMxKVJ9pH6jzu{\mR979# VF1=A[ϋ[`9WF>r_"uv6)cRo N Ï7zg@X wqS2vs¶.`bqF<4 Ӑs'#%Q@JPQ9}c1% ~I J2d&:fʚ1$2l dӁtG>7~s픾w?rwL4a9{ 7-9OpwJsRx?bR~Ãɡ& M4do7$_-L @@Ѯ7* k~oy{y5)jdkf2H]OG9N[)_OAY\_rZ(94: s_wM0K(zzX*͕K>q<[[,*27uos?9,HNyM`*Kڔ)jCloܒ!*@GBcPZtA,ӏÀ/%zWKM:z\E$]c邕ig* `]KȲsUgR,r42Vdndd RmMnGD[34^]DIV@йJQ=@iG oi~'y6v g[1Z|-q@s[ F= 'G BQN(Q(K/74x<:Vv?=QAo|e8\6g"h@30 hs&srҔ?ǩ◫yƴ [B $m`H45^;v÷}"lSkl2ɱ"6 Ic1,J~cOv+j#m uV[bΫ6^L@ գpy;X+]/9=h8r ȯ&v4ۏ'LҗWh;Ow͓c)n-jlvχNʉbazAH<'2K7g/#+:>tB`/ *SUaAI_Luu'Y8Ѹ^@^0JUJvp)Ɲ~3~}C9KgP ns Hp"AUYa߻]:^kB=@n|_ $& yVEI㏪/.6 ڒm[y~27xw+@{]Ǹ}t6Zu#m-QX }R[Q@EģpM~BcE͂Dq04ifggLAν VdCx9/~|_=^X03`':X2v}pI?_]9,FQ$I$`9^I~gf̟q}_QCcӘWH:Sr'&-GL[e4nd ct_.လo YH6UJV'Zl[V6\6Mkགj6!8C`JŔ׶MNLMG 3in7-ol}qn7 _ۻz U|S~i_)ftx}')izQ7G+IܝD^SK3iAHdn!i =ca0_oq~0x{M|~~%SqSחױ]ĉmm(;ȇבNý^}}|0dd_g?L3sbyzw !ulxrܿ8dL h/N (ۇn6LLL&%:_c?_mmoobsI=e/2i>ijg@P |c@Lyw ÷?eϹۭ~T>(]cY+x}gn͛-R]ÿ"…ykfӧr^Jlgk6mr^[/24J0_:sqbxU$ Ud)x%.z>NhhIɩ b`c`#&$!!&c$#$%$CI@GIDCȃEOY~ N&::FF1&NJIz22*FF>BNMF>MM*ABB%FN>62&J>N4L||.^0<POwa|?]J]E3YG׬w9GݹK2_w9oMU"RHH[B-,'|Y'1 (F'uOcfR~B;Guymm}쾏{A!JD6HXϞCz%4IfkqoWL:ޢӨkv[Sf`m̳scǦgXc}!:w)$bTI g|/|59:T?|oO\w}֟GZH `W6H i@74I/ dgiÒ<~ -TG?f q!k:' s~ d2@oܤ40y7MOk %`Y.rWF|ֽ@@-K'tsos$|$FLI-IWN\8;շ~S~CvyCʴ5h.NӗRgd&aMM\rfw͘ Om:S4G] -^9/{TWV{/VmBML6='Y۝F';-IO?[%rYJ^o#Ĺ]{w (ZUP ЌD i+fr ǵ$ KKKD]Rf=b}z(޵/5)grI9$ qS3Η/$rB(  G܃7ҹGTy_xּ_{ֿS@Ns9lإNZa\]H3@ "=GlMo k)}i\x~7k?'kн#wD+b%8K%$&Q۲>0.NF6`'>-U>Dy`/~#ЧEwwvAx$H O`,3̓pᩒ/v |+d$,DnIO+ymDԶAa^p+`FE&o_Pd0/0g¥%+8t(]޳N#m-@81ufr\+SJ{%Y}sfgb?_ΉKL6xΔjjDI!; K-" Kl`,_,Ӟ_PfhH0 n dZkTs-Rx/#AGܥW0'՝Ҥ;DH0pD,㸒Y;YCt! (d{f{a%%v vDr;Xhb\o~}Ԥ $` TMRI4`aͰ |0 x>-߯%1ܒy}\߯UkNyWf/KH"i΂t9M}Ie,4; /'{pD],4k|sGNmcLj^Z7W|ߗI$I$N,3ϒ_M}1N\ .0`EVY8R`;b_wp}3|y@DB[ ϒ]?Z `M} `Fk;Oݤپw#qϻ޴5ȌWj2\eStm!m-mN=Ȇ k|#8US@`~N.5KYb=׫p託?#~~ X7Zm!A$5n44` -^8jl\w-mKeyV@m*$7YAEWwꤕW«`o;hr:PI$QVX1.I'=̒ys{C8467wU¤ kڥJ'yKb9 |rx:Jz;/ʊ0V.ݾZж%3+ޝ0տVRg?`";cԨ)ûpy)o;G{5JBʜ/*n|^{_ w - kdA]@ .ITmp龟$JU.|/ݐfORRܞo_y1O<խ{k@;TmbҀ\_8+$4 5RKx 0%r''| Z[bQ䝊v]ό٩3{JX\$Kݱfh;wXJwW} y_O:;kB" m"TgA9] Β{IyN7M@lAPP *CIU`ۿB*.Rs %')1+$_8 ^pXk"Oi? s0@IF1 B0!$I0$$,c [2HL̶f[dIm$1`H@ &M1!!l!G HB6 hI~3B@48l޹H2!6HMb@1ICi$HI !6l@a ƒI I6Ci$ؐBI& 6 6IaM!]Mb3eF- x?89aP,Z~st3~W ^U-HjjCԪWU4ykz}Oպ^EMMO7捄r§*O~#M5CTn.TkfTQU_+[R=O'U+);Kz{l~68:gA}X|N.~fzi5TUjeo^ݭTTL**+q`ON)<}Ogڊ* EGxwjSSyz5>v5-R*>sP9\}ON)f5_i|u5,*EMOgu׃TԀ*EMNڹT➟ njj@"kprSz}|_'jjCT_/+5!EMO \T0=)_7to@ej "bsC=(@$9bN$HY M@ *d*"Ek4rQ " CPB"!P,*QH AHn @Q5T"Hi$8dTª 7NCjF)4UFV1)1J0 nlR3tUL@V7:tm T @&#l@T7B qpaT) p Qp &&MT n0IB46f86i4fCL6J˵\BPɄhI% <PKvX `:CuQ1^$J4a@Pe*uADD:(ۀL"S %UB FO*P(lP @$b @l Sf*$M<2He€&$Mbbl&j l8PUc# DeUS 3!RtFJTJ0Ѷs XUU0RLF$hp Bb!m@TT4@&a cj!#` ( n0F H iLFyLj1F1mMS s9)at2<`aUMraL T)ƌ:lSMeF(M&{RvM7Rh  @&4) L0`DHڀLF76d`66@&0lLd`6@&#n %J-6ڶY*]14PɈF@ 1bTR#*PEL*SP MLFS ڀL(52uCaF@& (aQ# m@&ce*fELM`#6@& `#d`#p &RaIU6&&hbƝH7CUI1 IT@V 6@T(bLi0FDiJitTbCaQPLa aF  0F6 0AFP jSDSCHdaFDFٚ)H4BKH$G&VoˬE{wGxͻ]b7-~7Btqp9+@I:=I%c2iuRN'E$$%!bQkI({[T۴\-OLIAx`S*a6Pʬ#V3=+dN]]6#6l+,lWd+]c=$kni>o0u*'6B01.yٶثP6b¡N~Cov+caBڔj1˿}MCXٿp 6\l" i +0ÂbJn,P6;S[)*7oҽնaF@&0:m0da͌8`ہUda(z @&DnʕOv #vڀLcj0(d6ʈaa!6P)M8lam`[ml6  A A8p 7 0mmٷPda(06@& #Oٸ F6FFm@& ͖m Tcp)mnU" re"# F>ml-L(a󜁚2p DZvM P "mln8ÇU( Flٍ Hnm2T ⚆oR$`a0SQjٞo2&%$ !$H&x( *@O ;H"Nak;<3 T:qM!Mu@V*ږmd)Ob=j2xNM7PF)C# M1:)UiAK,lȲ%Ll2,[t23@T  fhjP̅PT0L%H( ڀL"1 TE 3@T6daFڪT2h*m k%P cp A핔)l2D  lGf*m&@&2g f86ꀨp n0R)i7IVmf-9LUB$F@T0p qeR)4e9IM&#Ca*A7QA@Thm@&#C{ed Zp ӀHM@URp #Lcp i0Ved&J#Y3dq<*S XHdYvmImdFRl{DDq$A$"BJRyX619*Vd eqnQeELtd2VF Ua*M!K,o4V!Ln5QFRem9Ma2rVp:PGOPY -i7* 91@ViE+| -m@&SV@ %lR#-1" @(da(@$ۥ@TE mQH*%CL UT"B (PPem= @(d`PHn*i$ӀH"1( @$i6LLF'%RR2eM&#MPT &p t i78i:TU184B4ꀪ" &i7TJi$Q#CPAMd`" cp 7QF@0ņ$+ fyFۙifd\Kwپړt6BmDnL6[Yw!757͈TSO :xx2̱HJ08Yec %"!0R@-)4MHYeʨۦ(¤VJi fi|d[1cj) /9FL16nQH`N[E6eRTQ*n"#p ,X7(: H0(Q@US,*M*Ŗ&SlDDB T DTU6Z*=ga]UMPTڀHDAPT6(H*M@$bID[؍Y9 aM "mb46 l6(6SbIDTJiHDD6 g9M $FBLM 6`6M!-lɦvM2ID!BH,"(ɤ$&؉ҭ&6eK7IzXm -/w$D@NN'oԒyѨÍ`6o^A_w(W#|$|^\]*O̥?{O\]$I;[؀Tdg$k2nJgJLz `%,6.>E?KuNsC߿+$K"_txӤI AR$IK|̜aJZ5 LЀ"h?c;> RE_*( ugwR$A Fm ;Y~7zcM2s|_nEߢoֻ5QIs#ef'7~Jl)3QOo~ϱ= [i-mDtM`"CN0OTza:.6{U't,zW,u_xYp -Udi=@mD.1JN{ % iX~G|6FGDc Gƻy>ozo 1!Ql)݋ baV`/;OxWx3)h<.jxRkZ~g7 yuh NͲwIgO䶛8ǧp;0)C".D7M{ΟAikvֿ+z/~+y@:60;@`;xzB1?HJ^G _G ":/ֽ~z˽]-Du$I$"uA'Aktd]_=ݿzf`@^I0w\6N~~-6 Y&I̢H0 % |sx '-K|Hk_kb @r\v\L/$U S;^D}? JY/5;\v H8[1 #^Ei I#д\Ѓ0GhG 6X/. }|$zD sL?%/bG`zbM Ĵ0@'gGi/~I!|6:hIЎėbKh1%ߘ9 hK4dh^݉/ G`#U#_FxV^-B]݈A</?/hK3X @~ >B_I.+Hp%^ЅZIwvK餏yK0Gqh1#$bZ_ 4/?^m }v|ii/A{V/d!/b=Gh_|Y}Vi $wǴiq-Ui/i+m$~ |Ƒ?ЃӴG`?ɉ1}}i#ߴXb $/`/`IhG`>H=B-_]Q#㴒^/j%Zm1QhGhB_Ϟ#4I#0l F} E%}B9hI~/\YdX7C TY=HQq,L>Ƶ$I$E"CJ$IVⶀM>&Ob'Y26OT|L#CV!'__OoU-FC~;T]Hheӌ4gȏ(k@kc39<8qgȮйb;ZgQ.~tmC'r6ߛ63nGk ;~-TD"U/DC9NDD"S&AIkq5kZ@h , uv`] h 5wwwgHb^OkV`]ߜ݁`Xww5`X[=, ww @j-+ww`X=k54֭F]>N, :Z@kZ׳j-+րWֵh 5K., :, A,, J z.Z@h ]뫻.j, Awww, ڻK=5kZ, J :@j, Z, ޮ ֵh  󺻾&4־_Wv`].oZ@h ]'Wwvi]݁`Xww`X}}`XWww`Xwwuv`]-h 5www`XM]wѵ8րKZ{`{]݁`X~Wwk6@h kZMZ,&DOҔ)/cZ"%^S((D }&h .-+X}~~,, j һZ.7kZրWwvi]GִֵՁ`Z }v`~ ]݁`ZW~?Ww`Xj~#Fֵw&KWwwv`_݁`XbFkZ@h , }}5`X-+,  һ:@j-+z, Wv`]j, }`|tkZրKZ:|^&Www~[Z@i.'Zֵ4]w~Wvi]߹Ձ`X, -]݁`ZW~WWw`Xwww4݁`ZWww`X]ww?V`oŵ]݁`ZW󫻰, J һIj-+, J һx@h i{]_ZִZ.-+}, JH-q%ĭkZִZz, JN һՁ`X -+wwwzIj-+z, Jv һ7}ߣ{4kZwZ6-+.&kVi_-+Z, JV`o߰, K һ:n-+4t6ֵ`Xv һ7} һ_Ѡ4}.& ⵭kZ@8q݁`ZW, KkZ q8@8qֵ, }}`Z_, Jj һZ@i-][Wwvi]j-+:, }}Ձ`Z_gWww`Xwwvi]h www`X ]w{Wv'@@oݰ, Kuwwvi] ]݁`ZW} Z|MilѠ4k_w4݁`ZWuww`Xޮ һ7}wwvi]~.4|]]`ZWww|Z, J һ݁`ZWwwڰ, }}U`Z_, Jwvi]]`ZWwwKZ@i-]񫻻-+:, }}p7nrt$2+xDf岻%LG;O)g=^ܐ;Isɳ!3X4ـK wu=g 0JtqxFE+[lszy`_e/Q$@,bszNBI к 5*YD{|]"5V6RXf>{f׀ݿ~|U X $H.Q%VV'E:'3&8s~Q O,`ʣy襁%d^)ߵK]?n_?so)hD.ww:IZKId~?c38 +%]Wik#R!.{{)#P Vm\|5{(7`a KvfOxU\WVWMI$:,K$$;%|qYllQG`3esoLJ_#'>Oܖ߹_ۺ[UV6Ӝ] POcJI$I$E$tI%m[m-5Ʉ r0 nw"ILBHL$t@dM&ąAVާ H)$]P@) pAaII[ rܶRAYe$ 0%3$32KnfIm-"̑p c 0 a$ H1$H0i&Aa$@HI`FXbH:Ki&!!6 $A~L̓fd$fI2pf K1 HL@Sa11-$Вi l!&M!h@`M$؀hнOihlfDҼwh]^R.؆64$I$! T9i?/۞93YwgV̞O*ɞm&|T ɒN٢=3菒bG>CI},k3$f/nGg^m:+=0>{@I..Y}_! ={@leJh9MyCb\Gbh 4E? ?YM%X%M$|f#>V)<q>[/pƃ4Êk4ϰ|F Շri/~{  p%"_C07k0=islM1#$h~`B +xM#װG5tf =>]v z&!-i$kI/Vės- 034#g[0~VYhKGJi~{AiV%G}{a4K -  <]Eg #XWZ^yF%b;B ´/k]ɯb> _/bֻ~-?lLAXh;^m߰@kתiv/dl9п)G`blB9V#i|v/gXkh>؏r pZ?u#^k#h1 %5Gik3#0:{{ r]g {&t#I/ZH<#A4ȱ/ЛG#UpXCizv 5Ĺ X~@AGIbA0ЃkԴ`k4` ĴhXK{>?auvM# 4DZl7 # =|1%!h`[(f0HS5A4ļI-|F=8%?3zV u eȴX$k#r9/i^/f3B:_:0:wVra%?p!ql]y <h^1~?p%_eK0կNF#Ű.пF[>?!=<{1H;G`b?G`-Xڵ"-{'2ow7j(/ EoDI62@ nTގk'1ϾI| ڧ߈y>yx};> /h}" J  A "  Aԗo|CI?#q"wH6uw>`F<~X~! J/r@k=yۛ`~o}WD}}}%9 XW_h"!b*ŸF q`-a$6۞sH3iCt2rm*ҽpŻi;edww=R^@" N:].Y~ QG0wr;YR •-#`Sj}yk|G׌_kOC:g56s]<#}m%dwY";|<0d# _9pk_pu @olmtJgjA7c0 gB%i:]g齂يwI rߊXd}unV=ν*cn^Qܾ+x B%YhF`7^ի:m}q[wwwwI$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I+mmmmmm I$I$I$fǭ,tMw7_>g{WG2//Ϸ? ]纗././?;歱h{S|?E?o]nJsV>8^s4^mo4i}Fwx_sO5/7GJת<~knmt2ֵNw?/k?~Ng7AI}_G6X$?W}~AAdS|ޕw RN?׎w rN?鷬otH4:1?4H4I:^9=OI= $Wxۻ.iUUv^]H4I9]H4s9iiӒs~u;sNb i~l}տ+{Ah,Gv/GZA4#I?Qއ| I߆' Oou:ips|?; O|>z 'l|yw Ny/_~Ɛim$3^sXrI|Z쪫=ulEb:<{Orw<-;?O{_AA_QAAO|{'=i{!|}ۘ $s{iؒuoWlݪZ ׿nZ RN~Wh-gx34#H?I?|>ϽAA뤞~H4H?BIrߞAI9k_?7iےr us=F ĒwO:~~Z ROvX$]u_^RX#>5yNWrB?Ih-OM=`AAݤ{o?H4H?IH42Nrǫ{>:AANs9}'Oes~ԓ{^wX|*f'B:>t_A I9o}{!~iz)'nC洃H4ds=WuzAArNcy>'9AA䞗ziwNh-rO}VXw$//IwNܴU^~xH4H;7i뜵UyN?AARNo?~?H4H<ܓ;wH4@tU]~۽'9H4:iK/iqN_mIzn3x_)mk<hlE$?+| vSt7A:W9<ᴃH4_H4@toU]ך&Z cܗ?Ah-!$Sz溝7c-h-䝣꾧 Ah9'_!_9'g޷ZB4?$_%j _=񅠴?$~;'C*Z A9~;n;͠v'h-CoܷfsOok|ti?$zQ|+H9'1UA4+ոU? }WPH4@sO-EFZ A?z_h-;yCVZ~IOcx[Ah-?$Gxn?$~zE4h~I{_ih~IzqA?x.gZ$

붂Z$a|5I=]^*Z A$svh-'?Z Ah>9=7H4x>?H>̓:.iO>$Z AN[|-ykAh-Βz^~o~I;?[szZ ARO}G^؋{'-n/WOC}G#~niNc/'дv9'5gvI'yoh-礜Om<4H4iOCTowωwA t'f; 2NWsthْ~gtH4;]wy/ւZ+$9~3{Ah-zN_nAh-y>sy.׻}N|1份?59fXw58fXI;33A=TsC?VZ>Ir_-h-RN|gIz[=Xw 'P{yoKvZ$/o$zMh ;$'}i I<u⾧h$u?I;Jyo>RZ A$ҿ Ah> Wt]w~9c7 pqo{}e=R{~Z Ad#w׿tO|WI=o7A yy'w^} Ah; 羅;/@[y/h-N914H4v޺<|I_9Ah-IQ)wZ}//+_6TU&PZzX+(CRWwRo#:(ܞO}{>O%n*7h-X!%ܱtL ?\7w]"10T-~EVRJJqW)[VܯL$%2䝟:\']>&j*7uL'=^ax>0Nu&_)U蟑TD[}}-O)zXrwnoJg [@2'sҒY0!Ro2N3I Tؼf84N?J_UyyX&)\z<h W$Q$V2L'`:'p RVe6`'<;3( EjYiĉ֞Y-غuE;^=~a]J DB:Nt2 (̀&`$f̓ V.M߅DZwi 24W$ [Be C1$8\/2`F "**a \F+UJ#X޾së6ykz+6cSh 1arz_P×g^k;'kl@2T ^ˠe{ /yg?U\)zޤMR~Im?E~*GD\Һ/Ƕ .Nrt| @o~)#9 {=gX>՟ρ| dw9v$M2D'7S$DL2R{[Z54W5 ov};9}R^G7ǒ@uhT1`b},?vda%u'c鴘WFKKd" %bKhb23}b~oG@#m mKm2ir\xTBgJ`߶t*7RV q}CIYp?9R+-Ӑe+-ߩKkNl )AΒ*~Y Q )R]簀D -W<ߝovkaoonKՠ 뜷/${9ɩ8A! 6ct:UI0`/kbqwRgl{ ߻rcwкpA%I$Q$%I9%ĘN? A@ ͸`'D{xj#~*sks}/֠ "c 6Az >a!6y1 7DݪLJZߋt=Zo߁;R+:8Ye.CŸ:@ m䳎\\S?Ym 1ןnkd#9ތ#qM ?KFe4{bMf]0wh%l]'〇/ȤfwI?B 3)u~sq+V1ߌ=^: mNTRNnާo[@2`.MAǰ9T1>1@ pwyKUR$ 9CtK=M㮀'ei4Vgb b١ogN~&R {=Fj ~}<{W>lkb:su/ ;X-XrW@>-J̓ŘTTWv;V`f}la)a:KdmDX$I @r}i'yl cyi>N8߅Dk69lF1@I$A2 3sDL}_Đ )vJL+;`p*}YL'oo*Џҏw GclI IHm :nםf#:mwZL{j& w^֢9ž|rҞ/xM#NZ+mhxEI QDgoNߥ$ٓ5.b!8yߋZ %/+s~>9^X{/Z:9J!+BD$Xb'\2w,{%GͦDq4w 0a)h݀Y-#x-?= pmŅZc$~2Dd;>iM@2 I"n@ +mȵ56 0WɀOl2`ںԥpV>߱1w;_.=mme)+ 8rx1 pR'f: Ʌ_q $Yv=jD~#4}!@'Ƹx4}_VD<411_7 T`"!LO~S(#xwTe<L^l`SSbA +qJj%ܝZú̻x|=S7Z[mim DNJgE̒_ MC8n ;4FjE_dا' _ֹZw{}} hia!:G4$G& %/i.#w(a-q w6۪"ՠ k0~gfZFlvsfa9[4F-^NUm*RҲaw/s6|]ELgHh#m%,t *y8IDn{ ̷Ui:/bwʗwJS>grA&@#'mǟ*$z H`o<yPO_WMtQˬYWrkww(I$!TI9nM:2LM02!*W{DYpח}}moqSZj @$YyL/1m~=ȐIzi:Fz>Nr|Owֿ7[im" A1t>, {a&1o/@F&Hґ2+(%.7=#v^>琩ֽ'B;kjAUDIи47%MqI ;aWmN[)Ky>:9qzQuR EAr|ưIm cdH$72:nӏI+jBRnnl$D 1.Nk'5̒?3f>`QEcvS ;?]U5ҍ7j%㥦q#z꺊$Og $ m m[aO/_%MAUw\1d9 1}~~>v^b#@F@  rɝ^3asAIm M.S~7oɩ+&]WC%~굏VHI$Usgʲvݬk%ɉmAJU^0:-._zO, JwGA.s$*C;ߋn AEI @9քԶŌi9,=̓ۙI{F05@ ";= 3N}߱}+,_9x}v{o~Ɩ L:K}q~%'U]Ȟ0'˓02Qn.'\ܓ?vۙƵ?m KuҒ_K29ə+7i73)lz{)f{??YN~ݙܑ%Yr3~>Z-9Cu) خ1Ѹ6wN_*'~#b8E?VүWiDz;咅T>OpgS{쁈5WoYɤm'Q( ӓKk# Wx%qp/I$lDBd'7_{XjjeDu"»J?7UT?r>OD~NJ" I$I'frJ䤇8.}M0Q`fiJuwzZ̤yOK訞O|wRuIKqoo t $$ Ld .IIÇ >0wR-#GJ%K\WW_A{?Mn.*m 6H\`wn[lAW*|e2R tSZTG}oOklD0;IGzd Kt}=E>#I*۸}Ӿkjkb x61t8{GǴ4p)w0%rr`OK{}ū<_m{N~ޣD~pIP$I$A Cc'JI?fI6L5Q ,_s'-ŃS;~|F^}],/_^|NY%ޝ>˸ HQIH gf㳙͒]jmM3Jwz6f' 9sEDNJ[)6ߺ֣u6\[@ǵۡJ@d:A^kPD|8ES}oQR|Jw'ֿV@ 9*K9y, \8~V9G@g8c ysV<#\u/); $U-sKͳ2h&"尀DOioR@પ~ Cx^ZHuvD$Ȗjl}ە`Z@bEMOpC4{ʳwdV׬?Z_ ~@! bg-ᚄHj(b`ҖD#&; *})SoLÚ|m?CD H8[-B\/Q`pimYU~B$R*/?3 ڮUa 4yZ~"b6>DŽg;E'r{#;5h@L۳2̚JIˤ$}N8p;_O(sجR~\) $I$K9i#7LfsJjeީLS`%-\X;궷z_ I@g<'O$ň`@`"80v؎*O{Yy;[w}k_i4]l5hIQ;.Is18t{O̓}sś TKWwYܟ;{U:=A2HI%$H$RvgRIY9Çgvd~[\qNR'ݽo3*O4eÁT hd*Ï|ps|_ \N'RA&O:Ra<WO/ݿR]YD[~R4޷hAIh-Kdd Ã#c6r`$fKR"\|x煈hmUGײÎZ)"I Y |Cy@2QF;$.,\U>_{ߕ$ֻ)uIS۹}mܗL( `ql8pf~?m[m8^ܟ~̜}`c=WcoDJI dO$vK?W\`[A''#>/VqSw݇ )~u"Ժ~WrAI$HKI.dfIËwOTa:)x)ozęR')khy+i굯q{Mk}m?轖D"ІKt$v+&')2qq]LZZߓ3qHr#wR syN^,D@ IپzUpo*g"NOA-qg#K o]W~YjiIy aRrA*[@-Ksyw8ť[y2Mq[ķo?-k_ oB ; M(,vKRdI8)RxI48_陀8oܜrYL~*۶C֏K,c3I4Lm&Mk J $H}|9oط{?Er2 gYwN`-`hK hߊ@ Sۇqmy3ΐ&vkwn'G5=Cb>#I!mm @l# &$p{ H8gD䥲Q.9>i?=awH5ĉT gfu]_뿷el!5o=Y@g?WДi޼/S՟j"wR~57$$$I&edII=ἑ'jcON7 sݓ3JW w$}/kZsm;G&VC*3N);ɜ\^}w 7 ٿם9m;+Kkf:~[4%[-G:Imm`]|v1i,# >.@ Jh/"+"'/n`vW5χ>wlG[̺Vrl:wLH`lm nK.Wf"[]'%VU^O/v,MXm iO3Eb@f/`"9`ְ>8U^%5h|{c@[b d;]I}MdqUdkwvRfw^dZzqr=Sҕ$'JQTFP )"QRDJH2fXK֭5۵ ¢H0Sې`%$|g|oսD]cw&Þ] mPa(qG}=S n)KK*{iG⇼i]rB9VhD dwN<}J2^x2%<<(?/7P'_Ec۹*(؏ޟ h$`!m[C'ku-t$fAEMd0wp"1ڎ#+g}|+ק_\U JI$׀OV\,Af Zy%"zn;*0yuFr;uU5q@, 5쩘@C7 o0TИKp;~u{؂k|.s)Ḙ A[l%\ rRX`,U .xsΈ3הcwTnuX>N[!O>1/37)J"J@ l&N$|tI&%w$S >networkx-1.11/networkx/algorithms/flow/tests/netgen-2.gpickle.bz20000644000175000017500000004473212637544450025035 0ustar aricaric00000000000000BZh91AY&SYs 1X Iz 6Q AJ@0+fmCHtEUS G{`s|;@#Rdb6Ҍm)!zSؓz'M=f2hdm 4m M4`L0 4@Aba04zL$bi 14M04i0ڙ0jA1CA3A&L zT*x"d0!@&jF4 @hH4") 4d 4FCaBdѵ 4M F F4=L6J@ 4h4 4 4 @hiAz22hRiM2ɍGz&i0MCM=F4z=&SѦi16FڞL4h4izu.<-S} wuOMe5Ťw;ΩSWMe"Cq?*^I'hW֣mm]fi v!fxZ~v׋U(t͒Dm'T3͙5ٳ|oOow˚G6AʩsV7yfq^}(x_ jy{ÕD[0k;wk{c[yENh D]kTQtK]睱H^6MQm?5L$灺9|Ζv}{|Jqg^ ͱKG`']2QoAk[YUȆsYc#5Z[QVT,{H٫kꍒ]RDw+峖jۻgs})QM[A"rMkoxsK"cA4nVmBW(՝zM{_kC#NٻVXutX[y۽$h^R\j飋lyζڌ8$Q;%Q-zJYIccƊ&Xʚ.5nƹ:sx޽gBݳk}݁Grnsڦ-oEWoyT3v|h I*뗝QF.ޜwPWu!T٩=Dםk<]#strڧ5QB+ƭqVfss.9iZp(+vڹVMk|8[ry|4ݵ<_|^TnyڹKBO4u狺fGN>CK::ײr{V,]]˧1W=^/ZszY#M]xNCظF饏 dsNZ177?Wqej{+]1xՉZ n>jvmPzѠAOr 5oGU+ ǖ{|8²\ڑNQzf4A5,*&έo]#[E^i\5uCt/oPzJv N^9d:E#7^ͽt{7}ޯL$gB|?=Vg) L EDXb0 $Y) E P(((A,$YXIPd"J$ ?~o>Cp}/:)k)rTF;mW7r ?v/ ȯ5):k9kz9Ej_F7_F^}1>/$|H$?I$/z2I$H~*Nxu_yuxUN=}߾*~U{2uO^|t~8O>OUT,!gŸII<5gKY9)ȸ,,$j %񗰯sUUijj)驩ii)) _))))))))>ңQyn%yK;1,}cU z:6EQAӋqqX@W'(Hc9.䧥%#j$k'(T )̖I0(B]^+MBƢ4גuIBQ5&ƥDsLgIJB5RcGOAPUL9-8NӬVc;Q'F霱Pͭ,0 r,\KԵ+k&UfŽ2 ;!+$/X\hruoPhyH- BF._=\MΰQN }noX(4Un`dۤe1Lk󵃛 D>/CXڣRIWŖ\N;/Jѣ e|`ga(r/Әw1A(k1pU uSXحIC@P(ܖSsh*Y%[j?Ok.TH64W;ĵTs9jiyt{j)5e*LDc%YeAIwKӽG5?#'+( ^r]vLRbP>D3GnQywQ]tpd~.޴ʟ1*=4_ԇ/쿝>_ptQl<9}b|Jj>3R*]3fzfQ¯4ei9kZ=d[L$ܚvS#L[ Dњ5(#5 MMk`p[]ʇKBu$VCh~զȲf`~ &HB4HrN 2f 4ٳF1%~I6sۓηZr;;`Ζ#zﭷI9jmDG&+iQǵ$v4EGu}.d=of%E5% YxXxk̩YĹ .B2TK18g 9%Mq l3 e>DJ'Xէ JXVpK`> Cǎ<@OOO>Q A8B5qW+b< [z^гU@vV%siy[*9qoܳ4oWbB ĘY{4fkNE:Ů ؤ1.L|4>-ںUׇI]1ƶA'g`Za$ -lpVܙ8 8rPCa(ypAXj׽>~u0d#wj0^_jU95+;YQh-:#{<٘;2l#8Wj6.}{Eg *Z ?#쳥ՓY98  )px̹v^wcNFMZ7"ZS8D}Ⱥ% DY֭Dt?!$d'7Aý {$fDx!gkl'8y Rk"wzn9ޗ \b߅Èqr$ya5>N/u6ЛhZzyݧ&V#Kϙ$h5Rt[ٽSYԾEW}SLK㗇 =[\JIbJߗJiVTt(hNdZQJI򕄍0a,XPK 0N:q%ְ2; T ݙG)Um6>~CYe%^p4Y(VָwX)Iwma3^",@X prC1cΒVnf)c-un5z: ֎6cIۗЂtWR0WDD)RD20&l6hӁxVKPS+Cί5WW:z-kh7nz 䔴Bkbz]f__謹&,ZKS*5 4M^ gfU[egRiwoqj[c,T"ʑ+Dy:t&8tمNDI$TϽs:3 DDDD@n/gqW5mqsQqawy[p>Lܸ&ơVj jWHs98hΓzbp}g%Z^P#x GU~:뎳ߤϯPg?Rbױ.PˢNJ&ʗ@ĀSE5$kR66޳bй갣+6+)u#(^ߞx[M֒7AW+UqQAj Ety%0lq %P[-Yuݳ6J:jx'+[P~uKĠ[k*QW(?y*Aq%H *ӄ - xaG8p :xt @Dn Z׽an [+S"wtU ifSz yQ9%ق#gjd}\_ysb8"E^c,w)iC/-eA}$z%Ifg[q%_ͺ=k|Yhp׊p`e&,y.a@:ABy^ {E,}-V{"KoJ-fZ;;l )r4w}7<~?:g$GȑշقׅՖjx?6~.omxyw#.a7 _LB""$Fb\_M˃Jjzgd6}9|x\g'U:~<^)4Sntʌ]JVj.1eSagb4<Ԇ)EsJ7lgyܝB3g>82be wAf@]ܦu^^{}yt"]}!4y&tۻϤ-۩3Y➘秈 <'\D93%088tz ڱA[s3k;us0*XDDDDHH1[T魵't"v9nQL3#oQJ:*H,R=-Kvw¤eigΞpy}|DH26SlZd{JV@ @B"""eOY-c2n'9IG}[=.}0D"$y'H4OXmoY(wKk%v`"1*|*45wl ~קxE9-镗LDDDLF^<Շ+%c;E8^1^<띛 S e2DDDB"$mPMS;. סV;l>5s?_rvO*B`fge09YV>Ů+ne܄69 UecDDDDDy D٭/coXdٖWB}B$""$`0mH>=v̨߽KyMXR ^/URyl#@x18|ptT_?90))*b$DDB""@'UO^U;&-ゎVZFdՌ""0-wK {U&L4gSY@d\0/0 D ""$D$xp09|o6Saa|_5fل cɧgs_^>Dt2\$ϚRc-.QUhЯ 'd-}UVN=Sa/dd;D݁sl:VI; 2TNsԟ:V3Wd @>k Ƅ'Tv5 1I~N10!p=-=԰!@ 5'DYT>']s;^>'n*ld=)!RI:4g`t =2M;9݇FʳB{'.a͠H}>[Ч2NN$Y=H|$d>Of:&s)N'CѠٓdz?zC0'aCuBt|7۲N;>@-^@?"tv~3$Rj}΢ٰH~ҧL-@C{?v 'a:uOQ}oN2%R9aBU_t<<=`BUjmƪ*˦0dsmTy[~HT.3* 9UUVD7CáHb$p|Unaa0a#9H pY"s7pnHH"nsw2U7La** 3 a*"Zat UVD MEaOm$Y)UZ B 18TܐLtEL UURR9-]˚ TU}`LBU**M~+_ #M*-L~ˢIR*Ujk9E.l(UZ7;K92e!TTL֧D7 ]RU"Jѣכxa$(UZ&eqx o+Z J,GZ"#DPգؐ ##_EYmHd@F_ݪ33M@ix!Xi%2D `hD늕ሉ~HEL3&)37qhǞ+M'dU඘kwuzg_bUկ*#ςjaf 1/F"1  !;WhxJUJ3xnw=gjMTUZ W_q VUjb'L1UU*R1+u[ktI{+r&[!THLgs?!.2U*er9<I4$ɗ, UTenOS~}7T&]Tj ܪ҅'jnYVRTo_}4%UR+?<amug; d+4Y.o_V5<Gp1@!;́/>MowI֒U"5s=mFBjZJ;:<{5)VK_0oan)U mh)_ &׵'_;f,`4Xݦgҳbk>uqy4 j\S]uC"n3LMCcx}ֽ?=Tס'43;M v c6(xw/s8;s7:J&S&]xg6nxuYJ)VРYBXٻ ]kto>lڡSb@h6 p ,e+lZmSuf0K8zZ0T8:t{ y8Ku8+ guଦ&b¼ dɕ6.E޺ݙ{ ٯߨJ7& +h5[beHՁ u"9k. o6Hw&V6 7lpk`hXÓKj,%qqW;bq(w8'zGz8nl ʻRCޅMk r8)ƾVxZJGzds݊MN<%8FN) ۓ~xty-Th)JR0.qiԠh*URxvNa6{Z&& {<3QxD l37d6u3Y1N 'RմMacm[5i'T5a4ȭN[`& = q?=ynx^n '9 C,Ny䧝3{R[m@|w}m0.{Ŧ{i%|Dcmܶ?3{[UR}PLK3>~@Ͼ=@{]VrȧzP{𖀏m'm#_s.ZCn[mVOϖ)2~D[>gϜ>r rj}Wwrjmg{﫷-If|]=mjYWmI.{_}3/ISIO)X-I=lS[A{qZˁj^zm_ѿVE'6OmT{0A"jڬ [mT=lz=+jdQ@|mV>L5[mVImuZW[Umj~mZWdmH-|Xz磌cj-ACs* OOf}j#mه[j/'Vڭ{/cYmh{ }eZ }l؇UO_Odh/a'}[mH~ $>ʈg@>n>h?q$nHR?ϣ QUJ_^e.$JU2N/.;/=%~titׅ~ZXd^zWax]ՆݓWN(Tif;p}P?n ,jWUe7Wl0HY$2U6%z[@Hjj;LsD@$I Ё BM,-(b@ Q$XI5lV)Dk;MTAw~f[ã~s37لPt8rlxR\uPA 'x,n}nV%Kk3p00=(c8eB$B""lyݮǾCSFHTa6ƹ-8*"0I]+7M6j&oa{B8v#1ـ:Q Rg0c27RD)c"ng`L"Zi"к},10ukBC6FZB j K*gMיy9Lxz9܅EP0-,oyמQt7ʃz*p`t:hP7r7;aD5;Ćl$[a"AG:-E,LCt%HFvUq硵aE(v {|P %F/j"#EFuܓ K^L}w_RZ$$P ΝfO?'Q=Q4׫?|{KKD~~v J |i_˕&D$H56nBF AO5Gn]5u f9g͛h?{ %NNǏ8'}~;!1 gyD*6_mqs3{T uQ۠׏ |B FA]&^7HQak4xL BR, CZ\?aiH˯;~?"?#+\!4 Έ7~IrFL ShpgIف ]Ux.!UH Y/htOj*ĊD4ti b*,o!.ޑ 6mELA'՝z*@4b,M UO-K &릂2ǐnC;k~;Ca{A;;KB?vgzBΉ_(FFAgI{#q;!Uf2BXvc^ BrdY-b3VvI9W;$aoH;b7*b1U5 &p =PEb([2Mz$!~9/@/pW`/׽?8菚 aox+luND\V+踵/Fj:dk IN935Aٓ ÎP49l喉EtZY (4sA\%9$QģS)'0ZLBy$ڝk022x%n Ԛٳ0P5q&k3 5fi6և6CĢ3jORXr\zwNv=GwS6Y~7ӭ˻];-:yoyӥfy>|'Mxq&mlvZ߿,6Ӷٮq}G>[tΜo|wmvϗ=׉/mwmΙNovWu<}z)׎Ვ]vZ{z>ڼ>:v}geW^'\pߎܾ]ݕK[g.w댯 ~|:ww/>>sχtwWr~w}ݾ}=zlwsvy]wx./ <q'3/G8pwxÀ@X ,Z:We]}|s5wWzs-hpqv$k(ÅR!xضHV :)o8s757mNt*`Ovb%hۢG*%";|fyc~gFWW6CP;_zvR,U,0jq"u'OW^_ϟCk?{`,dxwe%wha.x X%& )$AVdr ^Ԑu[5hufi&X:nxTܹx >|z]3lUUC~N{પ~R}|㪡,Pߒzy/UU|3|eUUMMLLKHX `&&nj)v*w4rYz;+ 43$:Y@LD, tsf^F6&y;]ס\e_]x=gy_|\~:Hq,UN3˓' Y0k+S}滺&k]bԧJ 1%nHof 玜7lգ"X:D 5I$P M6ٖ.g5]pB|ӧN:tӧN:ts#x).Bj enۜgl IW*x-'T#wdJK#jN[j4os`#e׌+gE+1T3 W"D!Jw͂2P3_0gİh`Cp FP a8Bj>mImOKUB&уs&ꄑg{%r$Ig_[=P]ܡϽߗ/?{Pwwwwrp{c/Pu|=nw|{C>NM󻻻 ~OwwwwrG_!z 𻻻Oy~:~gwww)/S>I_7www(ycwwwwww(y@L10 Iu6g:bh:ade}b}oh.B:* waokb0nw}~Ay`QH=6y1~.W?W=\ oYU)Qʑ -0Sd^yn|^o@ͫ* fJ܏&(޼t"`"kM$s78 fe^UjS# L*NƊ$0F @ZN(9{1-!PS;m[<4>ݺ#/|s5fsafۧԉZxdbs[Cg&ڟ~'EJ߀7+?#_Vs_ 3ajJb4t+&  eT% hoAbVi"A].[Y':ffx/*g};'h$Z' :d }ḍ!M#(6;핌T2(Nw@NnV/>`8nRYBf50LØ֋֭Z-aa*GсA$Pz5"dXXI*ׄR\0aV, a }}s y\ (mKpTxDnT򽢬D^6/ձ]llnZZ3ضM^)@#AΆ\8Ϝx yKhPT0hՑ8:= 3l\ :@/3(/4//@n0B9=#櫄֞}Y}+#7>Ƒx"ۊ3u v˃| L:l٤nl*lEf u "&a9̭̇ 瓕)M/i:e,)2ªn,R(r!"xXD}f.4 gb  eEE/rzQ437nduIF$ݰ Z,uy3w1$[S2Rp[+{<7 - <.W_/oVcfl 5&zIj )/ӏnU%d\ ȳUΚWW[7Xu8xE c1j)y`rǏbw&]]=̓7̇+͝ިoKI#X"gTX0A3 bu90 y;"Q<6)̵,af)H~q \XL*{}GPQwhSmA"4._jA_h*eD#ԩm *:a2O]ۅZbfNާ /(>:L_ ٸo;K(wkdj|}9ߚG+ z9ssʂ_â\9D1һ%+(4܄..\x+:O|%}n}@ىlwQc/kl{LlqwHzP\Fܹk}| O\lrTa‡R-sf%fO7?6K7[#@NA}Dk8jꉪ]Z[j!YYp\)mF"~ e6ΏwQE=njlh,*˼6b*.S.`4. 1 ͆:.`@Sh/E4E 2ndCtȆ-G ܼ^q_,xK|Ɉ!2<[k۞MZCsPѿA`wO``% Yv\mm02}ws??PwwwwrZ~Оe_o|CkP^wQKC'wwwwwwwr}?8wwww)g/ߥCӯ?d~C֯wwwr(yK]~/Jwwwwwrޟ`F}z?wwwwwq~w(}Zۻt/~9u]ܡ_S?Yp~> ߼߷R}"+~CfT7GC&?bmL=2]o:ow7Ҫ~7UC}0V7Q٪ze߿gתUUT}PUC|p=r>{UP=>zUP=>7TUU E7|UUP߲ߑk=C|J{Jov/oUUC}-;⪪>1` ~|P>"|m_,*2|s~`|T7-vǙUUMLLK-"X `&&YLNNjG3'=¯B^וaZQKEA8~e!c!`oM GjNb0.*G x[s90:ݛ-JT'KA~Vd0_WSOC;733+#˃_Y.Zt3vVE(0zۥ۶jћ"SI6ٖY]zՊ((QEQEQEQEUVZ C=mӓ''4ʿmv:U,q=Whާd:rc~{R)lu*x;Y  I߆k㳵ٮym #A&nS:t6OmQbO[-y}O*9ƐE?>(OYš? svثgʼwmNNgNy0ybce8.;8Vz Ʃil/ꎪ>EEb&Q-LcYCnN3ı1N=X˻.n^S/` FHQCH^ ּfjnUVx``+ E $BB [ aJ4-3'V`bfeW֙FJ2` A~XUUI> -ܤPJ E)\" 9$@t7'e`հp@ I&怏)|@z~b>tIӓ/E"jE{? lY h#RFgRG;OUoo8P.fv0ǝX\[IT_gRuWkBzɣwV V{!": Ř+WtTWԬjgPfR]JNv6qd;wc`.^-<68v$%^c7(%e~Ǖ2}7_DnAPTZy> [ہ8MqRv .<'D'syח|*,;L%BIg@S,^oU\0KL}d}m-?%z{E|^S7|v\ˉf#+Ep j^4R 0ySmX!Ds24`Q*Oijd [j(o K5²~ı]ӑ R ټ6_KUԶKMӥEEzvInygbQnT @3> J%P$eޥKmK`uڰr ry2N[wZ$ާaL<&YwB_Ifmt|)̺OA}zEq 1&f&R$rlqe]ς6X&Xcrh^0[|D~3&&v̅1`_qT`ahj:rx#@6hsU#~&\NA\UJnkFS |#1&GgS8|4n@Yvr+UpUpSfڨbgJ)+znʷ5 Y00+1p0X B=rN cȦ*~0sY@e7Ფ&qO)MH9E! cK"17q|]7 IN8ߦEu &v| N3VA!.b L_30Л1Ik{فofz 17=J~7{gFUh}Sv}Ϗs@B3G)UƎMN{|E(|hLʲ@68:==^U0\pApJSnt4!T[#n 3ն[+㐆xobDdX:5T*Ol?cX#çBnKpmFMFe"Hf7ܻ23f,cryXݺFu/X02>>+ zU <?e>d}' OŁ/ \L>TO~TOړgПO??u?r϶~D~H@~ 1S}9oW{+\'dzȔ.E.DDCBa `8>|޵--*(PBE8)SPB@(PB (PBpEAƊ*PPQÇD|'5 ڛf=qI;1ME]szrxUu97tŴc߆QN_|_&mWw)7I5v^Sİ dn'%/.竦c޳Rͫ[ɦldº|9+5hJu58Z{\ fC)3 tliʯ5[mo̷7Ogid@u-kzS&|.㴮kR^}ЕB ldq4)-"K"Ik=+Y/$VzV'Bma 3 &}m m5 LM<AlC`-1ԥQV"\ie{ڂSx`;Qƻ'Ž[tUWy0)H@#G5Mnx_7G aV^/Mʥ0*˹qHY2|wǐ'Zq{ҙܖ5g._e)e!n%|mVzD]| [O<;| 6lJĚ&+s?G fE$Q/ ĉI_öxGk~Vvy\[X~ }'wKP{5;Zx>K܅spo2|q1sRӼίR͍`\hK^Y v q؈BH|k6iA fL3h:_= }]޳}^޷WcMϰg/!|NXis RZX0@͵m@\]~Gǯֆ .l|wmE "P \3Ȥ~nGGĄiu:̧[J\Gi88E_&)U{nG&F_To-uWvɓ}ʣ*GLV% Yb6V_nۈ{Ҍ}gwo[_S=LIYL葞S|re՗T%ޓ*X2s(˛Ps ]8Gŷg~ Aʮ#`o2:`?\F!5~](/o >wmКFա;UZN.<"L啗& 22Buy?gijBu<,6n=P47޽ vyXoapwL󟌸?=MnGPɏ J{3myt+E'[%FJ2#f=V>^cِtZ80jj|*7}>|-/i!\A!t՜-zH9qI ~%&PƘΊA_;i/Y&h!sȠ'g}̢k.wp.sB_bWK5-buK S,dS3lv BZ+mNT$ݪG(.StӇ@׮޽z Y}n_|S13;}7nA#i)!JRBm"RlnRcl6iϣ-]YooS{ߧKj+@UW^/F*K\>=_ArxIwAuTƟ {x.q~~7_~W٘Wr[sfjsѣfmFQ{WG{ҷ?̍H.TOf6X$YRv7nƔuў3 xkަ"gԫbB.*"KD%SBo#F9m4L|$'⧏T}JbTd&Jk]90`:>]ŵ4@9*ڰ*3-&w;50S+SIVܣmSҶZ$pzAk! j;P.\TŮ E&Lۤwǯ_~툞ug0d"留3Ⱦx|[6]mlm)6m)IK|jexX;/7~w{\smH<vl\l8<tnG쌭I_n3~إԿ+oy`>#f|W o<]T)cд4bW~C9 4{)9Q~:YnZ1Ӳ'\(d,gpFdIvuUru"и-7x>l.8<>={|*Ur͛Sֲq.ԍ3$n\],]8\8IQ0PQ@ A$V,VP A(VOf"w?~g>gvI ~O){k }ot~'?} r6?sY|5}}sB%.4@ޟkf?=!y'^WH]d N -,80Y+0-3;SOHJzusi&sp=#?YJx|X~#Atag0ȋ?oЅv2~| u0Trd9ܱ>Ī/yEI g8{ʦB?7~'*:̮UmyC_az"fBt٫d'jv$ke³xQw;fփZVujcߧj&0@ôB % س Fg͓ Հg*׵^)#ūxF;x7o@w=[ku8}1>ז_\Zُ=6AҮyh7~4`% l $ȟW?~g~#Kgo‰* RЬ o4W3vaj.&ŴJS_RdZ~[ؽ<4̷}W3u`ڌoU;WrhR_h\{'JbO,U&а ̧p#Mk4sHdo 20DQ=Z-qM!D3@O5+R6="&mG5^:?6?(]o*eyaʐ]dtyE)n퀺"GQcb\5d#r|>?+f/'cʏ./G[^nO.fQǬzqI"I&+ %Fihmn̖8}fJ#`]]`V\Z4n'jٳI/;p@7w^qٯ/7I/ﻧбy{9C4#C!p$#<ʧ|G睙;ЀM6ۅz{/9ǡ.TI?,ڙw,L74D/7&ķ Wpҗ30lV aicwKvviN,Q:!t4M\pۘ хkjN?gȢRnyæGE76yL%hZ("HF]g"HU\tXr5rڻ3l+o篩5)6T\,|m\蕇\ma k^8c )k.k,1!V y`StOIf$Zjf dGpٳf=znG/wPP|r=:7 6xC=Jb/oP{0`]h M !(ZRP ^`a0-|M{p`eDW@e$Nkw8D~nJ\/!쎜}h^x/j^1~\\>wTW7/sʖzhzk>UTo nV$`|SqHlqZypZd\+6[9tDBGܢz{O+p&YInaUjH:w'kkOV|#RgB2@(Rf $q"Ѻ51xH$>v*zLQ#1%dѠ.L ԝ VOZڼïc @c(ƬFNVFFESLdF^ÃTupIHlK&m]7o^5~ +7Մ>wo6]mdm$޳죟\uOǦW s{p9'tR Njhיnz՗J浆&fRB~o>69Y ïo*([dkZl,uPC|klM`R^Egba#sF}Z~77}>g$pelv"Y&úIx$i{i8 sDeB[}>L/V%]BJEK0gasr z*qu*),N6"re(b5ɧ9iIU|/ȯ_^ |λ%VԪT`a]\l>q1!ur}u`Bw< n3܎қ3SrJMʵgZ;vLzBr:n47Gq@Orr롈r!w A*L0iI@KKK ^٠@O|;L콙[H5~3v^|=K8kOKkR3yPմ;{ώCh?(뷔8oWT=Ew5~s& j/V֜.k_/ˇȐvD·.l{l~(a $qŅMJKZZ# R&X]x;u5}(p ВV-[1NPjT9)v* _rTp|<߹̹ƾg>h#*[aHux6L=zJ<2s~h8.~ag{ A5Y+3ԣÉQׯ^;!UHW 2 !K-0T jVb2HACV-ܞdt>zsv||c q} w=O+rK?9Ǫ޼/…e$&6))JIm7& km_< ^>{_Yd_݌f<܄\<(T:,]hZ GDhro9>'Ҙ5v' nhb3'4cy<>)OF[O W[ _o {;`do^5.pqSˇN P(b d-;}m FnVqK&{av!5=lxx)eˬ-G8MIΌRw1MnBcի׳=ȋ?őASȫw`b565u[ vG*RzaYfNk!% df,I6ظG|]'O'jS*td7nd[`{v>&v|^w;^w^_#_ )$JRA)6ri?UK_?K߷WCVsv\@`3o08-G Epp啷_{\N|ǶMOss 9h;RE. 2z֧_wwJb$ǼI+_TQ<,^gx 86BTTovMY?x,>0s3s=2ߜK $;v9?lz9o/3}d?`Ouu[Z, ?[8}_'|^!HR%) 6` |6ףYԆя<^Cvu]>Z0tM/OqPGnb/ؿ7E+ȃ$W*Jjh\H(Ӵl&P"vU,aT5qL}t?㫆iiNl jOˣ}=WzGh^]YnUweWΉF1$b82Delcڴd!(̚2A n09v\s_Xz'k% w.DBO3' qqwZZX0`]` `ĉz 0^LxM 4Qd-\;(y _m1p Zñ8˜'~n]Ż]LC+!ޤ23v!#m萾85fx|6}2<{wy߉x.9ُ9 tRdԸf%5)ZDVl:=&4dn210p22.jtr>&ݺh%)$9Hm}W?iggݿ)*_0_ԌݝniŌSbŋDXۋ"̊mHçN7on76E˚z~Oio%$JHH3  B~I~~K߿' _  ƻֶVf?3~v'GGt$6׮_#vcKb"\SU"JziEZ(dXڟBDEof7ePʁjnkFֱ[Q׬ͭ=,ˮ{E[bM^Y,\Ysm[ɭZz+V{ t"ڰl'6lؗZVO2$-$gys^Vqn4ϱ%)$JRHRmlYNjuM{xTc&I0V{uW[X^.fY{OO F#.׷a߆͋?o۷okҴ_HLbXD x,,] n`Zbe9GI>;;:9gQ{~si#޶(%)$JRH)6mY{>>_+>y;ut%,ԉJB%))6``#0&ygoO{Qx~'7\OG;6)I i2M6MeB2w/{xq?5OGqޭkck#k.`"lma&8rgé_[|}.2nc~B_[gmlYK+D؛JNMԆ6Om;W)vss>3ݱŕRhR%!JRrcm6ܥ$I*}_ߥgjyϵwMH)I RD)IJMcmHˀ3l%:Q RD혂"" {%]oIt;ԑq\ #h>fݣ|R¹<[;i ^'=`l8v^M9FYZ^Ɉ Sm oo_9e$TLQQ;9owY}eh @ {uK˶屴dDB"n-k+kcA 5/͵AD$memed B7=r[ٶVw]mlYZ1 A.gwY[YrٶͶ@ 2y^[ n˶0tmemem,3m"%ӻe[6]`"e;/mlYKfa /wYvݵml %guM3 e4X/ylKv͵`ow[6]m$2GwY{˶66lDm6嗶m [<)lY6 @6ę%.Y6͠@%-yml[6M6yMmlK6 buk&ٶ bdId[l,[ +)JReOook/w[6͵m%3"%gum%-i. @6,'yd[6]`#0.߫wm-K66` oyɶ @bw[=umb I;OOookfI[IZ4MO<ٶ6iA"%y}uic* M%DMf6n'N;We%KI>çѨ8jM5KTju'b3_ǵQcZҔn^^4Z,8Oіl;VaڴZcҺZ[^_juu}6crK_EFn8||pjIq{W3a2[Rn)'sS{{j.ID$d+βgY~| ݝig}Ӭ7v."ؚ7>%Ru236[{^g[}|9ζ  l}W66⊯aI e#|:Bpٹζ,,-1-z?SGu -kF g'b>< jOk{R˶6!d#Mc> 'wYvݵm%b#llA`_{_vݯkvivi?H@лx>G}uε>Ѯv K~[Bs;kbLIv?+ؑ4?ܴ;vgir[v.0m%I~}~mo7نwMEݣ#F!a#Ue&j?Xz #Gv*C=z~C->Z7ɺ7yI"[YX@ -]>~}H=3=x``5̂9qqqqpkZÍkുs>ZPhq֣Qѭk^88#Z`?YQ!֣Qљj A4qqAq`d``{Pj 4k'ZPh5kZPhyy`Mk1 dd`a盆 ֠#FkZPhfffAawi!?XcdF }_FdFpvO 0J _~ոp32 2 #=Y0~BV5(P̃ 3332 m ׯPRp8ۇ VB dF6 Avp23e|VAaAfffdņ//poddFfffAa}-?8 mÆ fddFch __Vq(PٙAf6Bd{iBQP/2 2 #J p8˲|3 3332 2 #J 5ņ/H I}d8 mCd8 (26(@d8 mCƒ %aaq~/d8 m P/H(Bmp1+ : / _!cm%aaq~{t $[d8 i PZ,./߾2 FA`@K[!eEW|z&Cd8 mC% #Cfd6!cm*(W|!@cI^ŶCd8 J5(@H(Bm%aaqTϲ6(@H(_:,.,ԾYp1H P2袨=5̇p8m6(@,,.,{ %-6!cm*(=!@cI_mp6d8 Jϧ_6B6CH(GV/bp8l _ }:%\ mp1 X\1|nCXX\Y @d8m4 PM*(}ظnCd8PBJ wAAPI/p2 m!p2 e}{p2 J^86ہm%IBnp2 m(@%!@ƒ ŞzJ I{6d8m4 Pp2 TQE{p6p2 m!ƒ P{j {Tˁcm6ہn.bVzi{!p2 PB6d8Ş}z.P" m!ƒ şbB~( 6ہcIBŞ^p6҄iB^nCd8Vzi{!we~ m!p2 J]~B^(@֥!@p2 f m]@ @._7=-͵m%"6{Y=ٰm uOeKv̀ 6Am˽4X/?{r!/VXrGb q<<+$]x&l4l1mR-C݁uuXE_3&Hay8rP;ByRA-ggH^6 =C0XZymJ;EXǖ4j]d3NRBʒӑi,id0fJF+dAˍUރFĵ9rZIo)lhɵ!aʼƋI01a[hߚ34%v fffSf@Ye^1hH5yl 5FxF2Yw$h4-D^cE,|IkZmJ߻r^4_AcVh(a6Zf;F3aiͰ`cX# :Y FYkX̶0aZHe&}ԫW-48!ex eYI~b ^1Xv``4voua-cZƁe5L tѪžLƑj .o)-k4iCA Xn2K%2񤭺g5| -,v ֌YX.e00d1Q'{if1jj޵~fsƼ3d~Z` bif4^Yh=ZF3Kxd,Yn@cRBk.̶ ;9lYv]6 tωI{me,mM ~rw?{Gk~}ʾeW|3߽KD_?_N+L_ZG׿Xwܿs?}`Z}0@@@D@D@D@D@DAAAAAAAAA@D@D@D@D@D@D@DAAAAAAAAAAADADADADDADADADADADDADADADADADADADADAD@AD@D@@DADgli'|Ųk.DA |:RYS)J],?U] }{ook+k." %'˶F" m6XefYv!ONMUY bt~͖V./%ڊ􁖁eԕV.t(P=dwTaHp… (qgl6cY{r$YeYeYeYe]"(8iO=C= ݶmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmSdOqKn"b{GmNV,87%OO10naKgr--E\7Ĺۖ_Kvݼ+aܯ[5qp+.|LJ瑉\I\鉈knmnays<'0p]"5EPN_v͒cd+~]I:2^rSv¢RFl QS{/أb' VKTEAZ9g|DPTEF{|V(+G9}JDG/o3DQF1T#]\"-~SDUGn_( f"=}_DPTEG޾wQ(i|Dl?OKDPTEDFtWKDQEF b&*qJDZXsvt( On}∢(+F"7= 6FǾ""06E"#?WTEAZ5uيhdDE8~ѠPT"= mJ"*{( (K^˜~U( b#?Ƣ(Jѯk_EhȈ6EAZ>Dq1LSEN"/|(KGܸ~gs( ߈[u8^DQ8wꢨ`b;x*;8o/EEWbĬq6EQQTbŏ8m6*"#8>|TUDF7}o ^"#y*lDGvQTTU"4-굖E^"#=-@Tۂy>**DFoDk7.UEQQWr27+uEQQW;2(DGw/kTU DGϲ*e{l?mTU{8ߦoex{uQTTU"8/("Drֵ₠PT;(9_*DF5UTU{(7TAS5EQQW{[*e_wꊢe9KETTb"0>{:U*(=.cO9_*ES4qª*DG?ASDPb";5 䈈-rj**H_9T""9PTE7; (~X[q9N**_[j**ouTDPWqtE@-AP(*LDEw?5]ʨ*DDGܭDPT*"E"";[tjb"7mݨ*"Azc(%OudTECmkAQE DG# "(ztJ QzH9?AQE ؈tDP< DDg(\DGt*""/xTECDr>."dDG DwwJ*gDDxOoPTE""hTECgTECiB(*"{>BQ*zh*"zUEQSB"#vPT 񈈽`TEq\DQ1"4_O( ("#WMDQ?DEp5wtDP1\/PTEC^&}AP(OiPTEO; (~X" (k"<ߑDPTE DF~ APQyl*"D`uDQ?lDGMAP(߱PTECqOנy#aAQE"1tlYh*{n.b&ύ4TQW4MEVDp{NØq~t4T#\YPb# 4M>D]4T̈oSDFOdSDSGyb*|X?2VQe:؈λڈUD4TqΘ YEf"9?uO hD(0Ug^HqbuUnٳD0b#CO]SDSG1YE DSDDas>41M'߉4LSEM"3ǘgGYEILSDSDF}fh"ȬȨaDF}>b&1}|4M.#}z٢bUoo "+(W]sD4T1{lf)hwf1M5q봽44LU2"#4Tߢ#FȬMiG/oeDSN"5ټb&Ddo?"h`M«q8,,DFufhr"6:1MEMDsz4LSEO2"0~ 1M.DGY_hDp&)o% 4LSEN">-[1MEM"0tqfh#:hu1QdVBm4LSEN"?.VBȨqQfh"#ߴȬˈi;(+(#ۊM)tS4TQyD4UrW&)͘LSOe>xӚ&)'g@9^hhZS/Yb*oksxhzXv[4LSEM":J"+(F鴷SDF>ӥȘM)"8MoMDUGYE DDkw=o4LSEMD|̋wD4Th>"h&*ql1M/#cL1M1#^q3D4TLeEdT=D|9hz𦉊h#w&hDFhMDS*"8:a4YYE ؈7&GȬ(tqmGؚ&3?k&숏xҚ*sl/&hDG}'#4LSEN"3WD4T#릉bG ipD0"#~c4LSEO+0*lDg"+("#]MSF9^a4TDp4MO hu1ynwD4T^z"+(GS4LSEOS}h*z[Mhj"~Br%!%Yk@=Žememd b D,loe["#1A t˖LBB}IumlIvlNd.Y[6@imbNI<6mDF32NN[Zɴf "#mH$ylkc dyg$d[ f6Bזɶm0f2Eefɶlh"Iٶ&m@Lwuml[6fmfd&Yv M66Ёry^Y cik@!u[;k&mbmM,׻lIv]" Hlf˶m6 |^ ['y,Yv] Y{ij6)JnwumlYv6mIkck.mK<.˶lm'wYvRٱ6 B/̮rK2m\Dh\Z\nX—)Ö7 dmͽUmN)$D}xupdD)r471&k8i9'Yq5-jYՊbNimؼ#jVvEUKg!MMeC3mǏl|^~/}mI$GaS&hP̼$`9#,f=XX0:f61c7O9tE&F\9@-jw#1q jϼ{;Z5*2=wޣCcHqʫUJ9yȐAdf}t=~5Zְ$w<,T^.0K*V]/DbCX^e]m/{Vi(1S иZ3&URc$raf$ZܒiB7kZ]D`s8y ~Ӌ:ΐIrA8bK:ľFOȿmP'C7v3ԧ5TMI* !"ߢñC@5j2 ܖ_0ϵgUk$noy}f5}͉ӯky4ۅBg/\`UYdL\31鶛'êUd'T9}o |Xֳ Pd/lŒ[If5(v#9v01.fb 4'3ak0$Z}Mgg*Om^zM6%"⫌P}s=*6NNgޭ"stI=&^Z3EgH A$*O\f=Ū3Tm^,3~r?Y=`yg gAx&gQVsU\f%[y5axϥ^1;m=cQB MZ[uNn#^y{`1֨rj֛)dQ0sRZ%G- b"~8DHߓ[DݑyQGhOG6AޏJM'WbRZ0Z;`*C 6kO6izƯ _1Ƭy#{X0$ù$.I3 ZFߒGtNv>vxu4Aj~LL"#5a'y=zOq)|uz5'<ޯDcnquHHqg|ߋUM=#ժy!'R&"dTc>:[z)1ۄ Y<"hdgG>({ݟ6n<yмs"+Am5/s Ȼmbs>0}e6LM$ Кm]'vz+2xb}mcVB8@58ͬ^kBW+s%ԄEx{4+kXxubi 烹^c66)8v{dX^i7`<ЭYtZˏf=Ӯuӌdz6cy=\C(lvA EcGdޤvmg#X`LzPvaLŒ g#{~uam  #/ǦXNzG\bj0:v,nAMd2 /UWB[kG 5yho\zk czq}]hl;.=;,r,a5лW Hxvu(d8k 3YCU~6`{LRf܌8*c6GSr+QG>/Z㝔<>=> DwsִRƤ+> ֵ3t8u3@oBҗG V̡r.(q%Hϊ!詏9xZ98~5Ɯ{=)>,P ER1K(#񊵥[wֳ>޼xq&.^5k2lbigG[Ҭ`ܪXA)tk>"M -U4& xǯ7Ƕ,(\C^Z'ױ/v(mwMO'SE =w4兊n匼xwU1ޏi3>jōx1b<,OJ5d뾐77׭e vY쇡]쏲C!loLuʏ_,gD93"XR) $,QJB-RHBB B!R) @$y}CB9hsgD{cuZC݂99WVw-{Y5xb<0HEphTgLT ˲pI{Cl d;p1 Z!.'1ڊQ|„db=$lk(ЩՏ dB#IJ0. bVrhi-\w-eg9*CBJZpϋˢ i'zu^fnf |B́9PuUȨ5ĊMF f!Ln@Lb^&`ddwdG["c}'0/q*(-o}MbHCwm&;=fFR(wfE%3wpseY%|MAZaZ\H挞PġY v-Q ڍ]hd!d;ݍrѬ9OC#RK6Q\ hYP$sfOCO*yRx=Qy$OĒՐĒx=f+3X]Uc8>1)c '=_i$Ɓ"jFWep!%>kǩ8 mg8/H//|Qq3Udm+|njc8XUb|V6N1yQdVx<Ncak+9=+dL1qd7'kWI$W=dy+߳o။ԃև@pRMD>ć~~~"1-VR&W&+/$nc͞d݋9Z(6]"i!Fq,(*դW]5)ReHX/ Ib+,S")،E$[ɲ1^ هQw.u7[Wu=7/3o#Wa>G &=?w;K'i---a` ShE! \'ﻶ=-b[>_~gߡ>mo-l[|$~km,1;eş9~-'}g?mogobmoobşT{OşQ?`bsv,?m~vض6km[~I[l[}_@;mg_?ض~şk\}&ߦ?M=pvض#omo}^{~lc0[_6-i-A8mg?babu_\س>Hy`oqd~@[>>`ߵ}@Y=|' m;' '_j[oyd/-{k}:pϿsmY>`Y> -׀O6 , ml6p=ŷ|}6bO:?zP6缿= >%ŷg}6b瞆 ~||[}l`|[ſ.``/=?|Ͷ-m|鞖ߘzXmˀ B^l'+U#2X\[0V5PMI6A! yX72dԴ1.^ņO?|vֶ㖬WPtsLn^VN6& ׽|\o`jSI S /ZQBftgedcHVNʼn Zfu8nݳV1`aSVI۵f͊UR4'6lד_M6lٳf͛6lٳB!i!l! BrU.GW:4t%vb=Ϳ~WqVz?u\R_U**-5-4>,jo("IS5qeen5ͤd-DxRQr0X+N[dh o\d/N.x GN;*!o`p)ʜMIt+&^MMlxCP艑V2 |[Tnћ$R.kB<|yCdثŻ?7I}uy :^ڕQd@gOxbmak'+';X#>s{Gy #h :v3%U QS;#̥39au fN&œ܌ w*}K?Ď:cI> } T7U·OuӵoUb${K<]͖a~]^g1<0(kVdz³28"th 93#$h2¿b卽m_^tk'vs.sðfMLwq }k "O 10\0CD!fDp"S_=Nj> O+tMg< uoR`Pyn Iﬓ;=1Cư^x?UG#'ag Ȗȋ#o e`BS_ ۾{ (IvR8n>|nH٘ c(nLViYm?^u Mj2XeI\y>ryUu t!iyYAP'Ϊ>N/Ht[/ּ[r:?΋=.[3$RT;Ϗ"_ht_DG@VAP"VHYYxy:r1R314Բ222;+WjMpT8x)k)d4d&A +T MQ8֒2ce=$2T^U/lCњ [lѳw MG^4~U?]vE#E|hdbYY-ZZZ -`DY~/'ߟŷm~жŷ>s{ س bm, ݶ-n~~1ms]-Fmmm1|{=_`[`ه6_́6-?WߪCşu}&ŷ珫vسXgg~]?y>l}Eo[o6'o g-~;Y>~@Y?-쟪 '-Y=,-I=,vȶzY=Osm~Y?m,]Oo gߧ?P{{}ؓzݰ{`]=R_j}^ߊ{M? mŷg`݇ml{=[ط/l}>7=߹?#m; oޟŷ=0/`{,wmŷ=ş"3_j_D=o|-6=l[ؓߕ@3m,Om3#m={ŷf>ŞO/m-0[釛o>l[}3CH{mg˾zX'ş2g`ŷ= t. @.h]k򿲾7o&U;[dDA>zۭG.lՠJ$m^ 0 0 0 0G$aa 0B!Bs9Je4+6>E}cSK}ƫN<*'@)y}'=3*ZE!MNkv%V;=9>ki@<"Ҭ|^@3Uab o|Ѐ'luK}U㳰+7OɜrƷ_k4qDfT*׽.~B۸*'Ri!i<ɻE:0j o+;um@ͼQ CI+>A(2)̯\_׮i^S7W3OP wW[pjB]1qivbϧ~Ϊ9y %^}} &b^%Ss<]uE^䜣򯽱 <(^g%>NO}gU$ , ~%h WUS4@/og~LR IiZfCm%ò|~QnACnLX bo-b?6!|XF'day^;E!헡Ќ쐢S ȓ؍QkpsdjCOWN2J8YPd@YHRʦ #Ν44ZZS[O+RG08{K'=Cl緅y}Wb<e-DѶ/m/lI] ذ' qruV~顖蟷AOʎC|PRK1 Ԑr^7&"~N9VAoIT\6J_Oisš\A+Z_sĦvD/ gc@+HlsgL]TOLIukc)}]G`evCqt Wh5s1!s0pbq,6lO_!e@7Ca~^D:40 !ݨn)u(ZNpq tKo nt:#NΆ噤?bj d1SE=dKϊ}'\hP}{ U}Tz_mR4 o5L^NVƫզ1:RHNc_96 E&+x x{ъ;!k~Iv?"~~R{o?D^d]'W~`rj:e::2D6D#fTa]dUqٶ 'XȒ)4(bK*yϳnTUSDMn_ҷ}cVAs،CþijaeMdWCn$;+iW)o5"nɯ)\7?Uk3|+>s5K+&v|p%L|g$2Э9Z?ifM&*ȻvچtiIjp:x[6:0fMaU56"c[^^{:= N;k;Wsrs+%gij.5ޑ-eό3 3 -/Mb.\qr1b@+"n8aݭ = 9Yi~:[e5|idizϠ) v$HDQ]J`'Nu6F-UђT1!ѤBZ4R* ȦV9ruר{ou~7p nqޘȿK2qTwm-JXDPm <.o汧L]3ə&=f<33080?7٘y1TFfED|yvfՏĒd?I$l1dO[QdK|aFb#Ccb#iGI~?O:'dϝ? gO绞mgo>@ g@@|;~]ö`-?[o@lY[o$kK~ gƓ/m_@}5< >il~$Ym? m dϭ-߷,6߭_DYoʶ~~6%-[~-ylh߉g[=Ϳf>0->o|&߅m -2[|-ѷζyw-,ξ gÿ'[o@y{W=;m[=}ߗm_,tY4Gzw|y@[} <~Mҷ/N7t; all|5X}xO_+P²ގd̬|tT4wWCP/ ׇSG;3!vy24Mihfc{x8םw/\ rHՊiҟ6\w߾8j͊R*47G/{OL{vر^jSF ͙.TE׈~hx֣ h!QT ((((o뢱 #}E*w3ΫO;+tL.69(*ڛwMGdURG8@RGg}JžCƳ9`*1OGdT9V8ҧ]y]W x Ec:VwбQiU΀:@f.i#tt Nn_P.Q4(MП?`2g H9򫘢A{{p,o E")| ':&m*-vA KmdTLGݠgjҍkn쩿ٟQڍzu ?|Lw2{ siOuǘHL|@nht oދ^]/)C2#Q@$H"u/ =A'] 0܇$^Hh680?Hk (ȥ/J2!~tPpPے3gf <֓u؅F3GBK}|TC)N/b`+ x#Wa>wF"i b!KKУu~ @ 4H /:o]NlfkNL{/ cՏeۏ0>oOd6cF;rWQ̦-#sj$E{ j}[^"k^:\$}d %&p:e9'uqա3[uw茈kBjP~_pڛh)7k`iM /G05Dn-TȞ.r@5b gyȔ\dX4myҁTlZ<'(:n#TL{&;Β)^S ~ֽӝYc>)D{;Ys`soeT n~WV̘_LwHqMy`ٗuM^FDvbR!nW1!CI I+Td?qKxgLonBtU 08rӌ#ǁeLʬ~rB0Du^s{ֻ]ZA@lRI(zz|o;j4D<FHÛYϩ6$-O9 Мэ @=IUbF"0Vqe'RuMI)vKzr}ǓW{@(}@[TczLTJ$lB!E9CX|7wM'-B|;ݺ6!w$xwN. ̈́W}<"9~+ìba?2hc&e&?}e^5#ox&[y_OPm8ZU$L xFU| O(xPZf8V a}I8 O^gG-/7uJ [K[N 𒉴 AxP3zNALM<*G u@[F<5+ւ{0w8 ^@6*lP_BWkk0ZsK]z/[6dev#Nɦa! v\c?yc?[s&S_! uϳ['Nd5LN83ZIfr%ƐhJz$0Dʇ6A' M&V _9jq>OW~,#C$ طBE P>ddtX!А#FfCNM평B"dH889|utt6U:sF='?Ip~MuT*駶em .dŕl! U('ή^/dѡLVBF\7rQ׮Pؾ`XM6///II5z [$Y} K+Nϣ Ci輐w|:-\6j6gu3WW F8OtgbFe\/⻢\ w We:=U//[0=@G4 M%k4f%V#>|Ȝ._=`Y,o_eï NOR;}X|g"r:N[-YLnd;3hY?ƊcMVDR[umU,hsCo!%_prz7Ʉ*%'E7[RZR[~V#R$](h(J-hX.h+OX?e t6RӾʚ5_;B|eT8rrF +Q+,Vɴ*OR9q%>x)"v kE)h[5b1/?LWaGSSSQ_ 1>?8\Z:@Pʷsw??_ٖʹb"$bE-,,|Au>ռpoZ{)ʃ"PS/l/g#l唖ܗ¸.V]+^qwKF'SM"3konsmé䗅ٽʰ@NS5uS+3(Qi8˗ i#Ix_4Fzm"`CX0Ϻ 0۳kI-Ă[ h3ɎgN$O#lxɢ]xUgSpO9T:zcB<)\9$=D 8q#\W)T 7cδhD`r Ke (Mdt*@2JjQS,X0XrhPV(ք %.)Y0m4j`accx}댵^_s10"b:*T?'%+=J>JNⲆ!QJyٹ:QqOVZd(,!#dce*u) * !{1T)f ي͡!jՓR@k__yހy??o[{mjf Cmd=OIt[|~/h9xl5__ECs@o=gTSأ4Nصy1?>IEZɍ+[q!򎫿uCE3ԭfwrč޿%zgއaӂEi859 1y lOULHg xt9#)I]^%VjD"9P%'s9ٹ+x2j' )EV Źr =}6 ^IERЍ-[ᥘ´\źA:K$(>PD3+8<Gzp<Ø<ʷ!\2IpsG?3-"0(QANPYq Hu%U&ѡfrFI 2 ). ۷G;_C(Ϻ=so?~/뾓/~o[fYm86m>~ه/y]O<#Eba$:? \;^+_}uXbsvژ6* J<+,ex|fy wq_0~`ǹP oQ|*c%|ٻ҈s"Fc4caNPBc ;LZK`FD7Wz#WcS{:|DxFn*(ITxwpSBQz'51V1ٮmzn{pʕǁ[8H8Qf[M^]. > 7A) G7X4t//)Zzy'47HVvq XSKMX6/< VjZţ&(r.u;mN@o+<<_+~Os:?GzL!}?;]%Ruۥ]/䥷 +SihެuŰɡ˷jM&qi{[߇@fUbo: 78bp`6ֱw+BGmo^=ƼFE]bFJYyN=Ԇmi:2JPj2h%5cdRW+sjq1f)cIDmj7 9WNx!5V||eezseYdxdcbtyR1Ժ2:/sgwnov]&C 'z OǽxV3WxW%r#y-B:LΡ*)^99YiE0Vvq%1$W)fKthکf'hXf.Yl\:ޢ=0E[}5-G 1}7_꾫7Gԗm%m,؄@6"5͏~iz?8]l1vy$Ym6fM݋v=w;!w˚ٯ|{X؊9| ҡyKxؼ}=\RtڤygɦGZ'!2ypT-3EVQC]w7eib7+oSɱQ6:Ϝ #5afuQ%$`I&GYA!Z^Ghhxw(xfH.e}( "=/9q3rusjQ; NvX- j&ɚʊ)m>͗!66@jlֵx䇤o۶La z|ay!8oU/ 􇐋ĩU2t}K)hSdQ:,)nXܾXM#9Iѫr4TԞT TVrD[}~ʕ]x-+*32puviem֖g~a~?}尒FllPbI!{6hqE=iqleک 32硭L{o%#EfbV7a?SysmYYUk'Yy!v56m>Ev}m# ی ײWW='ŷ1Q30j. mI~ xb23l f#9`UL @9L/a qO6dB\}1 Ht$z+0ىAKOUӦY5w8pږN̞ ]O"0m.BѪ}/4Zh䇷ASڐʬl!O{q@N&I=iYі̋]3AWJ*[sBD_C5^aX$1OJ QE:d4:5tlI(H6dݲ%ZՠR+ԊH+Ap[5nu=~N]Z8̵+-D]X:먱˫Jz 0` KAZZZ ͠ R|!և 4.G5 ~cFEšTD|=>E퍇fa^#%ޱr_(y:eV>K^nE7i7N\-u"rѰ^/֓p/)8Y`K Vo{#(aiqfǰ R0< 0v1^sL؁2+DWBKۓ礴{{PQZ', Z:qޏgE$h$@uuSQ'>E\:vئT(Xf n\so ~@̈́F#Tۗf?{[mrSOViU, U쿀,-- a` sh ,Oy'ϜJ( 7P@셾a!2 %\O Pɬ95%G֎e6k+7}qsc}}gĽT[Iw~ 2x^:'Bگɛy[1E0 |5pЍ:s600ͬ7=\pgWT'S?\N3ysGs-N9u7V${ 1jά0 Ƈhå6nHOݽaTWޟugc4:uZN@L$|ؚw;%"jDBRFm#$A9s[^S+0zZ& |b,^Yii.dҝ2IϒRd&^I S63bԦL3 w-۹rӲvzj'yNǃޏ~{3}nzaH1 ۆDCmpa Z?q7x/6n;*^#kKgwkTeɾ+rvg,N!s8tUfѵW~yit%?G{w<9W#o   " YRI$ow`A?j!ǂ1L_X,bEOݖg&|K#N7ӓeUUʼn`q ʉlgҒնvp7*Ls{^`Z Te_diiI~d%GbQ+IuP bcu [OyH2kiBdrӽ<[+fU: ̒ZGMG؝9كklb+S'+ BKWL=>$E5U+Ve$Y1ԶdJ8Ġ_d*hX(hSsvOS|` #x}}ַo;WyV0.3 6ټf͓QJT J* H!Z(O'B:RAEZ;|LeRw~?jN&T֛OFT0E,-- f")lԄAg_$>7 ps$"}8 Og*G(;=GQ|0o׶wvf=-DWJISOVvݛ(;8N,W}2TU.Ik5rPVIg#%*D4lk |"+V LQQ)JJX뗱b+ xLw x\۟Ny{߫G~Go?_#࿗/?}mem%b mmI<4un6^W~w4tS { g/ў7w8y+V şOSRG!==ؒ1Gld(+_A 1_[P3u!|r2#EWYTr;3j56F~xք Z!_Lźy~G7ys-3cOmx;Ίxwt֏LPZ$M8jJ 7Iz(VOCЇ݂U䅡qem#E$"PGTM}<,QbK6CL(2f,L2btJVJb^z71`_}VOu|_e2~kߘ/AO/_vRKA `K ' I?e#F 6q(N2ۨ Yns}NEݯGUVim) IUUЖV6/5k0&oK$?tc7x_Ӷ4 F4qteqa܎Z+|ڸXiT{;cI\'1ӟe \K)4  ÍYae@- ts !/-G+I4LtS/3>ט׈zGwxig4|XH9HZg5X9N'*]wjhYuZ:\T-zBWⲞM%A.6F64Ft숑"#EX #f*GzFff6ɹ01ش !-I됦ʛh"1 X1+7w@0 CzAtV@jfS=== ]_+i+ Z5j,#J1E([\R?0sטWwio;߆u}ed_y[|>߮ Ilx#2^Nb4|})oykٝEWWZ^?~-,/3B}gH&m6\ pD˾ T+ǀXEeQRX hR}Dα.K@,Jp8n;Z|Dy }o^?ϼݥe<<G~|qzy/7vϙtxL@X{Uo@:&p>'ɴ Op oKRXPRu-j߼r0ܦwwEr<<4ϳ #YK:LWVN -Nb2Q:X>*Yu 93wsdPTU:'G`+VAjū5ɶUA?p=a_u]\\Iv f,ߓ]~aK=(E@`H^ƃo%k `!l!B`B BP5@I$$!B |= ݡvۺ]ߞDۄ BoOI{o__?_׫IR'0$H/ HK8)Xo'PGT/?EDӈ"1FyY wTP׭\sײ^DV Xe)&t7SvM'tٲo7n[\Əu]srzrM;?WWŢKP M֢Ür8rbD\hj&=6X$2JjLdsf6dt8PLTJZb8ۺ(Q$D CmmpZZZZZXXXXXX"Ϙ0Q g\roCr ;lLQ.ʼn۪ݦiuKTS#%#t]ƀQїѣ ?z6 sHӿeӝK˗&Dws̏҉*/ xJGX8br%ƌ74qٍ4mtZӡƍb1qƊM}}z4Ə(y-ɳ*ljeK2.TrK3Oz=(FP[J5*S&-X#vٳnբmԷlI$Q_ߍYI.̨Ϝ}A8?m|:^xV mG0`#8N=H=eKv] "h#N~,oM^9o ]g|Ϲ,*D!DA+mi1/}_sVӄn+,'8cclmp #"yzq߆u[\n-3?]lKb<#.\pщ8'   2 u%?q{υ?>> iih0sm,M16BOޓUrΟz]v8B(m8mm,{^3co|$1,R]jT߽YuZOtI͵9n"XN! m6m#ozO< \~{? ,LE/ZIvD |-3icd!8'`A2:'qߑpXxQO/yem-m%h"6 I['ۧ|7ovQ!x,:7j`^dc?{&Fqm8p_Đ%1/! \kI>AЀKi!%H_]%b06BB|ךG x^ ( @dY~˶@"=Vye h;9owYv]mmDy啵awt:Yv]d{$a{=.˶vDAf=}dw[[6͈D%g[;ka,mR <啵``Ɂf%[V]DB{K]m%؈4 7wI{˶d @;Kumem%h ,y%$Y_;k+i.DDRMr] MDKtβD 'wI};i.@)Fس$+k.K),/Y{k)%`6&Ir<.Zhfm;YK.Jm/umlm,b#l"&".GK A+L Wg dM~G$"i `?ڔm`Ɔ//zͿgͽ<,>vo?7?ߏ=߅w< 6~7ظٶ~3_xom谳o a^^?'LflG|QFڼ{6jÑ6mӑFڣ⍵yY[hlB [`(:ri`Jr"n23l4ei:D4+km[4i#+mQFڼ!DHyY[h 9̣m6# ܳYLColL\9TÑFͳF69nmnXYbTV6 @mß,٦T6\T6Hml,|:7yYZn2&r%$E|QZldbi(wM6 i@Dmṃv$mV**P8*Pۼ 4FȊ,,9Xo !4 b X'ojfE[t|T6+mhmlhۼP3LtngV;ţV9`:ȵMVV8c@1VY(Mg n4eihhUCNVs*g,Tf|l9b-ʴT T6 0&"3un!z!B|mÁti $A 3XZYvEClgVÙJ9` pZ@;g:T;Ѧr` pZhu'eZZleA!Ȫ CNV45FTbai8JڛceD(@'DDC7R|N]$KML{/{}=4 N+hj&=IGaFƭR'o3wq3SU1CIp!aT5US,d۰q&&:X^<%$Qmd^;DŽ0C@LEUL!? UPlLjJ :R!fe27eQZL1Y*spiDLKM{O"ǒGH$*f2s4&S`2^( Fqjd۽~&HFq+lyoǓ,,W J@\i АI-Mmh9AEޗlU+jM,fr435r4ګ4Q1"OvKeoT6ЕȻDZew(Jf1#PHPLn,|bR6mE"ۖw++oXΨ(Y-hh-Q[1pHM#Lx$jm\^HܶnhHBmnneLk-\wJʚBKfkhlm5mCL "rf$"XGW*6$E[ܶ;Y[|-ʹ 7tx;2XmaeZ$1QudmbBrES7-\ړu+mlhbsG"gl- 'וJ[hP^VVTٴ++oVV11q kC6ЫE}P4ɗnZ6D 6&QX^loW^u l-וУMT6bFfڼeCM߈7,"1BVB ^<-vm+oV*ylmmq%j  HnܶYXoUZZ*j@nZ++MgnG9 u1Rbkt٧2m%sJm$EZo$rfYHm7ͭ6Z(b8ۖ4*jɀHi֐Cm Y_6-641Q̚ǏHBhEnZg"umԬ6xWD8BDnSDxbpqnܞV*U++o[%!_*&\jVY[|vɬzp"o:<<6fQmZnem xr16@FOm^V*~*\,"lH>͐JƜ;lO-mgO>H>׶7M[U|,FYxm۞3NÑH<ݛ(v- 7ȃM[y}K 6!"3gۖݛw(T46nNc%+Ñ(lD[d9dU r7Ȍm$Qrmr* ضJln6Qw6eiPHmWn 6劆eCmn[5hc'{e 4[7"6Uvmԁ6ĉշEyYռ$6hf9VDnnܵٶ[T6KafڽZ 6D۩R nj iO4ct{,lDCLh8mԂ##6[^bVumr/ $J7#8i%7yXm^ o%9_b^6ۛK30CmT|=!#AF3xqUE HY~+^-clH=pm^xmԬܲg!nD1l{b(Z/ lG2ܱq$UIKr|5d%ixY Q&R1"܈>nQ܊6򲵤%d-͵yY_7RV5ha$7hփyNN LJlm2 ۖ6̣ĬU,vlB@2H/{r % 4Oqɇ 82mXx jb7#6Fm40844VV4ڨe@ݒ\9VYs%l,Cou[Z͕۳ynO%k6%Cؕ[y/ ÑRUVumlYD&:ekAfثE RFuErRF$E[k$r1 l,ݖl䦹f6  ɗٟ$E[f[;6+;o%nb66von@#{-V[ZvJljU$'gA"uBKr>nliyr/VJ -F۩Yն-ƱƄ Co8F6v YDn 7Dpٷ-MkbDU,, xm,,wZwpݹkIXӇo-csg"VZo-挜luF:[fZ(NE2ܰhlP(ܵCy[ZNmi1 5ϝmcR-4 [ZL[%|Fp m4+UiQwY;g3Cq c$ɬFn!8LUë(je"n;6 VWk 4 D('Kw UVW͋dٱ  C$ݹ=R9ZMԬUŻgU26*m^X[EpQI,JRW g 7nZ6$ XY!VV[[զ-|jF[PEP+knj8$JK>u+kzё{^Vp^gslޙ o3vS&J^~u+k}FmdjeZ&HudGT)<%s6 PP ,nnmo369[`u;_-l#mk@h #ZnkB4#@;k,wִ#B4[pB4Z`\]m m֮mkZֵl/[b,kWm#B5֮..ִ#B5v 5X~wmX֮.;" +b.i@@(I~9 CPP 6GhGb $ C*_F1 "H [b P$؇H@0~/ױ C@ ؆!m  B 0 _\ۀ8h1mP_`8 Iʯ61&`~ C@(@$ʸI1؆!D0 /nm %C@0 }Rͱ C@?>i P$j( 6ۀ8hW7pF&!`$0 ~UQ` pI (b@ 0D &!`$B-8$8 р`O؇H ?\B  0Fp }3p>:X` /m @h_<8D`%8 @(@/ٶ؇ sq(P_L؇H >( /b _lCi%  P`#]؋pJ%@@"~B !`"LӶ0`>p $_(pWZ _P@@#(_60 F(?B0${pmPh`|!D`/nm  %p& |! ب $mb"L_:ۀ0( pۆP(@ =ڀ ?_IPnI0`4L/n6B/@~B1& z8 @'8 C$ ~!h _`!נ!I(@(( & R_ C { C"@0@R"H!  )@P {~(DPD@1_$A`}ZFI!@"L I@P@$! FIJD IB@ B!   T P=@P B@ IB HB$D@ $ T |#Ҁ E}BI%I! j B5+OPI  IB*@O!@TWҟBI BP*?'#ĉYDQ~~-wY[Y[IX6_]!B4B%.˶Dm4m޾YK+i+m)B3&Nn}{ٶ61>*S˸31F,i﬌OMmzzzm _H҆$1 QI &D)3"@32!8 "'^0|fLQADB26DBhhhh@"XX 9h.1%| @ BL@o˶ h #3xy8tj\3Zl^])\y l߳%:5 3شaVim+ְ Ƴ4\x~̋jHI`G22!ﱠ \6T ̄ `4kCyL ץa³XX%c h3~-A֎܉yh3`5jXXF[y=f f#5 =ްuza4ύ.fdJ^Dh0V Xٰ{Ga͆\iZ01t[)!3.2f./.1jXX,by̼/Լfbg{Y-+A4^y ƲXC@, & X,a3]-N 01p ``Uaұ蘊01h2+1c6Bs`hZia k?a1w`ZE"  Ĭ2X+KAiiX sB`k<6X6Psa~1C@KTh5LƂƵ /3V4|lXb +/FS & ceGY˙1" 9Fi6azvid1  Aqb iplӰ1cq 1Ixf!hz;>LF^5c4m!jra#! +ic0ZFaՠְkV64 ^By 114K5_c% GXҵ̐/Y6fijVE5h4t”abhkPҿh e3ZZLjKA3L &b1ZZFmVq%~|CF#v^ 摅Vc4υ„3=6+SsX!Y&eZiKRev6h/ZKcX-jZZFv%|2p!#VŬh1 ^Ҽb0KK5& aYX `XDеFs1c;#1K%<=--% ^ADBK4@J尲ڸ4hսvT,i4-k6 ӱg3`^4u%qؼWh\V9w#aZF"5O [-c02Zˀ4m x5ac3bK\ђ#b4f 1Z4LUs {V+4f6yg[ Y,KLk3O i5Pv5M_00WF#EYkPK3sa!={ SM䴌FXW|A`+;4m 2̄,<hѴa0eSJbao/v]`"mDok{=R˶61COGǞk5^{=;ܾ_p'~OtoοVCo}~+fTǠ+   " " " " " " " " 'lDAADDADv " FADAC;ADADA"*h" " " " " "" " KLD@DADADDK@ykS6]hAfOI|YI. #,[eYv]DA~߽___[.˶mA! ;b[+k+i+f "3DF=}zKu,i+D!ɛhDYY[YvV,yeh"?~|o~'_|_|FV?Jgf~%7_SA/#{y>``hw8A\dwEOHϺɻ-}tNGJ'//, (vkRK&]Ue]˓muu둮}ǯ݅wa{iw6;+eݝl]l:]v|nn͕.͞˖]׃mٳnʗvt{:]{*l~G@! Do}qKPLZ--`≧^b&f}UTu!4z*,ۑ =e Qfn jAP4^,[9+U{iUQQ{y|}~_X1T=|zTl {33?e/ت* Tc`Ɨؔh Z7cla3>_T1T>}_!kL?fv?'ܨgu=- &F·eρQGʙ;m> Cu@s33_ԫnM^+ ދ*%3:osPPfg{U-af/Z3<٪X0gE趔Ti3l{9Pb})NQGљea5NTUCx.QPc ӜXC]wCV33:Tԙ __n#Ú~{-vm-37v?ZeZaf\xwQbS3: ](m3?.CBA; AѦ>Vh3<&*"ękWu5UR&fa;X1ZfxehHQ@L-{ ETUٙy~^h;9iRLyvLWf:Q ]h*m}1QG131YXe Z'rcmGșp~-*ڬG3ꊊS3-(  }T M]sr 37>MuUQ&g΋[ll^fwJ ҃m4\~;U*:[E*EAbWgQLڝ mmmTUU --Q?CEETu3IũZfr8|J0b&f=5}iW6 UH3;gqQQ}fZ3ާuUV©KvҊQ1뻏5Z­γ =%I31oib(*#&g[j̙r$"kM9yRuifweЪb1&gUsv5E*h#aeboyQQǸJ0hf_gXik30u=dbIǹm;^BQhF1s33Om**URLp]_WXQ4| dt(jh&gγP(=MKEDzə߷Ҡʌhgu.B`3{횱V#2fwZ/;cU?$m(4ts3VԹ;lE h3]J&gw:14LiˠQЀhf7FsKSEGř~uuQQT7,ǬУLTG+-b&g?6-m>6Dl&f iщ3;ԭLk҅1Qx2Eh3;m[hgwK/ZحGG3=$V >ƒhg'PVѴg\5m(6˴t(EL_=Cch3v""" # Copyright (C) 2010 Loïc Séguin-C. # All rights reserved. # BSD license. import os from nose.tools import * import networkx as nx from networkx.algorithms.flow import build_flow_dict, build_residual_network from networkx.algorithms.flow import (edmonds_karp, preflow_push, shortest_augmenting_path) flow_funcs = [edmonds_karp, preflow_push, shortest_augmenting_path] msg = "Assertion failed in function: {0}" def gen_pyramid(N): # This graph admits a flow of value 1 for which every arc is at # capacity (except the arcs incident to the sink which have # infinite capacity). G = nx.DiGraph() for i in range(N - 1): cap = 1. / (i + 2) for j in range(i + 1): G.add_edge((i, j), (i + 1, j), capacity = cap) cap = 1. / (i + 1) - cap G.add_edge((i, j), (i + 1, j + 1), capacity = cap) cap = 1. / (i + 2) - cap for j in range(N): G.add_edge((N - 1, j), 't') return G def read_graph(name): dirname = os.path.dirname(__file__) path = os.path.join(dirname, name + '.gpickle.bz2') return nx.read_gpickle(path) def validate_flows(G, s, t, soln_value, R, flow_func): flow_value = R.graph['flow_value'] flow_dict = build_flow_dict(G, R) assert_equal(soln_value, flow_value, msg=msg.format(flow_func.__name__)) assert_equal(set(G), set(flow_dict), msg=msg.format(flow_func.__name__)) for u in G: assert_equal(set(G[u]), set(flow_dict[u]), msg=msg.format(flow_func.__name__)) excess = dict((u, 0) for u in flow_dict) for u in flow_dict: for v, flow in flow_dict[u].items(): ok_(flow <= G[u][v].get('capacity', float('inf')), msg=msg.format(flow_func.__name__)) ok_(flow >= 0, msg=msg.format(flow_func.__name__)) excess[u] -= flow excess[v] += flow for u, exc in excess.items(): if u == s: assert_equal(exc, -soln_value, msg=msg.format(flow_func.__name__)) elif u == t: assert_equal(exc, soln_value, msg=msg.format(flow_func.__name__)) else: assert_equal(exc, 0, msg=msg.format(flow_func.__name__)) class TestMaxflowLargeGraph: def test_complete_graph(self): N = 50 G = nx.complete_graph(N) nx.set_edge_attributes(G, 'capacity', 5) R = build_residual_network(G, 'capacity') kwargs = dict(residual=R) for flow_func in flow_funcs: kwargs['flow_func'] = flow_func flow_value = nx.maximum_flow_value(G, 1, 2, **kwargs) assert_equal(flow_value, 5 * (N - 1), msg=msg.format(flow_func.__name__)) def test_pyramid(self): N = 10 #N = 100 # this gives a graph with 5051 nodes G = gen_pyramid(N) R = build_residual_network(G, 'capacity') kwargs = dict(residual=R) for flow_func in flow_funcs: kwargs['flow_func'] = flow_func flow_value = nx.maximum_flow_value(G, (0, 0), 't', **kwargs) assert_almost_equal(flow_value, 1., msg=msg.format(flow_func.__name__)) def test_gl1(self): G = read_graph('gl1') s = 1 t = len(G) R = build_residual_network(G, 'capacity') kwargs = dict(residual=R) for flow_func in flow_funcs: validate_flows(G, s, t, 156545, flow_func(G, s, t, **kwargs), flow_func) def test_gw1(self): G = read_graph('gw1') s = 1 t = len(G) R = build_residual_network(G, 'capacity') kwargs = dict(residual=R) for flow_func in flow_funcs: validate_flows(G, s, t, 1202018, flow_func(G, s, t, **kwargs), flow_func) def test_wlm3(self): G = read_graph('wlm3') s = 1 t = len(G) R = build_residual_network(G, 'capacity') kwargs = dict(residual=R) for flow_func in flow_funcs: validate_flows(G, s, t, 11875108, flow_func(G, s, t, **kwargs), flow_func) def test_preflow_push_global_relabel(self): G = read_graph('gw1') R = preflow_push(G, 1, len(G), global_relabel_freq=50) assert_equal(R.graph['flow_value'], 1202018) networkx-1.11/networkx/algorithms/flow/tests/test_maxflow.py0000644000175000017500000004275112637544450024447 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """Maximum flow algorithms test suite. """ from nose.tools import * import networkx as nx from networkx.algorithms.flow import build_flow_dict, build_residual_network from networkx.algorithms.flow import edmonds_karp, preflow_push, shortest_augmenting_path flow_funcs = [edmonds_karp, preflow_push, shortest_augmenting_path] max_min_funcs = [nx.maximum_flow, nx.minimum_cut] flow_value_funcs = [nx.maximum_flow_value, nx.minimum_cut_value] interface_funcs = sum([max_min_funcs, flow_value_funcs], []) all_funcs = sum([flow_funcs, interface_funcs], []) msg = "Assertion failed in function: {0}" msgi = "Assertion failed in function: {0} in interface {1}" def compute_cutset(G, partition): reachable, non_reachable = partition cutset = set() for u, nbrs in ((n, G[n]) for n in reachable): cutset.update((u, v) for v in nbrs if v in non_reachable) return cutset def validate_flows(G, s, t, flowDict, solnValue, capacity, flow_func): assert_equal(set(G), set(flowDict), msg=msg.format(flow_func.__name__)) for u in G: assert_equal(set(G[u]), set(flowDict[u]), msg=msg.format(flow_func.__name__)) excess = dict((u, 0) for u in flowDict) for u in flowDict: for v, flow in flowDict[u].items(): if capacity in G[u][v]: ok_(flow <= G[u][v][capacity]) ok_(flow >= 0, msg=msg.format(flow_func.__name__)) excess[u] -= flow excess[v] += flow for u, exc in excess.items(): if u == s: assert_equal(exc, -solnValue, msg=msg.format(flow_func.__name__)) elif u == t: assert_equal(exc, solnValue, msg=msg.format(flow_func.__name__)) else: assert_equal(exc, 0, msg=msg.format(flow_func.__name__)) def validate_cuts(G, s, t, solnValue, partition, capacity, flow_func): assert_true(all(n in G for n in partition[0]), msg=msg.format(flow_func.__name__)) assert_true(all(n in G for n in partition[1]), msg=msg.format(flow_func.__name__)) cutset = compute_cutset(G, partition) assert_true(all(G.has_edge(u, v) for (u, v) in cutset), msg=msg.format(flow_func.__name__)) assert_equal(solnValue, sum(G[u][v][capacity] for (u, v) in cutset), msg=msg.format(flow_func.__name__)) H = G.copy() H.remove_edges_from(cutset) if not G.is_directed(): assert_false(nx.is_connected(H), msg=msg.format(flow_func.__name__)) else: assert_false(nx.is_strongly_connected(H), msg=msg.format(flow_func.__name__)) def compare_flows_and_cuts(G, s, t, solnFlows, solnValue, capacity='capacity'): for flow_func in flow_funcs: R = flow_func(G, s, t, capacity) # Test both legacy and new implementations. flow_value = R.graph['flow_value'] flow_dict = build_flow_dict(G, R) assert_equal(flow_value, solnValue, msg=msg.format(flow_func.__name__)) validate_flows(G, s, t, flow_dict, solnValue, capacity, flow_func) # Minimum cut cut_value, partition = nx.minimum_cut(G, s, t, capacity=capacity, flow_func=flow_func) validate_cuts(G, s, t, solnValue, partition, capacity, flow_func) class TestMaxflowMinCutCommon: def test_graph1(self): # Trivial undirected graph G = nx.Graph() G.add_edge(1,2, capacity = 1.0) solnFlows = {1: {2: 1.0}, 2: {1: 1.0}} compare_flows_and_cuts(G, 1, 2, solnFlows, 1.0) def test_graph2(self): # A more complex undirected graph # adapted from www.topcoder.com/tc?module=Statc&d1=tutorials&d2=maxFlow G = nx.Graph() G.add_edge('x','a', capacity = 3.0) G.add_edge('x','b', capacity = 1.0) G.add_edge('a','c', capacity = 3.0) G.add_edge('b','c', capacity = 5.0) G.add_edge('b','d', capacity = 4.0) G.add_edge('d','e', capacity = 2.0) G.add_edge('c','y', capacity = 2.0) G.add_edge('e','y', capacity = 3.0) H = {'x': {'a': 3, 'b': 1}, 'a': {'c': 3, 'x': 3}, 'b': {'c': 1, 'd': 2, 'x': 1}, 'c': {'a': 3, 'b': 1, 'y': 2}, 'd': {'b': 2, 'e': 2}, 'e': {'d': 2, 'y': 2}, 'y': {'c': 2, 'e': 2}} compare_flows_and_cuts(G, 'x', 'y', H, 4.0) def test_digraph1(self): # The classic directed graph example G = nx.DiGraph() G.add_edge('a','b', capacity = 1000.0) G.add_edge('a','c', capacity = 1000.0) G.add_edge('b','c', capacity = 1.0) G.add_edge('b','d', capacity = 1000.0) G.add_edge('c','d', capacity = 1000.0) H = {'a': {'b': 1000.0, 'c': 1000.0}, 'b': {'c': 0, 'd': 1000.0}, 'c': {'d': 1000.0}, 'd': {}} compare_flows_and_cuts(G, 'a', 'd', H, 2000.0) def test_digraph2(self): # An example in which some edges end up with zero flow. G = nx.DiGraph() G.add_edge('s', 'b', capacity = 2) G.add_edge('s', 'c', capacity = 1) G.add_edge('c', 'd', capacity = 1) G.add_edge('d', 'a', capacity = 1) G.add_edge('b', 'a', capacity = 2) G.add_edge('a', 't', capacity = 2) H = {'s': {'b': 2, 'c': 0}, 'c': {'d': 0}, 'd': {'a': 0}, 'b': {'a': 2}, 'a': {'t': 2}, 't': {}} compare_flows_and_cuts(G, 's', 't', H, 2) def test_digraph3(self): # A directed graph example from Cormen et al. G = nx.DiGraph() G.add_edge('s','v1', capacity = 16.0) G.add_edge('s','v2', capacity = 13.0) G.add_edge('v1','v2', capacity = 10.0) G.add_edge('v2','v1', capacity = 4.0) G.add_edge('v1','v3', capacity = 12.0) G.add_edge('v3','v2', capacity = 9.0) G.add_edge('v2','v4', capacity = 14.0) G.add_edge('v4','v3', capacity = 7.0) G.add_edge('v3','t', capacity = 20.0) G.add_edge('v4','t', capacity = 4.0) H = {'s': {'v1': 12.0, 'v2': 11.0}, 'v2': {'v1': 0, 'v4': 11.0}, 'v1': {'v2': 0, 'v3': 12.0}, 'v3': {'v2': 0, 't': 19.0}, 'v4': {'v3': 7.0, 't': 4.0}, 't': {}} compare_flows_and_cuts(G, 's', 't', H, 23.0) def test_digraph4(self): # A more complex directed graph # from www.topcoder.com/tc?module=Statc&d1=tutorials&d2=maxFlow G = nx.DiGraph() G.add_edge('x','a', capacity = 3.0) G.add_edge('x','b', capacity = 1.0) G.add_edge('a','c', capacity = 3.0) G.add_edge('b','c', capacity = 5.0) G.add_edge('b','d', capacity = 4.0) G.add_edge('d','e', capacity = 2.0) G.add_edge('c','y', capacity = 2.0) G.add_edge('e','y', capacity = 3.0) H = {'x': {'a': 2.0, 'b': 1.0}, 'a': {'c': 2.0}, 'b': {'c': 0, 'd': 1.0}, 'c': {'y': 2.0}, 'd': {'e': 1.0}, 'e': {'y': 1.0}, 'y': {}} compare_flows_and_cuts(G, 'x', 'y', H, 3.0) def test_optional_capacity(self): # Test optional capacity parameter. G = nx.DiGraph() G.add_edge('x','a', spam = 3.0) G.add_edge('x','b', spam = 1.0) G.add_edge('a','c', spam = 3.0) G.add_edge('b','c', spam = 5.0) G.add_edge('b','d', spam = 4.0) G.add_edge('d','e', spam = 2.0) G.add_edge('c','y', spam = 2.0) G.add_edge('e','y', spam = 3.0) solnFlows = {'x': {'a': 2.0, 'b': 1.0}, 'a': {'c': 2.0}, 'b': {'c': 0, 'd': 1.0}, 'c': {'y': 2.0}, 'd': {'e': 1.0}, 'e': {'y': 1.0}, 'y': {}} solnValue = 3.0 s = 'x' t = 'y' compare_flows_and_cuts(G, s, t, solnFlows, solnValue, capacity = 'spam') def test_digraph_infcap_edges(self): # DiGraph with infinite capacity edges G = nx.DiGraph() G.add_edge('s', 'a') G.add_edge('s', 'b', capacity = 30) G.add_edge('a', 'c', capacity = 25) G.add_edge('b', 'c', capacity = 12) G.add_edge('a', 't', capacity = 60) G.add_edge('c', 't') H = {'s': {'a': 85, 'b': 12}, 'a': {'c': 25, 't': 60}, 'b': {'c': 12}, 'c': {'t': 37}, 't': {}} compare_flows_and_cuts(G, 's', 't', H, 97) # DiGraph with infinite capacity digon G = nx.DiGraph() G.add_edge('s', 'a', capacity = 85) G.add_edge('s', 'b', capacity = 30) G.add_edge('a', 'c') G.add_edge('c', 'a') G.add_edge('b', 'c', capacity = 12) G.add_edge('a', 't', capacity = 60) G.add_edge('c', 't', capacity = 37) H = {'s': {'a': 85, 'b': 12}, 'a': {'c': 25, 't': 60}, 'c': {'a': 0, 't': 37}, 'b': {'c': 12}, 't': {}} compare_flows_and_cuts(G, 's', 't', H, 97) def test_digraph_infcap_path(self): # Graph with infinite capacity (s, t)-path G = nx.DiGraph() G.add_edge('s', 'a') G.add_edge('s', 'b', capacity = 30) G.add_edge('a', 'c') G.add_edge('b', 'c', capacity = 12) G.add_edge('a', 't', capacity = 60) G.add_edge('c', 't') for flow_func in all_funcs: assert_raises(nx.NetworkXUnbounded, flow_func, G, 's', 't') def test_graph_infcap_edges(self): # Undirected graph with infinite capacity edges G = nx.Graph() G.add_edge('s', 'a') G.add_edge('s', 'b', capacity = 30) G.add_edge('a', 'c', capacity = 25) G.add_edge('b', 'c', capacity = 12) G.add_edge('a', 't', capacity = 60) G.add_edge('c', 't') H = {'s': {'a': 85, 'b': 12}, 'a': {'c': 25, 's': 85, 't': 60}, 'b': {'c': 12, 's': 12}, 'c': {'a': 25, 'b': 12, 't': 37}, 't': {'a': 60, 'c': 37}} compare_flows_and_cuts(G, 's', 't', H, 97) def test_digraph4(self): # From ticket #429 by mfrasca. G = nx.DiGraph() G.add_edge('s', 'a', capacity = 2) G.add_edge('s', 'b', capacity = 2) G.add_edge('a', 'b', capacity = 5) G.add_edge('a', 't', capacity = 1) G.add_edge('b', 'a', capacity = 1) G.add_edge('b', 't', capacity = 3) flowSoln = {'a': {'b': 1, 't': 1}, 'b': {'a': 0, 't': 3}, 's': {'a': 2, 'b': 2}, 't': {}} compare_flows_and_cuts(G, 's', 't', flowSoln, 4) def test_disconnected(self): G = nx.Graph() G.add_weighted_edges_from([(0,1,1),(1,2,1),(2,3,1)],weight='capacity') G.remove_node(1) assert_equal(nx.maximum_flow_value(G,0,3), 0) flowSoln = {0: {}, 2: {3: 0}, 3: {2: 0}} compare_flows_and_cuts(G, 0, 3, flowSoln, 0) def test_source_target_not_in_graph(self): G = nx.Graph() G.add_weighted_edges_from([(0,1,1),(1,2,1),(2,3,1)],weight='capacity') G.remove_node(0) for flow_func in all_funcs: assert_raises(nx.NetworkXError, flow_func, G, 0, 3) G.add_weighted_edges_from([(0,1,1),(1,2,1),(2,3,1)],weight='capacity') G.remove_node(3) for flow_func in all_funcs: assert_raises(nx.NetworkXError, flow_func, G, 0, 3) def test_source_target_coincide(self): G = nx.Graph() G.add_node(0) for flow_func in all_funcs: assert_raises(nx.NetworkXError, flow_func, G, 0, 0) def test_multigraphs_raise(self): G = nx.MultiGraph() M = nx.MultiDiGraph() G.add_edges_from([(0, 1), (1, 0)], capacity=True) for flow_func in all_funcs: assert_raises(nx.NetworkXError, flow_func, G, 0, 0) class TestMaxFlowMinCutInterface: def setup(self): G = nx.DiGraph() G.add_edge('x','a', capacity = 3.0) G.add_edge('x','b', capacity = 1.0) G.add_edge('a','c', capacity = 3.0) G.add_edge('b','c', capacity = 5.0) G.add_edge('b','d', capacity = 4.0) G.add_edge('d','e', capacity = 2.0) G.add_edge('c','y', capacity = 2.0) G.add_edge('e','y', capacity = 3.0) self.G = G H = nx.DiGraph() H.add_edge(0, 1, capacity = 1.0) H.add_edge(1, 2, capacity = 1.0) self.H = H def test_flow_func_not_callable(self): elements = ['this_should_be_callable', 10, set([1,2,3])] G = nx.Graph() G.add_weighted_edges_from([(0,1,1),(1,2,1),(2,3,1)], weight='capacity') for flow_func in interface_funcs: for element in elements: assert_raises(nx.NetworkXError, flow_func, G, 0, 1, flow_func=element) assert_raises(nx.NetworkXError, flow_func, G, 0, 1, flow_func=element) def test_flow_func_parameters(self): G = self.G fv = 3.0 for interface_func in interface_funcs: for flow_func in flow_funcs: result = interface_func(G, 'x', 'y', flow_func=flow_func) if interface_func in max_min_funcs: result = result[0] assert_equal(fv, result, msg=msgi.format(flow_func.__name__, interface_func.__name__)) def test_minimum_cut_no_cutoff(self): G = self.G for flow_func in flow_funcs: assert_raises(nx.NetworkXError, nx.minimum_cut, G, 'x', 'y', flow_func=flow_func, cutoff=1.0) assert_raises(nx.NetworkXError, nx.minimum_cut_value, G, 'x', 'y', flow_func=flow_func, cutoff=1.0) def test_kwargs(self): G = self.H fv = 1.0 to_test = ( (shortest_augmenting_path, dict(two_phase=True)), (preflow_push, dict(global_relabel_freq=5)), ) for interface_func in interface_funcs: for flow_func, kwargs in to_test: result = interface_func(G, 0, 2, flow_func=flow_func, **kwargs) if interface_func in max_min_funcs: result = result[0] assert_equal(fv, result, msg=msgi.format(flow_func.__name__, interface_func.__name__)) def test_kwargs_default_flow_func(self): G = self.H for interface_func in interface_funcs: assert_raises(nx.NetworkXError, interface_func, G, 0, 1, global_relabel_freq=2) def test_reusing_residual(self): G = self.G fv = 3.0 s, t = 'x', 'y' R = build_residual_network(G, 'capacity') for interface_func in interface_funcs: for flow_func in flow_funcs: for i in range(3): result = interface_func(G, 'x', 'y', flow_func=flow_func, residual=R) if interface_func in max_min_funcs: result = result[0] assert_equal(fv, result, msg=msgi.format(flow_func.__name__, interface_func.__name__)) # Tests specific to one algorithm def test_preflow_push_global_relabel_freq(): G = nx.DiGraph() G.add_edge(1, 2, capacity=1) R = preflow_push(G, 1, 2, global_relabel_freq=None) assert_equal(R.graph['flow_value'], 1) assert_raises(nx.NetworkXError, preflow_push, G, 1, 2, global_relabel_freq=-1) def test_preflow_push_makes_enough_space(): #From ticket #1542 G = nx.DiGraph() G.add_path([0, 1, 3], capacity=1) G.add_path([1, 2, 3], capacity=1) R = preflow_push(G, 0, 3, value_only=False) assert_equal(R.graph['flow_value'], 1) def test_shortest_augmenting_path_two_phase(): k = 5 p = 1000 G = nx.DiGraph() for i in range(k): G.add_edge('s', (i, 0), capacity=1) G.add_path(((i, j) for j in range(p)), capacity=1) G.add_edge((i, p - 1), 't', capacity=1) R = shortest_augmenting_path(G, 's', 't', two_phase=True) assert_equal(R.graph['flow_value'], k) R = shortest_augmenting_path(G, 's', 't', two_phase=False) assert_equal(R.graph['flow_value'], k) class TestCutoff: def test_cutoff(self): k = 5 p = 1000 G = nx.DiGraph() for i in range(k): G.add_edge('s', (i, 0), capacity=2) G.add_path(((i, j) for j in range(p)), capacity=2) G.add_edge((i, p - 1), 't', capacity=2) R = shortest_augmenting_path(G, 's', 't', two_phase=True, cutoff=k) ok_(k <= R.graph['flow_value'] <= 2 * k) R = shortest_augmenting_path(G, 's', 't', two_phase=False, cutoff=k) ok_(k <= R.graph['flow_value'] <= 2 * k) R = edmonds_karp(G, 's', 't', cutoff=k) ok_(k <= R.graph['flow_value'] <= 2 * k) def test_complete_graph_cutoff(self): G = nx.complete_graph(5) nx.set_edge_attributes(G, 'capacity', dict(((u, v), 1) for u, v in G.edges())) for flow_func in [shortest_augmenting_path, edmonds_karp]: for cutoff in [3, 2, 1]: result = nx.maximum_flow_value(G, 0, 4, flow_func=flow_func, cutoff=cutoff) assert_equal(cutoff, result, msg="cutoff error in {0}".format(flow_func.__name__)) networkx-1.11/networkx/algorithms/flow/networksimplex.py0000644000175000017500000005100112637544500023644 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ Minimum cost flow algorithms on directed connected graphs. """ __author__ = """Loïc Séguin-C. """ # Copyright (C) 2010 Loïc Séguin-C. # All rights reserved. # BSD license. __all__ = ['network_simplex'] from itertools import chain, islice, repeat from math import ceil, sqrt import networkx as nx from networkx.utils import not_implemented_for try: from itertools import izip as zip except ImportError: pass try: range = xrange except NameError: pass @not_implemented_for('undirected') def network_simplex(G, demand='demand', capacity='capacity', weight='weight'): r"""Find a minimum cost flow satisfying all demands in digraph G. This is a primal network simplex algorithm that uses the leaving arc rule to prevent cycling. G is a digraph with edge costs and capacities and in which nodes have demand, i.e., they want to send or receive some amount of flow. A negative demand means that the node wants to send flow, a positive demand means that the node want to receive flow. A flow on the digraph G satisfies all demand if the net flow into each node is equal to the demand of that node. Parameters ---------- G : NetworkX graph DiGraph on which a minimum cost flow satisfying all demands is to be found. demand : string Nodes of the graph G are expected to have an attribute demand that indicates how much flow a node wants to send (negative demand) or receive (positive demand). Note that the sum of the demands should be 0 otherwise the problem in not feasible. If this attribute is not present, a node is considered to have 0 demand. Default value: 'demand'. capacity : string Edges of the graph G are expected to have an attribute capacity that indicates how much flow the edge can support. If this attribute is not present, the edge is considered to have infinite capacity. Default value: 'capacity'. weight : string Edges of the graph G are expected to have an attribute weight that indicates the cost incurred by sending one unit of flow on that edge. If not present, the weight is considered to be 0. Default value: 'weight'. Returns ------- flowCost : integer, float Cost of a minimum cost flow satisfying all demands. flowDict : dictionary Dictionary of dictionaries keyed by nodes such that flowDict[u][v] is the flow edge (u, v). Raises ------ NetworkXError This exception is raised if the input graph is not directed, not connected or is a multigraph. NetworkXUnfeasible This exception is raised in the following situations: * The sum of the demands is not zero. Then, there is no flow satisfying all demands. * There is no flow satisfying all demand. NetworkXUnbounded This exception is raised if the digraph G has a cycle of negative cost and infinite capacity. Then, the cost of a flow satisfying all demands is unbounded below. Notes ----- This algorithm is not guaranteed to work if edge weights are floating point numbers (overflows and roundoff errors can cause problems). See also -------- cost_of_flow, max_flow_min_cost, min_cost_flow, min_cost_flow_cost Examples -------- A simple example of a min cost flow problem. >>> import networkx as nx >>> G = nx.DiGraph() >>> G.add_node('a', demand=-5) >>> G.add_node('d', demand=5) >>> G.add_edge('a', 'b', weight=3, capacity=4) >>> G.add_edge('a', 'c', weight=6, capacity=10) >>> G.add_edge('b', 'd', weight=1, capacity=9) >>> G.add_edge('c', 'd', weight=2, capacity=5) >>> flowCost, flowDict = nx.network_simplex(G) >>> flowCost 24 >>> flowDict # doctest: +SKIP {'a': {'c': 1, 'b': 4}, 'c': {'d': 1}, 'b': {'d': 4}, 'd': {}} The mincost flow algorithm can also be used to solve shortest path problems. To find the shortest path between two nodes u and v, give all edges an infinite capacity, give node u a demand of -1 and node v a demand a 1. Then run the network simplex. The value of a min cost flow will be the distance between u and v and edges carrying positive flow will indicate the path. >>> G=nx.DiGraph() >>> G.add_weighted_edges_from([('s', 'u' ,10), ('s' ,'x' ,5), ... ('u', 'v' ,1), ('u' ,'x' ,2), ... ('v', 'y' ,1), ('x' ,'u' ,3), ... ('x', 'v' ,5), ('x' ,'y' ,2), ... ('y', 's' ,7), ('y' ,'v' ,6)]) >>> G.add_node('s', demand = -1) >>> G.add_node('v', demand = 1) >>> flowCost, flowDict = nx.network_simplex(G) >>> flowCost == nx.shortest_path_length(G, 's', 'v', weight='weight') True >>> sorted([(u, v) for u in flowDict for v in flowDict[u] if flowDict[u][v] > 0]) [('s', 'x'), ('u', 'v'), ('x', 'u')] >>> nx.shortest_path(G, 's', 'v', weight = 'weight') ['s', 'x', 'u', 'v'] It is possible to change the name of the attributes used for the algorithm. >>> G = nx.DiGraph() >>> G.add_node('p', spam=-4) >>> G.add_node('q', spam=2) >>> G.add_node('a', spam=-2) >>> G.add_node('d', spam=-1) >>> G.add_node('t', spam=2) >>> G.add_node('w', spam=3) >>> G.add_edge('p', 'q', cost=7, vacancies=5) >>> G.add_edge('p', 'a', cost=1, vacancies=4) >>> G.add_edge('q', 'd', cost=2, vacancies=3) >>> G.add_edge('t', 'q', cost=1, vacancies=2) >>> G.add_edge('a', 't', cost=2, vacancies=4) >>> G.add_edge('d', 'w', cost=3, vacancies=4) >>> G.add_edge('t', 'w', cost=4, vacancies=1) >>> flowCost, flowDict = nx.network_simplex(G, demand='spam', ... capacity='vacancies', ... weight='cost') >>> flowCost 37 >>> flowDict # doctest: +SKIP {'a': {'t': 4}, 'd': {'w': 2}, 'q': {'d': 1}, 'p': {'q': 2, 'a': 2}, 't': {'q': 1, 'w': 1}, 'w': {}} References ---------- .. [1] Z. Kiraly, P. Kovacs. Efficient implementation of minimum-cost flow algorithms. Acta Universitatis Sapientiae, Informatica 4(1):67--118. 2012. .. [2] R. Barr, F. Glover, D. Klingman. Enhancement of spanning tree labeling procedures for network optimization. INFOR 17(1):16--34. 1979. """ ########################################################################### # Problem essentials extraction and sanity check ########################################################################### if len(G) == 0: raise nx.NetworkXError('graph has no nodes') # Number all nodes and edges and hereafter reference them using ONLY their # numbers N = list(G) # nodes I = {u: i for i, u in enumerate(N)} # node indices D = [G.node[u].get(demand, 0) for u in N] # node demands inf = float('inf') for p, b in zip(N, D): if abs(b) == inf: raise nx.NetworkXError('node %r has infinite demand' % (p,)) multigraph = G.is_multigraph() S = [] # edge sources T = [] # edge targets if multigraph: K = [] # edge keys E = {} # edge indices U = [] # edge capacities C = [] # edge weights if not multigraph: edges = G.edges_iter(data=True) else: edges = G.edges_iter(data=True, keys=True) edges = (e for e in edges if e[0] != e[1] and e[-1].get(capacity, inf) != 0) for i, e in enumerate(edges): S.append(I[e[0]]) T.append(I[e[1]]) if multigraph: K.append(e[2]) E[e[:-1]] = i U.append(e[-1].get(capacity, inf)) C.append(e[-1].get(weight, 0)) for e, c in zip(E, C): if abs(c) == inf: raise nx.NetworkXError('edge %r has infinite weight' % (e,)) if not multigraph: edges = G.selfloop_edges(data=True) else: edges = G.selfloop_edges(data=True, keys=True) for e in edges: if abs(e[-1].get(weight, 0)) == inf: raise nx.NetworkXError('edge %r has infinite weight' % (e[:-1],)) ########################################################################### # Quick infeasibility detection ########################################################################### if sum(D) != 0: raise nx.NetworkXUnfeasible('total node demand is not zero') for e, u in zip(E, U): if u < 0: raise nx.NetworkXUnfeasible('edge %r has negative capacity' % (e,)) if not multigraph: edges = G.selfloop_edges(data=True) else: edges = G.selfloop_edges(data=True, keys=True) for e in edges: if e[-1].get(capacity, inf) < 0: raise nx.NetworkXUnfeasible( 'edge %r has negative capacity' % (e[:-1],)) ########################################################################### # Initialization ########################################################################### # Add a dummy node -1 and connect all existing nodes to it with infinite- # capacity dummy edges. Node -1 will serve as the root of the # spanning tree of the network simplex method. The new edges will used to # trivially satisfy the node demands and create an initial strongly # feasible spanning tree. n = len(N) # number of nodes for p, d in enumerate(D): if d > 0: # Must be greater-than here. Zero-demand nodes must have # edges pointing towards the root to ensure strong # feasibility. S.append(-1) T.append(p) else: S.append(p) T.append(-1) faux_inf = 3 * max(chain([sum(u for u in U if u < inf), sum(abs(c) for c in C)], (abs(d) for d in D))) or 1 C.extend(repeat(faux_inf, n)) U.extend(repeat(faux_inf, n)) # Construct the initial spanning tree. e = len(E) # number of edges x = list(chain(repeat(0, e), (abs(d) for d in D))) # edge flows pi = [faux_inf if d <= 0 else -faux_inf for d in D] # node potentials parent = list(chain(repeat(-1, n), [None])) # parent nodes edge = list(range(e, e + n)) # edges to parents size = list(chain(repeat(1, n), [n + 1])) # subtree sizes next = list(chain(range(1, n), [-1, 0])) # next nodes in depth-first thread prev = list(range(-1, n)) # previous nodes in depth-first thread last = list(chain(range(n), [n - 1])) # last descendants in depth-first thread ########################################################################### # Pivot loop ########################################################################### def reduced_cost(i): """Return the reduced cost of an edge i. """ c = C[i] - pi[S[i]] + pi[T[i]] return c if x[i] == 0 else -c def find_entering_edges(): """Yield entering edges until none can be found. """ if e == 0: return # Entering edges are found by combining Dantzig's rule and Bland's # rule. The edges are cyclically grouped into blocks of size B. Within # each block, Dantzig's rule is applied to find an entering edge. The # blocks to search is determined following Bland's rule. B = int(ceil(sqrt(e))) # pivot block size M = (e + B - 1) // B # number of blocks needed to cover all edges m = 0 # number of consecutive blocks without eligible # entering edges f = 0 # first edge in block while m < M: # Determine the next block of edges. l = f + B if l <= e: edges = range(f, l) else: l -= e edges = chain(range(f, e), range(l)) f = l # Find the first edge with the lowest reduced cost. i = min(edges, key=reduced_cost) c = reduced_cost(i) if c >= 0: # No entering edge found in the current block. m += 1 else: # Entering edge found. if x[i] == 0: p = S[i] q = T[i] else: p = T[i] q = S[i] yield i, p, q m = 0 # All edges have nonnegative reduced costs. The current flow is # optimal. def find_apex(p, q): """Find the lowest common ancestor of nodes p and q in the spanning tree. """ size_p = size[p] size_q = size[q] while True: while size_p < size_q: p = parent[p] size_p = size[p] while size_p > size_q: q = parent[q] size_q = size[q] if size_p == size_q: if p != q: p = parent[p] size_p = size[p] q = parent[q] size_q = size[q] else: return p def trace_path(p, w): """Return the nodes and edges on the path from node p to its ancestor w. """ Wn = [p] We = [] while p != w: We.append(edge[p]) p = parent[p] Wn.append(p) return Wn, We def find_cycle(i, p, q): """Return the nodes and edges on the cycle containing edge i == (p, q) when the latter is added to the spanning tree. The cycle is oriented in the direction from p to q. """ w = find_apex(p, q) Wn, We = trace_path(p, w) Wn.reverse() We.reverse() We.append(i) WnR, WeR = trace_path(q, w) del WnR[-1] Wn += WnR We += WeR return Wn, We def residual_capacity(i, p): """Return the residual capacity of an edge i in the direction away from its endpoint p. """ return U[i] - x[i] if S[i] == p else x[i] def find_leaving_edge(Wn, We): """Return the leaving edge in a cycle represented by Wn and We. """ j, s = min(zip(reversed(We), reversed(Wn)), key=lambda i_p: residual_capacity(*i_p)) t = T[j] if S[j] == s else S[j] return j, s, t def augment_flow(Wn, We, f): """Augment f units of flow along a cycle represented by Wn and We. """ for i, p in zip(We, Wn): if S[i] == p: x[i] += f else: x[i] -= f def trace_subtree(p): """Yield the nodes in the subtree rooted at a node p. """ yield p l = last[p] while p != l: p = next[p] yield p def remove_edge(s, t): """Remove an edge (s, t) where parent[t] == s from the spanning tree. """ size_t = size[t] prev_t = prev[t] last_t = last[t] next_last_t = next[last_t] # Remove (s, t). parent[t] = None edge[t] = None # Remove the subtree rooted at t from the depth-first thread. next[prev_t] = next_last_t prev[next_last_t] = prev_t next[last_t] = t prev[t] = last_t # Update the subtree sizes and last descendants of the (old) acenstors # of t. while s is not None: size[s] -= size_t if last[s] == last_t: last[s] = prev_t s = parent[s] def make_root(q): """Make a node q the root of its containing subtree. """ ancestors = [] while q is not None: ancestors.append(q) q = parent[q] ancestors.reverse() for p, q in zip(ancestors, islice(ancestors, 1, None)): size_p = size[p] last_p = last[p] prev_q = prev[q] last_q = last[q] next_last_q = next[last_q] # Make p a child of q. parent[p] = q parent[q] = None edge[p] = edge[q] edge[q] = None size[p] = size_p - size[q] size[q] = size_p # Remove the subtree rooted at q from the depth-first thread. next[prev_q] = next_last_q prev[next_last_q] = prev_q next[last_q] = q prev[q] = last_q if last_p == last_q: last[p] = prev_q last_p = prev_q # Add the remaining parts of the subtree rooted at p as a subtree # of q in the depth-first thread. prev[p] = last_q next[last_q] = p next[last_p] = q prev[q] = last_p last[q] = last_p def add_edge(i, p, q): """Add an edge (p, q) to the spanning tree where q is the root of a subtree. """ last_p = last[p] next_last_p = next[last_p] size_q = size[q] last_q = last[q] # Make q a child of p. parent[q] = p edge[q] = i # Insert the subtree rooted at q into the depth-first thread. next[last_p] = q prev[q] = last_p prev[next_last_p] = last_q next[last_q] = next_last_p # Update the subtree sizes and last descendants of the (new) ancestors # of q. while p is not None: size[p] += size_q if last[p] == last_p: last[p] = last_q p = parent[p] def update_potentials(i, p, q): """Update the potentials of the nodes in the subtree rooted at a node q connected to its parent p by an edge i. """ if q == T[i]: d = pi[p] - C[i] - pi[q] else: d = pi[p] + C[i] - pi[q] for q in trace_subtree(q): pi[q] += d # Pivot loop for i, p, q in find_entering_edges(): Wn, We = find_cycle(i, p, q) j, s, t = find_leaving_edge(Wn, We) augment_flow(Wn, We, residual_capacity(j, s)) if i != j: # Do nothing more if the entering edge is the same as the # the leaving edge. if parent[t] != s: # Ensure that s is the parent of t. s, t = t, s if We.index(i) > We.index(j): # Ensure that q is in the subtree rooted at t. p, q = q, p remove_edge(s, t) make_root(q) add_edge(i, p, q) update_potentials(i, p, q) ########################################################################### # Infeasibility and unboundedness detection ########################################################################### if any(x[i] != 0 for i in range(-n, 0)): raise nx.NetworkXUnfeasible('no flow satisfies all node demands') if (any(x[i] * 2 >= faux_inf for i in range(e)) or any(e[-1].get(capacity, inf) == inf and e[-1].get(weight, 0) < 0 for e in G.selfloop_edges(data=True))): raise nx.NetworkXUnbounded( 'negative cycle with infinite capacity found') ########################################################################### # Flow cost calculation and flow dict construction ########################################################################### del x[e:] flow_cost = sum(c * x for c, x in zip(C, x)) flow_dict = {n: {} for n in N} def add_entry(e): """Add a flow dict entry. """ d = flow_dict[e[0]] for k in e[1:-2]: try: d = d[k] except KeyError: t = {} d[k] = t d = t d[e[-2]] = e[-1] S = (N[s] for s in S) # Use original nodes. T = (N[t] for t in T) # Use original nodes. if not multigraph: for e in zip(S, T, x): add_entry(e) edges = G.edges_iter(data=True) else: for e in zip(S, T, K, x): add_entry(e) edges = G.edges_iter(data=True, keys=True) for e in edges: if e[0] != e[1]: if e[-1].get(capacity, inf) == 0: add_entry(e[:-1] + (0,)) else: c = e[-1].get(weight, 0) if c >= 0: add_entry(e[:-1] + (0,)) else: u = e[-1][capacity] flow_cost += c * u add_entry(e[:-1] + (u,)) return flow_cost, flow_dict networkx-1.11/networkx/algorithms/isomorphism/0000755000175000017500000000000012653231454021603 5ustar aricaric00000000000000networkx-1.11/networkx/algorithms/isomorphism/matchhelpers.py0000644000175000017500000003167012637544500024644 0ustar aricaric00000000000000"""Functions which help end users define customize node_match and edge_match functions to use during isomorphism checks. """ from itertools import permutations import types import networkx as nx __all__ = ['categorical_node_match', 'categorical_edge_match', 'categorical_multiedge_match', 'numerical_node_match', 'numerical_edge_match', 'numerical_multiedge_match', 'generic_node_match', 'generic_edge_match', 'generic_multiedge_match', ] def copyfunc(f, name=None): """Returns a deepcopy of a function.""" try: # Python <3 return types.FunctionType(f.func_code, f.func_globals, name or f.__name__, f.func_defaults, f.func_closure) except AttributeError: # Python >=3 return types.FunctionType(f.__code__, f.__globals__, name or f.__name__, f.__defaults__, f.__closure__) def allclose(x, y, rtol=1.0000000000000001e-05, atol=1e-08): """Returns True if x and y are sufficiently close, elementwise. Parameters ---------- rtol : float The relative error tolerance. atol : float The absolute error tolerance. """ # assume finite weights, see numpy.allclose() for reference for xi, yi in zip(x,y): if not ( abs(xi-yi) <= atol + rtol * abs(yi) ): return False return True def close(x, y, rtol=1.0000000000000001e-05, atol=1e-08): """Returns True if x and y are sufficiently close. Parameters ---------- rtol : float The relative error tolerance. atol : float The absolute error tolerance. """ # assume finite weights, see numpy.allclose() for reference return abs(x-y) <= atol + rtol * abs(y) categorical_doc = """ Returns a comparison function for a categorical node attribute. The value(s) of the attr(s) must be hashable and comparable via the == operator since they are placed into a set([]) object. If the sets from G1 and G2 are the same, then the constructed function returns True. Parameters ---------- attr : string | list The categorical node attribute to compare, or a list of categorical node attributes to compare. default : value | list The default value for the categorical node attribute, or a list of default values for the categorical node attributes. Returns ------- match : function The customized, categorical `node_match` function. Examples -------- >>> import networkx.algorithms.isomorphism as iso >>> nm = iso.categorical_node_match('size', 1) >>> nm = iso.categorical_node_match(['color', 'size'], ['red', 2]) """ def categorical_node_match(attr, default): if nx.utils.is_string_like(attr): def match(data1, data2): return data1.get(attr, default) == data2.get(attr, default) else: attrs = list(zip(attr, default)) # Python 3 def match(data1, data2): values1 = set([data1.get(attr, d) for attr, d in attrs]) values2 = set([data2.get(attr, d) for attr, d in attrs]) return values1 == values2 return match try: categorical_edge_match = copyfunc(categorical_node_match, 'categorical_edge_match') except NotImplementedError: # IronPython lacks support for types.FunctionType. # https://github.com/networkx/networkx/issues/949 # https://github.com/networkx/networkx/issues/1127 def categorical_edge_match(*args, **kwargs): return categorical_node_match(*args, **kwargs) def categorical_multiedge_match(attr, default): if nx.utils.is_string_like(attr): def match(datasets1, datasets2): values1 = set([data.get(attr, default) for data in datasets1.values()]) values2 = set([data.get(attr, default) for data in datasets2.values()]) return values1 == values2 else: attrs = list(zip(attr, default)) # Python 3 def match(datasets1, datasets2): values1 = set([]) for data1 in datasets1.values(): x = tuple( data1.get(attr, d) for attr, d in attrs ) values1.add(x) values2 = set([]) for data2 in datasets2.values(): x = tuple( data2.get(attr, d) for attr, d in attrs ) values2.add(x) return values1 == values2 return match # Docstrings for categorical functions. categorical_node_match.__doc__ = categorical_doc categorical_edge_match.__doc__ = categorical_doc.replace('node', 'edge') tmpdoc = categorical_doc.replace('node', 'edge') tmpdoc = tmpdoc.replace('categorical_edge_match', 'categorical_multiedge_match') categorical_multiedge_match.__doc__ = tmpdoc numerical_doc = """ Returns a comparison function for a numerical node attribute. The value(s) of the attr(s) must be numerical and sortable. If the sorted list of values from G1 and G2 are the same within some tolerance, then the constructed function returns True. Parameters ---------- attr : string | list The numerical node attribute to compare, or a list of numerical node attributes to compare. default : value | list The default value for the numerical node attribute, or a list of default values for the numerical node attributes. rtol : float The relative error tolerance. atol : float The absolute error tolerance. Returns ------- match : function The customized, numerical `node_match` function. Examples -------- >>> import networkx.algorithms.isomorphism as iso >>> nm = iso.numerical_node_match('weight', 1.0) >>> nm = iso.numerical_node_match(['weight', 'linewidth'], [.25, .5]) """ def numerical_node_match(attr, default, rtol=1.0000000000000001e-05, atol=1e-08): if nx.utils.is_string_like(attr): def match(data1, data2): return close(data1.get(attr, default), data2.get(attr, default), rtol=rtol, atol=atol) else: attrs = list(zip(attr, default)) # Python 3 def match(data1, data2): values1 = [data1.get(attr, d) for attr, d in attrs] values2 = [data2.get(attr, d) for attr, d in attrs] return allclose(values1, values2, rtol=rtol, atol=atol) return match try: numerical_edge_match = copyfunc(numerical_node_match, 'numerical_edge_match') except NotImplementedError: # IronPython lacks support for types.FunctionType. # https://github.com/networkx/networkx/issues/949 # https://github.com/networkx/networkx/issues/1127 def numerical_edge_match(*args, **kwargs): return numerical_node_match(*args, **kwargs) def numerical_multiedge_match(attr, default, rtol=1.0000000000000001e-05, atol=1e-08): if nx.utils.is_string_like(attr): def match(datasets1, datasets2): values1 = sorted([data.get(attr, default) for data in datasets1.values()]) values2 = sorted([data.get(attr, default) for data in datasets2.values()]) return allclose(values1, values2, rtol=rtol, atol=atol) else: attrs = list(zip(attr, default)) # Python 3 def match(datasets1, datasets2): values1 = [] for data1 in datasets1.values(): x = tuple( data1.get(attr, d) for attr, d in attrs ) values1.append(x) values2 = [] for data2 in datasets2.values(): x = tuple( data2.get(attr, d) for attr, d in attrs ) values2.append(x) values1.sort() values2.sort() for xi, yi in zip(values1, values2): if not allclose(xi, yi, rtol=rtol, atol=atol): return False else: return True return match # Docstrings for numerical functions. numerical_node_match.__doc__ = numerical_doc numerical_edge_match.__doc__ = numerical_doc.replace('node', 'edge') tmpdoc = numerical_doc.replace('node', 'edge') tmpdoc = tmpdoc.replace('numerical_edge_match', 'numerical_multiedge_match') numerical_multiedge_match.__doc__ = tmpdoc generic_doc = """ Returns a comparison function for a generic attribute. The value(s) of the attr(s) are compared using the specified operators. If all the attributes are equal, then the constructed function returns True. Parameters ---------- attr : string | list The node attribute to compare, or a list of node attributes to compare. default : value | list The default value for the node attribute, or a list of default values for the node attributes. op : callable | list The operator to use when comparing attribute values, or a list of operators to use when comparing values for each attribute. Returns ------- match : function The customized, generic `node_match` function. Examples -------- >>> from operator import eq >>> from networkx.algorithms.isomorphism.matchhelpers import close >>> from networkx.algorithms.isomorphism import generic_node_match >>> nm = generic_node_match('weight', 1.0, close) >>> nm = generic_node_match('color', 'red', eq) >>> nm = generic_node_match(['weight', 'color'], [1.0, 'red'], [close, eq]) """ def generic_node_match(attr, default, op): if nx.utils.is_string_like(attr): def match(data1, data2): return op(data1.get(attr, default), data2.get(attr, default)) else: attrs = list(zip(attr, default, op)) # Python 3 def match(data1, data2): for attr, d, operator in attrs: if not operator(data1.get(attr, d), data2.get(attr, d)): return False else: return True return match try: generic_edge_match = copyfunc(generic_node_match, 'generic_edge_match') except NotImplementedError: # IronPython lacks support for types.FunctionType. # https://github.com/networkx/networkx/issues/949 # https://github.com/networkx/networkx/issues/1127 def generic_edge_match(*args, **kwargs): return generic_node_match(*args, **kwargs) def generic_multiedge_match(attr, default, op): """Returns a comparison function for a generic attribute. The value(s) of the attr(s) are compared using the specified operators. If all the attributes are equal, then the constructed function returns True. Potentially, the constructed edge_match function can be slow since it must verify that no isomorphism exists between the multiedges before it returns False. Parameters ---------- attr : string | list The edge attribute to compare, or a list of node attributes to compare. default : value | list The default value for the edge attribute, or a list of default values for the dgeattributes. op : callable | list The operator to use when comparing attribute values, or a list of operators to use when comparing values for each attribute. Returns ------- match : function The customized, generic `edge_match` function. Examples -------- >>> from operator import eq >>> from networkx.algorithms.isomorphism.matchhelpers import close >>> from networkx.algorithms.isomorphism import generic_node_match >>> nm = generic_node_match('weight', 1.0, close) >>> nm = generic_node_match('color', 'red', eq) >>> nm = generic_node_match(['weight', 'color'], ... [1.0, 'red'], ... [close, eq]) ... """ # This is slow, but generic. # We must test every possible isomorphism between the edges. if nx.utils.is_string_like(attr): def match(datasets1, datasets2): values1 = [data.get(attr, default) for data in datasets1.values()] values2 = [data.get(attr, default) for data in datasets2.values()] for vals2 in permutations(values2): for xi, yi in zip(values1, vals2): if not op(xi, yi): # This is not an isomorphism, go to next permutation. break else: # Then we found an isomorphism. return True else: # Then there are no isomorphisms between the multiedges. return False else: attrs = list(zip(attr, default)) # Python 3 def match(datasets1, datasets2): values1 = [] for data1 in datasets1.values(): x = tuple( data1.get(attr, d) for attr, d in attrs ) values1.append(x) values2 = [] for data2 in datasets2.values(): x = tuple( data2.get(attr, d) for attr, d in attrs ) values2.append(x) for vals2 in permutations(values2): for xi, yi, operator in zip(values1, vals2, op): if not operator(xi, yi): return False else: return True return match # Docstrings for numerical functions. generic_node_match.__doc__ = generic_doc generic_edge_match.__doc__ = generic_doc.replace('node', 'edge') networkx-1.11/networkx/algorithms/isomorphism/isomorphvf2.py0000644000175000017500000010767212637544500024451 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ ************* VF2 Algorithm ************* An implementation of VF2 algorithm for graph ismorphism testing. The simplest interface to use this module is to call networkx.is_isomorphic(). Introduction ------------ The GraphMatcher and DiGraphMatcher are responsible for matching graphs or directed graphs in a predetermined manner. This usually means a check for an isomorphism, though other checks are also possible. For example, a subgraph of one graph can be checked for isomorphism to a second graph. Matching is done via syntactic feasibility. It is also possible to check for semantic feasibility. Feasibility, then, is defined as the logical AND of the two functions. To include a semantic check, the (Di)GraphMatcher class should be subclassed, and the semantic_feasibility() function should be redefined. By default, the semantic feasibility function always returns True. The effect of this is that semantics are not considered in the matching of G1 and G2. Examples -------- Suppose G1 and G2 are isomorphic graphs. Verification is as follows: >>> from networkx.algorithms import isomorphism >>> G1 = nx.path_graph(4) >>> G2 = nx.path_graph(4) >>> GM = isomorphism.GraphMatcher(G1,G2) >>> GM.is_isomorphic() True GM.mapping stores the isomorphism mapping from G1 to G2. >>> GM.mapping {0: 0, 1: 1, 2: 2, 3: 3} Suppose G1 and G2 are isomorphic directed graphs graphs. Verification is as follows: >>> G1 = nx.path_graph(4, create_using=nx.DiGraph()) >>> G2 = nx.path_graph(4, create_using=nx.DiGraph()) >>> DiGM = isomorphism.DiGraphMatcher(G1,G2) >>> DiGM.is_isomorphic() True DiGM.mapping stores the isomorphism mapping from G1 to G2. >>> DiGM.mapping {0: 0, 1: 1, 2: 2, 3: 3} Subgraph Isomorphism -------------------- Graph theory literature can be ambiguious about the meaning of the above statement, and we seek to clarify it now. In the VF2 literature, a mapping M is said to be a graph-subgraph isomorphism iff M is an isomorphism between G2 and a subgraph of G1. Thus, to say that G1 and G2 are graph-subgraph isomorphic is to say that a subgraph of G1 is isomorphic to G2. Other literature uses the phrase 'subgraph isomorphic' as in 'G1 does not have a subgraph isomorphic to G2'. Another use is as an in adverb for isomorphic. Thus, to say that G1 and G2 are subgraph isomorphic is to say that a subgraph of G1 is isomorphic to G2. Finally, the term 'subgraph' can have multiple meanings. In this context, 'subgraph' always means a 'node-induced subgraph'. Edge-induced subgraph isomorphisms are not directly supported, but one should be able to perform the check by making use of nx.line_graph(). For subgraphs which are not induced, the term 'monomorphism' is preferred over 'isomorphism'. Currently, it is not possible to check for monomorphisms. Let G=(N,E) be a graph with a set of nodes N and set of edges E. If G'=(N',E') is a subgraph, then: N' is a subset of N E' is a subset of E If G'=(N',E') is a node-induced subgraph, then: N' is a subset of N E' is the subset of edges in E relating nodes in N' If G'=(N',E') is an edge-induced subgrpah, then: N' is the subset of nodes in N related by edges in E' E' is a subset of E References ---------- [1] Luigi P. Cordella, Pasquale Foggia, Carlo Sansone, Mario Vento, "A (Sub)Graph Isomorphism Algorithm for Matching Large Graphs", IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 26, no. 10, pp. 1367-1372, Oct., 2004. http://ieeexplore.ieee.org/iel5/34/29305/01323804.pdf [2] L. P. Cordella, P. Foggia, C. Sansone, M. Vento, "An Improved Algorithm for Matching Large Graphs", 3rd IAPR-TC15 Workshop on Graph-based Representations in Pattern Recognition, Cuen, pp. 149-159, 2001. http://amalfi.dis.unina.it/graph/db/papers/vf-algorithm.pdf See Also -------- syntactic_feasibliity(), semantic_feasibility() Notes ----- Modified to handle undirected graphs. Modified to handle multiple edges. In general, this problem is NP-Complete. """ # Copyright (C) 2007-2009 by the NetworkX maintainers # All rights reserved. # BSD license. # This work was originally coded by Christopher Ellison # as part of the Computational Mechanics Python (CMPy) project. # James P. Crutchfield, principal investigator. # Complexity Sciences Center and Physics Department, UC Davis. import sys import networkx as nx __all__ = ['GraphMatcher', 'DiGraphMatcher'] class GraphMatcher(object): """Implementation of VF2 algorithm for matching undirected graphs. Suitable for Graph and MultiGraph instances. """ def __init__(self, G1, G2): """Initialize GraphMatcher. Parameters ---------- G1,G2: NetworkX Graph or MultiGraph instances. The two graphs to check for isomorphism. Examples -------- To create a GraphMatcher which checks for syntactic feasibility: >>> from networkx.algorithms import isomorphism >>> G1 = nx.path_graph(4) >>> G2 = nx.path_graph(4) >>> GM = isomorphism.GraphMatcher(G1,G2) """ self.G1 = G1 self.G2 = G2 self.G1_nodes = set(G1.nodes()) self.G2_nodes = set(G2.nodes()) # Set recursion limit. self.old_recursion_limit = sys.getrecursionlimit() expected_max_recursion_level = len(self.G2) if self.old_recursion_limit < 1.5 * expected_max_recursion_level: # Give some breathing room. sys.setrecursionlimit(int(1.5 * expected_max_recursion_level)) # Declare that we will be searching for a graph-graph isomorphism. self.test = 'graph' # Initialize state self.initialize() def reset_recursion_limit(self): """Restores the recursion limit.""" ### TODO: ### Currently, we use recursion and set the recursion level higher. ### It would be nice to restore the level, but because the ### (Di)GraphMatcher classes make use of cyclic references, garbage ### collection will never happen when we define __del__() to ### restore the recursion level. The result is a memory leak. ### So for now, we do not automatically restore the recursion level, ### and instead provide a method to do this manually. Eventually, ### we should turn this into a non-recursive implementation. sys.setrecursionlimit(self.old_recursion_limit) def candidate_pairs_iter(self): """Iterator over candidate pairs of nodes in G1 and G2.""" # All computations are done using the current state! G1_nodes = self.G1_nodes G2_nodes = self.G2_nodes # First we compute the inout-terminal sets. T1_inout = [node for node in G1_nodes if (node in self.inout_1) and (node not in self.core_1)] T2_inout = [node for node in G2_nodes if (node in self.inout_2) and (node not in self.core_2)] # If T1_inout and T2_inout are both nonempty. # P(s) = T1_inout x {min T2_inout} if T1_inout and T2_inout: for node in T1_inout: yield node, min(T2_inout) else: # If T1_inout and T2_inout were both empty.... # P(s) = (N_1 - M_1) x {min (N_2 - M_2)} ##if not (T1_inout or T2_inout): # as suggested by [2], incorrect if 1: # as inferred from [1], correct # First we determine the candidate node for G2 other_node = min(G2_nodes - set(self.core_2)) for node in self.G1: if node not in self.core_1: yield node, other_node # For all other cases, we don't have any candidate pairs. def initialize(self): """Reinitializes the state of the algorithm. This method should be redefined if using something other than GMState. If only subclassing GraphMatcher, a redefinition is not necessary. """ # core_1[n] contains the index of the node paired with n, which is m, # provided n is in the mapping. # core_2[m] contains the index of the node paired with m, which is n, # provided m is in the mapping. self.core_1 = {} self.core_2 = {} # See the paper for definitions of M_x and T_x^{y} # inout_1[n] is non-zero if n is in M_1 or in T_1^{inout} # inout_2[m] is non-zero if m is in M_2 or in T_2^{inout} # # The value stored is the depth of the SSR tree when the node became # part of the corresponding set. self.inout_1 = {} self.inout_2 = {} # Practically, these sets simply store the nodes in the subgraph. self.state = GMState(self) # Provide a convienient way to access the isomorphism mapping. self.mapping = self.core_1.copy() def is_isomorphic(self): """Returns True if G1 and G2 are isomorphic graphs.""" # Let's do two very quick checks! # QUESTION: Should we call faster_graph_could_be_isomorphic(G1,G2)? # For now, I just copy the code. # Check global properties if self.G1.order() != self.G2.order(): return False # Check local properties d1=sorted(self.G1.degree().values()) d2=sorted(self.G2.degree().values()) if d1 != d2: return False try: x = next(self.isomorphisms_iter()) return True except StopIteration: return False def isomorphisms_iter(self): """Generator over isomorphisms between G1 and G2.""" # Declare that we are looking for a graph-graph isomorphism. self.test = 'graph' self.initialize() for mapping in self.match(): yield mapping def match(self): """Extends the isomorphism mapping. This function is called recursively to determine if a complete isomorphism can be found between G1 and G2. It cleans up the class variables after each recursive call. If an isomorphism is found, we yield the mapping. """ if len(self.core_1) == len(self.G2): # Save the final mapping, otherwise garbage collection deletes it. self.mapping = self.core_1.copy() # The mapping is complete. yield self.mapping else: for G1_node, G2_node in self.candidate_pairs_iter(): if self.syntactic_feasibility(G1_node, G2_node): if self.semantic_feasibility(G1_node, G2_node): # Recursive call, adding the feasible state. newstate = self.state.__class__(self, G1_node, G2_node) for mapping in self.match(): yield mapping # restore data structures newstate.restore() def semantic_feasibility(self, G1_node, G2_node): """Returns True if adding (G1_node, G2_node) is symantically feasible. The semantic feasibility function should return True if it is acceptable to add the candidate pair (G1_node, G2_node) to the current partial isomorphism mapping. The logic should focus on semantic information contained in the edge data or a formalized node class. By acceptable, we mean that the subsequent mapping can still become a complete isomorphism mapping. Thus, if adding the candidate pair definitely makes it so that the subsequent mapping cannot become a complete isomorphism mapping, then this function must return False. The default semantic feasibility function always returns True. The effect is that semantics are not considered in the matching of G1 and G2. The semantic checks might differ based on the what type of test is being performed. A keyword description of the test is stored in self.test. Here is a quick description of the currently implemented tests:: test='graph' Indicates that the graph matcher is looking for a graph-graph isomorphism. test='subgraph' Indicates that the graph matcher is looking for a subgraph-graph isomorphism such that a subgraph of G1 is isomorphic to G2. Any subclass which redefines semantic_feasibility() must maintain the above form to keep the match() method functional. Implementations should consider multigraphs. """ return True def subgraph_is_isomorphic(self): """Returns True if a subgraph of G1 is isomorphic to G2.""" try: x = next(self.subgraph_isomorphisms_iter()) return True except StopIteration: return False # subgraph_is_isomorphic.__doc__ += "\n" + subgraph.replace('\n','\n'+indent) def subgraph_isomorphisms_iter(self): """Generator over isomorphisms between a subgraph of G1 and G2.""" # Declare that we are looking for graph-subgraph isomorphism. self.test = 'subgraph' self.initialize() for mapping in self.match(): yield mapping # subgraph_isomorphisms_iter.__doc__ += "\n" + subgraph.replace('\n','\n'+indent) def syntactic_feasibility(self, G1_node, G2_node): """Returns True if adding (G1_node, G2_node) is syntactically feasible. This function returns True if it is adding the candidate pair to the current partial isomorphism mapping is allowable. The addition is allowable if the inclusion of the candidate pair does not make it impossible for an isomorphism to be found. """ # The VF2 algorithm was designed to work with graphs having, at most, # one edge connecting any two nodes. This is not the case when # dealing with an MultiGraphs. # # Basically, when we test the look-ahead rules R_neighbor, we will # make sure that the number of edges are checked. We also add # a R_self check to verify that the number of selfloops is acceptable. # # Users might be comparing Graph instances with MultiGraph instances. # So the generic GraphMatcher class must work with MultiGraphs. # Care must be taken since the value in the innermost dictionary is a # singlet for Graph instances. For MultiGraphs, the value in the # innermost dictionary is a list. ### ### Test at each step to get a return value as soon as possible. ### ### Look ahead 0 # R_self # The number of selfloops for G1_node must equal the number of # self-loops for G2_node. Without this check, we would fail on # R_neighbor at the next recursion level. But it is good to prune the # search tree now. if self.G1.number_of_edges(G1_node,G1_node) != self.G2.number_of_edges(G2_node,G2_node): return False # R_neighbor # For each neighbor n' of n in the partial mapping, the corresponding # node m' is a neighbor of m, and vice versa. Also, the number of # edges must be equal. for neighbor in self.G1[G1_node]: if neighbor in self.core_1: if not (self.core_1[neighbor] in self.G2[G2_node]): return False elif self.G1.number_of_edges(neighbor, G1_node) != self.G2.number_of_edges(self.core_1[neighbor], G2_node): return False for neighbor in self.G2[G2_node]: if neighbor in self.core_2: if not (self.core_2[neighbor] in self.G1[G1_node]): return False elif self.G1.number_of_edges(self.core_2[neighbor], G1_node) != self.G2.number_of_edges(neighbor, G2_node): return False ### Look ahead 1 # R_terminout # The number of neighbors of n that are in T_1^{inout} is equal to the # number of neighbors of m that are in T_2^{inout}, and vice versa. num1 = 0 for neighbor in self.G1[G1_node]: if (neighbor in self.inout_1) and (neighbor not in self.core_1): num1 += 1 num2 = 0 for neighbor in self.G2[G2_node]: if (neighbor in self.inout_2) and (neighbor not in self.core_2): num2 += 1 if self.test == 'graph': if not (num1 == num2): return False else: # self.test == 'subgraph' if not (num1 >= num2): return False ### Look ahead 2 # R_new # The number of neighbors of n that are neither in the core_1 nor # T_1^{inout} is equal to the number of neighbors of m # that are neither in core_2 nor T_2^{inout}. num1 = 0 for neighbor in self.G1[G1_node]: if neighbor not in self.inout_1: num1 += 1 num2 = 0 for neighbor in self.G2[G2_node]: if neighbor not in self.inout_2: num2 += 1 if self.test == 'graph': if not (num1 == num2): return False else: # self.test == 'subgraph' if not (num1 >= num2): return False # Otherwise, this node pair is syntactically feasible! return True class DiGraphMatcher(GraphMatcher): """Implementation of VF2 algorithm for matching directed graphs. Suitable for DiGraph and MultiDiGraph instances. """ # __doc__ += "Notes\n%s-----" % (indent,) + sources.replace('\n','\n'+indent) def __init__(self, G1, G2): """Initialize DiGraphMatcher. G1 and G2 should be nx.Graph or nx.MultiGraph instances. Examples -------- To create a GraphMatcher which checks for syntactic feasibility: >>> from networkx.algorithms import isomorphism >>> G1 = nx.DiGraph(nx.path_graph(4, create_using=nx.DiGraph())) >>> G2 = nx.DiGraph(nx.path_graph(4, create_using=nx.DiGraph())) >>> DiGM = isomorphism.DiGraphMatcher(G1,G2) """ super(DiGraphMatcher, self).__init__(G1, G2) def candidate_pairs_iter(self): """Iterator over candidate pairs of nodes in G1 and G2.""" # All computations are done using the current state! G1_nodes = self.G1_nodes G2_nodes = self.G2_nodes # First we compute the out-terminal sets. T1_out = [node for node in G1_nodes if (node in self.out_1) and (node not in self.core_1)] T2_out = [node for node in G2_nodes if (node in self.out_2) and (node not in self.core_2)] # If T1_out and T2_out are both nonempty. # P(s) = T1_out x {min T2_out} if T1_out and T2_out: node_2 = min(T2_out) for node_1 in T1_out: yield node_1, node_2 # If T1_out and T2_out were both empty.... # We compute the in-terminal sets. ##elif not (T1_out or T2_out): # as suggested by [2], incorrect else: # as suggested by [1], correct T1_in = [node for node in G1_nodes if (node in self.in_1) and (node not in self.core_1)] T2_in = [node for node in G2_nodes if (node in self.in_2) and (node not in self.core_2)] # If T1_in and T2_in are both nonempty. # P(s) = T1_out x {min T2_out} if T1_in and T2_in: node_2 = min(T2_in) for node_1 in T1_in: yield node_1, node_2 # If all terminal sets are empty... # P(s) = (N_1 - M_1) x {min (N_2 - M_2)} ##elif not (T1_in or T2_in): # as suggested by [2], incorrect else: # as inferred from [1], correct node_2 = min(G2_nodes - set(self.core_2)) for node_1 in G1_nodes: if node_1 not in self.core_1: yield node_1, node_2 # For all other cases, we don't have any candidate pairs. def initialize(self): """Reinitializes the state of the algorithm. This method should be redefined if using something other than DiGMState. If only subclassing GraphMatcher, a redefinition is not necessary. """ # core_1[n] contains the index of the node paired with n, which is m, # provided n is in the mapping. # core_2[m] contains the index of the node paired with m, which is n, # provided m is in the mapping. self.core_1 = {} self.core_2 = {} # See the paper for definitions of M_x and T_x^{y} # in_1[n] is non-zero if n is in M_1 or in T_1^{in} # out_1[n] is non-zero if n is in M_1 or in T_1^{out} # # in_2[m] is non-zero if m is in M_2 or in T_2^{in} # out_2[m] is non-zero if m is in M_2 or in T_2^{out} # # The value stored is the depth of the search tree when the node became # part of the corresponding set. self.in_1 = {} self.in_2 = {} self.out_1 = {} self.out_2 = {} self.state = DiGMState(self) # Provide a convienient way to access the isomorphism mapping. self.mapping = self.core_1.copy() def syntactic_feasibility(self, G1_node, G2_node): """Returns True if adding (G1_node, G2_node) is syntactically feasible. This function returns True if it is adding the candidate pair to the current partial isomorphism mapping is allowable. The addition is allowable if the inclusion of the candidate pair does not make it impossible for an isomorphism to be found. """ # The VF2 algorithm was designed to work with graphs having, at most, # one edge connecting any two nodes. This is not the case when # dealing with an MultiGraphs. # # Basically, when we test the look-ahead rules R_pred and R_succ, we # will make sure that the number of edges are checked. We also add # a R_self check to verify that the number of selfloops is acceptable. # Users might be comparing DiGraph instances with MultiDiGraph # instances. So the generic DiGraphMatcher class must work with # MultiDiGraphs. Care must be taken since the value in the innermost # dictionary is a singlet for DiGraph instances. For MultiDiGraphs, # the value in the innermost dictionary is a list. ### ### Test at each step to get a return value as soon as possible. ### ### Look ahead 0 # R_self # The number of selfloops for G1_node must equal the number of # self-loops for G2_node. Without this check, we would fail on R_pred # at the next recursion level. This should prune the tree even further. if self.G1.number_of_edges(G1_node,G1_node) != self.G2.number_of_edges(G2_node,G2_node): return False # R_pred # For each predecessor n' of n in the partial mapping, the # corresponding node m' is a predecessor of m, and vice versa. Also, # the number of edges must be equal for predecessor in self.G1.pred[G1_node]: if predecessor in self.core_1: if not (self.core_1[predecessor] in self.G2.pred[G2_node]): return False elif self.G1.number_of_edges(predecessor, G1_node) != self.G2.number_of_edges(self.core_1[predecessor], G2_node): return False for predecessor in self.G2.pred[G2_node]: if predecessor in self.core_2: if not (self.core_2[predecessor] in self.G1.pred[G1_node]): return False elif self.G1.number_of_edges(self.core_2[predecessor], G1_node) != self.G2.number_of_edges(predecessor, G2_node): return False # R_succ # For each successor n' of n in the partial mapping, the corresponding # node m' is a successor of m, and vice versa. Also, the number of # edges must be equal. for successor in self.G1[G1_node]: if successor in self.core_1: if not (self.core_1[successor] in self.G2[G2_node]): return False elif self.G1.number_of_edges(G1_node, successor) != self.G2.number_of_edges(G2_node, self.core_1[successor]): return False for successor in self.G2[G2_node]: if successor in self.core_2: if not (self.core_2[successor] in self.G1[G1_node]): return False elif self.G1.number_of_edges(G1_node, self.core_2[successor]) != self.G2.number_of_edges(G2_node, successor): return False ### Look ahead 1 # R_termin # The number of predecessors of n that are in T_1^{in} is equal to the # number of predecessors of m that are in T_2^{in}. num1 = 0 for predecessor in self.G1.pred[G1_node]: if (predecessor in self.in_1) and (predecessor not in self.core_1): num1 += 1 num2 = 0 for predecessor in self.G2.pred[G2_node]: if (predecessor in self.in_2) and (predecessor not in self.core_2): num2 += 1 if self.test == 'graph': if not (num1 == num2): return False else: # self.test == 'subgraph' if not (num1 >= num2): return False # The number of successors of n that are in T_1^{in} is equal to the # number of successors of m that are in T_2^{in}. num1 = 0 for successor in self.G1[G1_node]: if (successor in self.in_1) and (successor not in self.core_1): num1 += 1 num2 = 0 for successor in self.G2[G2_node]: if (successor in self.in_2) and (successor not in self.core_2): num2 += 1 if self.test == 'graph': if not (num1 == num2): return False else: # self.test == 'subgraph' if not (num1 >= num2): return False # R_termout # The number of predecessors of n that are in T_1^{out} is equal to the # number of predecessors of m that are in T_2^{out}. num1 = 0 for predecessor in self.G1.pred[G1_node]: if (predecessor in self.out_1) and (predecessor not in self.core_1): num1 += 1 num2 = 0 for predecessor in self.G2.pred[G2_node]: if (predecessor in self.out_2) and (predecessor not in self.core_2): num2 += 1 if self.test == 'graph': if not (num1 == num2): return False else: # self.test == 'subgraph' if not (num1 >= num2): return False # The number of successors of n that are in T_1^{out} is equal to the # number of successors of m that are in T_2^{out}. num1 = 0 for successor in self.G1[G1_node]: if (successor in self.out_1) and (successor not in self.core_1): num1 += 1 num2 = 0 for successor in self.G2[G2_node]: if (successor in self.out_2) and (successor not in self.core_2): num2 += 1 if self.test == 'graph': if not (num1 == num2): return False else: # self.test == 'subgraph' if not (num1 >= num2): return False ### Look ahead 2 # R_new # The number of predecessors of n that are neither in the core_1 nor # T_1^{in} nor T_1^{out} is equal to the number of predecessors of m # that are neither in core_2 nor T_2^{in} nor T_2^{out}. num1 = 0 for predecessor in self.G1.pred[G1_node]: if (predecessor not in self.in_1) and (predecessor not in self.out_1): num1 += 1 num2 = 0 for predecessor in self.G2.pred[G2_node]: if (predecessor not in self.in_2) and (predecessor not in self.out_2): num2 += 1 if self.test == 'graph': if not (num1 == num2): return False else: # self.test == 'subgraph' if not (num1 >= num2): return False # The number of successors of n that are neither in the core_1 nor # T_1^{in} nor T_1^{out} is equal to the number of successors of m # that are neither in core_2 nor T_2^{in} nor T_2^{out}. num1 = 0 for successor in self.G1[G1_node]: if (successor not in self.in_1) and (successor not in self.out_1): num1 += 1 num2 = 0 for successor in self.G2[G2_node]: if (successor not in self.in_2) and (successor not in self.out_2): num2 += 1 if self.test == 'graph': if not (num1 == num2): return False else: # self.test == 'subgraph' if not (num1 >= num2): return False # Otherwise, this node pair is syntactically feasible! return True class GMState(object): """Internal representation of state for the GraphMatcher class. This class is used internally by the GraphMatcher class. It is used only to store state specific data. There will be at most G2.order() of these objects in memory at a time, due to the depth-first search strategy employed by the VF2 algorithm. """ def __init__(self, GM, G1_node=None, G2_node=None): """Initializes GMState object. Pass in the GraphMatcher to which this GMState belongs and the new node pair that will be added to the GraphMatcher's current isomorphism mapping. """ self.GM = GM # Initialize the last stored node pair. self.G1_node = None self.G2_node = None self.depth = len(GM.core_1) if G1_node is None or G2_node is None: # Then we reset the class variables GM.core_1 = {} GM.core_2 = {} GM.inout_1 = {} GM.inout_2 = {} # Watch out! G1_node == 0 should evaluate to True. if G1_node is not None and G2_node is not None: # Add the node pair to the isomorphism mapping. GM.core_1[G1_node] = G2_node GM.core_2[G2_node] = G1_node # Store the node that was added last. self.G1_node = G1_node self.G2_node = G2_node # Now we must update the other two vectors. # We will add only if it is not in there already! self.depth = len(GM.core_1) # First we add the new nodes... if G1_node not in GM.inout_1: GM.inout_1[G1_node] = self.depth if G2_node not in GM.inout_2: GM.inout_2[G2_node] = self.depth # Now we add every other node... # Updates for T_1^{inout} new_nodes = set([]) for node in GM.core_1: new_nodes.update([neighbor for neighbor in GM.G1[node] if neighbor not in GM.core_1]) for node in new_nodes: if node not in GM.inout_1: GM.inout_1[node] = self.depth # Updates for T_2^{inout} new_nodes = set([]) for node in GM.core_2: new_nodes.update([neighbor for neighbor in GM.G2[node] if neighbor not in GM.core_2]) for node in new_nodes: if node not in GM.inout_2: GM.inout_2[node] = self.depth def restore(self): """Deletes the GMState object and restores the class variables.""" # First we remove the node that was added from the core vectors. # Watch out! G1_node == 0 should evaluate to True. if self.G1_node is not None and self.G2_node is not None: del self.GM.core_1[self.G1_node] del self.GM.core_2[self.G2_node] # Now we revert the other two vectors. # Thus, we delete all entries which have this depth level. for vector in (self.GM.inout_1, self.GM.inout_2): for node in list(vector.keys()): if vector[node] == self.depth: del vector[node] class DiGMState(object): """Internal representation of state for the DiGraphMatcher class. This class is used internally by the DiGraphMatcher class. It is used only to store state specific data. There will be at most G2.order() of these objects in memory at a time, due to the depth-first search strategy employed by the VF2 algorithm. """ def __init__(self, GM, G1_node=None, G2_node=None): """Initializes DiGMState object. Pass in the DiGraphMatcher to which this DiGMState belongs and the new node pair that will be added to the GraphMatcher's current isomorphism mapping. """ self.GM = GM # Initialize the last stored node pair. self.G1_node = None self.G2_node = None self.depth = len(GM.core_1) if G1_node is None or G2_node is None: # Then we reset the class variables GM.core_1 = {} GM.core_2 = {} GM.in_1 = {} GM.in_2 = {} GM.out_1 = {} GM.out_2 = {} # Watch out! G1_node == 0 should evaluate to True. if G1_node is not None and G2_node is not None: # Add the node pair to the isomorphism mapping. GM.core_1[G1_node] = G2_node GM.core_2[G2_node] = G1_node # Store the node that was added last. self.G1_node = G1_node self.G2_node = G2_node # Now we must update the other four vectors. # We will add only if it is not in there already! self.depth = len(GM.core_1) # First we add the new nodes... for vector in (GM.in_1, GM.out_1): if G1_node not in vector: vector[G1_node] = self.depth for vector in (GM.in_2, GM.out_2): if G2_node not in vector: vector[G2_node] = self.depth # Now we add every other node... # Updates for T_1^{in} new_nodes = set([]) for node in GM.core_1: new_nodes.update([predecessor for predecessor in GM.G1.predecessors(node) if predecessor not in GM.core_1]) for node in new_nodes: if node not in GM.in_1: GM.in_1[node] = self.depth # Updates for T_2^{in} new_nodes = set([]) for node in GM.core_2: new_nodes.update([predecessor for predecessor in GM.G2.predecessors(node) if predecessor not in GM.core_2]) for node in new_nodes: if node not in GM.in_2: GM.in_2[node] = self.depth # Updates for T_1^{out} new_nodes = set([]) for node in GM.core_1: new_nodes.update([successor for successor in GM.G1.successors(node) if successor not in GM.core_1]) for node in new_nodes: if node not in GM.out_1: GM.out_1[node] = self.depth # Updates for T_2^{out} new_nodes = set([]) for node in GM.core_2: new_nodes.update([successor for successor in GM.G2.successors(node) if successor not in GM.core_2]) for node in new_nodes: if node not in GM.out_2: GM.out_2[node] = self.depth def restore(self): """Deletes the DiGMState object and restores the class variables.""" # First we remove the node that was added from the core vectors. # Watch out! G1_node == 0 should evaluate to True. if self.G1_node is not None and self.G2_node is not None: del self.GM.core_1[self.G1_node] del self.GM.core_2[self.G2_node] # Now we revert the other four vectors. # Thus, we delete all entries which have this depth level. for vector in (self.GM.in_1, self.GM.in_2, self.GM.out_1, self.GM.out_2): for node in list(vector.keys()): if vector[node] == self.depth: del vector[node] networkx-1.11/networkx/algorithms/isomorphism/vf2userfunc.py0000644000175000017500000001653512637544450024444 0ustar aricaric00000000000000""" Module to simplify the specification of user-defined equality functions for node and edge attributes during isomorphism checks. During the construction of an isomorphism, the algorithm considers two candidate nodes n1 in G1 and n2 in G2. The graphs G1 and G2 are then compared with respect to properties involving n1 and n2, and if the outcome is good, then the candidate nodes are considered isomorphic. NetworkX provides a simple mechanism for users to extend the comparisons to include node and edge attributes. Node attributes are handled by the node_match keyword. When considering n1 and n2, the algorithm passes their node attribute dictionaries to node_match, and if it returns False, then n1 and n2 cannot be considered to be isomorphic. Edge attributes are handled by the edge_match keyword. When considering n1 and n2, the algorithm must verify that outgoing edges from n1 are commensurate with the outgoing edges for n2. If the graph is directed, then a similar check is also performed for incoming edges. Focusing only on outgoing edges, we consider pairs of nodes (n1, v1) from G1 and (n2, v2) from G2. For graphs and digraphs, there is only one edge between (n1, v1) and only one edge between (n2, v2). Those edge attribute dictionaries are passed to edge_match, and if it returns False, then n1 and n2 cannot be considered isomorphic. For multigraphs and multidigraphs, there can be multiple edges between (n1, v1) and also multiple edges between (n2, v2). Now, there must exist an isomorphism from "all the edges between (n1, v1)" to "all the edges between (n2, v2)". So, all of the edge attribute dictionaries are passed to edge_match, and it must determine if there is an isomorphism between the two sets of edges. """ import networkx as nx from . import isomorphvf2 as vf2 __all__ = ['GraphMatcher', 'DiGraphMatcher', 'MultiGraphMatcher', 'MultiDiGraphMatcher', ] def _semantic_feasibility(self, G1_node, G2_node): """Returns True if mapping G1_node to G2_node is semantically feasible. """ # Make sure the nodes match if self.node_match is not None: nm = self.node_match(self.G1.node[G1_node], self.G2.node[G2_node]) if not nm: return False # Make sure the edges match if self.edge_match is not None: # Cached lookups G1_adj = self.G1_adj G2_adj = self.G2_adj core_1 = self.core_1 edge_match = self.edge_match for neighbor in G1_adj[G1_node]: # G1_node is not in core_1, so we must handle R_self separately if neighbor == G1_node: if not edge_match(G1_adj[G1_node][G1_node], G2_adj[G2_node][G2_node]): return False elif neighbor in core_1: if not edge_match(G1_adj[G1_node][neighbor], G2_adj[G2_node][core_1[neighbor]]): return False # syntactic check has already verified that neighbors are symmetric return True class GraphMatcher(vf2.GraphMatcher): """VF2 isomorphism checker for undirected graphs. """ def __init__(self, G1, G2, node_match=None, edge_match=None): """Initialize graph matcher. Parameters ---------- G1, G2: graph The graphs to be tested. node_match: callable A function that returns True iff node n1 in G1 and n2 in G2 should be considered equal during the isomorphism test. The function will be called like:: node_match(G1.node[n1], G2.node[n2]) That is, the function will receive the node attribute dictionaries of the nodes under consideration. If None, then no attributes are considered when testing for an isomorphism. edge_match: callable A function that returns True iff the edge attribute dictionary for the pair of nodes (u1, v1) in G1 and (u2, v2) in G2 should be considered equal during the isomorphism test. The function will be called like:: edge_match(G1[u1][v1], G2[u2][v2]) That is, the function will receive the edge attribute dictionaries of the edges under consideration. If None, then no attributes are considered when testing for an isomorphism. """ vf2.GraphMatcher.__init__(self, G1, G2) self.node_match = node_match self.edge_match = edge_match # These will be modified during checks to minimize code repeat. self.G1_adj = self.G1.adj self.G2_adj = self.G2.adj semantic_feasibility = _semantic_feasibility class DiGraphMatcher(vf2.DiGraphMatcher): """VF2 isomorphism checker for directed graphs. """ def __init__(self, G1, G2, node_match=None, edge_match=None): """Initialize graph matcher. Parameters ---------- G1, G2 : graph The graphs to be tested. node_match : callable A function that returns True iff node n1 in G1 and n2 in G2 should be considered equal during the isomorphism test. The function will be called like:: node_match(G1.node[n1], G2.node[n2]) That is, the function will receive the node attribute dictionaries of the nodes under consideration. If None, then no attributes are considered when testing for an isomorphism. edge_match : callable A function that returns True iff the edge attribute dictionary for the pair of nodes (u1, v1) in G1 and (u2, v2) in G2 should be considered equal during the isomorphism test. The function will be called like:: edge_match(G1[u1][v1], G2[u2][v2]) That is, the function will receive the edge attribute dictionaries of the edges under consideration. If None, then no attributes are considered when testing for an isomorphism. """ vf2.DiGraphMatcher.__init__(self, G1, G2) self.node_match = node_match self.edge_match = edge_match # These will be modified during checks to minimize code repeat. self.G1_adj = self.G1.adj self.G2_adj = self.G2.adj def semantic_feasibility(self, G1_node, G2_node): """Returns True if mapping G1_node to G2_node is semantically feasible.""" # Test node_match and also test edge_match on successors feasible = _semantic_feasibility(self, G1_node, G2_node) if not feasible: return False # Test edge_match on predecessors self.G1_adj = self.G1.pred self.G2_adj = self.G2.pred feasible = _semantic_feasibility(self, G1_node, G2_node) self.G1_adj = self.G1.adj self.G2_adj = self.G2.adj return feasible ## The "semantics" of edge_match are different for multi(di)graphs, but ## the implementation is the same. So, technically we do not need to ## provide "multi" versions, but we do so to match NetworkX's base classes. class MultiGraphMatcher(GraphMatcher): """VF2 isomorphism checker for undirected multigraphs. """ pass class MultiDiGraphMatcher(DiGraphMatcher): """VF2 isomorphism checker for directed multigraphs. """ pass networkx-1.11/networkx/algorithms/isomorphism/__init__.py0000644000175000017500000000025512637544450023723 0ustar aricaric00000000000000from networkx.algorithms.isomorphism.isomorph import * from networkx.algorithms.isomorphism.vf2userfunc import * from networkx.algorithms.isomorphism.matchhelpers import * networkx-1.11/networkx/algorithms/isomorphism/isomorph.py0000644000175000017500000001500212637544500024014 0ustar aricaric00000000000000""" Graph isomorphism functions. """ import networkx as nx from networkx.exception import NetworkXError __author__ = """\n""".join(['Aric Hagberg (hagberg@lanl.gov)', 'Pieter Swart (swart@lanl.gov)', 'Christopher Ellison cellison@cse.ucdavis.edu)']) # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. __all__ = ['could_be_isomorphic', 'fast_could_be_isomorphic', 'faster_could_be_isomorphic', 'is_isomorphic'] def could_be_isomorphic(G1,G2): """Returns False if graphs are definitely not isomorphic. True does NOT guarantee isomorphism. Parameters ---------- G1, G2 : graphs The two graphs G1 and G2 must be the same type. Notes ----- Checks for matching degree, triangle, and number of cliques sequences. """ # Check global properties if G1.order() != G2.order(): return False # Check local properties d1=G1.degree() t1=nx.triangles(G1) c1=nx.number_of_cliques(G1) props1=[ [d1[v], t1[v], c1[v]] for v in d1 ] props1.sort() d2=G2.degree() t2=nx.triangles(G2) c2=nx.number_of_cliques(G2) props2=[ [d2[v], t2[v], c2[v]] for v in d2 ] props2.sort() if props1 != props2: return False # OK... return True graph_could_be_isomorphic=could_be_isomorphic def fast_could_be_isomorphic(G1,G2): """Returns False if graphs are definitely not isomorphic. True does NOT guarantee isomorphism. Parameters ---------- G1, G2 : graphs The two graphs G1 and G2 must be the same type. Notes ----- Checks for matching degree and triangle sequences. """ # Check global properties if G1.order() != G2.order(): return False # Check local properties d1=G1.degree() t1=nx.triangles(G1) props1=[ [d1[v], t1[v]] for v in d1 ] props1.sort() d2=G2.degree() t2=nx.triangles(G2) props2=[ [d2[v], t2[v]] for v in d2 ] props2.sort() if props1 != props2: return False # OK... return True fast_graph_could_be_isomorphic=fast_could_be_isomorphic def faster_could_be_isomorphic(G1,G2): """Returns False if graphs are definitely not isomorphic. True does NOT guarantee isomorphism. Parameters ---------- G1, G2 : graphs The two graphs G1 and G2 must be the same type. Notes ----- Checks for matching degree sequences. """ # Check global properties if G1.order() != G2.order(): return False # Check local properties d1=list(G1.degree().values()) d1.sort() d2=list(G2.degree().values()) d2.sort() if d1 != d2: return False # OK... return True faster_graph_could_be_isomorphic=faster_could_be_isomorphic def is_isomorphic(G1, G2, node_match=None, edge_match=None): """Returns True if the graphs G1 and G2 are isomorphic and False otherwise. Parameters ---------- G1, G2: graphs The two graphs G1 and G2 must be the same type. node_match : callable A function that returns True if node n1 in G1 and n2 in G2 should be considered equal during the isomorphism test. If node_match is not specified then node attributes are not considered. The function will be called like node_match(G1.node[n1], G2.node[n2]). That is, the function will receive the node attribute dictionaries for n1 and n2 as inputs. edge_match : callable A function that returns True if the edge attribute dictionary for the pair of nodes (u1, v1) in G1 and (u2, v2) in G2 should be considered equal during the isomorphism test. If edge_match is not specified then edge attributes are not considered. The function will be called like edge_match(G1[u1][v1], G2[u2][v2]). That is, the function will receive the edge attribute dictionaries of the edges under consideration. Notes ----- Uses the vf2 algorithm [1]_. Examples -------- >>> import networkx.algorithms.isomorphism as iso For digraphs G1 and G2, using 'weight' edge attribute (default: 1) >>> G1 = nx.DiGraph() >>> G2 = nx.DiGraph() >>> G1.add_path([1,2,3,4],weight=1) >>> G2.add_path([10,20,30,40],weight=2) >>> em = iso.numerical_edge_match('weight', 1) >>> nx.is_isomorphic(G1, G2) # no weights considered True >>> nx.is_isomorphic(G1, G2, edge_match=em) # match weights False For multidigraphs G1 and G2, using 'fill' node attribute (default: '') >>> G1 = nx.MultiDiGraph() >>> G2 = nx.MultiDiGraph() >>> G1.add_nodes_from([1,2,3],fill='red') >>> G2.add_nodes_from([10,20,30,40],fill='red') >>> G1.add_path([1,2,3,4],weight=3, linewidth=2.5) >>> G2.add_path([10,20,30,40],weight=3) >>> nm = iso.categorical_node_match('fill', 'red') >>> nx.is_isomorphic(G1, G2, node_match=nm) True For multidigraphs G1 and G2, using 'weight' edge attribute (default: 7) >>> G1.add_edge(1,2, weight=7) >>> G2.add_edge(10,20) >>> em = iso.numerical_multiedge_match('weight', 7, rtol=1e-6) >>> nx.is_isomorphic(G1, G2, edge_match=em) True For multigraphs G1 and G2, using 'weight' and 'linewidth' edge attributes with default values 7 and 2.5. Also using 'fill' node attribute with default value 'red'. >>> em = iso.numerical_multiedge_match(['weight', 'linewidth'], [7, 2.5]) >>> nm = iso.categorical_node_match('fill', 'red') >>> nx.is_isomorphic(G1, G2, edge_match=em, node_match=nm) True See Also -------- numerical_node_match, numerical_edge_match, numerical_multiedge_match categorical_node_match, categorical_edge_match, categorical_multiedge_match References ---------- .. [1] L. P. Cordella, P. Foggia, C. Sansone, M. Vento, "An Improved Algorithm for Matching Large Graphs", 3rd IAPR-TC15 Workshop on Graph-based Representations in Pattern Recognition, Cuen, pp. 149-159, 2001. http://amalfi.dis.unina.it/graph/db/papers/vf-algorithm.pdf """ if G1.is_directed() and G2.is_directed(): GM = nx.algorithms.isomorphism.DiGraphMatcher elif (not G1.is_directed()) and (not G2.is_directed()): GM = nx.algorithms.isomorphism.GraphMatcher else: raise NetworkXError("Graphs G1 and G2 are not of the same type.") gm = GM(G1, G2, node_match=node_match, edge_match=edge_match) return gm.is_isomorphic() networkx-1.11/networkx/algorithms/isomorphism/tests/0000755000175000017500000000000012653231454022745 5ustar aricaric00000000000000networkx-1.11/networkx/algorithms/isomorphism/tests/test_isomorphism.py0000644000175000017500000000223712637544450026740 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * import networkx as nx from networkx.algorithms import isomorphism as iso class TestIsomorph: def setUp(self): self.G1=nx.Graph() self.G2=nx.Graph() self.G3=nx.Graph() self.G4=nx.Graph() self.G1.add_edges_from([ [1,2],[1,3],[1,5],[2,3] ]) self.G2.add_edges_from([ [10,20],[20,30],[10,30],[10,50] ]) self.G3.add_edges_from([ [1,2],[1,3],[1,5],[2,5] ]) self.G4.add_edges_from([ [1,2],[1,3],[1,5],[2,4] ]) def test_could_be_isomorphic(self): assert_true(iso.could_be_isomorphic(self.G1,self.G2)) assert_true(iso.could_be_isomorphic(self.G1,self.G3)) assert_false(iso.could_be_isomorphic(self.G1,self.G4)) assert_true(iso.could_be_isomorphic(self.G3,self.G2)) def test_fast_could_be_isomorphic(self): assert_true(iso.fast_could_be_isomorphic(self.G3,self.G2)) def test_faster_could_be_isomorphic(self): assert_true(iso.faster_could_be_isomorphic(self.G3,self.G2)) def test_is_isomorphic(self): assert_true(iso.is_isomorphic(self.G1,self.G2)) assert_false(iso.is_isomorphic(self.G1,self.G4)) networkx-1.11/networkx/algorithms/isomorphism/tests/iso_r01_s80.A990000644000175000017500000000264212637544450025210 0ustar aricaric00000000000000P%6:CG ,-.9:G%014 !$+02:BJ !*567<AL 26 13DGIK,29M+/FJ -1 (/6;ACDH   %1<?@AJ #%05 ,/HMO .KL $-04=';@ $36;DO/268;!%):=H #)15G ",15BFK #+?@ #&)5@AGJM*=L *49F&(+/?*/58>DL02<>M %:C #&157A &(2<AD#*.D  ')*+AFHKN,=AH*-/O-6"$&+1F>?FK&*1H)+2=IM<M !3> "%/  !$/47&?FI '-367HJNO/23HJK "$&/0J %(2<(23@ ,2ABJ &/CEJN +8&)3469=@BCHO!0@G >?C )-08<  (/6  ->CL#1<  )KLN &'-15B $%E  "#$+/ADFMO".:EHNO 0;BFM  "')46FHK!25;(?LO ')?O !)07C $,BGN 8J  (+236B &'(357K !BIAEFnetworkx-1.11/networkx/algorithms/isomorphism/tests/si2_b06_m200.B990000644000175000017500000000310212637544450025135 0ustar aricaric00000000000000,EHy hs8-+1s5Y6 ]]EyOOX s[fk*I hs%9,BBv|N= ORn CH]/_ (+\>IR\di>Q8[fk{!@Dgjbt~ -pS3=c9.Amrwcz<!DVnPl!Vnx2O2~E>PoTkUZdiI:X&%4k`q!7=:Jm5S4Kk{+h2FG~ULZwW$[_Djx dDx{KTAP0Vje 6L`?b|rv#EH7WKTlgjxGe~eW>INb #H]ye<?rwM45SZQuBv)duU_7?c%!@CV;5YU`qMi RM&a}~Nb'*>.8B.89<u<L`qJ}1t.9B) "C"5_fa6)QpTm:AK1st "Y3 #H]gx|,O4LSZJm -3Bvaw2Fnetworkx-1.11/networkx/algorithms/isomorphism/tests/test_isomorphvf2.py0000644000175000017500000001762212637544500026645 0ustar aricaric00000000000000""" Tests for VF2 isomorphism algorithm. """ import os import struct import random from nose.tools import assert_true, assert_equal from nose import SkipTest import networkx as nx from networkx.algorithms import isomorphism as iso class TestWikipediaExample(object): # Source: http://en.wikipedia.org/wiki/Graph_isomorphism # Nodes 'a', 'b', 'c' and 'd' form a column. # Nodes 'g', 'h', 'i' and 'j' form a column. g1edges = [['a','g'], ['a','h'], ['a','i'], ['b','g'], ['b','h'], ['b','j'], ['c','g'], ['c','i'], ['c','j'], ['d','h'], ['d','i'], ['d','j']] # Nodes 1,2,3,4 form the clockwise corners of a large square. # Nodes 5,6,7,8 form the clockwise corners of a small square g2edges = [[1,2], [2,3], [3,4], [4,1], [5,6], [6,7], [7,8], [8,5], [1,5], [2,6], [3,7], [4,8]] def test_graph(self): g1 = nx.Graph() g2 = nx.Graph() g1.add_edges_from(self.g1edges) g2.add_edges_from(self.g2edges) gm = iso.GraphMatcher(g1,g2) assert_true(gm.is_isomorphic()) mapping = sorted(gm.mapping.items()) # this mapping is only one of the possibilies # so this test needs to be reconsidered # isomap = [('a', 1), ('b', 6), ('c', 3), ('d', 8), # ('g', 2), ('h', 5), ('i', 4), ('j', 7)] # assert_equal(mapping, isomap) def test_subgraph(self): g1 = nx.Graph() g2 = nx.Graph() g1.add_edges_from(self.g1edges) g2.add_edges_from(self.g2edges) g3 = g2.subgraph([1,2,3,4]) gm = iso.GraphMatcher(g1,g3) assert_true(gm.subgraph_is_isomorphic()) class TestVF2GraphDB(object): # http://amalfi.dis.unina.it/graph/db/ @staticmethod def create_graph(filename): """Creates a Graph instance from the filename.""" # The file is assumed to be in the format from the VF2 graph database. # Each file is composed of 16-bit numbers (unsigned short int). # So we will want to read 2 bytes at a time. # We can read the number as follows: # number = struct.unpack('FN25<?/;= ,K !"#%2@9=K&'>EKO3:<?AGIN &025<>K (4: ;@FJL!&.:@K!(*6CD $'-/4BK ",>@C !%9?AHN %7H *2CIL.18:M -<>BGI$+.6D"2K&, ?CFH "*.@HJN %4LN'IJO"1?@H K );D$348E;J#%)F  &-34@AM  !#1<?BK !.@CFI  ')*56E ;O &;O $&127J$GI  $45:AKO !%./18?EN  %&8J05;>BIL *;EI4H  #05>GM +?@/2@  ./5>ACG 4B #+3EGH  *B ;?HN#27=FN23=>B  !/38AHK-@  (-58:>?CGO -/02?JO7>$BMN $'(C"/24?I 9L(F &.0<J (+B  ,<BK/29BCG !#&-2<>CMOM%DEHN;DH  &.4=J !*/;B "')9Cnetworkx-1.11/networkx/algorithms/matching.py0000644000175000017500000010041112637544500021374 0ustar aricaric00000000000000""" ******** Matching ******** """ # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. # Copyright (C) 2011 by # Nicholas Mancuso # All rights reserved. # BSD license. from itertools import repeat __author__ = """\n""".join(['Joris van Rantwijk', 'Nicholas Mancuso (nick.mancuso@gmail.com)']) __all__ = ['max_weight_matching', 'maximal_matching'] def maximal_matching(G): r"""Find a maximal cardinality matching in the graph. A matching is a subset of edges in which no node occurs more than once. The cardinality of a matching is the number of matched edges. Parameters ---------- G : NetworkX graph Undirected graph Returns ------- matching : set A maximal matching of the graph. Notes ----- The algorithm greedily selects a maximal matching M of the graph G (i.e. no superset of M exists). It runs in `O(|E|)` time. """ matching = set([]) edges = set([]) for u,v in G.edges_iter(): # If the edge isn't covered, add it to the matching # then remove neighborhood of u and v from consideration. if (u,v) not in edges and (v,u) not in edges: matching.add((u,v)) edges |= set(G.edges(u)) edges |= set(G.edges(v)) return matching def max_weight_matching(G, maxcardinality=False): """Compute a maximum-weighted matching of G. A matching is a subset of edges in which no node occurs more than once. The cardinality of a matching is the number of matched edges. The weight of a matching is the sum of the weights of its edges. Parameters ---------- G : NetworkX graph Undirected graph maxcardinality: bool, optional If maxcardinality is True, compute the maximum-cardinality matching with maximum weight among all maximum-cardinality matchings. Returns ------- mate : dictionary The matching is returned as a dictionary, mate, such that mate[v] == w if node v is matched to node w. Unmatched nodes do not occur as a key in mate. Notes ----- If G has edges with 'weight' attribute the edge data are used as weight values else the weights are assumed to be 1. This function takes time O(number_of_nodes ** 3). If all edge weights are integers, the algorithm uses only integer computations. If floating point weights are used, the algorithm could return a slightly suboptimal matching due to numeric precision errors. This method is based on the "blossom" method for finding augmenting paths and the "primal-dual" method for finding a matching of maximum weight, both methods invented by Jack Edmonds [1]_. Bipartite graphs can also be matched using the functions present in :mod:`networkx.algorithms.bipartite.matching`. References ---------- .. [1] "Efficient Algorithms for Finding Maximum Matching in Graphs", Zvi Galil, ACM Computing Surveys, 1986. """ # # The algorithm is taken from "Efficient Algorithms for Finding Maximum # Matching in Graphs" by Zvi Galil, ACM Computing Surveys, 1986. # It is based on the "blossom" method for finding augmenting paths and # the "primal-dual" method for finding a matching of maximum weight, both # methods invented by Jack Edmonds. # # A C program for maximum weight matching by Ed Rothberg was used # extensively to validate this new code. # # Many terms used in the code comments are explained in the paper # by Galil. You will probably need the paper to make sense of this code. # class NoNode: """Dummy value which is different from any node.""" pass class Blossom: """Representation of a non-trivial blossom or sub-blossom.""" __slots__ = [ 'childs', 'edges', 'mybestedges' ] # b.childs is an ordered list of b's sub-blossoms, starting with # the base and going round the blossom. # b.edges is the list of b's connecting edges, such that # b.edges[i] = (v, w) where v is a vertex in b.childs[i] # and w is a vertex in b.childs[wrap(i+1)]. # If b is a top-level S-blossom, # b.mybestedges is a list of least-slack edges to neighbouring # S-blossoms, or None if no such list has been computed yet. # This is used for efficient computation of delta3. # Generate the blossom's leaf vertices. def leaves(self): for t in self.childs: if isinstance(t, Blossom): for v in t.leaves(): yield v else: yield t # Get a list of vertices. gnodes = G.nodes() if not gnodes: return { } # don't bother with empty graphs # Find the maximum edge weight. maxweight = 0 allinteger = True for i,j,d in G.edges_iter(data=True): wt=d.get('weight',1) if i != j and wt > maxweight: maxweight = wt allinteger = allinteger and (str(type(wt)).split("'")[1] in ('int', 'long')) # If v is a matched vertex, mate[v] is its partner vertex. # If v is a single vertex, v does not occur as a key in mate. # Initially all vertices are single; updated during augmentation. mate = { } # If b is a top-level blossom, # label.get(b) is None if b is unlabeled (free), # 1 if b is an S-blossom, # 2 if b is a T-blossom. # The label of a vertex is found by looking at the label of its top-level # containing blossom. # If v is a vertex inside a T-blossom, label[v] is 2 iff v is reachable # from an S-vertex outside the blossom. # Labels are assigned during a stage and reset after each augmentation. label = { } # If b is a labeled top-level blossom, # labeledge[b] = (v, w) is the edge through which b obtained its label # such that w is a vertex in b, or None if b's base vertex is single. # If w is a vertex inside a T-blossom and label[w] == 2, # labeledge[w] = (v, w) is an edge through which w is reachable from # outside the blossom. labeledge = { } # If v is a vertex, inblossom[v] is the top-level blossom to which v # belongs. # If v is a top-level vertex, inblossom[v] == v since v is itself # a (trivial) top-level blossom. # Initially all vertices are top-level trivial blossoms. inblossom = dict(zip(gnodes, gnodes)) # If b is a sub-blossom, # blossomparent[b] is its immediate parent (sub-)blossom. # If b is a top-level blossom, blossomparent[b] is None. blossomparent = dict(zip(gnodes, repeat(None))) # If b is a (sub-)blossom, # blossombase[b] is its base VERTEX (i.e. recursive sub-blossom). blossombase = dict(zip(gnodes, gnodes)) # If w is a free vertex (or an unreached vertex inside a T-blossom), # bestedge[w] = (v, w) is the least-slack edge from an S-vertex, # or None if there is no such edge. # If b is a (possibly trivial) top-level S-blossom, # bestedge[b] = (v, w) is the least-slack edge to a different S-blossom # (v inside b), or None if there is no such edge. # This is used for efficient computation of delta2 and delta3. bestedge = { } # If v is a vertex, # dualvar[v] = 2 * u(v) where u(v) is the v's variable in the dual # optimization problem (if all edge weights are integers, multiplication # by two ensures that all values remain integers throughout the algorithm). # Initially, u(v) = maxweight / 2. dualvar = dict(zip(gnodes, repeat(maxweight))) # If b is a non-trivial blossom, # blossomdual[b] = z(b) where z(b) is b's variable in the dual # optimization problem. blossomdual = { } # If (v, w) in allowedge or (w, v) in allowedg, then the edge # (v, w) is known to have zero slack in the optimization problem; # otherwise the edge may or may not have zero slack. allowedge = { } # Queue of newly discovered S-vertices. queue = [ ] # Return 2 * slack of edge (v, w) (does not work inside blossoms). def slack(v, w): return dualvar[v] + dualvar[w] - 2 * G[v][w].get('weight',1) # Assign label t to the top-level blossom containing vertex w, # coming through an edge from vertex v. def assignLabel(w, t, v): b = inblossom[w] assert label.get(w) is None and label.get(b) is None label[w] = label[b] = t if v is not None: labeledge[w] = labeledge[b] = (v, w) else: labeledge[w] = labeledge[b] = None bestedge[w] = bestedge[b] = None if t == 1: # b became an S-vertex/blossom; add it(s vertices) to the queue. if isinstance(b, Blossom): queue.extend(b.leaves()) else: queue.append(b) elif t == 2: # b became a T-vertex/blossom; assign label S to its mate. # (If b is a non-trivial blossom, its base is the only vertex # with an external mate.) base = blossombase[b] assignLabel(mate[base], 1, base) # Trace back from vertices v and w to discover either a new blossom # or an augmenting path. Return the base vertex of the new blossom, # or NoNode if an augmenting path was found. def scanBlossom(v, w): # Trace back from v and w, placing breadcrumbs as we go. path = [ ] base = NoNode while v is not NoNode: # Look for a breadcrumb in v's blossom or put a new breadcrumb. b = inblossom[v] if label[b] & 4: base = blossombase[b] break assert label[b] == 1 path.append(b) label[b] = 5 # Trace one step back. if labeledge[b] is None: # The base of blossom b is single; stop tracing this path. assert blossombase[b] not in mate v = NoNode else: assert labeledge[b][0] == mate[blossombase[b]] v = labeledge[b][0] b = inblossom[v] assert label[b] == 2 # b is a T-blossom; trace one more step back. v = labeledge[b][0] # Swap v and w so that we alternate between both paths. if w is not NoNode: v, w = w, v # Remove breadcrumbs. for b in path: label[b] = 1 # Return base vertex, if we found one. return base # Construct a new blossom with given base, through S-vertices v and w. # Label the new blossom as S; set its dual variable to zero; # relabel its T-vertices to S and add them to the queue. def addBlossom(base, v, w): bb = inblossom[base] bv = inblossom[v] bw = inblossom[w] # Create blossom. b = Blossom() blossombase[b] = base blossomparent[b] = None blossomparent[bb] = b # Make list of sub-blossoms and their interconnecting edge endpoints. b.childs = path = [ ] b.edges = edgs = [ (v, w) ] # Trace back from v to base. while bv != bb: # Add bv to the new blossom. blossomparent[bv] = b path.append(bv) edgs.append(labeledge[bv]) assert label[bv] == 2 or (label[bv] == 1 and labeledge[bv][0] == mate[blossombase[bv]]) # Trace one step back. v = labeledge[bv][0] bv = inblossom[v] # Add base sub-blossom; reverse lists. path.append(bb) path.reverse() edgs.reverse() # Trace back from w to base. while bw != bb: # Add bw to the new blossom. blossomparent[bw] = b path.append(bw) edgs.append((labeledge[bw][1], labeledge[bw][0])) assert label[bw] == 2 or (label[bw] == 1 and labeledge[bw][0] == mate[blossombase[bw]]) # Trace one step back. w = labeledge[bw][0] bw = inblossom[w] # Set label to S. assert label[bb] == 1 label[b] = 1 labeledge[b] = labeledge[bb] # Set dual variable to zero. blossomdual[b] = 0 # Relabel vertices. for v in b.leaves(): if label[inblossom[v]] == 2: # This T-vertex now turns into an S-vertex because it becomes # part of an S-blossom; add it to the queue. queue.append(v) inblossom[v] = b # Compute b.mybestedges. bestedgeto = { } for bv in path: if isinstance(bv, Blossom): if bv.mybestedges is not None: # Walk this subblossom's least-slack edges. nblist = bv.mybestedges # The sub-blossom won't need this data again. bv.mybestedges = None else: # This subblossom does not have a list of least-slack # edges; get the information from the vertices. nblist = [ (v, w) for v in bv.leaves() for w in G.neighbors_iter(v) if v != w ] else: nblist = [ (bv, w) for w in G.neighbors_iter(bv) if bv != w ] for k in nblist: (i, j) = k if inblossom[j] == b: i, j = j, i bj = inblossom[j] if (bj != b and label.get(bj) == 1 and ((bj not in bestedgeto) or slack(i, j) < slack(*bestedgeto[bj]))): bestedgeto[bj] = k # Forget about least-slack edge of the subblossom. bestedge[bv] = None b.mybestedges = list(bestedgeto.values()) # Select bestedge[b]. mybestedge = None bestedge[b] = None for k in b.mybestedges: kslack = slack(*k) if mybestedge is None or kslack < mybestslack: mybestedge = k mybestslack = kslack bestedge[b] = mybestedge # Expand the given top-level blossom. def expandBlossom(b, endstage): # Convert sub-blossoms into top-level blossoms. for s in b.childs: blossomparent[s] = None if isinstance(s, Blossom): if endstage and blossomdual[s] == 0: # Recursively expand this sub-blossom. expandBlossom(s, endstage) else: for v in s.leaves(): inblossom[v] = s else: inblossom[s] = s # If we expand a T-blossom during a stage, its sub-blossoms must be # relabeled. if (not endstage) and label.get(b) == 2: # Start at the sub-blossom through which the expanding # blossom obtained its label, and relabel sub-blossoms untili # we reach the base. # Figure out through which sub-blossom the expanding blossom # obtained its label initially. entrychild = inblossom[labeledge[b][1]] # Decide in which direction we will go round the blossom. j = b.childs.index(entrychild) if j & 1: # Start index is odd; go forward and wrap. j -= len(b.childs) jstep = 1 else: # Start index is even; go backward. jstep = -1 # Move along the blossom until we get to the base. v, w = labeledge[b] while j != 0: # Relabel the T-sub-blossom. if jstep == 1: p, q = b.edges[j] else: q, p = b.edges[j-1] label[w] = None label[q] = None assignLabel(w, 2, v) # Step to the next S-sub-blossom and note its forward edge. allowedge[(p, q)] = allowedge[(q, p)] = True j += jstep if jstep == 1: v, w = b.edges[j] else: w, v = b.edges[j-1] # Step to the next T-sub-blossom. allowedge[(v, w)] = allowedge[(w, v)] = True j += jstep # Relabel the base T-sub-blossom WITHOUT stepping through to # its mate (so don't call assignLabel). bw = b.childs[j] label[w] = label[bw] = 2 labeledge[w] = labeledge[bw] = (v, w) bestedge[bw] = None # Continue along the blossom until we get back to entrychild. j += jstep while b.childs[j] != entrychild: # Examine the vertices of the sub-blossom to see whether # it is reachable from a neighbouring S-vertex outside the # expanding blossom. bv = b.childs[j] if label.get(bv) == 1: # This sub-blossom just got label S through one of its # neighbours; leave it be. j += jstep continue if isinstance(bv, Blossom): for v in bv.leaves(): if label.get(v): break else: v = bv # If the sub-blossom contains a reachable vertex, assign # label T to the sub-blossom. if label.get(v): assert label[v] == 2 assert inblossom[v] == bv label[v] = None label[mate[blossombase[bv]]] = None assignLabel(v, 2, labeledge[v][0]) j += jstep # Remove the expanded blossom entirely. label.pop(b, None) labeledge.pop(b, None) bestedge.pop(b, None) del blossomparent[b] del blossombase[b] del blossomdual[b] # Swap matched/unmatched edges over an alternating path through blossom b # between vertex v and the base vertex. Keep blossom bookkeeping consistent. def augmentBlossom(b, v): # Bubble up through the blossom tree from vertex v to an immediate # sub-blossom of b. t = v while blossomparent[t] != b: t = blossomparent[t] # Recursively deal with the first sub-blossom. if isinstance(t, Blossom): augmentBlossom(t, v) # Decide in which direction we will go round the blossom. i = j = b.childs.index(t) if i & 1: # Start index is odd; go forward and wrap. j -= len(b.childs) jstep = 1 else: # Start index is even; go backward. jstep = -1 # Move along the blossom until we get to the base. while j != 0: # Step to the next sub-blossom and augment it recursively. j += jstep t = b.childs[j] if jstep == 1: w, x = b.edges[j] else: x, w = b.edges[j-1] if isinstance(t, Blossom): augmentBlossom(t, w) # Step to the next sub-blossom and augment it recursively. j += jstep t = b.childs[j] if isinstance(t, Blossom): augmentBlossom(t, x) # Match the edge connecting those sub-blossoms. mate[w] = x mate[x] = w # Rotate the list of sub-blossoms to put the new base at the front. b.childs = b.childs[i:] + b.childs[:i] b.edges = b.edges[i:] + b.edges[:i] blossombase[b] = blossombase[b.childs[0]] assert blossombase[b] == v # Swap matched/unmatched edges over an alternating path between two # single vertices. The augmenting path runs through S-vertices v and w. def augmentMatching(v, w): for (s, j) in ((v, w), (w, v)): # Match vertex s to vertex j. Then trace back from s # until we find a single vertex, swapping matched and unmatched # edges as we go. while 1: bs = inblossom[s] assert label[bs] == 1 assert (labeledge[bs] is None and blossombase[bs] not in mate) or (labeledge[bs][0] == mate[blossombase[bs]]) # Augment through the S-blossom from s to base. if isinstance(bs, Blossom): augmentBlossom(bs, s) # Update mate[s] mate[s] = j # Trace one step back. if labeledge[bs] is None: # Reached single vertex; stop. break t = labeledge[bs][0] bt = inblossom[t] assert label[bt] == 2 # Trace one more step back. s, j = labeledge[bt] # Augment through the T-blossom from j to base. assert blossombase[bt] == t if isinstance(bt, Blossom): augmentBlossom(bt, j) # Update mate[j] mate[j] = s # Verify that the optimum solution has been reached. def verifyOptimum(): if maxcardinality: # Vertices may have negative dual; # find a constant non-negative number to add to all vertex duals. vdualoffset = max(0, -min(dualvar.values())) else: vdualoffset = 0 # 0. all dual variables are non-negative assert min(dualvar.values()) + vdualoffset >= 0 assert len(blossomdual) == 0 or min(blossomdual.values()) >= 0 # 0. all edges have non-negative slack and # 1. all matched edges have zero slack; for i,j,d in G.edges_iter(data=True): wt=d.get('weight',1) if i == j: continue # ignore self-loops s = dualvar[i] + dualvar[j] - 2 * wt iblossoms = [ i ] jblossoms = [ j ] while blossomparent[iblossoms[-1]] is not None: iblossoms.append(blossomparent[iblossoms[-1]]) while blossomparent[jblossoms[-1]] is not None: jblossoms.append(blossomparent[jblossoms[-1]]) iblossoms.reverse() jblossoms.reverse() for (bi, bj) in zip(iblossoms, jblossoms): if bi != bj: break s += 2 * blossomdual[bi] assert s >= 0 if mate.get(i) == j or mate.get(j) == i: assert mate[i] == j and mate[j] == i assert s == 0 # 2. all single vertices have zero dual value; for v in gnodes: assert (v in mate) or dualvar[v] + vdualoffset == 0 # 3. all blossoms with positive dual value are full. for b in blossomdual: if blossomdual[b] > 0: assert len(b.edges) % 2 == 1 for (i, j) in b.edges[1::2]: assert mate[i] == j and mate[j] == i # Ok. # Main loop: continue until no further improvement is possible. while 1: # Each iteration of this loop is a "stage". # A stage finds an augmenting path and uses that to improve # the matching. # Remove labels from top-level blossoms/vertices. label.clear() labeledge.clear() # Forget all about least-slack edges. bestedge.clear() for b in blossomdual: b.mybestedges = None # Loss of labeling means that we can not be sure that currently # allowable edges remain allowable througout this stage. allowedge.clear() # Make queue empty. queue[:] = [ ] # Label single blossoms/vertices with S and put them in the queue. for v in gnodes: if (v not in mate) and label.get(inblossom[v]) is None: assignLabel(v, 1, None) # Loop until we succeed in augmenting the matching. augmented = 0 while 1: # Each iteration of this loop is a "substage". # A substage tries to find an augmenting path; # if found, the path is used to improve the matching and # the stage ends. If there is no augmenting path, the # primal-dual method is used to pump some slack out of # the dual variables. # Continue labeling until all vertices which are reachable # through an alternating path have got a label. while queue and not augmented: # Take an S vertex from the queue. v = queue.pop() assert label[inblossom[v]] == 1 # Scan its neighbours: for w in G.neighbors_iter(v): if w == v: continue # ignore self-loops # w is a neighbour to v bv = inblossom[v] bw = inblossom[w] if bv == bw: # this edge is internal to a blossom; ignore it continue if (v, w) not in allowedge: kslack = slack(v, w) if kslack <= 0: # edge k has zero slack => it is allowable allowedge[(v, w)] = allowedge[(w, v)] = True if (v, w) in allowedge: if label.get(bw) is None: # (C1) w is a free vertex; # label w with T and label its mate with S (R12). assignLabel(w, 2, v) elif label.get(bw) == 1: # (C2) w is an S-vertex (not in the same blossom); # follow back-links to discover either an # augmenting path or a new blossom. base = scanBlossom(v, w) if base is not NoNode: # Found a new blossom; add it to the blossom # bookkeeping and turn it into an S-blossom. addBlossom(base, v, w) else: # Found an augmenting path; augment the # matching and end this stage. augmentMatching(v, w) augmented = 1 break elif label.get(w) is None: # w is inside a T-blossom, but w itself has not # yet been reached from outside the blossom; # mark it as reached (we need this to relabel # during T-blossom expansion). assert label[bw] == 2 label[w] = 2 labeledge[w] = (v, w) elif label.get(bw) == 1: # keep track of the least-slack non-allowable edge to # a different S-blossom. if bestedge.get(bv) is None or kslack < slack(*bestedge[bv]): bestedge[bv] = (v, w) elif label.get(w) is None: # w is a free vertex (or an unreached vertex inside # a T-blossom) but we can not reach it yet; # keep track of the least-slack edge that reaches w. if bestedge.get(w) is None or kslack < slack(*bestedge[w]): bestedge[w] = (v, w) if augmented: break # There is no augmenting path under these constraints; # compute delta and reduce slack in the optimization problem. # (Note that our vertex dual variables, edge slacks and delta's # are pre-multiplied by two.) deltatype = -1 delta = deltaedge = deltablossom = None # Compute delta1: the minumum value of any vertex dual. if not maxcardinality: deltatype = 1 delta = min(dualvar.values()) # Compute delta2: the minimum slack on any edge between # an S-vertex and a free vertex. for v in G.nodes_iter(): if label.get(inblossom[v]) is None and bestedge.get(v) is not None: d = slack(*bestedge[v]) if deltatype == -1 or d < delta: delta = d deltatype = 2 deltaedge = bestedge[v] # Compute delta3: half the minimum slack on any edge between # a pair of S-blossoms. for b in blossomparent: if ( blossomparent[b] is None and label.get(b) == 1 and bestedge.get(b) is not None ): kslack = slack(*bestedge[b]) if allinteger: assert (kslack % 2) == 0 d = kslack // 2 else: d = kslack / 2.0 if deltatype == -1 or d < delta: delta = d deltatype = 3 deltaedge = bestedge[b] # Compute delta4: minimum z variable of any T-blossom. for b in blossomdual: if ( blossomparent[b] is None and label.get(b) == 2 and (deltatype == -1 or blossomdual[b] < delta) ): delta = blossomdual[b] deltatype = 4 deltablossom = b if deltatype == -1: # No further improvement possible; max-cardinality optimum # reached. Do a final delta update to make the optimum # verifyable. assert maxcardinality deltatype = 1 delta = max(0, min(dualvar.values())) # Update dual variables according to delta. for v in gnodes: if label.get(inblossom[v]) == 1: # S-vertex: 2*u = 2*u - 2*delta dualvar[v] -= delta elif label.get(inblossom[v]) == 2: # T-vertex: 2*u = 2*u + 2*delta dualvar[v] += delta for b in blossomdual: if blossomparent[b] is None: if label.get(b) == 1: # top-level S-blossom: z = z + 2*delta blossomdual[b] += delta elif label.get(b) == 2: # top-level T-blossom: z = z - 2*delta blossomdual[b] -= delta # Take action at the point where minimum delta occurred. if deltatype == 1: # No further improvement possible; optimum reached. break elif deltatype == 2: # Use the least-slack edge to continue the search. (v, w) = deltaedge assert label[inblossom[v]] == 1 allowedge[(v, w)] = allowedge[(w, v)] = True queue.append(v) elif deltatype == 3: # Use the least-slack edge to continue the search. (v, w) = deltaedge allowedge[(v, w)] = allowedge[(w, v)] = True assert label[inblossom[v]] == 1 queue.append(v) elif deltatype == 4: # Expand the least-z blossom. expandBlossom(deltablossom, False) # End of a this substage. # Paranoia check that the matching is symmetric. for v in mate: assert mate[mate[v]] == v # Stop when no more augmenting path can be found. if not augmented: break # End of a stage; expand all S-blossoms which have zero dual. for b in list(blossomdual.keys()): if b not in blossomdual: continue # already expanded if ( blossomparent[b] is None and label.get(b) == 1 and blossomdual[b] == 0 ): expandBlossom(b, True) # Verify that we reached the optimum solution (only for integer weights). if allinteger: verifyOptimum() return mate networkx-1.11/networkx/algorithms/cluster.py0000644000175000017500000002527612637544500021302 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """Algorithms to characterize the number of triangles in a graph.""" from itertools import combinations import networkx as nx from networkx import NetworkXError __author__ = """\n""".join(['Aric Hagberg ', 'Dan Schult (dschult@colgate.edu)', 'Pieter Swart (swart@lanl.gov)', 'Jordi Torrents ']) # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. __all__= ['triangles', 'average_clustering', 'clustering', 'transitivity', 'square_clustering'] def triangles(G, nodes=None): """Compute the number of triangles. Finds the number of triangles that include a node as one vertex. Parameters ---------- G : graph A networkx graph nodes : container of nodes, optional (default= all nodes in G) Compute triangles for nodes in this container. Returns ------- out : dictionary Number of triangles keyed by node label. Examples -------- >>> G=nx.complete_graph(5) >>> print(nx.triangles(G,0)) 6 >>> print(nx.triangles(G)) {0: 6, 1: 6, 2: 6, 3: 6, 4: 6} >>> print(list(nx.triangles(G,(0,1)).values())) [6, 6] Notes ----- When computing triangles for the entire graph each triangle is counted three times, once at each node. Self loops are ignored. """ if G.is_directed(): raise NetworkXError("triangles() is not defined for directed graphs.") if nodes in G: # return single value return next(_triangles_and_degree_iter(G,nodes))[2] // 2 return dict( (v,t // 2) for v,d,t in _triangles_and_degree_iter(G,nodes)) def _triangles_and_degree_iter(G,nodes=None): """ Return an iterator of (node, degree, triangles). This double counts triangles so you may want to divide by 2. See degree() and triangles() for definitions and details. """ if G.is_multigraph(): raise NetworkXError("Not defined for multigraphs.") if nodes is None: nodes_nbrs = G.adj.items() else: nodes_nbrs= ( (n,G[n]) for n in G.nbunch_iter(nodes) ) for v,v_nbrs in nodes_nbrs: vs=set(v_nbrs)-set([v]) ntriangles=0 for w in vs: ws=set(G[w])-set([w]) ntriangles+=len(vs.intersection(ws)) yield (v,len(vs),ntriangles) def _weighted_triangles_and_degree_iter(G, nodes=None, weight='weight'): """ Return an iterator of (node, degree, weighted_triangles). Used for weighted clustering. """ if G.is_multigraph(): raise NetworkXError("Not defined for multigraphs.") if weight is None or G.edges()==[]: max_weight=1.0 else: max_weight=float(max(d.get(weight,1.0) for u,v,d in G.edges(data=True))) if nodes is None: nodes_nbrs = G.adj.items() else: nodes_nbrs= ( (n,G[n]) for n in G.nbunch_iter(nodes) ) for i,nbrs in nodes_nbrs: inbrs=set(nbrs)-set([i]) weighted_triangles=0.0 seen=set() for j in inbrs: wij=G[i][j].get(weight,1.0)/max_weight seen.add(j) jnbrs=set(G[j])-seen # this keeps from double counting for k in inbrs&jnbrs: wjk=G[j][k].get(weight,1.0)/max_weight wki=G[i][k].get(weight,1.0)/max_weight weighted_triangles+=(wij*wjk*wki)**(1.0/3.0) yield (i,len(inbrs),weighted_triangles*2) def average_clustering(G, nodes=None, weight=None, count_zeros=True): r"""Compute the average clustering coefficient for the graph G. The clustering coefficient for the graph is the average, .. math:: C = \frac{1}{n}\sum_{v \in G} c_v, where `n` is the number of nodes in `G`. Parameters ---------- G : graph nodes : container of nodes, optional (default=all nodes in G) Compute average clustering for nodes in this container. weight : string or None, optional (default=None) The edge attribute that holds the numerical value used as a weight. If None, then each edge has weight 1. count_zeros : bool If False include only the nodes with nonzero clustering in the average. Returns ------- avg : float Average clustering Examples -------- >>> G=nx.complete_graph(5) >>> print(nx.average_clustering(G)) 1.0 Notes ----- This is a space saving routine; it might be faster to use the clustering function to get a list and then take the average. Self loops are ignored. References ---------- .. [1] Generalizations of the clustering coefficient to weighted complex networks by J. Saramäki, M. Kivelä, J.-P. Onnela, K. Kaski, and J. Kertész, Physical Review E, 75 027105 (2007). http://jponnela.com/web_documents/a9.pdf .. [2] Marcus Kaiser, Mean clustering coefficients: the role of isolated nodes and leafs on clustering measures for small-world networks. http://arxiv.org/abs/0802.2512 """ c=clustering(G,nodes,weight=weight).values() if not count_zeros: c = [v for v in c if v > 0] return sum(c)/float(len(c)) def clustering(G, nodes=None, weight=None): r"""Compute the clustering coefficient for nodes. For unweighted graphs, the clustering of a node `u` is the fraction of possible triangles through that node that exist, .. math:: c_u = \frac{2 T(u)}{deg(u)(deg(u)-1)}, where `T(u)` is the number of triangles through node `u` and `deg(u)` is the degree of `u`. For weighted graphs, the clustering is defined as the geometric average of the subgraph edge weights [1]_, .. math:: c_u = \frac{1}{deg(u)(deg(u)-1))} \sum_{uv} (\hat{w}_{uv} \hat{w}_{uw} \hat{w}_{vw})^{1/3}. The edge weights `\hat{w}_{uv}` are normalized by the maximum weight in the network `\hat{w}_{uv} = w_{uv}/\max(w)`. The value of `c_u` is assigned to 0 if `deg(u) < 2`. Parameters ---------- G : graph nodes : container of nodes, optional (default=all nodes in G) Compute clustering for nodes in this container. weight : string or None, optional (default=None) The edge attribute that holds the numerical value used as a weight. If None, then each edge has weight 1. Returns ------- out : float, or dictionary Clustering coefficient at specified nodes Examples -------- >>> G=nx.complete_graph(5) >>> print(nx.clustering(G,0)) 1.0 >>> print(nx.clustering(G)) {0: 1.0, 1: 1.0, 2: 1.0, 3: 1.0, 4: 1.0} Notes ----- Self loops are ignored. References ---------- .. [1] Generalizations of the clustering coefficient to weighted complex networks by J. Saramäki, M. Kivelä, J.-P. Onnela, K. Kaski, and J. Kertész, Physical Review E, 75 027105 (2007). http://jponnela.com/web_documents/a9.pdf """ if G.is_directed(): raise NetworkXError('Clustering algorithms are not defined ', 'for directed graphs.') if weight is not None: td_iter=_weighted_triangles_and_degree_iter(G,nodes,weight) else: td_iter=_triangles_and_degree_iter(G,nodes) clusterc={} for v,d,t in td_iter: if t==0: clusterc[v]=0.0 else: clusterc[v]=t/float(d*(d-1)) if nodes in G: return list(clusterc.values())[0] # return single value return clusterc def transitivity(G): r"""Compute graph transitivity, the fraction of all possible triangles present in G. Possible triangles are identified by the number of "triads" (two edges with a shared vertex). The transitivity is .. math:: T = 3\frac{\#triangles}{\#triads}. Parameters ---------- G : graph Returns ------- out : float Transitivity Examples -------- >>> G = nx.complete_graph(5) >>> print(nx.transitivity(G)) 1.0 """ triangles=0 # 6 times number of triangles contri=0 # 2 times number of connected triples for v,d,t in _triangles_and_degree_iter(G): contri += d*(d-1) triangles += t if triangles==0: # we had no triangles or possible triangles return 0.0 else: return triangles/float(contri) def square_clustering(G, nodes=None): r""" Compute the squares clustering coefficient for nodes. For each node return the fraction of possible squares that exist at the node [1]_ .. math:: C_4(v) = \frac{ \sum_{u=1}^{k_v} \sum_{w=u+1}^{k_v} q_v(u,w) }{ \sum_{u=1}^{k_v} \sum_{w=u+1}^{k_v} [a_v(u,w) + q_v(u,w)]}, where `q_v(u,w)` are the number of common neighbors of `u` and `w` other than `v` (ie squares), and `a_v(u,w) = (k_u - (1+q_v(u,w)+\theta_{uv}))(k_w - (1+q_v(u,w)+\theta_{uw}))`, where `\theta_{uw} = 1` if `u` and `w` are connected and 0 otherwise. Parameters ---------- G : graph nodes : container of nodes, optional (default=all nodes in G) Compute clustering for nodes in this container. Returns ------- c4 : dictionary A dictionary keyed by node with the square clustering coefficient value. Examples -------- >>> G=nx.complete_graph(5) >>> print(nx.square_clustering(G,0)) 1.0 >>> print(nx.square_clustering(G)) {0: 1.0, 1: 1.0, 2: 1.0, 3: 1.0, 4: 1.0} Notes ----- While `C_3(v)` (triangle clustering) gives the probability that two neighbors of node v are connected with each other, `C_4(v)` is the probability that two neighbors of node v share a common neighbor different from v. This algorithm can be applied to both bipartite and unipartite networks. References ---------- .. [1] Pedro G. Lind, Marta C. González, and Hans J. Herrmann. 2005 Cycles and clustering in bipartite networks. Physical Review E (72) 056127. """ if nodes is None: node_iter = G else: node_iter = G.nbunch_iter(nodes) clustering = {} for v in node_iter: clustering[v] = 0.0 potential=0 for u,w in combinations(G[v], 2): squares = len((set(G[u]) & set(G[w])) - set([v])) clustering[v] += squares degm = squares + 1.0 if w in G[u]: degm += 1 potential += (len(G[u]) - degm) * (len(G[w]) - degm) + squares if potential > 0: clustering[v] /= potential if nodes in G: return list(clustering.values())[0] # return single value return clustering networkx-1.11/networkx/algorithms/hierarchy.py0000644000175000017500000000336112637544450021572 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ Flow Hierarchy. """ # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import networkx as nx __authors__ = "\n".join(['Ben Edwards (bedwards@cs.unm.edu)']) __all__ = ['flow_hierarchy'] def flow_hierarchy(G, weight=None): """Returns the flow hierarchy of a directed network. Flow hierarchy is defined as the fraction of edges not participating in cycles in a directed graph [1]_. Parameters ---------- G : DiGraph or MultiDiGraph A directed graph weight : key,optional (default=None) Attribute to use for node weights. If None the weight defaults to 1. Returns ------- h : float Flow heirarchy value Notes ----- The algorithm described in [1]_ computes the flow hierarchy through exponentiation of the adjacency matrix. This function implements an alternative approach that finds strongly connected components. An edge is in a cycle if and only if it is in a strongly connected component, which can be found in `O(m)` time using Tarjan's algorithm. References ---------- .. [1] Luo, J.; Magee, C.L. (2011), Detecting evolving patterns of self-organizing networks by flow hierarchy measurement, Complexity, Volume 16 Issue 6 53-61. DOI: 10.1002/cplx.20368 http://web.mit.edu/~cmagee/www/documents/28-DetectingEvolvingPatterns_FlowHierarchy.pdf """ if not G.is_directed(): raise nx.NetworkXError("G must be a digraph in flow_heirarchy") scc = nx.strongly_connected_components(G) return 1.-sum(G.subgraph(c).size(weight) for c in scc)/float(G.size(weight)) networkx-1.11/networkx/algorithms/isolate.py0000644000175000017500000000320512637544500021245 0ustar aricaric00000000000000# encoding: utf-8 """ Functions for identifying isolate (degree zero) nodes. """ # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import networkx as nx __author__ = """\n""".join(['Drew Conway ', 'Aric Hagberg ']) __all__=['is_isolate','isolates'] def is_isolate(G,n): """Determine of node n is an isolate (degree zero). Parameters ---------- G : graph A networkx graph n : node A node in G Returns ------- isolate : bool True if n has no neighbors, False otherwise. Examples -------- >>> G=nx.Graph() >>> G.add_edge(1,2) >>> G.add_node(3) >>> nx.is_isolate(G,2) False >>> nx.is_isolate(G,3) True """ return G.degree(n)==0 def isolates(G): """Return list of isolates in the graph. Isolates are nodes with no neighbors (degree zero). Parameters ---------- G : graph A networkx graph Returns ------- isolates : list List of isolate nodes. Examples -------- >>> G = nx.Graph() >>> G.add_edge(1,2) >>> G.add_node(3) >>> nx.isolates(G) [3] To remove all isolates in the graph use >>> G.remove_nodes_from(nx.isolates(G)) >>> G.nodes() [1, 2] For digraphs isolates have zero in-degree and zero out_degre >>> G = nx.DiGraph([(0,1),(1,2)]) >>> G.add_node(3) >>> nx.isolates(G) [3] """ return [n for (n,d) in G.degree_iter() if d==0] networkx-1.11/networkx/algorithms/__init__.py0000644000175000017500000000604512637544500021351 0ustar aricaric00000000000000from networkx.algorithms.assortativity import * from networkx.algorithms.block import * from networkx.algorithms.boundary import * from networkx.algorithms.centrality import * from networkx.algorithms.cluster import * from networkx.algorithms.clique import * from networkx.algorithms.community import * from networkx.algorithms.components import * from networkx.algorithms.coloring import * from networkx.algorithms.core import * from networkx.algorithms.cycles import * from networkx.algorithms.dag import * from networkx.algorithms.distance_measures import * from networkx.algorithms.dominance import * from networkx.algorithms.dominating import * from networkx.algorithms.hierarchy import * from networkx.algorithms.hybrid import * from networkx.algorithms.matching import * from networkx.algorithms.minors import * from networkx.algorithms.mis import * from networkx.algorithms.mst import * from networkx.algorithms.link_analysis import * from networkx.algorithms.link_prediction import * from networkx.algorithms.operators import * from networkx.algorithms.shortest_paths import * from networkx.algorithms.smetric import * from networkx.algorithms.triads import * from networkx.algorithms.traversal import * from networkx.algorithms.isolate import * from networkx.algorithms.euler import * from networkx.algorithms.vitality import * from networkx.algorithms.chordal import * from networkx.algorithms.richclub import * from networkx.algorithms.distance_regular import * from networkx.algorithms.swap import * from networkx.algorithms.graphical import * from networkx.algorithms.simple_paths import * import networkx.algorithms.assortativity import networkx.algorithms.bipartite import networkx.algorithms.centrality import networkx.algorithms.cluster import networkx.algorithms.clique import networkx.algorithms.components import networkx.algorithms.connectivity import networkx.algorithms.coloring import networkx.algorithms.flow import networkx.algorithms.isomorphism import networkx.algorithms.link_analysis import networkx.algorithms.shortest_paths import networkx.algorithms.traversal import networkx.algorithms.chordal import networkx.algorithms.operators import networkx.algorithms.tree # bipartite from networkx.algorithms.bipartite import (projected_graph, project, is_bipartite, complete_bipartite_graph) # connectivity from networkx.algorithms.connectivity import (minimum_edge_cut, minimum_node_cut, average_node_connectivity, edge_connectivity, node_connectivity, stoer_wagner, all_pairs_node_connectivity, all_node_cuts, k_components) # isomorphism from networkx.algorithms.isomorphism import (is_isomorphic, could_be_isomorphic, fast_could_be_isomorphic, faster_could_be_isomorphic) # flow from networkx.algorithms.flow import (maximum_flow, maximum_flow_value, minimum_cut, minimum_cut_value, capacity_scaling, network_simplex, min_cost_flow_cost, max_flow_min_cost, min_cost_flow, cost_of_flow) from .tree.recognition import * from .tree.branchings import ( maximum_branching, minimum_branching, maximum_spanning_arborescence, minimum_spanning_arborescence ) networkx-1.11/networkx/algorithms/assortativity/0000755000175000017500000000000012653231454022157 5ustar aricaric00000000000000networkx-1.11/networkx/algorithms/assortativity/__init__.py0000644000175000017500000000044612637544450024301 0ustar aricaric00000000000000from networkx.algorithms.assortativity.connectivity import * from networkx.algorithms.assortativity.correlation import * from networkx.algorithms.assortativity.mixing import * from networkx.algorithms.assortativity.neighbor_degree import * from networkx.algorithms.assortativity.pairs import * networkx-1.11/networkx/algorithms/assortativity/pairs.py0000644000175000017500000000725612637544500023662 0ustar aricaric00000000000000#-*- coding: utf-8 -*- """Generators of x-y pairs of node data.""" import networkx as nx from networkx.utils import dict_to_numpy_array __author__ = ' '.join(['Aric Hagberg ']) __all__ = ['node_attribute_xy', 'node_degree_xy'] def node_attribute_xy(G, attribute, nodes=None): """Return iterator of node-attribute pairs for all edges in G. Parameters ---------- G: NetworkX graph attribute: key The node attribute key. nodes: list or iterable (optional) Use only edges that are adjacency to specified nodes. The default is all nodes. Returns ------- (x,y): 2-tuple Generates 2-tuple of (attribute,attribute) values. Examples -------- >>> G = nx.DiGraph() >>> G.add_node(1,color='red') >>> G.add_node(2,color='blue') >>> G.add_edge(1,2) >>> list(nx.node_attribute_xy(G,'color')) [('red', 'blue')] Notes ----- For undirected graphs each edge is produced twice, once for each edge representation (u,v) and (v,u), with the exception of self-loop edges which only appear once. """ if nodes is None: nodes = set(G) else: nodes = set(nodes) node = G.node for u,nbrsdict in G.adjacency_iter(): if u not in nodes: continue uattr = node[u].get(attribute,None) if G.is_multigraph(): for v,keys in nbrsdict.items(): vattr = node[v].get(attribute,None) for k,d in keys.items(): yield (uattr,vattr) else: for v,eattr in nbrsdict.items(): vattr = node[v].get(attribute,None) yield (uattr,vattr) def node_degree_xy(G, x='out', y='in', weight=None, nodes=None): """Generate node degree-degree pairs for edges in G. Parameters ---------- G: NetworkX graph x: string ('in','out') The degree type for source node (directed graphs only). y: string ('in','out') The degree type for target node (directed graphs only). weight: string or None, optional (default=None) The edge attribute that holds the numerical value used as a weight. If None, then each edge has weight 1. The degree is the sum of the edge weights adjacent to the node. nodes: list or iterable (optional) Use only edges that are adjacency to specified nodes. The default is all nodes. Returns ------- (x,y): 2-tuple Generates 2-tuple of (degree,degree) values. Examples -------- >>> G = nx.DiGraph() >>> G.add_edge(1,2) >>> list(nx.node_degree_xy(G,x='out',y='in')) [(1, 1)] >>> list(nx.node_degree_xy(G,x='in',y='out')) [(0, 0)] Notes ----- For undirected graphs each edge is produced twice, once for each edge representation (u,v) and (v,u), with the exception of self-loop edges which only appear once. """ if nodes is None: nodes = set(G) else: nodes = set(nodes) xdeg = G.degree_iter ydeg = G.degree_iter if G.is_directed(): direction = {'out':G.out_degree_iter, 'in':G.in_degree_iter} xdeg = direction[x] ydeg = direction[y] for u,degu in xdeg(nodes, weight=weight): neighbors = (nbr for _,nbr in G.edges_iter(u) if nbr in nodes) for v,degv in ydeg(neighbors, weight=weight): yield degu,degv # fixture for nose tests def setup_module(module): from nose import SkipTest try: import numpy except: raise SkipTest("NumPy not available") try: import scipy except: raise SkipTest("SciPy not available") networkx-1.11/networkx/algorithms/assortativity/correlation.py0000644000175000017500000002062512637544450025064 0ustar aricaric00000000000000#-*- coding: utf-8 -*- """Node assortativity coefficients and correlation measures. """ import networkx as nx from networkx.algorithms.assortativity.mixing import degree_mixing_matrix, \ attribute_mixing_matrix, numeric_mixing_matrix from networkx.algorithms.assortativity.pairs import node_degree_xy, \ node_attribute_xy __author__ = ' '.join(['Aric Hagberg ', 'Oleguer Sagarra ']) __all__ = ['degree_pearson_correlation_coefficient', 'degree_assortativity_coefficient', 'attribute_assortativity_coefficient', 'numeric_assortativity_coefficient'] def degree_assortativity_coefficient(G, x='out', y='in', weight=None, nodes=None): """Compute degree assortativity of graph. Assortativity measures the similarity of connections in the graph with respect to the node degree. Parameters ---------- G : NetworkX graph x: string ('in','out') The degree type for source node (directed graphs only). y: string ('in','out') The degree type for target node (directed graphs only). weight: string or None, optional (default=None) The edge attribute that holds the numerical value used as a weight. If None, then each edge has weight 1. The degree is the sum of the edge weights adjacent to the node. nodes: list or iterable (optional) Compute degree assortativity only for nodes in container. The default is all nodes. Returns ------- r : float Assortativity of graph by degree. Examples -------- >>> G=nx.path_graph(4) >>> r=nx.degree_assortativity_coefficient(G) >>> print("%3.1f"%r) -0.5 See Also -------- attribute_assortativity_coefficient numeric_assortativity_coefficient neighbor_connectivity degree_mixing_dict degree_mixing_matrix Notes ----- This computes Eq. (21) in Ref. [1]_ , where e is the joint probability distribution (mixing matrix) of the degrees. If G is directed than the matrix e is the joint probability of the user-specified degree type for the source and target. References ---------- .. [1] M. E. J. Newman, Mixing patterns in networks, Physical Review E, 67 026126, 2003 .. [2] Foster, J.G., Foster, D.V., Grassberger, P. & Paczuski, M. Edge direction and the structure of networks, PNAS 107, 10815-20 (2010). """ M = degree_mixing_matrix(G, x=x, y=y, nodes=nodes, weight=weight) return numeric_ac(M) def degree_pearson_correlation_coefficient(G, x='out', y='in', weight=None, nodes=None): """Compute degree assortativity of graph. Assortativity measures the similarity of connections in the graph with respect to the node degree. This is the same as degree_assortativity_coefficient but uses the potentially faster scipy.stats.pearsonr function. Parameters ---------- G : NetworkX graph x: string ('in','out') The degree type for source node (directed graphs only). y: string ('in','out') The degree type for target node (directed graphs only). weight: string or None, optional (default=None) The edge attribute that holds the numerical value used as a weight. If None, then each edge has weight 1. The degree is the sum of the edge weights adjacent to the node. nodes: list or iterable (optional) Compute pearson correlation of degrees only for specified nodes. The default is all nodes. Returns ------- r : float Assortativity of graph by degree. Examples -------- >>> G=nx.path_graph(4) >>> r=nx.degree_pearson_correlation_coefficient(G) >>> print("%3.1f"%r) -0.5 Notes ----- This calls scipy.stats.pearsonr. References ---------- .. [1] M. E. J. Newman, Mixing patterns in networks Physical Review E, 67 026126, 2003 .. [2] Foster, J.G., Foster, D.V., Grassberger, P. & Paczuski, M. Edge direction and the structure of networks, PNAS 107, 10815-20 (2010). """ try: import scipy.stats as stats except ImportError: raise ImportError( "Assortativity requires SciPy: http://scipy.org/ ") xy=node_degree_xy(G, x=x, y=y, nodes=nodes, weight=weight) x,y=zip(*xy) return stats.pearsonr(x,y)[0] def attribute_assortativity_coefficient(G,attribute,nodes=None): """Compute assortativity for node attributes. Assortativity measures the similarity of connections in the graph with respect to the given attribute. Parameters ---------- G : NetworkX graph attribute : string Node attribute key nodes: list or iterable (optional) Compute attribute assortativity for nodes in container. The default is all nodes. Returns ------- r: float Assortativity of graph for given attribute Examples -------- >>> G=nx.Graph() >>> G.add_nodes_from([0,1],color='red') >>> G.add_nodes_from([2,3],color='blue') >>> G.add_edges_from([(0,1),(2,3)]) >>> print(nx.attribute_assortativity_coefficient(G,'color')) 1.0 Notes ----- This computes Eq. (2) in Ref. [1]_ , trace(M)-sum(M))/(1-sum(M), where M is the joint probability distribution (mixing matrix) of the specified attribute. References ---------- .. [1] M. E. J. Newman, Mixing patterns in networks, Physical Review E, 67 026126, 2003 """ M = attribute_mixing_matrix(G,attribute,nodes) return attribute_ac(M) def numeric_assortativity_coefficient(G, attribute, nodes=None): """Compute assortativity for numerical node attributes. Assortativity measures the similarity of connections in the graph with respect to the given numeric attribute. Parameters ---------- G : NetworkX graph attribute : string Node attribute key nodes: list or iterable (optional) Compute numeric assortativity only for attributes of nodes in container. The default is all nodes. Returns ------- r: float Assortativity of graph for given attribute Examples -------- >>> G=nx.Graph() >>> G.add_nodes_from([0,1],size=2) >>> G.add_nodes_from([2,3],size=3) >>> G.add_edges_from([(0,1),(2,3)]) >>> print(nx.numeric_assortativity_coefficient(G,'size')) 1.0 Notes ----- This computes Eq. (21) in Ref. [1]_ , for the mixing matrix of of the specified attribute. References ---------- .. [1] M. E. J. Newman, Mixing patterns in networks Physical Review E, 67 026126, 2003 """ a = numeric_mixing_matrix(G,attribute,nodes) return numeric_ac(a) def attribute_ac(M): """Compute assortativity for attribute matrix M. Parameters ---------- M : numpy array or matrix Attribute mixing matrix. Notes ----- This computes Eq. (2) in Ref. [1]_ , (trace(e)-sum(e))/(1-sum(e)), where e is the joint probability distribution (mixing matrix) of the specified attribute. References ---------- .. [1] M. E. J. Newman, Mixing patterns in networks, Physical Review E, 67 026126, 2003 """ try: import numpy except ImportError: raise ImportError( "attribute_assortativity requires NumPy: http://scipy.org/ ") if M.sum() != 1.0: M=M/float(M.sum()) M=numpy.asmatrix(M) s=(M*M).sum() t=M.trace() r=(t-s)/(1-s) return float(r) def numeric_ac(M): # M is a numpy matrix or array # numeric assortativity coefficient, pearsonr try: import numpy except ImportError: raise ImportError('numeric_assortativity requires ', 'NumPy: http://scipy.org/') if M.sum() != 1.0: M=M/float(M.sum()) nx,ny=M.shape # nx=ny x=numpy.arange(nx) y=numpy.arange(ny) a=M.sum(axis=0) b=M.sum(axis=1) vara=(a*x**2).sum()-((a*x).sum())**2 varb=(b*x**2).sum()-((b*x).sum())**2 xy=numpy.outer(x,y) ab=numpy.outer(a,b) return (xy*(M-ab)).sum()/numpy.sqrt(vara*varb) # fixture for nose tests def setup_module(module): from nose import SkipTest try: import numpy except: raise SkipTest("NumPy not available") try: import scipy except: raise SkipTest("SciPy not available") networkx-1.11/networkx/algorithms/assortativity/tests/0000755000175000017500000000000012653231454023321 5ustar aricaric00000000000000networkx-1.11/networkx/algorithms/assortativity/tests/test_mixing.py0000644000175000017500000001414712637544450026241 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * from nose import SkipTest import networkx as nx from base_test import BaseTestAttributeMixing,BaseTestDegreeMixing class TestDegreeMixingDict(BaseTestDegreeMixing): def test_degree_mixing_dict_undirected(self): d=nx.degree_mixing_dict(self.P4) d_result={1:{2:2}, 2:{1:2,2:2}, } assert_equal(d,d_result) def test_degree_mixing_dict_undirected_normalized(self): d=nx.degree_mixing_dict(self.P4, normalized=True) d_result={1:{2:1.0/3}, 2:{1:1.0/3,2:1.0/3}, } assert_equal(d,d_result) def test_degree_mixing_dict_directed(self): d=nx.degree_mixing_dict(self.D) print(d) d_result={1:{3:2}, 2:{1:1,3:1}, 3:{} } assert_equal(d,d_result) def test_degree_mixing_dict_multigraph(self): d=nx.degree_mixing_dict(self.M) d_result={1:{2:1}, 2:{1:1,3:3}, 3:{2:3} } assert_equal(d,d_result) class TestDegreeMixingMatrix(BaseTestDegreeMixing): @classmethod def setupClass(cls): global np global npt try: import numpy as np import numpy.testing as npt except ImportError: raise SkipTest('NumPy not available.') def test_degree_mixing_matrix_undirected(self): a_result=np.array([[0,0,0], [0,0,2], [0,2,2]] ) a=nx.degree_mixing_matrix(self.P4,normalized=False) npt.assert_equal(a,a_result) a=nx.degree_mixing_matrix(self.P4) npt.assert_equal(a,a_result/float(a_result.sum())) def test_degree_mixing_matrix_directed(self): a_result=np.array([[0,0,0,0], [0,0,0,2], [0,1,0,1], [0,0,0,0]] ) a=nx.degree_mixing_matrix(self.D,normalized=False) npt.assert_equal(a,a_result) a=nx.degree_mixing_matrix(self.D) npt.assert_equal(a,a_result/float(a_result.sum())) def test_degree_mixing_matrix_multigraph(self): a_result=np.array([[0,0,0,0], [0,0,1,0], [0,1,0,3], [0,0,3,0]] ) a=nx.degree_mixing_matrix(self.M,normalized=False) npt.assert_equal(a,a_result) a=nx.degree_mixing_matrix(self.M) npt.assert_equal(a,a_result/float(a_result.sum())) def test_degree_mixing_matrix_selfloop(self): a_result=np.array([[0,0,0], [0,0,0], [0,0,2]] ) a=nx.degree_mixing_matrix(self.S,normalized=False) npt.assert_equal(a,a_result) a=nx.degree_mixing_matrix(self.S) npt.assert_equal(a,a_result/float(a_result.sum())) class TestAttributeMixingDict(BaseTestAttributeMixing): def test_attribute_mixing_dict_undirected(self): d=nx.attribute_mixing_dict(self.G,'fish') d_result={'one':{'one':2,'red':1}, 'two':{'two':2,'blue':1}, 'red':{'one':1}, 'blue':{'two':1} } assert_equal(d,d_result) def test_attribute_mixing_dict_directed(self): d=nx.attribute_mixing_dict(self.D,'fish') d_result={'one':{'one':1,'red':1}, 'two':{'two':1,'blue':1}, 'red':{}, 'blue':{} } assert_equal(d,d_result) def test_attribute_mixing_dict_multigraph(self): d=nx.attribute_mixing_dict(self.M,'fish') d_result={'one':{'one':4}, 'two':{'two':2}, } assert_equal(d,d_result) class TestAttributeMixingMatrix(BaseTestAttributeMixing): @classmethod def setupClass(cls): global np global npt try: import numpy as np import numpy.testing as npt except ImportError: raise SkipTest('NumPy not available.') def test_attribute_mixing_matrix_undirected(self): mapping={'one':0,'two':1,'red':2,'blue':3} a_result=np.array([[2,0,1,0], [0,2,0,1], [1,0,0,0], [0,1,0,0]] ) a=nx.attribute_mixing_matrix(self.G,'fish', mapping=mapping, normalized=False) npt.assert_equal(a,a_result) a=nx.attribute_mixing_matrix(self.G,'fish', mapping=mapping) npt.assert_equal(a,a_result/float(a_result.sum())) def test_attribute_mixing_matrix_directed(self): mapping={'one':0,'two':1,'red':2,'blue':3} a_result=np.array([[1,0,1,0], [0,1,0,1], [0,0,0,0], [0,0,0,0]] ) a=nx.attribute_mixing_matrix(self.D,'fish', mapping=mapping, normalized=False) npt.assert_equal(a,a_result) a=nx.attribute_mixing_matrix(self.D,'fish', mapping=mapping) npt.assert_equal(a,a_result/float(a_result.sum())) def test_attribute_mixing_matrix_multigraph(self): mapping={'one':0,'two':1,'red':2,'blue':3} a_result=np.array([[4,0,0,0], [0,2,0,0], [0,0,0,0], [0,0,0,0]] ) a=nx.attribute_mixing_matrix(self.M,'fish', mapping=mapping, normalized=False) npt.assert_equal(a,a_result) a=nx.attribute_mixing_matrix(self.M,'fish', mapping=mapping) npt.assert_equal(a,a_result/float(a_result.sum())) networkx-1.11/networkx/algorithms/assortativity/tests/base_test.py0000644000175000017500000000272112637544450025653 0ustar aricaric00000000000000import networkx as nx class BaseTestAttributeMixing(object): def setUp(self): G=nx.Graph() G.add_nodes_from([0,1],fish='one') G.add_nodes_from([2,3],fish='two') G.add_nodes_from([4],fish='red') G.add_nodes_from([5],fish='blue') G.add_edges_from([(0,1),(2,3),(0,4),(2,5)]) self.G=G D=nx.DiGraph() D.add_nodes_from([0,1],fish='one') D.add_nodes_from([2,3],fish='two') D.add_nodes_from([4],fish='red') D.add_nodes_from([5],fish='blue') D.add_edges_from([(0,1),(2,3),(0,4),(2,5)]) self.D=D M=nx.MultiGraph() M.add_nodes_from([0,1],fish='one') M.add_nodes_from([2,3],fish='two') M.add_nodes_from([4],fish='red') M.add_nodes_from([5],fish='blue') M.add_edges_from([(0,1),(0,1),(2,3)]) self.M=M S=nx.Graph() S.add_nodes_from([0,1],fish='one') S.add_nodes_from([2,3],fish='two') S.add_nodes_from([4],fish='red') S.add_nodes_from([5],fish='blue') S.add_edge(0,0) S.add_edge(2,2) self.S=S class BaseTestDegreeMixing(object): def setUp(self): self.P4=nx.path_graph(4) self.D=nx.DiGraph() self.D.add_edges_from([(0, 2), (0, 3), (1, 3), (2, 3)]) self.M=nx.MultiGraph() self.M.add_path(list(range(4))) self.M.add_edge(0,1) self.S=nx.Graph() self.S.add_edges_from([(0,0),(1,1)]) networkx-1.11/networkx/algorithms/assortativity/tests/test_neighbor_degree.py0000644000175000017500000000462612637544450030057 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * import networkx as nx class TestAverageNeighbor(object): def test_degree_p4(self): G=nx.path_graph(4) answer={0:2,1:1.5,2:1.5,3:2} nd = nx.average_neighbor_degree(G) assert_equal(nd,answer) D=G.to_directed() nd = nx.average_neighbor_degree(D) assert_equal(nd,answer) D=G.to_directed() nd = nx.average_neighbor_degree(D) assert_equal(nd,answer) D=G.to_directed() nd = nx.average_neighbor_degree(D, source='in', target='in') assert_equal(nd,answer) def test_degree_p4_weighted(self): G=nx.path_graph(4) G[1][2]['weight']=4 answer={0:2,1:1.8,2:1.8,3:2} nd = nx.average_neighbor_degree(G,weight='weight') assert_equal(nd,answer) D=G.to_directed() nd = nx.average_neighbor_degree(D,weight='weight') assert_equal(nd,answer) D=G.to_directed() nd = nx.average_neighbor_degree(D,weight='weight') assert_equal(nd,answer) nd = nx.average_neighbor_degree(D,source='out',target='out', weight='weight') assert_equal(nd,answer) D=G.to_directed() nd = nx.average_neighbor_degree(D,source='in',target='in', weight='weight') assert_equal(nd,answer) def test_degree_k4(self): G=nx.complete_graph(4) answer={0:3,1:3,2:3,3:3} nd = nx.average_neighbor_degree(G) assert_equal(nd,answer) D=G.to_directed() nd = nx.average_neighbor_degree(D) assert_equal(nd,answer) D=G.to_directed() nd = nx.average_neighbor_degree(D) assert_equal(nd,answer) D=G.to_directed() nd = nx.average_neighbor_degree(D,source='in',target='in') assert_equal(nd,answer) def test_degree_k4_nodes(self): G=nx.complete_graph(4) answer={1:3.0,2:3.0} nd = nx.average_neighbor_degree(G,nodes=[1,2]) assert_equal(nd,answer) def test_degree_barrat(self): G=nx.star_graph(5) G.add_edges_from([(5,6),(5,7),(5,8),(5,9)]) G[0][5]['weight']=5 nd = nx.average_neighbor_degree(G)[5] assert_equal(nd,1.8) nd = nx.average_neighbor_degree(G,weight='weight')[5] assert_almost_equal(nd,3.222222,places=5) networkx-1.11/networkx/algorithms/assortativity/tests/test_pairs.py0000644000175000017500000000741112637544450026060 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * import networkx as nx from base_test import BaseTestAttributeMixing,BaseTestDegreeMixing class TestAttributeMixingXY(BaseTestAttributeMixing): def test_node_attribute_xy_undirected(self): attrxy=sorted(nx.node_attribute_xy(self.G,'fish')) attrxy_result=sorted([('one','one'), ('one','one'), ('two','two'), ('two','two'), ('one','red'), ('red','one'), ('blue','two'), ('two','blue') ]) assert_equal(attrxy,attrxy_result) def test_node_attribute_xy_undirected_nodes(self): attrxy=sorted(nx.node_attribute_xy(self.G,'fish', nodes=['one','yellow'])) attrxy_result=sorted( [ ]) assert_equal(attrxy,attrxy_result) def test_node_attribute_xy_directed(self): attrxy=sorted(nx.node_attribute_xy(self.D,'fish')) attrxy_result=sorted([('one','one'), ('two','two'), ('one','red'), ('two','blue') ]) assert_equal(attrxy,attrxy_result) def test_node_attribute_xy_multigraph(self): attrxy=sorted(nx.node_attribute_xy(self.M,'fish')) attrxy_result=[('one','one'), ('one','one'), ('one','one'), ('one','one'), ('two','two'), ('two','two') ] assert_equal(attrxy,attrxy_result) def test_node_attribute_xy_selfloop(self): attrxy=sorted(nx.node_attribute_xy(self.S,'fish')) attrxy_result=[('one','one'), ('two','two') ] assert_equal(attrxy,attrxy_result) class TestDegreeMixingXY(BaseTestDegreeMixing): def test_node_degree_xy_undirected(self): xy=sorted(nx.node_degree_xy(self.P4)) xy_result=sorted([(1,2), (2,1), (2,2), (2,2), (1,2), (2,1)]) assert_equal(xy,xy_result) def test_node_degree_xy_undirected_nodes(self): xy=sorted(nx.node_degree_xy(self.P4,nodes=[0,1,-1])) xy_result=sorted([(1,2), (2,1),]) assert_equal(xy,xy_result) def test_node_degree_xy_directed(self): xy=sorted(nx.node_degree_xy(self.D)) xy_result=sorted([(2,1), (2,3), (1,3), (1,3)]) assert_equal(xy,xy_result) def test_node_degree_xy_multigraph(self): xy=sorted(nx.node_degree_xy(self.M)) xy_result=sorted([(2,3), (2,3), (3,2), (3,2), (2,3), (3,2), (1,2), (2,1)]) assert_equal(xy,xy_result) def test_node_degree_xy_selfloop(self): xy=sorted(nx.node_degree_xy(self.S)) xy_result=sorted([(2,2), (2,2)]) assert_equal(xy,xy_result) def test_node_degree_xy_weighted(self): G = nx.Graph() G.add_edge(1,2,weight=7) G.add_edge(2,3,weight=10) xy=sorted(nx.node_degree_xy(G,weight='weight')) xy_result=sorted([(7,17), (17,10), (17,7), (10,17)]) assert_equal(xy,xy_result) networkx-1.11/networkx/algorithms/assortativity/tests/test_correlation.py0000644000175000017500000000631412637544450027264 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * from nose import SkipTest import networkx as nx from base_test import BaseTestAttributeMixing,BaseTestDegreeMixing from networkx.algorithms.assortativity.correlation import attribute_ac class TestDegreeMixingCorrelation(BaseTestDegreeMixing): @classmethod def setupClass(cls): global np global npt try: import numpy as np import numpy.testing as npt except ImportError: raise SkipTest('NumPy not available.') try: import scipy import scipy.stats except ImportError: raise SkipTest('SciPy not available.') def test_degree_assortativity_undirected(self): r=nx.degree_assortativity_coefficient(self.P4) npt.assert_almost_equal(r,-1.0/2,decimal=4) def test_degree_assortativity_directed(self): r=nx.degree_assortativity_coefficient(self.D) npt.assert_almost_equal(r,-0.57735,decimal=4) def test_degree_assortativity_multigraph(self): r=nx.degree_assortativity_coefficient(self.M) npt.assert_almost_equal(r,-1.0/7.0,decimal=4) def test_degree_assortativity_undirected(self): r=nx.degree_pearson_correlation_coefficient(self.P4) npt.assert_almost_equal(r,-1.0/2,decimal=4) def test_degree_assortativity_directed(self): r=nx.degree_pearson_correlation_coefficient(self.D) npt.assert_almost_equal(r,-0.57735,decimal=4) def test_degree_assortativity_multigraph(self): r=nx.degree_pearson_correlation_coefficient(self.M) npt.assert_almost_equal(r,-1.0/7.0,decimal=4) class TestAttributeMixingCorrelation(BaseTestAttributeMixing): @classmethod def setupClass(cls): global np global npt try: import numpy as np import numpy.testing as npt except ImportError: raise SkipTest('NumPy not available.') def test_attribute_assortativity_undirected(self): r=nx.attribute_assortativity_coefficient(self.G,'fish') assert_equal(r,6.0/22.0) def test_attribute_assortativity_directed(self): r=nx.attribute_assortativity_coefficient(self.D,'fish') assert_equal(r,1.0/3.0) def test_attribute_assortativity_multigraph(self): r=nx.attribute_assortativity_coefficient(self.M,'fish') assert_equal(r,1.0) def test_attribute_assortativity_coefficient(self): # from "Mixing patterns in networks" a=np.array([[0.258,0.016,0.035,0.013], [0.012,0.157,0.058,0.019], [0.013,0.023,0.306,0.035], [0.005,0.007,0.024,0.016]]) r=attribute_ac(a) npt.assert_almost_equal(r,0.623,decimal=3) def test_attribute_assortativity_coefficient2(self): a=np.array([[0.18,0.02,0.01,0.03], [0.02,0.20,0.03,0.02], [0.01,0.03,0.16,0.01], [0.03,0.02,0.01,0.22]]) r=attribute_ac(a) npt.assert_almost_equal(r,0.68,decimal=2) def test_attribute_assortativity(self): a=np.array([[50,50,0],[50,50,0],[0,0,2]]) r=attribute_ac(a) npt.assert_almost_equal(r,0.029,decimal=3) networkx-1.11/networkx/algorithms/assortativity/tests/test_connectivity.py0000644000175000017500000001032312637544450027454 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * import networkx as nx class TestNeighborConnectivity(object): def test_degree_p4(self): G=nx.path_graph(4) answer={1:2.0,2:1.5} nd = nx.average_degree_connectivity(G) assert_equal(nd,answer) D=G.to_directed() answer={2:2.0,4:1.5} nd = nx.average_degree_connectivity(D) assert_equal(nd,answer) answer={1:2.0,2:1.5} D=G.to_directed() nd = nx.average_degree_connectivity(D, source='in', target='in') assert_equal(nd,answer) D=G.to_directed() nd = nx.average_degree_connectivity(D, source='in', target='in') assert_equal(nd,answer) def test_degree_p4_weighted(self): G=nx.path_graph(4) G[1][2]['weight']=4 answer={1:2.0,2:1.8} nd = nx.average_degree_connectivity(G,weight='weight') assert_equal(nd,answer) answer={1:2.0,2:1.5} nd = nx.average_degree_connectivity(G) assert_equal(nd,answer) D=G.to_directed() answer={2:2.0,4:1.8} nd = nx.average_degree_connectivity(D,weight='weight') assert_equal(nd,answer) answer={1:2.0,2:1.8} D=G.to_directed() nd = nx.average_degree_connectivity(D,weight='weight', source='in', target='in') assert_equal(nd,answer) D=G.to_directed() nd = nx.average_degree_connectivity(D,source='in',target='out', weight='weight') assert_equal(nd,answer) def test_weight_keyword(self): G=nx.path_graph(4) G[1][2]['other']=4 answer={1:2.0,2:1.8} nd = nx.average_degree_connectivity(G,weight='other') assert_equal(nd,answer) answer={1:2.0,2:1.5} nd = nx.average_degree_connectivity(G,weight=None) assert_equal(nd,answer) D=G.to_directed() answer={2:2.0,4:1.8} nd = nx.average_degree_connectivity(D,weight='other') assert_equal(nd,answer) answer={1:2.0,2:1.8} D=G.to_directed() nd = nx.average_degree_connectivity(D,weight='other', source='in', target='in') assert_equal(nd,answer) D=G.to_directed() nd = nx.average_degree_connectivity(D,weight='other',source='in', target='in') assert_equal(nd,answer) def test_degree_barrat(self): G=nx.star_graph(5) G.add_edges_from([(5,6),(5,7),(5,8),(5,9)]) G[0][5]['weight']=5 nd = nx.average_degree_connectivity(G)[5] assert_equal(nd,1.8) nd = nx.average_degree_connectivity(G,weight='weight')[5] assert_almost_equal(nd,3.222222,places=5) nd = nx.k_nearest_neighbors(G,weight='weight')[5] assert_almost_equal(nd,3.222222,places=5) def test_zero_deg(self): G=nx.DiGraph() G.add_edge(1,2) G.add_edge(1,3) G.add_edge(1,4) c = nx.average_degree_connectivity(G) assert_equal(c,{1:0,3:1}) c = nx.average_degree_connectivity(G, source='in', target='in') assert_equal(c,{0:0,1:0}) c = nx.average_degree_connectivity(G, source='in', target='out') assert_equal(c,{0:0,1:3}) c = nx.average_degree_connectivity(G, source='in', target='in+out') assert_equal(c,{0:0,1:3}) c = nx.average_degree_connectivity(G, source='out', target='out') assert_equal(c,{0:0,3:0}) c = nx.average_degree_connectivity(G, source='out', target='in') assert_equal(c,{0:0,3:1}) c = nx.average_degree_connectivity(G, source='out', target='in+out') assert_equal(c,{0:0,3:1}) def test_in_out_weight(self): from itertools import permutations G=nx.DiGraph() G.add_edge(1,2,weight=1) G.add_edge(1,3,weight=1) G.add_edge(3,1,weight=1) for s,t in permutations(['in','out','in+out'],2): c = nx.average_degree_connectivity(G, source=s, target=t) cw = nx.average_degree_connectivity(G,source=s, target=t, weight='weight') assert_equal(c,cw) networkx-1.11/networkx/algorithms/assortativity/connectivity.py0000644000175000017500000001025312637544500025251 0ustar aricaric00000000000000#-*- coding: utf-8 -*- # Copyright (C) 2011 by # Jordi Torrents # Aric Hagberg # All rights reserved. # BSD license. from collections import defaultdict import networkx as nx __author__ = """\n""".join(['Jordi Torrents ', 'Aric Hagberg (hagberg@lanl.gov)']) __all__ = ['average_degree_connectivity', 'k_nearest_neighbors'] def _avg_deg_conn(G, neighbors, source_degree, target_degree, nodes=None, weight=None): # "k nearest neighbors, or neighbor_connectivity dsum = defaultdict(float) dnorm = defaultdict(float) for n,k in source_degree(nodes).items(): nbrdeg = target_degree(neighbors(n)) if weight is None: s = float(sum(nbrdeg.values())) else: # weight nbr degree by weight of (n,nbr) edge if neighbors == G.neighbors: s = float(sum((G[n][nbr].get(weight,1)*d for nbr,d in nbrdeg.items()))) elif neighbors == G.successors: s = float(sum((G[n][nbr].get(weight,1)*d for nbr,d in nbrdeg.items()))) elif neighbors == G.predecessors: s = float(sum((G[nbr][n].get(weight,1)*d for nbr,d in nbrdeg.items()))) dnorm[k] += source_degree(n, weight=weight) dsum[k] += s # normalize dc = {} for k,avg in dsum.items(): dc[k]=avg norm = dnorm[k] if avg > 0 and norm > 0: dc[k]/=norm return dc def average_degree_connectivity(G, source="in+out", target="in+out", nodes=None, weight=None): r"""Compute the average degree connectivity of graph. The average degree connectivity is the average nearest neighbor degree of nodes with degree k. For weighted graphs, an analogous measure can be computed using the weighted average neighbors degree defined in [1]_, for a node `i`, as .. math:: k_{nn,i}^{w} = \frac{1}{s_i} \sum_{j \in N(i)} w_{ij} k_j where `s_i` is the weighted degree of node `i`, `w_{ij}` is the weight of the edge that links `i` and `j`, and `N(i)` are the neighbors of node `i`. Parameters ---------- G : NetworkX graph source : "in"|"out"|"in+out" (default:"in+out") Directed graphs only. Use "in"- or "out"-degree for source node. target : "in"|"out"|"in+out" (default:"in+out" Directed graphs only. Use "in"- or "out"-degree for target node. nodes : list or iterable (optional) Compute neighbor connectivity for these nodes. The default is all nodes. weight : string or None, optional (default=None) The edge attribute that holds the numerical value used as a weight. If None, then each edge has weight 1. Returns ------- d : dict A dictionary keyed by degree k with the value of average connectivity. Examples -------- >>> G=nx.path_graph(4) >>> G.edge[1][2]['weight'] = 3 >>> nx.k_nearest_neighbors(G) {1: 2.0, 2: 1.5} >>> nx.k_nearest_neighbors(G, weight='weight') {1: 2.0, 2: 1.75} See also -------- neighbors_average_degree Notes ----- This algorithm is sometimes called "k nearest neighbors" and is also available as ``k_nearest_neighbors``. References ---------- .. [1] A. Barrat, M. Barthélemy, R. Pastor-Satorras, and A. Vespignani, "The architecture of complex weighted networks". PNAS 101 (11): 3747–3752 (2004). """ source_degree = G.degree target_degree = G.degree neighbors = G.neighbors if G.is_directed(): direction = {'out': G.out_degree, 'in': G.in_degree, 'in+out': G.degree} source_degree = direction[source] target_degree = direction[target] if source == 'in': neighbors = G.predecessors elif source == 'out': neighbors = G.successors return _avg_deg_conn(G, neighbors, source_degree, target_degree, nodes=nodes, weight=weight) k_nearest_neighbors = average_degree_connectivity networkx-1.11/networkx/algorithms/assortativity/mixing.py0000644000175000017500000001513212637544450024033 0ustar aricaric00000000000000#-*- coding: utf-8 -*- """ Mixing matrices for node attributes and degree. """ import networkx as nx from networkx.utils import dict_to_numpy_array from networkx.algorithms.assortativity.pairs import node_degree_xy, \ node_attribute_xy __author__ = ' '.join(['Aric Hagberg ']) __all__ = ['attribute_mixing_matrix', 'attribute_mixing_dict', 'degree_mixing_matrix', 'degree_mixing_dict', 'numeric_mixing_matrix', 'mixing_dict'] def attribute_mixing_dict(G,attribute,nodes=None,normalized=False): """Return dictionary representation of mixing matrix for attribute. Parameters ---------- G : graph NetworkX graph object. attribute : string Node attribute key. nodes: list or iterable (optional) Unse nodes in container to build the dict. The default is all nodes. normalized : bool (default=False) Return counts if False or probabilities if True. Examples -------- >>> G=nx.Graph() >>> G.add_nodes_from([0,1],color='red') >>> G.add_nodes_from([2,3],color='blue') >>> G.add_edge(1,3) >>> d=nx.attribute_mixing_dict(G,'color') >>> print(d['red']['blue']) 1 >>> print(d['blue']['red']) # d symmetric for undirected graphs 1 Returns ------- d : dictionary Counts or joint probability of occurrence of attribute pairs. """ xy_iter=node_attribute_xy(G,attribute,nodes) return mixing_dict(xy_iter,normalized=normalized) def attribute_mixing_matrix(G,attribute,nodes=None,mapping=None, normalized=True): """Return mixing matrix for attribute. Parameters ---------- G : graph NetworkX graph object. attribute : string Node attribute key. nodes: list or iterable (optional) Use only nodes in container to build the matrix. The default is all nodes. mapping : dictionary, optional Mapping from node attribute to integer index in matrix. If not specified, an arbitrary ordering will be used. normalized : bool (default=False) Return counts if False or probabilities if True. Returns ------- m: numpy array Counts or joint probability of occurrence of attribute pairs. """ d=attribute_mixing_dict(G,attribute,nodes) a=dict_to_numpy_array(d,mapping=mapping) if normalized: a=a/a.sum() return a def degree_mixing_dict(G, x='out', y='in', weight=None, nodes=None, normalized=False): """Return dictionary representation of mixing matrix for degree. Parameters ---------- G : graph NetworkX graph object. x: string ('in','out') The degree type for source node (directed graphs only). y: string ('in','out') The degree type for target node (directed graphs only). weight: string or None, optional (default=None) The edge attribute that holds the numerical value used as a weight. If None, then each edge has weight 1. The degree is the sum of the edge weights adjacent to the node. normalized : bool (default=False) Return counts if False or probabilities if True. Returns ------- d: dictionary Counts or joint probability of occurrence of degree pairs. """ xy_iter=node_degree_xy(G, x=x, y=y, nodes=nodes, weight=weight) return mixing_dict(xy_iter,normalized=normalized) def degree_mixing_matrix(G, x='out', y='in', weight=None, nodes=None, normalized=True): """Return mixing matrix for attribute. Parameters ---------- G : graph NetworkX graph object. x: string ('in','out') The degree type for source node (directed graphs only). y: string ('in','out') The degree type for target node (directed graphs only). nodes: list or iterable (optional) Build the matrix using only nodes in container. The default is all nodes. weight: string or None, optional (default=None) The edge attribute that holds the numerical value used as a weight. If None, then each edge has weight 1. The degree is the sum of the edge weights adjacent to the node. normalized : bool (default=False) Return counts if False or probabilities if True. Returns ------- m: numpy array Counts, or joint probability, of occurrence of node degree. """ d=degree_mixing_dict(G, x=x, y=y, nodes=nodes, weight=weight) s=set(d.keys()) for k,v in d.items(): s.update(v.keys()) m=max(s) mapping=dict(zip(range(m+1),range(m+1))) a=dict_to_numpy_array(d,mapping=mapping) if normalized: a=a/a.sum() return a def numeric_mixing_matrix(G,attribute,nodes=None,normalized=True): """Return numeric mixing matrix for attribute. Parameters ---------- G : graph NetworkX graph object. attribute : string Node attribute key. nodes: list or iterable (optional) Build the matrix only with nodes in container. The default is all nodes. normalized : bool (default=False) Return counts if False or probabilities if True. Returns ------- m: numpy array Counts, or joint, probability of occurrence of node attribute pairs. """ d=attribute_mixing_dict(G,attribute,nodes) s=set(d.keys()) for k,v in d.items(): s.update(v.keys()) m=max(s) mapping=dict(zip(range(m+1),range(m+1))) a=dict_to_numpy_array(d,mapping=mapping) if normalized: a=a/a.sum() return a def mixing_dict(xy,normalized=False): """Return a dictionary representation of mixing matrix. Parameters ---------- xy : list or container of two-tuples Pairs of (x,y) items. attribute : string Node attribute key normalized : bool (default=False) Return counts if False or probabilities if True. Returns ------- d: dictionary Counts or Joint probability of occurrence of values in xy. """ d={} psum=0.0 for x,y in xy: if x not in d: d[x]={} if y not in d: d[y]={} v = d[x].get(y,0) d[x][y] = v+1 psum+=1 if normalized: for k,jdict in d.items(): for j in jdict: jdict[j]/=psum return d # fixture for nose tests def setup_module(module): from nose import SkipTest try: import numpy except: raise SkipTest("NumPy not available") try: import scipy except: raise SkipTest("SciPy not available") networkx-1.11/networkx/algorithms/assortativity/neighbor_degree.py0000644000175000017500000001032512637544500025643 0ustar aricaric00000000000000#-*- coding: utf-8 -*- # Copyright (C) 2011 by # Jordi Torrents # Aric Hagberg # All rights reserved. # BSD license. import networkx as nx __author__ = """\n""".join(['Jordi Torrents ', 'Aric Hagberg (hagberg@lanl.gov)']) __all__ = ["average_neighbor_degree"] def _average_nbr_deg(G, source_degree, target_degree, nodes=None, weight=None): # average degree of neighbors avg = {} for n,deg in source_degree(nodes,weight=weight).items(): # normalize but not by zero degree if deg == 0: deg = 1 nbrdeg = target_degree(G[n]) if weight is None: avg[n] = sum(nbrdeg.values())/float(deg) else: avg[n] = sum((G[n][nbr].get(weight,1)*d for nbr,d in nbrdeg.items()))/float(deg) return avg def average_neighbor_degree(G, source='out', target='out', nodes=None, weight=None): r"""Returns the average degree of the neighborhood of each node. The average degree of a node `i` is .. math:: k_{nn,i} = \frac{1}{|N(i)|} \sum_{j \in N(i)} k_j where `N(i)` are the neighbors of node `i` and `k_j` is the degree of node `j` which belongs to `N(i)`. For weighted graphs, an analogous measure can be defined [1]_, .. math:: k_{nn,i}^{w} = \frac{1}{s_i} \sum_{j \in N(i)} w_{ij} k_j where `s_i` is the weighted degree of node `i`, `w_{ij}` is the weight of the edge that links `i` and `j` and `N(i)` are the neighbors of node `i`. Parameters ---------- G : NetworkX graph source : string ("in"|"out") Directed graphs only. Use "in"- or "out"-degree for source node. target : string ("in"|"out") Directed graphs only. Use "in"- or "out"-degree for target node. nodes : list or iterable, optional Compute neighbor degree for specified nodes. The default is all nodes in the graph. weight : string or None, optional (default=None) The edge attribute that holds the numerical value used as a weight. If None, then each edge has weight 1. Returns ------- d: dict A dictionary keyed by node with average neighbors degree value. Examples -------- >>> G=nx.path_graph(4) >>> G.edge[0][1]['weight'] = 5 >>> G.edge[2][3]['weight'] = 3 >>> nx.average_neighbor_degree(G) {0: 2.0, 1: 1.5, 2: 1.5, 3: 2.0} >>> nx.average_neighbor_degree(G, weight='weight') {0: 2.0, 1: 1.1666666666666667, 2: 1.25, 3: 2.0} >>> G=nx.DiGraph() >>> G.add_path([0,1,2,3]) >>> nx.average_neighbor_degree(G, source='in', target='in') {0: 1.0, 1: 1.0, 2: 1.0, 3: 0.0} >>> nx.average_neighbor_degree(G, source='out', target='out') {0: 1.0, 1: 1.0, 2: 0.0, 3: 0.0} Notes ----- For directed graphs you can also specify in-degree or out-degree by passing keyword arguments. See Also -------- average_degree_connectivity References ---------- .. [1] A. Barrat, M. Barthélemy, R. Pastor-Satorras, and A. Vespignani, "The architecture of complex weighted networks". PNAS 101 (11): 3747–3752 (2004). """ source_degree = G.degree target_degree = G.degree if G.is_directed(): direction = {'out':G.out_degree, 'in':G.in_degree} source_degree = direction[source] target_degree = direction[target] return _average_nbr_deg(G, source_degree, target_degree, nodes=nodes, weight=weight) # obsolete # def average_neighbor_in_degree(G, nodes=None, weight=None): # if not G.is_directed(): # raise nx.NetworkXError("Not defined for undirected graphs.") # return _average_nbr_deg(G, G.in_degree, G.in_degree, nodes, weight) # average_neighbor_in_degree.__doc__=average_neighbor_degree.__doc__ # def average_neighbor_out_degree(G, nodes=None, weight=None): # if not G.is_directed(): # raise nx.NetworkXError("Not defined for undirected graphs.") # return _average_nbr_deg(G, G.out_degree, G.out_degree, nodes, weight) # average_neighbor_out_degree.__doc__=average_neighbor_degree.__doc__ networkx-1.11/networkx/algorithms/dominating.py0000644000175000017500000000424712637544450021751 0ustar aricaric00000000000000# -*- coding: utf-8 -*- import networkx as nx __author__ = '\n'.join(['Jordi Torrents ']) __all__ = [ 'dominating_set', 'is_dominating_set'] def dominating_set(G, start_with=None): r"""Finds a dominating set for the graph G. A dominating set for a graph `G = (V, E)` is a node subset `D` of `V` such that every node not in `D` is adjacent to at least one member of `D` [1]_. Parameters ---------- G : NetworkX graph start_with : Node (default=None) Node to use as a starting point for the algorithm. Returns ------- D : set A dominating set for G. Notes ----- This function is an implementation of algorithm 7 in [2]_ which finds some dominating set, not necessarily the smallest one. See also -------- is_dominating_set References ---------- .. [1] http://en.wikipedia.org/wiki/Dominating_set .. [2] Abdol-Hossein Esfahanian. Connectivity Algorithms. http://www.cse.msu.edu/~cse835/Papers/Graph_connectivity_revised.pdf """ all_nodes = set(G) if start_with is None: v = set(G).pop() # pick a node else: if start_with not in G: raise nx.NetworkXError('node %s not in G' % start_with) v = start_with D = set([v]) ND = set(G[v]) other = all_nodes - ND - D while other: w = other.pop() D.add(w) ND.update([nbr for nbr in G[w] if nbr not in D]) other = all_nodes - ND - D return D def is_dominating_set(G, nbunch): r"""Checks if nodes in nbunch are a dominating set for G. A dominating set for a graph `G = (V, E)` is a node subset `D` of `V` such that every node not in `D` is adjacent to at least one member of `D` [1]_. Parameters ---------- G : NetworkX graph nbunch : Node container See also -------- dominating_set References ---------- .. [1] http://en.wikipedia.org/wiki/Dominating_set """ testset = set(n for n in nbunch if n in G) nbrs = set() for n in testset: nbrs.update(G[n]) if len(set(G) - testset - nbrs) > 0: return False else: return True networkx-1.11/networkx/algorithms/graphical.py0000644000175000017500000003127612637544500021550 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """Test sequences for graphiness. """ # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. from collections import defaultdict import heapq import networkx as nx __author__ = "\n".join(['Aric Hagberg (hagberg@lanl.gov)', 'Pieter Swart (swart@lanl.gov)', 'Dan Schult (dschult@colgate.edu)' 'Joel Miller (joel.c.miller.research@gmail.com)' 'Ben Edwards' 'Brian Cloteaux ']) __all__ = ['is_graphical', 'is_multigraphical', 'is_pseudographical', 'is_digraphical', 'is_valid_degree_sequence_erdos_gallai', 'is_valid_degree_sequence_havel_hakimi', 'is_valid_degree_sequence', # deprecated ] def is_graphical(sequence, method='eg'): """Returns True if sequence is a valid degree sequence. A degree sequence is valid if some graph can realize it. Parameters ---------- sequence : list or iterable container A sequence of integer node degrees method : "eg" | "hh" The method used to validate the degree sequence. "eg" corresponds to the Erdős-Gallai algorithm, and "hh" to the Havel-Hakimi algorithm. Returns ------- valid : bool True if the sequence is a valid degree sequence and False if not. Examples -------- >>> G = nx.path_graph(4) >>> sequence = G.degree().values() >>> nx.is_valid_degree_sequence(sequence) True References ---------- Erdős-Gallai [EG1960]_, [choudum1986]_ Havel-Hakimi [havel1955]_, [hakimi1962]_, [CL1996]_ """ if method == 'eg': valid = is_valid_degree_sequence_erdos_gallai(list(sequence)) elif method == 'hh': valid = is_valid_degree_sequence_havel_hakimi(list(sequence)) else: msg = "`method` must be 'eg' or 'hh'" raise nx.NetworkXException(msg) return valid is_valid_degree_sequence = is_graphical def _basic_graphical_tests(deg_sequence): # Sort and perform some simple tests on the sequence if not nx.utils.is_list_of_ints(deg_sequence): raise nx.NetworkXUnfeasible p = len(deg_sequence) num_degs = [0]*p dmax, dmin, dsum, n = 0, p, 0, 0 for d in deg_sequence: # Reject if degree is negative or larger than the sequence length if d<0 or d>=p: raise nx.NetworkXUnfeasible # Process only the non-zero integers elif d>0: dmax, dmin, dsum, n = max(dmax,d), min(dmin,d), dsum+d, n+1 num_degs[d] += 1 # Reject sequence if it has odd sum or is oversaturated if dsum%2 or dsum>n*(n-1): raise nx.NetworkXUnfeasible return dmax,dmin,dsum,n,num_degs def is_valid_degree_sequence_havel_hakimi(deg_sequence): r"""Returns True if deg_sequence can be realized by a simple graph. The validation proceeds using the Havel-Hakimi theorem. Worst-case run time is: O(s) where s is the sum of the sequence. Parameters ---------- deg_sequence : list A list of integers where each element specifies the degree of a node in a graph. Returns ------- valid : bool True if deg_sequence is graphical and False if not. Notes ----- The ZZ condition says that for the sequence d if .. math:: |d| >= \frac{(\max(d) + \min(d) + 1)^2}{4*\min(d)} then d is graphical. This was shown in Theorem 6 in [1]_. References ---------- .. [1] I.E. Zverovich and V.E. Zverovich. "Contributions to the theory of graphic sequences", Discrete Mathematics, 105, pp. 292-303 (1992). [havel1955]_, [hakimi1962]_, [CL1996]_ """ try: dmax,dmin,dsum,n,num_degs = _basic_graphical_tests(deg_sequence) except nx.NetworkXUnfeasible: return False # Accept if sequence has no non-zero degrees or passes the ZZ condition if n==0 or 4*dmin*n >= (dmax+dmin+1) * (dmax+dmin+1): return True modstubs = [0]*(dmax+1) # Successively reduce degree sequence by removing the maximum degree while n > 0: # Retrieve the maximum degree in the sequence while num_degs[dmax] == 0: dmax -= 1; # If there are not enough stubs to connect to, then the sequence is # not graphical if dmax > n-1: return False # Remove largest stub in list num_degs[dmax], n = num_degs[dmax]-1, n-1 # Reduce the next dmax largest stubs mslen = 0 k = dmax for i in range(dmax): while num_degs[k] == 0: k -= 1 num_degs[k], n = num_degs[k]-1, n-1 if k > 1: modstubs[mslen] = k-1 mslen += 1 # Add back to the list any non-zero stubs that were removed for i in range(mslen): stub = modstubs[i] num_degs[stub], n = num_degs[stub]+1, n+1 return True def is_valid_degree_sequence_erdos_gallai(deg_sequence): r"""Returns True if deg_sequence can be realized by a simple graph. The validation is done using the Erdős-Gallai theorem [EG1960]_. Parameters ---------- deg_sequence : list A list of integers Returns ------- valid : bool True if deg_sequence is graphical and False if not. Notes ----- This implementation uses an equivalent form of the Erdős-Gallai criterion. Worst-case run time is: O(n) where n is the length of the sequence. Specifically, a sequence d is graphical if and only if the sum of the sequence is even and for all strong indices k in the sequence, .. math:: \sum_{i=1}^{k} d_i \leq k(k-1) + \sum_{j=k+1}^{n} \min(d_i,k) = k(n-1) - ( k \sum_{j=0}^{k-1} n_j - \sum_{j=0}^{k-1} j n_j ) A strong index k is any index where `d_k \geq k` and the value `n_j` is the number of occurrences of j in d. The maximal strong index is called the Durfee index. This particular rearrangement comes from the proof of Theorem 3 in [2]_. The ZZ condition says that for the sequence d if .. math:: |d| >= \frac{(\max(d) + \min(d) + 1)^2}{4*\min(d)} then d is graphical. This was shown in Theorem 6 in [2]_. References ---------- .. [1] A. Tripathi and S. Vijay. "A note on a theorem of Erdős & Gallai", Discrete Mathematics, 265, pp. 417-420 (2003). .. [2] I.E. Zverovich and V.E. Zverovich. "Contributions to the theory of graphic sequences", Discrete Mathematics, 105, pp. 292-303 (1992). [EG1960]_, [choudum1986]_ """ try: dmax,dmin,dsum,n,num_degs = _basic_graphical_tests(deg_sequence) except nx.NetworkXUnfeasible: return False # Accept if sequence has no non-zero degrees or passes the ZZ condition if n==0 or 4*dmin*n >= (dmax+dmin+1) * (dmax+dmin+1): return True # Perform the EG checks using the reformulation of Zverovich and Zverovich k, sum_deg, sum_nj, sum_jnj = 0, 0, 0, 0 for dk in range(dmax, dmin-1, -1): if dk < k+1: # Check if already past Durfee index return True if num_degs[dk] > 0: run_size = num_degs[dk] # Process a run of identical-valued degrees if dk < k+run_size: # Check if end of run is past Durfee index run_size = dk-k # Adjust back to Durfee index sum_deg += run_size * dk for v in range(run_size): sum_nj += num_degs[k+v] sum_jnj += (k+v) * num_degs[k+v] k += run_size if sum_deg > k*(n-1) - k*sum_nj + sum_jnj: return False return True def is_multigraphical(sequence): """Returns True if some multigraph can realize the sequence. Parameters ---------- deg_sequence : list A list of integers Returns ------- valid : bool True if deg_sequence is a multigraphic degree sequence and False if not. Notes ----- The worst-case run time is O(n) where n is the length of the sequence. References ---------- .. [1] S. L. Hakimi. "On the realizability of a set of integers as degrees of the vertices of a linear graph", J. SIAM, 10, pp. 496-506 (1962). """ deg_sequence = list(sequence) if not nx.utils.is_list_of_ints(deg_sequence): return False dsum, dmax = 0, 0 for d in deg_sequence: if d<0: return False dsum, dmax = dsum+d, max(dmax,d) if dsum%2 or dsum<2*dmax: return False return True def is_pseudographical(sequence): """Returns True if some pseudograph can realize the sequence. Every nonnegative integer sequence with an even sum is pseudographical (see [1]_). Parameters ---------- sequence : list or iterable container A sequence of integer node degrees Returns ------- valid : bool True if the sequence is a pseudographic degree sequence and False if not. Notes ----- The worst-case run time is O(n) where n is the length of the sequence. References ---------- .. [1] F. Boesch and F. Harary. "Line removal algorithms for graphs and their degree lists", IEEE Trans. Circuits and Systems, CAS-23(12), pp. 778-782 (1976). """ s = list(sequence) if not nx.utils.is_list_of_ints(s): return False return sum(s)%2 == 0 and min(s) >= 0 def is_digraphical(in_sequence, out_sequence): r"""Returns True if some directed graph can realize the in- and out-degree sequences. Parameters ---------- in_sequence : list or iterable container A sequence of integer node in-degrees out_sequence : list or iterable container A sequence of integer node out-degrees Returns ------- valid : bool True if in and out-sequences are digraphic False if not. Notes ----- This algorithm is from Kleitman and Wang [1]_. The worst case runtime is O(s * log n) where s and n are the sum and length of the sequences respectively. References ---------- .. [1] D.J. Kleitman and D.L. Wang Algorithms for Constructing Graphs and Digraphs with Given Valences and Factors, Discrete Mathematics, 6(1), pp. 79-88 (1973) """ in_deg_sequence = list(in_sequence) out_deg_sequence = list(out_sequence) if not nx.utils.is_list_of_ints(in_deg_sequence): return False if not nx.utils.is_list_of_ints(out_deg_sequence): return False # Process the sequences and form two heaps to store degree pairs with # either zero or non-zero out degrees sumin, sumout, nin, nout = 0, 0, len(in_deg_sequence), len(out_deg_sequence) maxn = max(nin, nout) maxin = 0 if maxn==0: return True stubheap, zeroheap = [ ], [ ] for n in range(maxn): in_deg, out_deg = 0, 0 if n 0: stubheap.append((-1*out_deg, -1*in_deg)) elif out_deg > 0: zeroheap.append(-1*out_deg) if sumin != sumout: return False heapq.heapify(stubheap) heapq.heapify(zeroheap) modstubs = [(0,0)]*(maxin+1) # Successively reduce degree sequence by removing the maximum out degree while stubheap: # Take the first value in the sequence with non-zero in degree (freeout, freein) = heapq.heappop( stubheap ) freein *= -1 if freein > len(stubheap)+len(zeroheap): return False # Attach out stubs to the nodes with the most in stubs mslen = 0 for i in range(freein): if zeroheap and (not stubheap or stubheap[0][0] > zeroheap[0]): stubout = heapq.heappop(zeroheap) stubin = 0 else: (stubout, stubin) = heapq.heappop(stubheap) if stubout == 0: return False # Check if target is now totally connected if stubout+1<0 or stubin<0: modstubs[mslen] = (stubout+1, stubin) mslen += 1 # Add back the nodes to the heap that still have available stubs for i in range(mslen): stub = modstubs[i] if stub[1] < 0: heapq.heappush(stubheap, stub) else: heapq.heappush(zeroheap, stub[0]) if freeout<0: heapq.heappush(zeroheap, freeout) return True networkx-1.11/networkx/algorithms/approximation/0000755000175000017500000000000012653231454022124 5ustar aricaric00000000000000networkx-1.11/networkx/algorithms/approximation/clustering_coefficient.py0000644000175000017500000000371012637544500027215 0ustar aricaric00000000000000# -*- coding: utf-8 -*- # Copyright (C) 2013 by # Fred Morstatter # Jordi Torrents # All rights reserved. # BSD license. import random from networkx.utils import not_implemented_for __all__ = ['average_clustering'] __author__ = """\n""".join(['Fred Morstatter ', 'Jordi Torrents ']) @not_implemented_for('directed') def average_clustering(G, trials=1000): r"""Estimates the average clustering coefficient of G. The local clustering of each node in `G` is the fraction of triangles that actually exist over all possible triangles in its neighborhood. The average clustering coefficient of a graph `G` is the mean of local clusterings. This function finds an approximate average clustering coefficient for G by repeating `n` times (defined in `trials`) the following experiment: choose a node at random, choose two of its neighbors at random, and check if they are connected. The approximate coefficient is the fraction of triangles found over the number of trials [1]_. Parameters ---------- G : NetworkX graph trials : integer Number of trials to perform (default 1000). Returns ------- c : float Approximated average clustering coefficient. References ---------- .. [1] Schank, Thomas, and Dorothea Wagner. Approximating clustering coefficient and transitivity. Universität Karlsruhe, Fakultät für Informatik, 2004. http://www.emis.ams.org/journals/JGAA/accepted/2005/SchankWagner2005.9.2.pdf """ n = len(G) triangles = 0 nodes = G.nodes() for i in [int(random.random() * n) for i in range(trials)]: nbrs = list(G[nodes[i]]) if len(nbrs) < 2: continue u, v = random.sample(nbrs, 2) if u in G[v]: triangles += 1 return triangles / float(trials) networkx-1.11/networkx/algorithms/approximation/independent_set.py0000644000175000017500000000371512637544450025661 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ Independent Set Independent set or stable set is a set of vertices in a graph, no two of which are adjacent. That is, it is a set I of vertices such that for every two vertices in I, there is no edge connecting the two. Equivalently, each edge in the graph has at most one endpoint in I. The size of an independent set is the number of vertices it contains. A maximum independent set is a largest independent set for a given graph G and its size is denoted α(G). The problem of finding such a set is called the maximum independent set problem and is an NP-hard optimization problem. As such, it is unlikely that there exists an efficient algorithm for finding a maximum independent set of a graph. http://en.wikipedia.org/wiki/Independent_set_(graph_theory) Independent set algorithm is based on the following paper: `O(|V|/(log|V|)^2)` apx of maximum clique/independent set. Boppana, R., & Halldórsson, M. M. (1992). Approximating maximum independent sets by excluding subgraphs. BIT Numerical Mathematics, 32(2), 180–196. Springer. doi:10.1007/BF01994876 """ # Copyright (C) 2011-2012 by # Nicholas Mancuso # All rights reserved. # BSD license. from networkx.algorithms.approximation import clique_removal __all__ = ["maximum_independent_set"] __author__ = """Nicholas Mancuso (nick.mancuso@gmail.com)""" def maximum_independent_set(G): """Return an approximate maximum independent set. Parameters ---------- G : NetworkX graph Undirected graph Returns ------- iset : Set The apx-maximum independent set Notes ----- Finds the `O(|V|/(log|V|)^2)` apx of independent set in the worst case. References ---------- .. [1] Boppana, R., & Halldórsson, M. M. (1992). Approximating maximum independent sets by excluding subgraphs. BIT Numerical Mathematics, 32(2), 180–196. Springer. """ iset, _ = clique_removal(G) return iset networkx-1.11/networkx/algorithms/approximation/ramsey.py0000644000175000017500000000155612637544500024006 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ Ramsey numbers. """ # Copyright (C) 2011 by # Nicholas Mancuso # All rights reserved. # BSD license. import networkx as nx __all__ = ["ramsey_R2"] __author__ = """Nicholas Mancuso (nick.mancuso@gmail.com)""" def ramsey_R2(G): r"""Approximately computes the Ramsey number `R(2;s,t)` for graph. Parameters ---------- G : NetworkX graph Undirected graph Returns ------- max_pair : (set, set) tuple Maximum clique, Maximum independent set. """ if not G: return (set([]), set([])) node = next(G.nodes_iter()) nbrs = nx.all_neighbors(G, node) nnbrs = nx.non_neighbors(G, node) c_1, i_1 = ramsey_R2(G.subgraph(nbrs)) c_2, i_2 = ramsey_R2(G.subgraph(nnbrs)) c_1.add(node) i_2.add(node) return (max([c_1, c_2]), max([i_1, i_2])) networkx-1.11/networkx/algorithms/approximation/kcomponents.py0000644000175000017500000003136412637544500025046 0ustar aricaric00000000000000""" Fast approximation for k-component structure """ # Copyright (C) 2015 by # Jordi Torrents # All rights reserved. # BSD license. import itertools import collections import networkx as nx from networkx.exception import NetworkXError from networkx.utils import not_implemented_for from networkx.algorithms.approximation import local_node_connectivity from networkx.algorithms.connectivity import \ local_node_connectivity as exact_local_node_connectivity from networkx.algorithms.connectivity import build_auxiliary_node_connectivity from networkx.algorithms.flow import build_residual_network __author__ = """\n""".join(['Jordi Torrents ']) __all__ = ['k_components'] not_implemented_for('directed') def k_components(G, min_density=0.95): r"""Returns the approximate k-component structure of a graph G. A `k`-component is a maximal subgraph of a graph G that has, at least, node connectivity `k`: we need to remove at least `k` nodes to break it into more components. `k`-components have an inherent hierarchical structure because they are nested in terms of connectivity: a connected graph can contain several 2-components, each of which can contain one or more 3-components, and so forth. This implementation is based on the fast heuristics to approximate the `k`-component sturcture of a graph [1]_. Which, in turn, it is based on a fast approximation algorithm for finding good lower bounds of the number of node independent paths between two nodes [2]_. Parameters ---------- G : NetworkX graph Undirected graph min_density : Float Density relaxation treshold. Default value 0.95 Returns ------- k_components : dict Dictionary with connectivity level `k` as key and a list of sets of nodes that form a k-component of level `k` as values. Examples -------- >>> # Petersen graph has 10 nodes and it is triconnected, thus all >>> # nodes are in a single component on all three connectivity levels >>> from networkx.algorithms import approximation as apxa >>> G = nx.petersen_graph() >>> k_components = apxa.k_components(G) Notes ----- The logic of the approximation algorithm for computing the `k`-component structure [1]_ is based on repeatedly applying simple and fast algorithms for `k`-cores and biconnected components in order to narrow down the number of pairs of nodes over which we have to compute White and Newman's approximation algorithm for finding node independent paths [2]_. More formally, this algorithm is based on Whitney's theorem, which states an inclusion relation among node connectivity, edge connectivity, and minimum degree for any graph G. This theorem implies that every `k`-component is nested inside a `k`-edge-component, which in turn, is contained in a `k`-core. Thus, this algorithm computes node independent paths among pairs of nodes in each biconnected part of each `k`-core, and repeats this procedure for each `k` from 3 to the maximal core number of a node in the input graph. Because, in practice, many nodes of the core of level `k` inside a bicomponent actually are part of a component of level k, the auxiliary graph needed for the algorithm is likely to be very dense. Thus, we use a complement graph data structure (see `AntiGraph`) to save memory. AntiGraph only stores information of the edges that are *not* present in the actual auxiliary graph. When applying algorithms to this complement graph data structure, it behaves as if it were the dense version. See also -------- k_components References ---------- .. [1] Torrents, J. and F. Ferraro (2015) Structural Cohesion: Visualization and Heuristics for Fast Computation. http://arxiv.org/pdf/1503.04476v1 .. [2] White, Douglas R., and Mark Newman (2001) A Fast Algorithm for Node-Independent Paths. Santa Fe Institute Working Paper #01-07-035 http://eclectic.ss.uci.edu/~drwhite/working.pdf .. [3] Moody, J. and D. White (2003). Social cohesion and embeddedness: A hierarchical conception of social groups. American Sociological Review 68(1), 103--28. http://www2.asanet.org/journals/ASRFeb03MoodyWhite.pdf """ # Dictionary with connectivity level (k) as keys and a list of # sets of nodes that form a k-component as values k_components = collections.defaultdict(list) # make a few functions local for speed node_connectivity = local_node_connectivity k_core = nx.k_core core_number = nx.core_number biconnected_components = nx.biconnected_components density = nx.density combinations = itertools.combinations # Exact solution for k = {1,2} # There is a linear time algorithm for triconnectivity, if we had an # implementation available we could start from k = 4. for component in nx.connected_components(G): # isolated nodes have connectivity 0 comp = set(component) if len(comp) > 1: k_components[1].append(comp) for bicomponent in nx.biconnected_components(G): # avoid considering dyads as bicomponents bicomp = set(bicomponent) if len(bicomp) > 2: k_components[2].append(bicomp) # There is no k-component of k > maximum core number # \kappa(G) <= \lambda(G) <= \delta(G) g_cnumber = core_number(G) max_core = max(g_cnumber.values()) for k in range(3, max_core + 1): C = k_core(G, k, core_number=g_cnumber) for nodes in biconnected_components(C): # Build a subgraph SG induced by the nodes that are part of # each biconnected component of the k-core subgraph C. if len(nodes) < k: continue SG = G.subgraph(nodes) # Build auxiliary graph H = _AntiGraph() H.add_nodes_from(SG.nodes_iter()) for u,v in combinations(SG, 2): K = node_connectivity(SG, u, v, cutoff=k) if k > K: H.add_edge(u,v) for h_nodes in biconnected_components(H): if len(h_nodes) <= k: continue SH = H.subgraph(h_nodes) for Gc in _cliques_heuristic(SG, SH, k, min_density): for k_nodes in biconnected_components(Gc): Gk = nx.k_core(SG.subgraph(k_nodes), k) if len(Gk) <= k: continue k_components[k].append(set(Gk)) return k_components def _cliques_heuristic(G, H, k, min_density): h_cnumber = nx.core_number(H) for i, c_value in enumerate(sorted(set(h_cnumber.values()), reverse=True)): cands = set(n for n, c in h_cnumber.items() if c == c_value) # Skip checking for overlap for the highest core value if i == 0: overlap = False else: overlap = set.intersection(*[ set(x for x in H[n] if x not in cands) for n in cands]) if overlap and len(overlap) < k: SH = H.subgraph(cands | overlap) else: SH = H.subgraph(cands) sh_cnumber = nx.core_number(SH) SG = nx.k_core(G.subgraph(SH), k) while not (_same(sh_cnumber) and nx.density(SH) >= min_density): SH = H.subgraph(SG) if len(SH) <= k: break sh_cnumber = nx.core_number(SH) sh_deg = SH.degree() min_deg = min(sh_deg.values()) SH.remove_nodes_from(n for n, d in sh_deg.items() if d == min_deg) SG = nx.k_core(G.subgraph(SH), k) else: yield SG def _same(measure, tol=0): vals = set(measure.values()) if (max(vals) - min(vals)) <= tol: return True return False class _AntiGraph(nx.Graph): """ Class for complement graphs. The main goal is to be able to work with big and dense graphs with a low memory foodprint. In this class you add the edges that *do not exist* in the dense graph, the report methods of the class return the neighbors, the edges and the degree as if it was the dense graph. Thus it's possible to use an instance of this class with some of NetworkX functions. In this case we only use k-core, connected_components, and biconnected_components. """ all_edge_dict = {'weight': 1} def single_edge_dict(self): return self.all_edge_dict edge_attr_dict_factory = single_edge_dict def __getitem__(self, n): """Return a dict of neighbors of node n in the dense graph. Parameters ---------- n : node A node in the graph. Returns ------- adj_dict : dictionary The adjacency dictionary for nodes connected to n. """ all_edge_dict = self.all_edge_dict return dict((node, all_edge_dict) for node in set(self.adj) - set(self.adj[n]) - set([n])) def neighbors(self, n): """Return a list of the nodes connected to the node n in the dense graph. Parameters ---------- n : node A node in the graph Returns ------- nlist : list A list of nodes that are adjacent to n. Raises ------ NetworkXError If the node n is not in the graph. """ try: return list(set(self.adj) - set(self.adj[n]) - set([n])) except KeyError: raise NetworkXError("The node %s is not in the graph."%(n,)) def neighbors_iter(self, n): """Return an iterator over all neighbors of node n in the dense graph. """ try: return iter(set(self.adj) - set(self.adj[n]) - set([n])) except KeyError: raise NetworkXError("The node %s is not in the graph."%(n,)) def degree(self, nbunch=None, weight=None): """Return the degree of a node or nodes in the dense graph. """ if nbunch in self: # return a single node return next(self.degree_iter(nbunch,weight))[1] else: # return a dict return dict(self.degree_iter(nbunch,weight)) def degree_iter(self, nbunch=None, weight=None): """Return an iterator for (node, degree) in the dense graph. The node degree is the number of edges adjacent to the node. Parameters ---------- nbunch : iterable container, optional (default=all nodes) A container of nodes. The container will be iterated through once. weight : string or None, optional (default=None) The edge attribute that holds the numerical value used as a weight. If None, then each edge has weight 1. The degree is the sum of the edge weights adjacent to the node. Returns ------- nd_iter : an iterator The iterator returns two-tuples of (node, degree). See Also -------- degree Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> list(G.degree_iter(0)) # node 0 with degree 1 [(0, 1)] >>> list(G.degree_iter([0,1])) [(0, 1), (1, 2)] """ if nbunch is None: nodes_nbrs = ((n, {v: self.all_edge_dict for v in set(self.adj) - set(self.adj[n]) - set([n])}) for n in self.nodes_iter()) else: nodes_nbrs = ((n, {v: self.all_edge_dict for v in set(self.nodes()) - set(self.adj[n]) - set([n])}) for n in self.nbunch_iter(nbunch)) if weight is None: for n,nbrs in nodes_nbrs: yield (n,len(nbrs)+(n in nbrs)) # return tuple (n,degree) else: # AntiGraph is a ThinGraph so all edges have weight 1 for n,nbrs in nodes_nbrs: yield (n, sum((nbrs[nbr].get(weight, 1) for nbr in nbrs)) + (n in nbrs and nbrs[n].get(weight, 1))) def adjacency_iter(self): """Return an iterator of (node, adjacency set) tuples for all nodes in the dense graph. This is the fastest way to look at every edge. For directed graphs, only outgoing adjacencies are included. Returns ------- adj_iter : iterator An iterator of (node, adjacency set) for all nodes in the graph. """ for n in self.adj: yield (n, set(self.adj) - set(self.adj[n]) - set([n])) networkx-1.11/networkx/algorithms/approximation/matching.py0000644000175000017500000000246312637544450024302 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ ************** Graph Matching ************** Given a graph G = (V,E), a matching M in G is a set of pairwise non-adjacent edges; that is, no two edges share a common vertex. http://en.wikipedia.org/wiki/Matching_(graph_theory) """ # Copyright (C) 2011-2012 by # Nicholas Mancuso # All rights reserved. # BSD license. import networkx as nx __all__ = ["min_maximal_matching"] __author__ = """Nicholas Mancuso (nick.mancuso@gmail.com)""" def min_maximal_matching(G): r"""Returns the minimum maximal matching of G. That is, out of all maximal matchings of the graph G, the smallest is returned. Parameters ---------- G : NetworkX graph Undirected graph Returns ------- min_maximal_matching : set Returns a set of edges such that no two edges share a common endpoint and every edge not in the set shares some common endpoint in the set. Cardinality will be 2*OPT in the worst case. Notes ----- The algorithm computes an approximate solution fo the minimum maximal cardinality matching problem. The solution is no more than 2 * OPT in size. Runtime is `O(|E|)`. References ---------- .. [1] Vazirani, Vijay Approximation Algorithms (2001) """ return nx.maximal_matching(G) networkx-1.11/networkx/algorithms/approximation/__init__.py0000644000175000017500000000104312637544450024240 0ustar aricaric00000000000000from networkx.algorithms.approximation.clustering_coefficient import * from networkx.algorithms.approximation.clique import * from networkx.algorithms.approximation.connectivity import * from networkx.algorithms.approximation.dominating_set import * from networkx.algorithms.approximation.kcomponents import * from networkx.algorithms.approximation.independent_set import * from networkx.algorithms.approximation.matching import * from networkx.algorithms.approximation.ramsey import * from networkx.algorithms.approximation.vertex_cover import * networkx-1.11/networkx/algorithms/approximation/dominating_set.py0000644000175000017500000001046412637544450025514 0ustar aricaric00000000000000# -*- coding: utf-8 -*- # Copyright (C) 2011-2012 by # Nicholas Mancuso # All rights reserved. # BSD license. """Functions for finding node and edge dominating sets. A *`dominating set`_[1] for an undirected graph *G* with vertex set *V* and edge set *E* is a subset *D* of *V* such that every vertex not in *D* is adjacent to at least one member of *D*. An *`edge dominating set`_[2] is a subset *F* of *E* such that every edge not in *F* is incident to an endpoint of at least one edge in *F*. .. [1] dominating set: https://en.wikipedia.org/wiki/Dominating_set .. [2] edge dominating set: https://en.wikipedia.org/wiki/Edge_dominating_set """ from __future__ import division from ..matching import maximal_matching from ...utils import not_implemented_for __all__ = ["min_weighted_dominating_set", "min_edge_dominating_set"] __author__ = """Nicholas Mancuso (nick.mancuso@gmail.com)""" # TODO Why doesn't this algorithm work for directed graphs? @not_implemented_for('directed') def min_weighted_dominating_set(G, weight=None): """Returns a dominating set that approximates the minimum weight node dominating set. Parameters ---------- G : NetworkX graph Undirected graph. weight : string The node attribute storing the weight of an edge. If provided, the node attribute with this key must be a number for each node. If not provided, each node is assumed to have weight one. Returns ------- min_weight_dominating_set : set A set of nodes, the sum of whose weights is no more than `(\log w(V)) w(V^*)`, where `w(V)` denotes the sum of the weights of each node in the graph and `w(V^*)` denotes the sum of the weights of each node in the minimum weight dominating set. Notes ----- This algorithm computes an approximate minimum weighted dominating set for the graph ``G``. The returned solution has weight `(\log w(V)) w(V^*)`, where `w(V)` denotes the sum of the weights of each node in the graph and `w(V^*)` denotes the sum of the weights of each node in the minimum weight dominating set for the graph. This implementation of the algorithm runs in `O(m)` time, where `m` is the number of edges in the graph. References ---------- .. [1] Vazirani, Vijay V. *Approximation Algorithms*. Springer Science & Business Media, 2001. """ # The unique dominating set for the null graph is the empty set. if len(G) == 0: return set() # This is the dominating set that will eventually be returned. dom_set = set() def _cost(node_and_neighborhood): """Returns the cost-effectiveness of greedily choosing the given node. `node_and_neighborhood` is a two-tuple comprising a node and its closed neighborhood. """ v, neighborhood = node_and_neighborhood return G.node[v].get(weight, 1) / len(neighborhood - dom_set) # This is a set of all vertices not already covered by the # dominating set. vertices = set(G) # This is a dictionary mapping each node to the closed neighborhood # of that node. neighborhoods = {v: {v} | set(G[v]) for v in G} # Continue until all vertices are adjacent to some node in the # dominating set. while vertices: # Find the most cost-effective node to add, along with its # closed neighborhood. dom_node, min_set = min(neighborhoods.items(), key=_cost) # Add the node to the dominating set and reduce the remaining # set of nodes to cover. dom_set.add(dom_node) del neighborhoods[dom_node] vertices -= min_set return dom_set def min_edge_dominating_set(G): r"""Return minimum cardinality edge dominating set. Parameters ---------- G : NetworkX graph Undirected graph Returns ------- min_edge_dominating_set : set Returns a set of dominating edges whose size is no more than 2 * OPT. Notes ----- The algorithm computes an approximate solution to the edge dominating set problem. The result is no more than 2 * OPT in terms of size of the set. Runtime of the algorithm is `O(|E|)`. """ if not G: raise ValueError("Expected non-empty NetworkX graph!") return maximal_matching(G) networkx-1.11/networkx/algorithms/approximation/clique.py0000644000175000017500000000554412637544500023771 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ Cliques. """ # Copyright (C) 2011-2012 by # Nicholas Mancuso # All rights reserved. # BSD license. import networkx as nx from networkx.algorithms.approximation import ramsey __author__ = """Nicholas Mancuso (nick.mancuso@gmail.com)""" __all__ = ["clique_removal","max_clique"] def max_clique(G): r"""Find the Maximum Clique Finds the `O(|V|/(log|V|)^2)` apx of maximum clique/independent set in the worst case. Parameters ---------- G : NetworkX graph Undirected graph Returns ------- clique : set The apx-maximum clique of the graph Notes ------ A clique in an undirected graph G = (V, E) is a subset of the vertex set `C \subseteq V`, such that for every two vertices in C, there exists an edge connecting the two. This is equivalent to saying that the subgraph induced by C is complete (in some cases, the term clique may also refer to the subgraph). A maximum clique is a clique of the largest possible size in a given graph. The clique number `\omega(G)` of a graph G is the number of vertices in a maximum clique in G. The intersection number of G is the smallest number of cliques that together cover all edges of G. http://en.wikipedia.org/wiki/Maximum_clique References ---------- .. [1] Boppana, R., & Halldórsson, M. M. (1992). Approximating maximum independent sets by excluding subgraphs. BIT Numerical Mathematics, 32(2), 180–196. Springer. doi:10.1007/BF01994876 """ if G is None: raise ValueError("Expected NetworkX graph!") # finding the maximum clique in a graph is equivalent to finding # the independent set in the complementary graph cgraph = nx.complement(G) iset, _ = clique_removal(cgraph) return iset def clique_removal(G): """ Repeatedly remove cliques from the graph. Results in a `O(|V|/(\log |V|)^2)` approximation of maximum clique & independent set. Returns the largest independent set found, along with found maximal cliques. Parameters ---------- G : NetworkX graph Undirected graph Returns ------- max_ind_cliques : (set, list) tuple Maximal independent set and list of maximal cliques (sets) in the graph. References ---------- .. [1] Boppana, R., & Halldórsson, M. M. (1992). Approximating maximum independent sets by excluding subgraphs. BIT Numerical Mathematics, 32(2), 180–196. Springer. """ graph = G.copy() c_i, i_i = ramsey.ramsey_R2(graph) cliques = [c_i] isets = [i_i] while graph: graph.remove_nodes_from(c_i) c_i, i_i = ramsey.ramsey_R2(graph) if c_i: cliques.append(c_i) if i_i: isets.append(i_i) maxiset = max(isets) return maxiset, cliques networkx-1.11/networkx/algorithms/approximation/tests/0000755000175000017500000000000012653231454023266 5ustar aricaric00000000000000networkx-1.11/networkx/algorithms/approximation/tests/test_independent_set.py0000644000175000017500000000032412637544450030053 0ustar aricaric00000000000000from nose.tools import * import networkx as nx import networkx.algorithms.approximation as a def test_independent_set(): # smoke test G = nx.Graph() assert_equal(len(a.maximum_independent_set(G)),0) networkx-1.11/networkx/algorithms/approximation/tests/test_vertex_cover.py0000644000175000017500000000213012637544500027407 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * import networkx as nx from networkx.algorithms import approximation as a class TestMWVC: def test_min_vertex_cover(self): # create a simple star graph size = 50 sg = nx.star_graph(size) cover = a.min_weighted_vertex_cover(sg) assert_equals(2, len(cover)) for u, v in sg.edges_iter(): ok_((u in cover or v in cover), "Node node covered!") wg = nx.Graph() wg.add_node(0, weight=10) wg.add_node(1, weight=1) wg.add_node(2, weight=1) wg.add_node(3, weight=1) wg.add_node(4, weight=1) wg.add_edge(0, 1) wg.add_edge(0, 2) wg.add_edge(0, 3) wg.add_edge(0, 4) wg.add_edge(1,2) wg.add_edge(2,3) wg.add_edge(3,4) wg.add_edge(4,1) cover = a.min_weighted_vertex_cover(wg, weight="weight") csum = sum(wg.node[node]["weight"] for node in cover) assert_equals(4, csum) for u, v in wg.edges_iter(): ok_((u in cover or v in cover), "Node node covered!") networkx-1.11/networkx/algorithms/approximation/tests/test_approx_clust_coeff.py0000644000175000017500000000214312637544450030571 0ustar aricaric00000000000000from nose.tools import assert_equal import networkx as nx from networkx.algorithms.approximation import average_clustering # This approximation has to be be exact in regular graphs # with no triangles or with all possible triangles. def test_petersen(): # Actual coefficient is 0 G = nx.petersen_graph() assert_equal(average_clustering(G, trials=int(len(G)/2)), nx.average_clustering(G)) def test_tetrahedral(): # Actual coefficient is 1 G = nx.tetrahedral_graph() assert_equal(average_clustering(G, trials=int(len(G)/2)), nx.average_clustering(G)) def test_dodecahedral(): # Actual coefficient is 0 G = nx.dodecahedral_graph() assert_equal(average_clustering(G, trials=int(len(G)/2)), nx.average_clustering(G)) def test_empty(): G = nx.empty_graph(5) assert_equal(average_clustering(G, trials=int(len(G)/2)), 0) def test_complete(): G = nx.complete_graph(5) assert_equal(average_clustering(G, trials=int(len(G)/2)), 1) G = nx.complete_graph(7) assert_equal(average_clustering(G, trials=int(len(G)/2)), 1) networkx-1.11/networkx/algorithms/approximation/tests/test_dominating_set.py0000644000175000017500000000455212637544500027712 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import ok_ from nose.tools import eq_ import networkx as nx from networkx.algorithms.approximation import min_weighted_dominating_set from networkx.algorithms.approximation import min_edge_dominating_set class TestMinWeightDominatingSet: def test_min_weighted_dominating_set(self): graph = nx.Graph() graph.add_edge(1, 2) graph.add_edge(1, 5) graph.add_edge(2, 3) graph.add_edge(2, 5) graph.add_edge(3, 4) graph.add_edge(3, 6) graph.add_edge(5, 6) vertices = set([1, 2, 3, 4, 5, 6]) # due to ties, this might be hard to test tight bounds dom_set = min_weighted_dominating_set(graph) for vertex in vertices - dom_set: neighbors = set(graph.neighbors(vertex)) ok_(len(neighbors & dom_set) > 0, "Non dominating set found!") def test_star_graph(self): """Tests that an approximate dominating set for the star graph, even when the center node does not have the smallest integer label, gives just the center node. For more information, see #1527. """ # Create a star graph in which the center node has the highest # label instead of the lowest. G = nx.star_graph(10) G = nx.relabel_nodes(G, {0: 9, 9: 0}) eq_(min_weighted_dominating_set(G), {9}) def test_min_edge_dominating_set(self): graph = nx.path_graph(5) dom_set = min_edge_dominating_set(graph) # this is a crappy way to test, but good enough for now. for edge in graph.edges_iter(): if edge in dom_set: continue else: u, v = edge found = False for dom_edge in dom_set: found |= u == dom_edge[0] or u == dom_edge[1] ok_(found, "Non adjacent edge found!") graph = nx.complete_graph(10) dom_set = min_edge_dominating_set(graph) # this is a crappy way to test, but good enough for now. for edge in graph.edges_iter(): if edge in dom_set: continue else: u, v = edge found = False for dom_edge in dom_set: found |= u == dom_edge[0] or u == dom_edge[1] ok_(found, "Non adjacent edge found!") networkx-1.11/networkx/algorithms/approximation/tests/test_matching.py0000644000175000017500000000032612637544450026477 0ustar aricaric00000000000000from nose.tools import * import networkx as nx import networkx.algorithms.approximation as a def test_min_maximal_matching(): # smoke test G = nx.Graph() assert_equal(len(a.min_maximal_matching(G)),0) networkx-1.11/networkx/algorithms/approximation/tests/test_ramsey.py0000644000175000017500000000175612637544450026215 0ustar aricaric00000000000000from nose.tools import * import networkx as nx import networkx.algorithms.approximation as apxa def test_ramsey(): # this should only find the complete graph graph = nx.complete_graph(10) c, i = apxa.ramsey_R2(graph) cdens = nx.density(graph.subgraph(c)) eq_(cdens, 1.0, "clique not found by ramsey!") idens = nx.density(graph.subgraph(i)) eq_(idens, 0.0, "i-set not found by ramsey!") # this trival graph has no cliques. should just find i-sets graph = nx.trivial_graph(nx.Graph()) c, i = apxa.ramsey_R2(graph) cdens = nx.density(graph.subgraph(c)) eq_(cdens, 0.0, "clique not found by ramsey!") idens = nx.density(graph.subgraph(i)) eq_(idens, 0.0, "i-set not found by ramsey!") graph = nx.barbell_graph(10, 5, nx.Graph()) c, i = apxa.ramsey_R2(graph) cdens = nx.density(graph.subgraph(c)) eq_(cdens, 1.0, "clique not found by ramsey!") idens = nx.density(graph.subgraph(i)) eq_(idens, 0.0, "i-set not found by ramsey!") networkx-1.11/networkx/algorithms/approximation/tests/test_kcomponents.py0000644000175000017500000002202212637544500027236 0ustar aricaric00000000000000# Test for approximation to k-components algorithm from nose.tools import assert_equal, assert_true, assert_false, assert_raises, raises import networkx as nx from networkx.algorithms.approximation import k_components from networkx.algorithms.approximation.kcomponents import _AntiGraph, _same def build_k_number_dict(k_components): k_num = {} for k, comps in sorted(k_components.items()): for comp in comps: for node in comp: k_num[node] = k return k_num ## ## Some nice synthetic graphs ## def graph_example_1(): G = nx.convert_node_labels_to_integers(nx.grid_graph([5,5]), label_attribute='labels') rlabels = nx.get_node_attributes(G, 'labels') labels = dict((v, k) for k, v in rlabels.items()) for nodes in [(labels[(0,0)], labels[(1,0)]), (labels[(0,4)], labels[(1,4)]), (labels[(3,0)], labels[(4,0)]), (labels[(3,4)], labels[(4,4)]) ]: new_node = G.order()+1 # Petersen graph is triconnected P = nx.petersen_graph() G = nx.disjoint_union(G,P) # Add two edges between the grid and P G.add_edge(new_node+1, nodes[0]) G.add_edge(new_node, nodes[1]) # K5 is 4-connected K = nx.complete_graph(5) G = nx.disjoint_union(G,K) # Add three edges between P and K5 G.add_edge(new_node+2,new_node+11) G.add_edge(new_node+3,new_node+12) G.add_edge(new_node+4,new_node+13) # Add another K5 sharing a node G = nx.disjoint_union(G,K) nbrs = G[new_node+10] G.remove_node(new_node+10) for nbr in nbrs: G.add_edge(new_node+17, nbr) G.add_edge(new_node+16, new_node+5) G.name = 'Example graph for connectivity' return G def torrents_and_ferraro_graph(): G = nx.convert_node_labels_to_integers(nx.grid_graph([5,5]), label_attribute='labels') rlabels = nx.get_node_attributes(G, 'labels') labels = dict((v, k) for k, v in rlabels.items()) for nodes in [ (labels[(0,4)], labels[(1,4)]), (labels[(3,4)], labels[(4,4)]) ]: new_node = G.order()+1 # Petersen graph is triconnected P = nx.petersen_graph() G = nx.disjoint_union(G,P) # Add two edges between the grid and P G.add_edge(new_node+1, nodes[0]) G.add_edge(new_node, nodes[1]) # K5 is 4-connected K = nx.complete_graph(5) G = nx.disjoint_union(G,K) # Add three edges between P and K5 G.add_edge(new_node+2,new_node+11) G.add_edge(new_node+3,new_node+12) G.add_edge(new_node+4,new_node+13) # Add another K5 sharing a node G = nx.disjoint_union(G,K) nbrs = G[new_node+10] G.remove_node(new_node+10) for nbr in nbrs: G.add_edge(new_node+17, nbr) # Commenting this makes the graph not biconnected !! # This stupid mistake make one reviewer very angry :P G.add_edge(new_node+16, new_node+8) for nodes in [(labels[(0,0)], labels[(1,0)]), (labels[(3,0)], labels[(4,0)])]: new_node = G.order()+1 # Petersen graph is triconnected P = nx.petersen_graph() G = nx.disjoint_union(G,P) # Add two edges between the grid and P G.add_edge(new_node+1, nodes[0]) G.add_edge(new_node, nodes[1]) # K5 is 4-connected K = nx.complete_graph(5) G = nx.disjoint_union(G,K) # Add three edges between P and K5 G.add_edge(new_node+2,new_node+11) G.add_edge(new_node+3,new_node+12) G.add_edge(new_node+4,new_node+13) # Add another K5 sharing two nodes G = nx.disjoint_union(G,K) nbrs = G[new_node+10] G.remove_node(new_node+10) for nbr in nbrs: G.add_edge(new_node+17, nbr) nbrs2 = G[new_node+9] G.remove_node(new_node+9) for nbr in nbrs2: G.add_edge(new_node+18, nbr) G.name = 'Example graph for connectivity' return G # Helper function def _check_connectivity(G): result = k_components(G) for k, components in result.items(): if k < 3: continue for component in components: C = G.subgraph(component) K = nx.node_connectivity(C) assert_true(K >= k) def test_torrents_and_ferraro_graph(): G = torrents_and_ferraro_graph() _check_connectivity(G) def test_example_1(): G = graph_example_1() _check_connectivity(G) def test_karate_0(): G = nx.karate_club_graph() _check_connectivity(G) def test_karate_1(): karate_k_num = {0: 4, 1: 4, 2: 4, 3: 4, 4: 3, 5: 3, 6: 3, 7: 4, 8: 4, 9: 2, 10: 3, 11: 1, 12: 2, 13: 4, 14: 2, 15: 2, 16: 2, 17: 2, 18: 2, 19: 3, 20: 2, 21: 2, 22: 2, 23: 3, 24: 3, 25: 3, 26: 2, 27: 3, 28: 3, 29: 3, 30: 4, 31: 3, 32: 4, 33: 4} G = nx.karate_club_graph() k_comps = k_components(G) k_num = build_k_number_dict(k_comps) assert_equal(karate_k_num, k_num) def test_example_1_detail_3_and_4(): solution = { 3: [set([40, 41, 42, 43, 39]), set([32, 33, 34, 35, 36, 37, 38, 42, 25, 26, 27, 28, 29, 30, 31]), set([58, 59, 60, 61, 62]), set([44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 61]), set([80, 81, 77, 78, 79]), set([64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 80, 63]), set([97, 98, 99, 100, 101]), set([96, 100, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 94, 95]) ], 4: [set([40, 41, 42, 43, 39]), set([42, 35, 36, 37, 38]), set([58, 59, 60, 61, 62]), set([56, 57, 61, 54, 55]), set([80, 81, 77, 78, 79]), set([80, 73, 74, 75, 76]), set([97, 98, 99, 100, 101]), set([96, 100, 92, 94, 95]) ], } G = graph_example_1() result = k_components(G) for k, components in result.items(): if k < 3: continue for component in components: assert_true(component in solution[k]) @raises(nx.NetworkXNotImplemented) def test_directed(): G = nx.gnp_random_graph(10, 0.4, directed=True) kc = k_components(G) def test_same(): equal = {'A': 2, 'B': 2, 'C': 2} slightly_different = {'A': 2, 'B': 1, 'C': 2} different = {'A': 2, 'B': 8, 'C': 18} assert_true(_same(equal)) assert_false(_same(slightly_different)) assert_true(_same(slightly_different, tol=1)) assert_false(_same(different)) assert_false(_same(different, tol=4)) class TestAntiGraph: def setUp(self): self.Gnp = nx.gnp_random_graph(20,0.8) self.Anp = _AntiGraph(nx.complement(self.Gnp)) self.Gd = nx.davis_southern_women_graph() self.Ad = _AntiGraph(nx.complement(self.Gd)) self.Gk = nx.karate_club_graph() self.Ak = _AntiGraph(nx.complement(self.Gk)) self.GA = [(self.Gnp, self.Anp), (self.Gd,self.Ad), (self.Gk, self.Ak)] def test_size(self): for G, A in self.GA: n = G.order() s = len(G.edges())+len(A.edges()) assert_true(s == (n*(n-1))/2) def test_degree(self): for G, A in self.GA: assert_equal(G.degree(), A.degree()) def test_core_number(self): for G, A in self.GA: assert_equal(nx.core_number(G), nx.core_number(A)) def test_connected_components(self): for G, A in self.GA: gc = [set(c) for c in nx.connected_components(G)] ac = [set(c) for c in nx.connected_components(A)] for comp in ac: assert_true(comp in gc) def test_adjacency_iter(self): for G, A in self.GA: a_adj = list(A.adjacency_iter()) for n, nbrs in G.adjacency_iter(): assert_true((n, set(nbrs)) in a_adj) def test_neighbors(self): for G, A in self.GA: node = list(G.nodes())[0] assert_equal(set(G.neighbors(node)), set(A.neighbors(node))) def test_node_not_in_graph(self): for G, A in self.GA: node = 'non_existent_node' assert_raises(nx.NetworkXError, A.neighbors, node) assert_raises(nx.NetworkXError, A.neighbors_iter, node) assert_raises(nx.NetworkXError, G.neighbors, node) assert_raises(nx.NetworkXError, G.neighbors_iter, node) def test_degree(self): for G, A in self.GA: node = list(G.nodes())[0] nodes = list(G.nodes())[1:4] assert_equal(G.degree(node), A.degree(node)) assert_equal(sum(G.degree().values()), sum(A.degree().values())) # AntiGraph is a ThinGraph, so all the weights are 1 assert_equal(sum(A.degree().values()), sum(A.degree(weight='weight').values())) assert_equal(sum(G.degree(nodes).values()), sum(A.degree(nodes).values())) networkx-1.11/networkx/algorithms/approximation/tests/test_clique.py0000644000175000017500000000261412637544500026165 0ustar aricaric00000000000000from nose.tools import * import networkx as nx import networkx.algorithms.approximation as apxa def test_clique_removal(): graph = nx.complete_graph(10) i, cs = apxa.clique_removal(graph) idens = nx.density(graph.subgraph(i)) eq_(idens, 0.0, "i-set not found by clique_removal!") for clique in cs: cdens = nx.density(graph.subgraph(clique)) eq_(cdens, 1.0, "clique not found by clique_removal!") graph = nx.trivial_graph(nx.Graph()) i, cs = apxa.clique_removal(graph) idens = nx.density(graph.subgraph(i)) eq_(idens, 0.0, "i-set not found by ramsey!") # we should only have 1-cliques. Just singleton nodes. for clique in cs: cdens = nx.density(graph.subgraph(clique)) eq_(cdens, 0.0, "clique not found by clique_removal!") graph = nx.barbell_graph(10, 5, nx.Graph()) i, cs = apxa.clique_removal(graph) idens = nx.density(graph.subgraph(i)) eq_(idens, 0.0, "i-set not found by ramsey!") for clique in cs: cdens = nx.density(graph.subgraph(clique)) eq_(cdens, 1.0, "clique not found by clique_removal!") def test_max_clique_smoke(): # smoke test G = nx.Graph() assert_equal(len(apxa.max_clique(G)),0) def test_max_clique(): # create a complete graph graph = nx.complete_graph(30) # this should return the entire graph mc = apxa.max_clique(graph) assert_equals(30, len(mc)) networkx-1.11/networkx/algorithms/approximation/tests/test_connectivity.py0000644000175000017500000001322412637544450027424 0ustar aricaric00000000000000import itertools from nose.tools import assert_true, assert_equal, assert_raises import networkx as nx from networkx.algorithms import approximation as approx def test_global_node_connectivity(): # Figure 1 chapter on Connectivity G = nx.Graph() G.add_edges_from([(1,2),(1,3),(1,4),(1,5),(2,3),(2,6),(3,4), (3,6),(4,6),(4,7),(5,7),(6,8),(6,9),(7,8), (7,10),(8,11),(9,10),(9,11),(10,11)]) assert_equal(2, approx.local_node_connectivity(G,1,11)) assert_equal(2, approx.node_connectivity(G)) assert_equal(2, approx.node_connectivity(G,1,11)) def test_white_harary1(): # Figure 1b white and harary (2001) # A graph with high adhesion (edge connectivity) and low cohesion # (node connectivity) G = nx.disjoint_union(nx.complete_graph(4), nx.complete_graph(4)) G.remove_node(7) for i in range(4,7): G.add_edge(0,i) G = nx.disjoint_union(G, nx.complete_graph(4)) G.remove_node(G.order()-1) for i in range(7,10): G.add_edge(0,i) assert_equal(1, approx.node_connectivity(G)) def test_complete_graphs(): for n in range(5, 25, 5): G = nx.complete_graph(n) assert_equal(n-1, approx.node_connectivity(G)) assert_equal(n-1, approx.node_connectivity(G, 0, 3)) def test_empty_graphs(): for k in range(5, 25, 5): G = nx.empty_graph(k) assert_equal(0, approx.node_connectivity(G)) assert_equal(0, approx.node_connectivity(G, 0, 3)) def test_petersen(): G = nx.petersen_graph() assert_equal(3, approx.node_connectivity(G)) assert_equal(3, approx.node_connectivity(G, 0, 5)) # Approximation fails with tutte graph #def test_tutte(): # G = nx.tutte_graph() # assert_equal(3, approx.node_connectivity(G)) def test_dodecahedral(): G = nx.dodecahedral_graph() assert_equal(3, approx.node_connectivity(G)) assert_equal(3, approx.node_connectivity(G, 0, 5)) def test_octahedral(): G=nx.octahedral_graph() assert_equal(4, approx.node_connectivity(G)) assert_equal(4, approx.node_connectivity(G, 0, 5)) # Approximation can fail with icosahedral graph depending # on iteration order. #def test_icosahedral(): # G=nx.icosahedral_graph() # assert_equal(5, approx.node_connectivity(G)) # assert_equal(5, approx.node_connectivity(G, 0, 5)) def test_only_source(): G = nx.complete_graph(5) assert_raises(nx.NetworkXError, approx.node_connectivity, G, s=0) def test_only_target(): G = nx.complete_graph(5) assert_raises(nx.NetworkXError, approx.node_connectivity, G, t=0) def test_missing_source(): G = nx.path_graph(4) assert_raises(nx.NetworkXError, approx.node_connectivity, G, 10, 1) def test_missing_target(): G = nx.path_graph(4) assert_raises(nx.NetworkXError, approx.node_connectivity, G, 1, 10) def test_source_equals_target(): G = nx.complete_graph(5) assert_raises(nx.NetworkXError, approx.local_node_connectivity, G, 0, 0) def test_directed_node_connectivity(): G = nx.cycle_graph(10, create_using=nx.DiGraph()) # only one direction D = nx.cycle_graph(10).to_directed() # 2 reciprocal edges assert_equal(1, approx.node_connectivity(G)) assert_equal(1, approx.node_connectivity(G, 1, 4)) assert_equal(2, approx.node_connectivity(D)) assert_equal(2, approx.node_connectivity(D, 1, 4)) class TestAllPairsNodeConnectivityApprox: def setUp(self): self.path = nx.path_graph(7) self.directed_path = nx.path_graph(7, create_using=nx.DiGraph()) self.cycle = nx.cycle_graph(7) self.directed_cycle = nx.cycle_graph(7, create_using=nx.DiGraph()) self.gnp = nx.gnp_random_graph(30, 0.1) self.directed_gnp = nx.gnp_random_graph(30, 0.1, directed=True) self.K20 = nx.complete_graph(20) self.K10 = nx.complete_graph(10) self.K5 = nx.complete_graph(5) self.G_list = [self.path, self.directed_path, self.cycle, self.directed_cycle, self.gnp, self.directed_gnp, self.K10, self.K5, self.K20] def test_cycles(self): K_undir = approx.all_pairs_node_connectivity(self.cycle) for source in K_undir: for target, k in K_undir[source].items(): assert_true(k == 2) K_dir = approx.all_pairs_node_connectivity(self.directed_cycle) for source in K_dir: for target, k in K_dir[source].items(): assert_true(k == 1) def test_complete(self): for G in [self.K10, self.K5, self.K20]: K = approx.all_pairs_node_connectivity(G) for source in K: for target, k in K[source].items(): assert_true(k == len(G)-1) def test_paths(self): K_undir = approx.all_pairs_node_connectivity(self.path) for source in K_undir: for target, k in K_undir[source].items(): assert_true(k == 1) K_dir = approx.all_pairs_node_connectivity(self.directed_path) for source in K_dir: for target, k in K_dir[source].items(): if source < target: assert_true(k == 1) else: assert_true(k == 0) def test_cutoff(self): for G in [self.K10, self.K5, self.K20]: for mp in [2, 3, 4]: paths = approx.all_pairs_node_connectivity(G, cutoff=mp) for source in paths: for target, K in paths[source].items(): assert_true(K == mp) def test_all_pairs_connectivity_nbunch(self): G = nx.complete_graph(5) nbunch = [0, 2, 3] C = approx.all_pairs_node_connectivity(G, nbunch=nbunch) assert_equal(len(C), len(nbunch)) networkx-1.11/networkx/algorithms/approximation/connectivity.py0000644000175000017500000003146212637544500025223 0ustar aricaric00000000000000""" Fast approximation for node connectivity """ # Copyright (C) 2015 by # Jordi Torrents # All rights reserved. # BSD license. import itertools from operator import itemgetter import networkx as nx __author__ = """\n""".join(['Jordi Torrents ']) __all__ = ['local_node_connectivity', 'node_connectivity', 'all_pairs_node_connectivity'] INF = float('inf') def local_node_connectivity(G, source, target, cutoff=None): """Compute node connectivity between source and target. Pairwise or local node connectivity between two distinct and nonadjacent nodes is the minimum number of nodes that must be removed (minimum separating cutset) to disconnect them. By Menger's theorem, this is equal to the number of node independent paths (paths that share no nodes other than source and target). Which is what we compute in this function. This algorithm is a fast approximation that gives an strict lower bound on the actual number of node independent paths between two nodes [1]_. It works for both directed and undirected graphs. Parameters ---------- G : NetworkX graph source : node Starting node for node connectivity target : node Ending node for node connectivity cutoff : integer Maximum node connectivity to consider. If None, the minimum degree of source or target is used as a cutoff. Default value None. Returns ------- k: integer pairwise node connectivity Examples -------- >>> # Platonic icosahedral graph has node connectivity 5 >>> # for each non adjacent node pair >>> from networkx.algorithms import approximation as approx >>> G = nx.icosahedral_graph() >>> approx.local_node_connectivity(G, 0, 6) 5 Notes ----- This algorithm [1]_ finds node independents paths between two nodes by computing their shortest path using BFS, marking the nodes of the path found as 'used' and then searching other shortest paths excluding the nodes marked as used until no more paths exist. It is not exact because a shortest path could use nodes that, if the path were longer, may belong to two different node independent paths. Thus it only guarantees an strict lower bound on node connectivity. Note that the authors propose a further refinement, losing accuracy and gaining speed, which is not implemented yet. See also -------- all_pairs_node_connectivity node_connectivity References ---------- .. [1] White, Douglas R., and Mark Newman. 2001 A Fast Algorithm for Node-Independent Paths. Santa Fe Institute Working Paper #01-07-035 http://eclectic.ss.uci.edu/~drwhite/working.pdf """ if target == source: raise nx.NetworkXError("source and target have to be different nodes.") # Maximum possible node independent paths if G.is_directed(): possible = min(G.out_degree(source), G.in_degree(target)) else: possible = min(G.degree(source), G.degree(target)) K = 0 if not possible: return K if cutoff is None: cutoff = INF exclude = set() for i in range(min(possible, cutoff)): try: path = _bidirectional_shortest_path(G, source, target, exclude) exclude.update(set(path)) K += 1 except nx.NetworkXNoPath: break return K def node_connectivity(G, s=None, t=None): r"""Returns an approximation for node connectivity for a graph or digraph G. Node connectivity is equal to the minimum number of nodes that must be removed to disconnect G or render it trivial. By Menger's theorem, this is equal to the number of node independent paths (paths that share no nodes other than source and target). If source and target nodes are provided, this function returns the local node connectivity: the minimum number of nodes that must be removed to break all paths from source to target in G. This algorithm is based on a fast approximation that gives an strict lower bound on the actual number of node independent paths between two nodes [1]_. It works for both directed and undirected graphs. Parameters ---------- G : NetworkX graph Undirected graph s : node Source node. Optional. Default value: None. t : node Target node. Optional. Default value: None. Returns ------- K : integer Node connectivity of G, or local node connectivity if source and target are provided. Examples -------- >>> # Platonic icosahedral graph is 5-node-connected >>> from networkx.algorithms import approximation as approx >>> G = nx.icosahedral_graph() >>> approx.node_connectivity(G) 5 Notes ----- This algorithm [1]_ finds node independents paths between two nodes by computing their shortest path using BFS, marking the nodes of the path found as 'used' and then searching other shortest paths excluding the nodes marked as used until no more paths exist. It is not exact because a shortest path could use nodes that, if the path were longer, may belong to two different node independent paths. Thus it only guarantees an strict lower bound on node connectivity. See also -------- all_pairs_node_connectivity local_node_connectivity References ---------- .. [1] White, Douglas R., and Mark Newman. 2001 A Fast Algorithm for Node-Independent Paths. Santa Fe Institute Working Paper #01-07-035 http://eclectic.ss.uci.edu/~drwhite/working.pdf """ if (s is not None and t is None) or (s is None and t is not None): raise nx.NetworkXError('Both source and target must be specified.') # Local node connectivity if s is not None and t is not None: if s not in G: raise nx.NetworkXError('node %s not in graph' % s) if t not in G: raise nx.NetworkXError('node %s not in graph' % t) return local_node_connectivity(G, s, t) # Global node connectivity if G.is_directed(): connected_func = nx.is_weakly_connected iter_func = itertools.permutations def neighbors(v): return itertools.chain.from_iterable([G.predecessors_iter(v), G.successors_iter(v)]) else: connected_func = nx.is_connected iter_func = itertools.combinations neighbors = G.neighbors_iter if not connected_func(G): return 0 # Choose a node with minimum degree v, minimum_degree = min(G.degree().items(), key=itemgetter(1)) # Node connectivity is bounded by minimum degree K = minimum_degree # compute local node connectivity with all non-neighbors nodes # and store the minimum for w in set(G) - set(neighbors(v)) - set([v]): K = min(K, local_node_connectivity(G, v, w, cutoff=K)) # Same for non adjacent pairs of neighbors of v for x, y in iter_func(neighbors(v), 2): if y not in G[x] and x != y: K = min(K, local_node_connectivity(G, x, y, cutoff=K)) return K def all_pairs_node_connectivity(G, nbunch=None, cutoff=None): """ Compute node connectivity between all pairs of nodes. Pairwise or local node connectivity between two distinct and nonadjacent nodes is the minimum number of nodes that must be removed (minimum separating cutset) to disconnect them. By Menger's theorem, this is equal to the number of node independent paths (paths that share no nodes other than source and target). Which is what we compute in this function. This algorithm is a fast approximation that gives an strict lower bound on the actual number of node independent paths between two nodes [1]_. It works for both directed and undirected graphs. Parameters ---------- G : NetworkX graph nbunch: container Container of nodes. If provided node connectivity will be computed only over pairs of nodes in nbunch. cutoff : integer Maximum node connectivity to consider. If None, the minimum degree of source or target is used as a cutoff in each pair of nodes. Default value None. Returns ------- K : dictionary Dictionary, keyed by source and target, of pairwise node connectivity See Also -------- local_node_connectivity all_pairs_node_connectivity References ---------- .. [1] White, Douglas R., and Mark Newman. 2001 A Fast Algorithm for Node-Independent Paths. Santa Fe Institute Working Paper #01-07-035 http://eclectic.ss.uci.edu/~drwhite/working.pdf """ if nbunch is None: nbunch = G else: nbunch = set(nbunch) directed = G.is_directed() if directed: iter_func = itertools.permutations else: iter_func = itertools.combinations all_pairs = {n: {} for n in nbunch} for u, v in iter_func(nbunch, 2): k = local_node_connectivity(G, u, v, cutoff=cutoff) all_pairs[u][v] = k if not directed: all_pairs[v][u] = k return all_pairs def _bidirectional_shortest_path(G, source, target, exclude): """Return shortest path between source and target ignoring nodes in the container 'exclude'. Parameters ---------- G : NetworkX graph source : node Starting node for path target : node Ending node for path exclude: container Container for nodes to exclude from the search for shortest paths Returns ------- path: list Shortest path between source and target ignoring nodes in 'exclude' Raises ------ NetworkXNoPath: exception If there is no path or if nodes are adjacent and have only one path between them Notes ----- This function and its helper are originaly from networkx.algorithms.shortest_paths.unweighted and are modified to accept the extra parameter 'exclude', which is a container for nodes already used in other paths that should be ignored. References ---------- .. [1] White, Douglas R., and Mark Newman. 2001 A Fast Algorithm for Node-Independent Paths. Santa Fe Institute Working Paper #01-07-035 http://eclectic.ss.uci.edu/~drwhite/working.pdf """ # call helper to do the real work results = _bidirectional_pred_succ(G, source, target, exclude) pred, succ, w = results # build path from pred+w+succ path = [] # from source to w while w is not None: path.append(w) w = pred[w] path.reverse() # from w to target w = succ[path[-1]] while w is not None: path.append(w) w = succ[w] return path def _bidirectional_pred_succ(G, source, target, exclude): # does BFS from both source and target and meets in the middle # excludes nodes in the container "exclude" from the search if source is None or target is None: raise nx.NetworkXException(\ "Bidirectional shortest path called without source or target") if target == source: return ({target:None},{source:None},source) # handle either directed or undirected if G.is_directed(): Gpred = G.predecessors_iter Gsucc = G.successors_iter else: Gpred = G.neighbors_iter Gsucc = G.neighbors_iter # predecesssor and successors in search pred = {source: None} succ = {target: None} # initialize fringes, start with forward forward_fringe = [source] reverse_fringe = [target] level = 0 while forward_fringe and reverse_fringe: # Make sure that we iterate one step forward and one step backwards # thus source and target will only tigger "found path" when they are # adjacent and then they can be safely included in the container 'exclude' level += 1 if not level % 2 == 0: this_level = forward_fringe forward_fringe = [] for v in this_level: for w in Gsucc(v): if w in exclude: continue if w not in pred: forward_fringe.append(w) pred[w] = v if w in succ: return pred, succ, w # found path else: this_level = reverse_fringe reverse_fringe = [] for v in this_level: for w in Gpred(v): if w in exclude: continue if w not in succ: succ[w] = v reverse_fringe.append(w) if w in pred: return pred, succ, w # found path raise nx.NetworkXNoPath("No path between %s and %s." % (source, target)) networkx-1.11/networkx/algorithms/approximation/vertex_cover.py0000644000175000017500000000405012637544500025211 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ ************ Vertex Cover ************ Given an undirected graph `G = (V, E)` and a function w assigning nonnegative weights to its vertices, find a minimum weight subset of V such that each edge in E is incident to at least one vertex in the subset. http://en.wikipedia.org/wiki/Vertex_cover """ # Copyright (C) 2011-2012 by # Nicholas Mancuso # All rights reserved. # BSD license. from networkx.utils import * __all__ = ["min_weighted_vertex_cover"] __author__ = """Nicholas Mancuso (nick.mancuso@gmail.com)""" @not_implemented_for('directed') def min_weighted_vertex_cover(G, weight=None): r"""2-OPT Local Ratio for Minimum Weighted Vertex Cover Find an approximate minimum weighted vertex cover of a graph. Parameters ---------- G : NetworkX graph Undirected graph weight : None or string, optional (default = None) If None, every edge has weight/distance/cost 1. If a string, use this edge attribute as the edge weight. Any edge attribute not present defaults to 1. Returns ------- min_weighted_cover : set Returns a set of vertices whose weight sum is no more than 2 * OPT. Notes ----- Local-Ratio algorithm for computing an approximate vertex cover. Algorithm greedily reduces the costs over edges and iteratively builds a cover. Worst-case runtime is `O(|E|)`. References ---------- .. [1] Bar-Yehuda, R., & Even, S. (1985). A local-ratio theorem for approximating the weighted vertex cover problem. Annals of Discrete Mathematics, 25, 27–46 http://www.cs.technion.ac.il/~reuven/PDF/vc_lr.pdf """ weight_func = lambda nd: nd.get(weight, 1) cost = dict((n, weight_func(nd)) for n, nd in G.nodes(data=True)) # while there are edges uncovered, continue for u,v in G.edges_iter(): # select some uncovered edge min_cost = min([cost[u], cost[v]]) cost[u] -= min_cost cost[v] -= min_cost return set(u for u in cost if cost[u] == 0) networkx-1.11/networkx/algorithms/swap.py0000644000175000017500000002311712637544500020563 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """Swap edges in a graph. """ # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. from __future__ import division import math import random import networkx as nx __author__ = "\n".join(['Aric Hagberg (hagberg@lanl.gov)', 'Pieter Swart (swart@lanl.gov)', 'Dan Schult (dschult@colgate.edu)', 'Joel Miller (joel.c.miller.research@gmail.com)', 'Ben Edwards']) __all__ = ['double_edge_swap', 'connected_double_edge_swap'] def double_edge_swap(G, nswap=1, max_tries=100): """Swap two edges in the graph while keeping the node degrees fixed. A double-edge swap removes two randomly chosen edges u-v and x-y and creates the new edges u-x and v-y:: u--v u v becomes | | x--y x y If either the edge u-x or v-y already exist no swap is performed and another attempt is made to find a suitable edge pair. Parameters ---------- G : graph An undirected graph nswap : integer (optional, default=1) Number of double-edge swaps to perform max_tries : integer (optional) Maximum number of attempts to swap edges Returns ------- G : graph The graph after double edge swaps. Notes ----- Does not enforce any connectivity constraints. The graph G is modified in place. """ if G.is_directed(): raise nx.NetworkXError(\ "double_edge_swap() not defined for directed graphs.") if nswap>max_tries: raise nx.NetworkXError("Number of swaps > number of tries allowed.") if len(G) < 4: raise nx.NetworkXError("Graph has less than four nodes.") # Instead of choosing uniformly at random from a generated edge list, # this algorithm chooses nonuniformly from the set of nodes with # probability weighted by degree. n=0 swapcount=0 keys,degrees=zip(*G.degree().items()) # keys, degree cdf=nx.utils.cumulative_distribution(degrees) # cdf of degree while swapcount < nswap: # if random.random() < 0.5: continue # trick to avoid periodicities? # pick two random edges without creating edge list # choose source node indices from discrete distribution (ui,xi)=nx.utils.discrete_sequence(2,cdistribution=cdf) if ui==xi: continue # same source, skip u=keys[ui] # convert index to label x=keys[xi] # choose target uniformly from neighbors v=random.choice(list(G[u])) y=random.choice(list(G[x])) if v==y: continue # same target, skip if (x not in G[u]) and (y not in G[v]): # don't create parallel edges G.add_edge(u,x) G.add_edge(v,y) G.remove_edge(u,v) G.remove_edge(x,y) swapcount+=1 if n >= max_tries: e=('Maximum number of swap attempts (%s) exceeded '%n + 'before desired swaps achieved (%s).'%nswap) raise nx.NetworkXAlgorithmError(e) n+=1 return G def connected_double_edge_swap(G, nswap=1, _window_threshold=3): """Attempts the specified number of double-edge swaps in the graph ``G``. A double-edge swap removes two randomly chosen edges ``(u, v)`` and ``(x, y)`` and creates the new edges ``(u, x)`` and ``(v, y)``:: u--v u v becomes | | x--y x y If either ``(u, x)`` or ``(v, y)`` already exist, then no swap is performed so the actual number of swapped edges is always *at most* ``nswap``. Parameters ---------- G : graph An undirected graph nswap : integer (optional, default=1) Number of double-edge swaps to perform _window_threshold : integer The window size below which connectedness of the graph will be checked after each swap. The "window" in this function is a dynamically updated integer that represents the number of swap attempts to make before checking if the graph remains connected. It is an optimization used to decrease the running time of the algorithm in exchange for increased complexity of implementation. If the window size is below this threshold, then the algorithm checks after each swap if the graph remains connected by checking if there is a path joining the two nodes whose edge was just removed. If the window size is above this threshold, then the algorithm performs do all the swaps in the window and only then check if the graph is still connected. Returns ------- int The number of successful swaps Raises ------ NetworkXError If the input graph is not connected, or if the graph has fewer than four nodes. Notes ----- The initial graph ``G`` must be connected, and the resulting graph is connected. The graph ``G`` is modified in place. References ---------- .. [1] C. Gkantsidis and M. Mihail and E. Zegura, The Markov chain simulation method for generating connected power law random graphs, 2003. http://citeseer.ist.psu.edu/gkantsidis03markov.html """ if not nx.is_connected(G): raise nx.NetworkXError("Graph not connected") if len(G) < 4: raise nx.NetworkXError("Graph has less than four nodes.") n = 0 swapcount = 0 deg = G.degree() # Label key for nodes dk = list(deg.keys()) cdf = nx.utils.cumulative_distribution(list(G.degree().values())) window = 1 while n < nswap: wcount = 0 swapped = [] # If the window is small, we just check each time whether the graph is # connected by checking if the nodes that were just separated are still # connected. if window < _window_threshold: # This Boolean keeps track of whether there was a failure or not. fail = False while wcount < window and n < nswap: # Pick two random edges without creating the edge list. Choose # source nodes from the discrete degree distribution. (ui, xi) = nx.utils.discrete_sequence(2, cdistribution=cdf) # If the source nodes are the same, skip this pair. if ui == xi: continue # Convert an index to a node label. u = dk[ui] x = dk[xi] # Choose targets uniformly from neighbors. v = random.choice(G.neighbors(u)) y = random.choice(G.neighbors(x)) # If the target nodes are the same, skip this pair. if v == y: continue if x not in G[u] and y not in G[v]: G.remove_edge(u, v) G.remove_edge(x, y) G.add_edge(u, x) G.add_edge(v, y) swapped.append((u, v, x, y)) swapcount += 1 n += 1 # If G remains connected... if nx.has_path(G, u, v): wcount += 1 # Otherwise, undo the changes. else: G.add_edge(u, v) G.add_edge(x, y) G.remove_edge(u, x) G.remove_edge(v, y) swapcount -= 1 fail = True # If one of the swaps failed, reduce the window size. if fail: window = int(math.ceil(window / 2)) else: window += 1 # If the window is large, then there is a good chance that a bunch of # swaps will work. It's quicker to do all those swaps first and then # check if the graph remains connected. else: while wcount < window and n < nswap: # Pick two random edges without creating the edge list. Choose # source nodes from the discrete degree distribution. (ui, xi) = nx.utils.discrete_sequence(2, cdistribution=cdf) # If the source nodes are the same, skip this pair. if ui == xi: continue # Convert an index to a node label. u = dk[ui] x = dk[xi] # Choose targets uniformly from neighbors. v = random.choice(G.neighbors(u)) y = random.choice(G.neighbors(x)) # If the target nodes are the same, skip this pair. if v == y: continue if x not in G[u] and y not in G[v]: G.remove_edge(u, v) G.remove_edge(x, y) G.add_edge(u, x) G.add_edge(v, y) swapped.append((u, v, x, y)) swapcount += 1 n += 1 wcount += 1 # If the graph remains connected, increase the window size. if nx.is_connected(G): window += 1 # Otherwise, undo the changes from the previous window and decrease # the window size. else: while swapped: (u, v, x, y) = swapped.pop() G.add_edge(u, v) G.add_edge(x, y) G.remove_edge(u, x) G.remove_edge(v, y) swapcount -= 1 window = int(math.ceil(window / 2)) return swapcount networkx-1.11/networkx/algorithms/boundary.py0000644000175000017500000000503112637544450021433 0ustar aricaric00000000000000""" Routines to find the boundary of a set of nodes. Edge boundaries are edges that have only one end in the set of nodes. Node boundaries are nodes outside the set of nodes that have an edge to a node in the set. """ __author__ = """Aric Hagberg (hagberg@lanl.gov)\nPieter Swart (swart@lanl.gov)\nDan Schult (dschult@colgate.edu)""" # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. __all__=['edge_boundary','node_boundary'] def edge_boundary(G, nbunch1, nbunch2=None): """Return the edge boundary. Edge boundaries are edges that have only one end in the given set of nodes. Parameters ---------- G : graph A networkx graph nbunch1 : list, container Interior node set nbunch2 : list, container Exterior node set. If None then it is set to all of the nodes in G not in nbunch1. Returns ------- elist : list List of edges Notes ----- Nodes in nbunch1 and nbunch2 that are not in G are ignored. nbunch1 and nbunch2 are usually meant to be disjoint, but in the interest of speed and generality, that is not required here. """ if nbunch2 is None: # Then nbunch2 is complement of nbunch1 nset1=set((n for n in nbunch1 if n in G)) return [(n1,n2) for n1 in nset1 for n2 in G[n1] \ if n2 not in nset1] nset2=set(nbunch2) return [(n1,n2) for n1 in nbunch1 if n1 in G for n2 in G[n1] \ if n2 in nset2] def node_boundary(G, nbunch1, nbunch2=None): """Return the node boundary. The node boundary is all nodes in the edge boundary of a given set of nodes that are in the set. Parameters ---------- G : graph A networkx graph nbunch1 : list, container Interior node set nbunch2 : list, container Exterior node set. If None then it is set to all of the nodes in G not in nbunch1. Returns ------- nlist : list List of nodes. Notes ----- Nodes in nbunch1 and nbunch2 that are not in G are ignored. nbunch1 and nbunch2 are usually meant to be disjoint, but in the interest of speed and generality, that is not required here. """ nset1=set(n for n in nbunch1 if n in G) bdy=set() for n1 in nset1: bdy.update(G[n1]) bdy -= nset1 if nbunch2 is not None: # else nbunch2 is complement of nbunch1 bdy &= set(nbunch2) return list(bdy) networkx-1.11/networkx/algorithms/clique.py0000644000175000017500000003753612637544500021105 0ustar aricaric00000000000000""" ======= Cliques ======= Find and manipulate cliques of graphs. Note that finding the largest clique of a graph has been shown to be an NP-complete problem; the algorithms here could take a long time to run. http://en.wikipedia.org/wiki/Clique_problem """ # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. from collections import deque from itertools import chain, islice try: from itertools import ifilter as filter except ImportError: pass import networkx from networkx.utils.decorators import * __author__ = """Dan Schult (dschult@colgate.edu)""" __all__ = ['find_cliques', 'find_cliques_recursive', 'make_max_clique_graph', 'make_clique_bipartite' ,'graph_clique_number', 'graph_number_of_cliques', 'node_clique_number', 'number_of_cliques', 'cliques_containing_node', 'project_down', 'project_up', 'enumerate_all_cliques'] @not_implemented_for('directed') def enumerate_all_cliques(G): """Returns all cliques in an undirected graph. This method returns cliques of size (cardinality) k = 1, 2, 3, ..., maxDegree - 1. Where maxDegree is the maximal degree of any node in the graph. Parameters ---------- G: undirected graph Returns ------- generator of lists: generator of list for each clique. Notes ----- To obtain a list of all cliques, use :samp:`list(enumerate_all_cliques(G))`. Based on the algorithm published by Zhang et al. (2005) [1]_ and adapted to output all cliques discovered. This algorithm is not applicable on directed graphs. This algorithm ignores self-loops and parallel edges as clique is not conventionally defined with such edges. There are often many cliques in graphs. This algorithm however, hopefully, does not run out of memory since it only keeps candidate sublists in memory and continuously removes exhausted sublists. References ---------- .. [1] Yun Zhang, Abu-Khzam, F.N., Baldwin, N.E., Chesler, E.J., Langston, M.A., Samatova, N.F., Genome-Scale Computational Approaches to Memory-Intensive Applications in Systems Biology. Supercomputing, 2005. Proceedings of the ACM/IEEE SC 2005 Conference, pp. 12, 12-18 Nov. 2005. doi: 10.1109/SC.2005.29. http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=1559964&isnumber=33129 """ index = {} nbrs = {} for u in G: index[u] = len(index) # Neighbors of u that appear after u in the iteration order of G. nbrs[u] = {v for v in G[u] if v not in index} queue = deque(([u], sorted(nbrs[u], key=index.__getitem__)) for u in G) # Loop invariants: # 1. len(base) is nondecreasing. # 2. (base + cnbrs) is sorted with respect to the iteration order of G. # 3. cnbrs is a set of common neighbors of nodes in base. while queue: base, cnbrs = map(list, queue.popleft()) yield base for i, u in enumerate(cnbrs): # Use generators to reduce memory consumption. queue.append((chain(base, [u]), filter(nbrs[u].__contains__, islice(cnbrs, i + 1, None)))) @not_implemented_for('directed') def find_cliques(G): """Search for all maximal cliques in a graph. Maximal cliques are the largest complete subgraph containing a given node. The largest maximal clique is sometimes called the maximum clique. Returns ------- generator of lists: genetor of member list for each maximal clique See Also -------- find_cliques_recursive : A recursive version of the same algorithm Notes ----- To obtain a list of cliques, use list(find_cliques(G)). Based on the algorithm published by Bron & Kerbosch (1973) [1]_ as adapted by Tomita, Tanaka and Takahashi (2006) [2]_ and discussed in Cazals and Karande (2008) [3]_. The method essentially unrolls the recursion used in the references to avoid issues of recursion stack depth. This algorithm is not suitable for directed graphs. This algorithm ignores self-loops and parallel edges as clique is not conventionally defined with such edges. There are often many cliques in graphs. This algorithm can run out of memory for large graphs. References ---------- .. [1] Bron, C. and Kerbosch, J. 1973. Algorithm 457: finding all cliques of an undirected graph. Commun. ACM 16, 9 (Sep. 1973), 575-577. http://portal.acm.org/citation.cfm?doid=362342.362367 .. [2] Etsuji Tomita, Akira Tanaka, Haruhisa Takahashi, The worst-case time complexity for generating all maximal cliques and computational experiments, Theoretical Computer Science, Volume 363, Issue 1, Computing and Combinatorics, 10th Annual International Conference on Computing and Combinatorics (COCOON 2004), 25 October 2006, Pages 28-42 http://dx.doi.org/10.1016/j.tcs.2006.06.015 .. [3] F. Cazals, C. Karande, A note on the problem of reporting maximal cliques, Theoretical Computer Science, Volume 407, Issues 1-3, 6 November 2008, Pages 564-568, http://dx.doi.org/10.1016/j.tcs.2008.05.010 """ if len(G) == 0: return adj = {u: {v for v in G[u] if v != u} for u in G} Q = [None] subg = set(G) cand = set(G) u = max(subg, key=lambda u: len(cand & adj[u])) ext_u = cand - adj[u] stack = [] try: while True: if ext_u: q = ext_u.pop() cand.remove(q) Q[-1] = q adj_q = adj[q] subg_q = subg & adj_q if not subg_q: yield Q[:] else: cand_q = cand & adj_q if cand_q: stack.append((subg, cand, ext_u)) Q.append(None) subg = subg_q cand = cand_q u = max(subg, key=lambda u: len(cand & adj[u])) ext_u = cand - adj[u] else: Q.pop() subg, cand, ext_u = stack.pop() except IndexError: pass def find_cliques_recursive(G): """Recursive search for all maximal cliques in a graph. Maximal cliques are the largest complete subgraph containing a given point. The largest maximal clique is sometimes called the maximum clique. Returns ------- list of lists: list of members in each maximal clique See Also -------- find_cliques : An nonrecursive version of the same algorithm Notes ----- Based on the algorithm published by Bron & Kerbosch (1973) [1]_ as adapted by Tomita, Tanaka and Takahashi (2006) [2]_ and discussed in Cazals and Karande (2008) [3]_. This implementation returns a list of lists each of which contains the members of a maximal clique. This algorithm ignores self-loops and parallel edges as clique is not conventionally defined with such edges. References ---------- .. [1] Bron, C. and Kerbosch, J. 1973. Algorithm 457: finding all cliques of an undirected graph. Commun. ACM 16, 9 (Sep. 1973), 575-577. http://portal.acm.org/citation.cfm?doid=362342.362367 .. [2] Etsuji Tomita, Akira Tanaka, Haruhisa Takahashi, The worst-case time complexity for generating all maximal cliques and computational experiments, Theoretical Computer Science, Volume 363, Issue 1, Computing and Combinatorics, 10th Annual International Conference on Computing and Combinatorics (COCOON 2004), 25 October 2006, Pages 28-42 http://dx.doi.org/10.1016/j.tcs.2006.06.015 .. [3] F. Cazals, C. Karande, A note on the problem of reporting maximal cliques, Theoretical Computer Science, Volume 407, Issues 1-3, 6 November 2008, Pages 564-568, http://dx.doi.org/10.1016/j.tcs.2008.05.010 """ if len(G) == 0: return iter([]) adj = {u: {v for v in G[u] if v != u} for u in G} Q = [] def expand(subg, cand): u = max(subg, key=lambda u: len(cand & adj[u])) for q in cand - adj[u]: cand.remove(q) Q.append(q) adj_q = adj[q] subg_q = subg & adj_q if not subg_q: yield Q[:] else: cand_q = cand & adj_q if cand_q: for clique in expand(subg_q, cand_q): yield clique Q.pop() return expand(set(G), set(G)) def make_max_clique_graph(G,create_using=None,name=None): """ Create the maximal clique graph of a graph. Finds the maximal cliques and treats these as nodes. The nodes are connected if they have common members in the original graph. Theory has done a lot with clique graphs, but I haven't seen much on maximal clique graphs. Notes ----- This should be the same as make_clique_bipartite followed by project_up, but it saves all the intermediate steps. """ cliq=list(map(set,find_cliques(G))) if create_using: B=create_using B.clear() else: B=networkx.Graph() if name is not None: B.name=name for i,cl in enumerate(cliq): B.add_node(i+1) for j,other_cl in enumerate(cliq[:i]): # if not cl.isdisjoint(other_cl): #Requires 2.6 intersect=cl & other_cl if intersect: # Not empty B.add_edge(i+1,j+1) return B def make_clique_bipartite(G,fpos=None,create_using=None,name=None): """Create a bipartite clique graph from a graph G. Nodes of G are retained as the "bottom nodes" of B and cliques of G become "top nodes" of B. Edges are present if a bottom node belongs to the clique represented by the top node. Returns a Graph with additional attribute dict B.node_type which is keyed by nodes to "Bottom" or "Top" appropriately. if fpos is not None, a second additional attribute dict B.pos is created to hold the position tuple of each node for viewing the bipartite graph. """ cliq=list(find_cliques(G)) if create_using: B=create_using B.clear() else: B=networkx.Graph() if name is not None: B.name=name B.add_nodes_from(G) B.node_type={} # New Attribute for B for n in B: B.node_type[n]="Bottom" if fpos: B.pos={} # New Attribute for B delta_cpos=1./len(cliq) delta_ppos=1./G.order() cpos=0. ppos=0. for i,cl in enumerate(cliq): name= -i-1 # Top nodes get negative names B.add_node(name) B.node_type[name]="Top" if fpos: if name not in B.pos: B.pos[name]=(0.2,cpos) cpos +=delta_cpos for v in cl: B.add_edge(name,v) if fpos is not None: if v not in B.pos: B.pos[v]=(0.8,ppos) ppos +=delta_ppos return B def project_down(B,create_using=None,name=None): """Project a bipartite graph B down onto its "bottom nodes". The nodes retain their names and are connected if they share a common top node in the bipartite graph. Returns a Graph. """ if create_using: G=create_using G.clear() else: G=networkx.Graph() if name is not None: G.name=name for v,Bvnbrs in B.adjacency_iter(): if B.node_type[v]=="Bottom": G.add_node(v) for cv in Bvnbrs: G.add_edges_from([(v,u) for u in B[cv] if u!=v]) return G def project_up(B,create_using=None,name=None): """Project a bipartite graph B down onto its "bottom nodes". The nodes retain their names and are connected if they share a common Bottom Node in the Bipartite Graph. Returns a Graph. """ if create_using: G=create_using G.clear() else: G=networkx.Graph() if name is not None: G.name=name for v,Bvnbrs in B.adjacency_iter(): if B.node_type[v]=="Top": vname= -v #Change sign of name for Top Nodes G.add_node(vname) for cv in Bvnbrs: # Note: -u changes the name (not Top node anymore) G.add_edges_from([(vname,-u) for u in B[cv] if u!=v]) return G def graph_clique_number(G,cliques=None): """Return the clique number (size of the largest clique) for G. An optional list of cliques can be input if already computed. """ if cliques is None: cliques=find_cliques(G) return max( [len(c) for c in cliques] ) def graph_number_of_cliques(G,cliques=None): """Returns the number of maximal cliques in G. An optional list of cliques can be input if already computed. """ if cliques is None: cliques=list(find_cliques(G)) return len(cliques) def node_clique_number(G,nodes=None,cliques=None): """ Returns the size of the largest maximal clique containing each given node. Returns a single or list depending on input nodes. Optional list of cliques can be input if already computed. """ if cliques is None: if nodes is not None: # Use ego_graph to decrease size of graph if isinstance(nodes,list): d={} for n in nodes: H=networkx.ego_graph(G,n) d[n]=max( (len(c) for c in find_cliques(H)) ) else: H=networkx.ego_graph(G,nodes) d=max( (len(c) for c in find_cliques(H)) ) return d # nodes is None--find all cliques cliques=list(find_cliques(G)) if nodes is None: nodes=G.nodes() # none, get entire graph if not isinstance(nodes, list): # check for a list v=nodes # assume it is a single value d=max([len(c) for c in cliques if v in c]) else: d={} for v in nodes: d[v]=max([len(c) for c in cliques if v in c]) return d # if nodes is None: # none, use entire graph # nodes=G.nodes() # elif not isinstance(nodes, list): # check for a list # nodes=[nodes] # assume it is a single value # if cliques is None: # cliques=list(find_cliques(G)) # d={} # for v in nodes: # d[v]=max([len(c) for c in cliques if v in c]) # if nodes in G: # return d[v] #return single value # return d def number_of_cliques(G,nodes=None,cliques=None): """Returns the number of maximal cliques for each node. Returns a single or list depending on input nodes. Optional list of cliques can be input if already computed. """ if cliques is None: cliques=list(find_cliques(G)) if nodes is None: nodes=G.nodes() # none, get entire graph if not isinstance(nodes, list): # check for a list v=nodes # assume it is a single value numcliq=len([1 for c in cliques if v in c]) else: numcliq={} for v in nodes: numcliq[v]=len([1 for c in cliques if v in c]) return numcliq def cliques_containing_node(G,nodes=None,cliques=None): """Returns a list of cliques containing the given node. Returns a single list or list of lists depending on input nodes. Optional list of cliques can be input if already computed. """ if cliques is None: cliques=list(find_cliques(G)) if nodes is None: nodes=G.nodes() # none, get entire graph if not isinstance(nodes, list): # check for a list v=nodes # assume it is a single value vcliques=[c for c in cliques if v in c] else: vcliques={} for v in nodes: vcliques[v]=[c for c in cliques if v in c] return vcliques networkx-1.11/networkx/algorithms/distance_regular.py0000644000175000017500000001244112637544500023122 0ustar aricaric00000000000000""" ======================= Distance-regular graphs ======================= """ # Copyright (C) 2011 by # Dheeraj M R # Aric Hagberg # All rights reserved. # BSD license. import networkx as nx __author__ = """\n""".join(['Dheeraj M R ', 'Aric Hagberg ']) __all__ = ['is_distance_regular','intersection_array','global_parameters'] def is_distance_regular(G): """Returns True if the graph is distance regular, False otherwise. A connected graph G is distance-regular if for any nodes x,y and any integers i,j=0,1,...,d (where d is the graph diameter), the number of vertices at distance i from x and distance j from y depends only on i,j and the graph distance between x and y, independently of the choice of x and y. Parameters ---------- G: Networkx graph (undirected) Returns ------- bool True if the graph is Distance Regular, False otherwise Examples -------- >>> G=nx.hypercube_graph(6) >>> nx.is_distance_regular(G) True See Also -------- intersection_array, global_parameters Notes ----- For undirected and simple graphs only References ---------- .. [1] Brouwer, A. E.; Cohen, A. M.; and Neumaier, A. Distance-Regular Graphs. New York: Springer-Verlag, 1989. .. [2] Weisstein, Eric W. "Distance-Regular Graph." http://mathworld.wolfram.com/Distance-RegularGraph.html """ try: a=intersection_array(G) return True except nx.NetworkXError: return False def global_parameters(b,c): """Return global parameters for a given intersection array. Given a distance-regular graph G with integers b_i, c_i,i = 0,....,d such that for any 2 vertices x,y in G at a distance i=d(x,y), there are exactly c_i neighbors of y at a distance of i-1 from x and b_i neighbors of y at a distance of i+1 from x. Thus, a distance regular graph has the global parameters, [[c_0,a_0,b_0],[c_1,a_1,b_1],......,[c_d,a_d,b_d]] for the intersection array [b_0,b_1,.....b_{d-1};c_1,c_2,.....c_d] where a_i+b_i+c_i=k , k= degree of every vertex. Parameters ---------- b,c: tuple of lists Returns ------- p : list of three-tuples Examples -------- >>> G=nx.dodecahedral_graph() >>> b,c=nx.intersection_array(G) >>> list(nx.global_parameters(b,c)) [(0, 0, 3), (1, 0, 2), (1, 1, 1), (1, 1, 1), (2, 0, 1), (3, 0, 0)] References ---------- .. [1] Weisstein, Eric W. "Global Parameters." From MathWorld--A Wolfram Web Resource. http://mathworld.wolfram.com/GlobalParameters.html See Also -------- intersection_array """ d=len(b) ba=b[:] ca=c[:] ba.append(0) ca.insert(0,0) k = ba[0] aa = [k-x-y for x,y in zip(ba,ca)] return zip(*[ca,aa,ba]) def intersection_array(G): """Returns the intersection array of a distance-regular graph. Given a distance-regular graph G with integers b_i, c_i,i = 0,....,d such that for any 2 vertices x,y in G at a distance i=d(x,y), there are exactly c_i neighbors of y at a distance of i-1 from x and b_i neighbors of y at a distance of i+1 from x. A distance regular graph'sintersection array is given by, [b_0,b_1,.....b_{d-1};c_1,c_2,.....c_d] Parameters ---------- G: Networkx graph (undirected) Returns ------- b,c: tuple of lists Examples -------- >>> G=nx.icosahedral_graph() >>> nx.intersection_array(G) ([5, 2, 1], [1, 2, 5]) References ---------- .. [1] Weisstein, Eric W. "Intersection Array." From MathWorld--A Wolfram Web Resource. http://mathworld.wolfram.com/IntersectionArray.html See Also -------- global_parameters """ if G.is_multigraph() or G.is_directed(): raise nx.NetworkxException('Not implemented for directed ', 'or multiedge graphs.') # test for regular graph (all degrees must be equal) degree = G.degree_iter() (_,k) = next(degree) for _,knext in degree: if knext != k: raise nx.NetworkXError('Graph is not distance regular.') k = knext path_length = nx.all_pairs_shortest_path_length(G) diameter = max([max(path_length[n].values()) for n in path_length]) bint = {} # 'b' intersection array cint = {} # 'c' intersection array for u in G: for v in G: try: i = path_length[u][v] except KeyError: # graph must be connected raise nx.NetworkXError('Graph is not distance regular.') # number of neighbors of v at a distance of i-1 from u c = len([n for n in G[v] if path_length[n][u]==i-1]) # number of neighbors of v at a distance of i+1 from u b = len([n for n in G[v] if path_length[n][u]==i+1]) # b,c are independent of u and v if cint.get(i,c) != c or bint.get(i,b) != b: raise nx.NetworkXError('Graph is not distance regular') bint[i] = b cint[i] = c return ([bint.get(i,0) for i in range(diameter)], [cint.get(i+1,0) for i in range(diameter)]) networkx-1.11/networkx/algorithms/minors.py0000644000175000017500000002622612637544500021124 0ustar aricaric00000000000000# minors.py - functions for computing minors of graphs # # Copyright 2015 Jeffrey Finkelstein . # # This file is part of NetworkX. # # NetworkX is distributed under a BSD license; see LICENSE.txt for more # information. """Provides functions for computing minors of a graph.""" from itertools import chain from itertools import combinations from itertools import permutations from itertools import product __all__ = ['contracted_edge', 'contracted_nodes', 'identified_nodes', 'quotient_graph'] def peek(iterable): """Returns an arbitrary element of ``iterable`` without removing it. This is most useful for peeking at an arbitrary element of a set:: >>> peek({3, 2, 1}) 1 >>> peek('hello') 'h' """ return next(iter(iterable)) def equivalence_classes(iterable, relation): """Returns the set of equivalence classes of the given ``iterable`` under the specified equivalence relation. ``relation`` must be a Boolean-valued function that takes two argument. It must represent an equivalence relation (that is, the relation induced by the function must be reflexive, symmetric, and transitive). The return value is a set of sets. It is a partition of the elements of ``iterable``; duplicate elements will be ignored so it makes the most sense for ``iterable`` to be a :class:`set`. """ # For simplicity of implementation, we initialize the return value as a # list of lists, then convert it to a set of sets at the end of the # function. blocks = [] # Determine the equivalence class for each element of the iterable. for y in iterable: # Each element y must be in *exactly one* equivalence class. # # Each block is guaranteed to be non-empty for block in blocks: x = peek(block) if relation(x, y): block.append(y) break else: # If the element y is not part of any known equivalence class, it # must be in its own, so we create a new singleton equivalence # class for it. blocks.append([y]) return {frozenset(block) for block in blocks} def quotient_graph(G, node_relation, edge_relation=None, create_using=None): """Returns the quotient graph of ``G`` under the specified equivalence relation on nodes. Parameters ---------- G : NetworkX graph The graph for which to return the quotient graph with the specified node relation. node_relation : Boolean function with two arguments This function must represent an equivalence relation on the nodes of ``G``. It must take two arguments *u* and *v* and return ``True`` exactly when *u* and *v* are in the same equivalence class. The equivalence classes form the nodes in the returned graph. edge_relation : Boolean function with two arguments This function must represent an edge relation on the *blocks* of ``G`` in the partition induced by ``node_relation``. It must take two arguments, *B* and *C*, each one a set of nodes, and return ``True`` exactly when there should be an edge joining block *B* to block *C* in the returned graph. If ``edge_relation`` is not specified, it is assumed to be the following relation. Block *B* is related to block *C* if and only if some node in *B* is adjacent to some node in *C*, according to the edge set of ``G``. create_using : NetworkX graph If specified, this must be an instance of a NetworkX graph class. The nodes and edges of the quotient graph will be added to this graph and returned. If not specified, the returned graph will have the same type as the input graph. Returns ------- NetworkX graph The quotient graph of ``G`` under the equivalence relation specified by ``node_relation``. Examples -------- The quotient graph of the complete bipartite graph under the "same neighbors" equivalence relation is `K_2`. Under this relation, two nodes are equivalent if they are not adjacent but have the same neighbor set:: >>> import networkx as nx >>> G = nx.complete_bipartite_graph(2, 3) >>> same_neighbors = lambda u, v: (u not in G[v] and v not in G[u] ... and G[u] == G[v]) >>> Q = nx.quotient_graph(G, same_neighbors) >>> K2 = nx.complete_graph(2) >>> nx.is_isomorphic(Q, K2) True The quotient graph of a directed graph under the "same strongly connected component" equivalence relation is the condensation of the graph (see :func:`condensation`). This example comes from the Wikipedia article *`Strongly connected component`_*:: >>> import networkx as nx >>> G = nx.DiGraph() >>> edges = ['ab', 'be', 'bf', 'bc', 'cg', 'cd', 'dc', 'dh', 'ea', ... 'ef', 'fg', 'gf', 'hd', 'hf'] >>> G.add_edges_from(tuple(x) for x in edges) >>> components = list(nx.strongly_connected_components(G)) >>> sorted(sorted(component) for component in components) [['a', 'b', 'e'], ['c', 'd', 'h'], ['f', 'g']] >>> >>> C = nx.condensation(G, components) >>> component_of = C.graph['mapping'] >>> same_component = lambda u, v: component_of[u] == component_of[v] >>> Q = nx.quotient_graph(G, same_component) >>> nx.is_isomorphic(C, Q) True Node identification can be represented as the quotient of a graph under the equivalence relation that places the two nodes in one block and each other node in its own singleton block:: >>> import networkx as nx >>> K24 = nx.complete_bipartite_graph(2, 4) >>> K34 = nx.complete_bipartite_graph(3, 4) >>> C = nx.contracted_nodes(K34, 1, 2) >>> nodes = {1, 2} >>> is_contracted = lambda u, v: u in nodes and v in nodes >>> Q = nx.quotient_graph(K34, is_contracted) >>> nx.is_isomorphic(Q, C) True >>> nx.is_isomorphic(Q, K24) True .. _Strongly connected component: https://en.wikipedia.org/wiki/Strongly_connected_component """ H = type(create_using)() if create_using is not None else type(G)() # Compute the blocks of the partition on the nodes of G induced by the # equivalence relation R. H.add_nodes_from(equivalence_classes(G, node_relation)) # By default, the edge relation is the relation defined as follows. B is # adjacent to C if a node in B is adjacent to a node in C, according to the # edge set of G. # # This is not a particularly efficient implementation of this relation: # there are O(n^2) pairs to check and each check may require O(log n) time # (to check set membership). This can certainly be parallelized. if edge_relation is None: edge_relation = lambda b, c: any(v in G[u] for u, v in product(b, c)) block_pairs = permutations(H, 2) if H.is_directed() else combinations(H, 2) H.add_edges_from((b, c) for (b, c) in block_pairs if edge_relation(b, c)) return H def contracted_nodes(G, u, v, self_loops=True): """Returns the graph that results from contracting ``u`` and ``v``. Node contraction identifies the two nodes as a single node incident to any edge that was incident to the original two nodes. Parameters ---------- G : NetworkX graph The graph whose nodes will be contracted. u, v : nodes Must be nodes in ``G``. self_loops : Boolean If this is ``True``, any edges joining ``u`` and ``v`` in ``G`` become self-loops on the new node in the returned graph. Returns ------- Networkx graph A new graph object of the same type as ``G`` (leaving ``G`` unmodified) with ``u`` and ``v`` identified in a single node. The right node ``v`` will be merged into the node ``u``, so only ``u`` will appear in the returned graph. Examples -------- Contracting two nonadjacent nodes of the cycle graph on four nodes `C_4` yields the path graph (ignoring parallel edges):: >>> import networkx as nx >>> G = nx.cycle_graph(4) >>> M = nx.contracted_nodes(G, 1, 3) >>> P3 = nx.path_graph(3) >>> nx.is_isomorphic(M, P3) True See also -------- contracted_edge quotient_graph Notes ----- This function is also available as ``identified_nodes``. """ H = G.copy() if H.is_directed(): in_edges = ((w, u, d) for w, x, d in G.in_edges(v, data=True) if self_loops or w != u) out_edges = ((u, w, d) for x, w, d in G.out_edges(v, data=True) if self_loops or w != u) new_edges = chain(in_edges, out_edges) else: new_edges = ((u, w, d) for x, w, d in G.edges(v, data=True) if self_loops or w != u) v_data = H.node[v] H.remove_node(v) H.add_edges_from(new_edges) if 'contraction' in H.node[u]: H.node[u]['contraction'][v] = v_data else: H.node[u]['contraction'] = {v: v_data} return H identified_nodes = contracted_nodes def contracted_edge(G, edge, self_loops=True): """Returns the graph that results from contracting the specified edge. Edge contraction identifies the two endpoints of the edge as a single node incident to any edge that was incident to the original two nodes. A graph that results from edge contraction is called a *minor* of the original graph. Parameters ---------- G : NetworkX graph The graph whose edge will be contracted. edge : tuple Must be a pair of nodes in ``G``. self_loops : Boolean If this is ``True``, any edges (including ``edge``) joining the endpoints of ``edge`` in ``G`` become self-loops on the new node in the returned graph. Returns ------- Networkx graph A new graph object of the same type as ``G`` (leaving ``G`` unmodified) with endpoints of ``edge`` identified in a single node. The right node of ``edge`` will be merged into the left one, so only the left one will appear in the returned graph. Raises ------ ValueError If ``edge`` is not an edge in ``G``. Examples -------- Attempting to contract two nonadjacent nodes yields an error:: >>> import networkx as nx >>> G = nx.cycle_graph(4) >>> nx.contracted_edge(G, (1, 3)) Traceback (most recent call last): ... ValueError: Edge (1, 3) does not exist in graph G; cannot contract it Contracting two adjacent nodes in the cycle graph on *n* nodes yields the cycle graph on *n - 1* nodes:: >>> import networkx as nx >>> C5 = nx.cycle_graph(5) >>> C4 = nx.cycle_graph(4) >>> M = nx.contracted_edge(C5, (0, 1), self_loops=False) >>> nx.is_isomorphic(M, C4) True See also -------- contracted_nodes quotient_graph """ if not G.has_edge(*edge): raise ValueError('Edge {0} does not exist in graph G; cannot contract' ' it'.format(edge)) return contracted_nodes(G, *edge, self_loops=self_loops) networkx-1.11/networkx/algorithms/euler.py0000644000175000017500000000737212637544500020732 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ Eulerian circuits and graphs. """ import networkx as nx __author__ = """\n""".join(['Nima Mohammadi (nima.irt[AT]gmail.com)', 'Aric Hagberg ']) # Copyright (C) 2010 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. __all__ = ['is_eulerian', 'eulerian_circuit'] def is_eulerian(G): """Return True if G is an Eulerian graph, False otherwise. An Eulerian graph is a graph with an Eulerian circuit. Parameters ---------- G : graph A NetworkX Graph Examples -------- >>> nx.is_eulerian(nx.DiGraph({0:[3], 1:[2], 2:[3], 3:[0, 1]})) True >>> nx.is_eulerian(nx.complete_graph(5)) True >>> nx.is_eulerian(nx.petersen_graph()) False Notes ----- This implementation requires the graph to be connected (or strongly connected for directed graphs). """ if G.is_directed(): # Every node must have equal in degree and out degree for n in G.nodes_iter(): if G.in_degree(n) != G.out_degree(n): return False # Must be strongly connected if not nx.is_strongly_connected(G): return False else: # An undirected Eulerian graph has no vertices of odd degrees for v,d in G.degree_iter(): if d % 2 != 0: return False # Must be connected if not nx.is_connected(G): return False return True def eulerian_circuit(G, source=None): """Return the edges of an Eulerian circuit in G. An Eulerian circuit is a path that crosses every edge in G exactly once and finishes at the starting node. Parameters ---------- G : NetworkX Graph or DiGraph A directed or undirected graph source : node, optional Starting node for circuit. Returns ------- edges : generator A generator that produces edges in the Eulerian circuit. Raises ------ NetworkXError If the graph is not Eulerian. See Also -------- is_eulerian Notes ----- Linear time algorithm, adapted from [1]_. General information about Euler tours [2]_. References ---------- .. [1] J. Edmonds, E. L. Johnson. Matching, Euler tours and the Chinese postman. Mathematical programming, Volume 5, Issue 1 (1973), 111-114. .. [2] http://en.wikipedia.org/wiki/Eulerian_path Examples -------- >>> G=nx.complete_graph(3) >>> list(nx.eulerian_circuit(G)) [(0, 2), (2, 1), (1, 0)] >>> list(nx.eulerian_circuit(G,source=1)) [(1, 2), (2, 0), (0, 1)] >>> [u for u,v in nx.eulerian_circuit(G)] # nodes in circuit [0, 2, 1] """ from operator import itemgetter if not is_eulerian(G): raise nx.NetworkXError("G is not Eulerian.") g = G.__class__(G) # copy graph structure (not attributes) # set starting node if source is None: v = next(g.nodes_iter()) else: v = source if g.is_directed(): degree = g.in_degree edges = g.in_edges_iter get_vertex = itemgetter(0) else: degree = g.degree edges = g.edges_iter get_vertex = itemgetter(1) vertex_stack = [v] last_vertex = None while vertex_stack: current_vertex = vertex_stack[-1] if degree(current_vertex) == 0: if last_vertex is not None: yield (last_vertex, current_vertex) last_vertex = current_vertex vertex_stack.pop() else: random_edge = next(edges(current_vertex)) vertex_stack.append(get_vertex(random_edge)) g.remove_edge(*random_edge) networkx-1.11/networkx/algorithms/coloring/0000755000175000017500000000000012653231454021046 5ustar aricaric00000000000000networkx-1.11/networkx/algorithms/coloring/greedy_coloring_with_interchange.py0000644000175000017500000001501612637544500030201 0ustar aricaric00000000000000import networkx as nx import itertools __all__ = ['greedy_coloring_with_interchange'] class Node(object): __slots__ = ['node_id', 'color', 'adj_list', 'adj_color'] def __init__(self, node_id, n): self.node_id = node_id self.color = -1 self.adj_list = None self.adj_color = [None for _ in range(n)] def __repr__(self): return "Node_id: {0}, Color: {1}, Adj_list: ({2}), \ adj_color: ({3})".format( self.node_id, self.color, self.adj_list, self.adj_color) def assign_color(self, adj_entry, color): adj_entry.col_prev = None adj_entry.col_next = self.adj_color[color] self.adj_color[color] = adj_entry if adj_entry.col_next != None: adj_entry.col_next.col_prev = adj_entry def clear_color(self, adj_entry, color): if adj_entry.col_prev == None: self.adj_color[color] = adj_entry.col_next else: adj_entry.col_prev.col_next = adj_entry.col_next if adj_entry.col_next != None: adj_entry.col_next.col_prev = adj_entry.col_prev def iter_neighbors(self): adj_node = self.adj_list while adj_node != None: yield adj_node adj_node = adj_node.next def iter_neighbors_color(self, color): adj_color_node = self.adj_color[color] while adj_color_node != None: yield adj_color_node.node_id adj_color_node = adj_color_node.col_next class AdjEntry(object): __slots__ = ['node_id', 'next', 'mate', 'col_next', 'col_prev'] def __init__(self, node_id): self.node_id = node_id self.next = None self.mate = None self.col_next = None self.col_prev = None def __repr__(self): return "Node_id: {0}, Next: ({1}), Mate: ({2}), \ col_next: ({3}), col_prev: ({4})".format( self.node_id, self.next, self.mate.node_id, None if self.col_next == None else self.col_next.node_id, None if self.col_prev == None else self.col_prev.node_id ) def greedy_coloring_with_interchange(original_graph, nodes): """ This procedure is an adaption of the algorithm described by [1]_, and is an implementation of coloring with interchange. Please be advised, that the datastructures used are rather complex because they are optimized to minimize the time spent identifying subcomponents of the graph, which are possible candidates for color interchange. References ---------- .. [1] Maciej M. Syslo, Marsingh Deo, Janusz S. Kowalik, Discrete Optimization Algorithms with Pascal Programs, 415-424, 1983. ISBN 0-486-45353-7. """ n = len(original_graph) graph = {node_id: Node(node_id, n) for node_id in original_graph} for (node1, node2) in original_graph.edges_iter(): adj_entry1 = AdjEntry(node2) adj_entry2 = AdjEntry(node1) adj_entry1.mate = adj_entry2 adj_entry2.mate = adj_entry1 node1_head = graph[node1].adj_list adj_entry1.next = node1_head graph[node1].adj_list = adj_entry1 node2_head = graph[node2].adj_list adj_entry2.next = node2_head graph[node2].adj_list = adj_entry2 k = 0 for node in nodes: # Find the smallest possible, unused color neighbors = graph[node].iter_neighbors() col_used = {graph[adj_node.node_id].color for adj_node in neighbors} col_used.discard(-1) k1 = next(itertools.dropwhile( lambda x: x in col_used, itertools.count())) # k1 is now the lowest available color if k1 > k: connected = True visited = set() col1 = -1 col2 = -1 while connected and col1 < k: col1 += 1 neighbor_cols = ( graph[node].iter_neighbors_color(col1)) col1_adj = [it for it in neighbor_cols] col2 = col1 while connected and col2 < k: col2 += 1 visited = set(col1_adj) frontier = list(col1_adj) i = 0 while i < len(frontier): search_node = frontier[i] i += 1 col_opp = ( col2 if graph[search_node].color == col1 else col1) neighbor_cols = ( graph[search_node].iter_neighbors_color(col_opp)) for neighbor in neighbor_cols: if neighbor not in visited: visited.add(neighbor) frontier.append(neighbor) # Search if node is not adj to any col2 vertex connected = len(visited.intersection( graph[node].iter_neighbors_color(col2))) > 0 # If connected is false then we can swap !!! if not connected: # Update all the nodes in the component for search_node in visited: graph[search_node].color = ( col2 if graph[search_node].color == col1 else col1) col2_adj = graph[search_node].adj_color[col2] graph[search_node].adj_color[col2] = ( graph[search_node].adj_color[col1]) graph[search_node].adj_color[col1] = col2_adj # Update all the neighboring nodes for search_node in visited: col = graph[search_node].color col_opp = col1 if col == col2 else col2 for adj_node in graph[search_node].iter_neighbors(): if graph[adj_node.node_id].color != col_opp: # Direct reference to entry adj_mate = adj_node.mate graph[adj_node.node_id].clear_color( adj_mate, col_opp) graph[adj_node.node_id].assign_color(adj_mate, col) k1 = col1 # We can color this node color k1 graph[node].color = k1 k = max(k1, k) # Update the neighbors of this node for adj_node in graph[node].iter_neighbors(): adj_mate = adj_node.mate graph[adj_node.node_id].assign_color(adj_mate, k1) return {node.node_id: node.color for node in graph.values()} networkx-1.11/networkx/algorithms/coloring/greedy_coloring.py0000644000175000017500000002272312637544500024602 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ Greedy graph coloring using various strategies. """ # Copyright (C) 2014 by # Christian Olsson # Jan Aagaard Meier # Henrik Haugbølle # All rights reserved. # BSD license. import networkx as nx import random import itertools from . import greedy_coloring_with_interchange as _interchange __author__ = "\n".join(["Christian Olsson ", "Jan Aagaard Meier ", "Henrik Haugbølle "]) __all__ = [ 'greedy_color', 'strategy_largest_first', 'strategy_random_sequential', 'strategy_smallest_last', 'strategy_independent_set', 'strategy_connected_sequential', 'strategy_connected_sequential_dfs', 'strategy_connected_sequential_bfs', 'strategy_saturation_largest_first' ] def min_degree_node(G): return min(G, key=G.degree) def max_degree_node(G): return max(G, key=G.degree) def strategy_largest_first(G, colors): """ Largest first (lf) ordering. Ordering the nodes by largest degree first. """ nodes = G.nodes() nodes.sort(key=lambda node: -G.degree(node)) return nodes def strategy_random_sequential(G, colors): """ Random sequential (RS) ordering. Scrambles nodes into random ordering. """ nodes = G.nodes() random.shuffle(nodes) return nodes def strategy_smallest_last(G, colors): """ Smallest last (sl). Picking the node with smallest degree first, subtracting it from the graph, and starting over with the new smallest degree node. When the graph is empty, the reverse ordering of the one built is returned. """ len_g = len(G) available_g = G.copy() nodes = [None] * len_g for i in range(len_g): node = min_degree_node(available_g) available_g.remove_node(node) nodes[len_g - i - 1] = node return nodes def strategy_independent_set(G, colors): """ Greedy independent set ordering (GIS). Generates a maximal independent set of nodes, and assigns color C to all nodes in this set. This set of nodes is now removed from the graph, and the algorithm runs again. """ len_g = len(G) no_colored = 0 k = 0 uncolored_g = G.copy() while no_colored < len_g: # While there are uncolored nodes available_g = uncolored_g.copy() while len(available_g): # While there are still nodes available node = min_degree_node(available_g) colors[node] = k # assign color to values no_colored += 1 uncolored_g.remove_node(node) # Remove node and its neighbors from available available_g.remove_nodes_from(available_g.neighbors(node) + [node]) k += 1 return None def strategy_connected_sequential_bfs(G, colors): """ Connected sequential ordering (CS). Yield nodes in such an order, that each node, except the first one, has at least one neighbour in the preceeding sequence. The sequence is generated using BFS) """ return strategy_connected_sequential(G, colors, 'bfs') def strategy_connected_sequential_dfs(G, colors): """ Connected sequential ordering (CS). Yield nodes in such an order, that each node, except the first one, has at least one neighbour in the preceeding sequence. The sequence is generated using DFS) """ return strategy_connected_sequential(G, colors, 'dfs') def strategy_connected_sequential(G, colors, traversal='bfs'): """ Connected sequential ordering (CS). Yield nodes in such an order, that each node, except the first one, has at least one neighbour in the preceeding sequence. The sequence can be generated using both BFS and DFS search (using the strategy_connected_sequential_bfs and strategy_connected_sequential_dfs method). The default is bfs. """ for component_graph in nx.connected_component_subgraphs(G): source = component_graph.nodes()[0] yield source # Pick the first node as source if traversal == 'bfs': tree = nx.bfs_edges(component_graph, source) elif traversal == 'dfs': tree = nx.dfs_edges(component_graph, source) else: raise nx.NetworkXError( 'Please specify bfs or dfs for connected sequential ordering') for (_, end) in tree: # Then yield nodes in the order traversed by either BFS or DFS yield end def strategy_saturation_largest_first(G, colors): """ Saturation largest first (SLF). Also known as degree saturation (DSATUR). """ len_g = len(G) no_colored = 0 distinct_colors = {} for node in G.nodes_iter(): distinct_colors[node] = set() while no_colored != len_g: if no_colored == 0: # When sat. for all nodes is 0, yield the node with highest degree no_colored += 1 node = max_degree_node(G) yield node for neighbour in G.neighbors_iter(node): distinct_colors[neighbour].add(0) else: highest_saturation = -1 highest_saturation_nodes = [] for node, distinct in distinct_colors.items(): if node not in colors: # If the node is not already colored saturation = len(distinct) if saturation > highest_saturation: highest_saturation = saturation highest_saturation_nodes = [node] elif saturation == highest_saturation: highest_saturation_nodes.append(node) if len(highest_saturation_nodes) == 1: node = highest_saturation_nodes[0] else: # Return the node with highest degree max_degree = -1 max_node = None for node in highest_saturation_nodes: degree = G.degree(node) if degree > max_degree: max_node = node max_degree = degree node = max_node no_colored += 1 yield node color = colors[node] for neighbour in G.neighbors_iter(node): distinct_colors[neighbour].add(color) def greedy_color(G, strategy=strategy_largest_first, interchange=False): """Color a graph using various strategies of greedy graph coloring. The strategies are described in [1]_. Attempts to color a graph using as few colors as possible, where no neighbours of a node can have same color as the node itself. Parameters ---------- G : NetworkX graph strategy : function(G, colors) A function that provides the coloring strategy, by returning nodes in the ordering they should be colored. G is the graph, and colors is a dict of the currently assigned colors, keyed by nodes. You can pass your own ordering function, or use one of the built in: * strategy_largest_first * strategy_random_sequential * strategy_smallest_last * strategy_independent_set * strategy_connected_sequential_bfs * strategy_connected_sequential_dfs * strategy_connected_sequential (alias of strategy_connected_sequential_bfs) * strategy_saturation_largest_first (also known as DSATUR) interchange: bool Will use the color interchange algorithm described by [2]_ if set to true. Note that saturation largest first and independent set do not work with interchange. Furthermore, if you use interchange with your own strategy function, you cannot rely on the values in the colors argument. Returns ------- A dictionary with keys representing nodes and values representing corresponding coloring. Examples -------- >>> G = nx.cycle_graph(4) >>> d = nx.coloring.greedy_color(G, strategy=nx.coloring.strategy_largest_first) >>> d in [{0: 0, 1: 1, 2: 0, 3: 1}, {0: 1, 1: 0, 2: 1, 3: 0}] True References ---------- .. [1] Adrian Kosowski, and Krzysztof Manuszewski, Classical Coloring of Graphs, Graph Colorings, 2-19, 2004. ISBN 0-8218-3458-4. .. [2] Maciej M. Syslo, Marsingh Deo, Janusz S. Kowalik, Discrete Optimization Algorithms with Pascal Programs, 415-424, 1983. ISBN 0-486-45353-7. """ colors = {} # dictionary to keep track of the colors of the nodes if len(G): if interchange and ( strategy == strategy_independent_set or strategy == strategy_saturation_largest_first): raise nx.NetworkXPointlessConcept( 'Interchange is not applicable for GIS and SLF') nodes = strategy(G, colors) if nodes: if interchange: return (_interchange .greedy_coloring_with_interchange(G, nodes)) else: for node in nodes: # set to keep track of colors of neighbours neighbour_colors = set() for neighbour in G.neighbors_iter(node): if neighbour in colors: neighbour_colors.add(colors[neighbour]) for color in itertools.count(): if color not in neighbour_colors: break # assign the node the newly found color colors[node] = color return colors networkx-1.11/networkx/algorithms/coloring/__init__.py0000644000175000017500000000012612637544450023163 0ustar aricaric00000000000000from networkx.algorithms.coloring.greedy_coloring import * __all__ = ['greedy_color'] networkx-1.11/networkx/algorithms/coloring/tests/0000755000175000017500000000000012653231454022210 5ustar aricaric00000000000000networkx-1.11/networkx/algorithms/coloring/tests/test_coloring.py0000644000175000017500000005461512637544500025451 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """Greedy coloring test suite. Run with nose: nosetests -v test_coloring.py """ __author__ = "\n".join(["Christian Olsson ", "Jan Aagaard Meier ", "Henrik Haugbølle "]) import networkx as nx from nose.tools import * class TestColoring: ############################## RS tests ############################## def test_rs_empty(self): graph = emptyGraph() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_random_sequential, interchange=False) assert_true(verify_length(coloring, 0)) assert_true(verify_coloring(graph, coloring)) def test_rs_oneNode(self): graph = oneNodeGraph() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_random_sequential, interchange=False) assert_true(verify_length(coloring, 1)) assert_true(verify_coloring(graph, coloring)) def test_rs_twoNodes(self): graph = twoNodesGraph() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_random_sequential, interchange=False) assert_true(verify_length(coloring, 2)) assert_true(verify_coloring(graph, coloring)) def test_rs_threeNodeClique(self): graph = threeNodeClique() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_random_sequential, interchange=False) assert_true(verify_length(coloring, 3)) assert_true(verify_coloring(graph, coloring)) def test_rs_shc(self): graph = rs_shc() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_random_sequential, interchange=False) assert_true(verify_length(coloring, 2) or verify_length(coloring, 3)) assert_true(verify_coloring(graph, coloring)) ############################## SLF tests ############################## def test_slf_empty(self): graph = emptyGraph() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_saturation_largest_first, interchange=False) assert_true(verify_length(coloring, 0)) assert_true(verify_coloring(graph, coloring)) def test_slf_oneNode(self): graph = oneNodeGraph() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_saturation_largest_first, interchange=False) assert_true(verify_length(coloring, 1)) assert_true(verify_coloring(graph, coloring)) def test_slf_twoNodes(self): graph = twoNodesGraph() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_saturation_largest_first, interchange=False) assert_true(verify_length(coloring, 2)) assert_true(verify_coloring(graph, coloring)) def test_slf_threeNodeClique(self): graph = threeNodeClique() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_saturation_largest_first, interchange=False) assert_true(verify_length(coloring, 3)) assert_true(verify_coloring(graph, coloring)) def test_slf_shc(self): graph = slf_shc() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_saturation_largest_first, interchange=False) assert_true(verify_length(coloring, 3) or verify_length(coloring, 4)) assert_true(verify_coloring(graph, coloring)) def test_slf_hc(self): graph = slf_hc() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_saturation_largest_first, interchange=False) assert_true(verify_length(coloring, 4)) assert_true(verify_coloring(graph, coloring)) ############################## LF tests ############################## def test_lf_empty(self): graph = emptyGraph() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_largest_first, interchange=False) assert_true(verify_length(coloring, 0)) assert_true(verify_coloring(graph, coloring)) def test_lf_oneNode(self): graph = oneNodeGraph() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_largest_first, interchange=False) assert_true(verify_length(coloring, 1)) assert_true(verify_coloring(graph, coloring)) def test_lf_twoNodes(self): graph = twoNodesGraph() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_largest_first, interchange=False) assert_true(verify_length(coloring, 2)) assert_true(verify_coloring(graph, coloring)) def test_lf_threeNodeClique(self): graph = threeNodeClique() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_largest_first, interchange=False) assert_true(verify_length(coloring, 3)) assert_true(verify_coloring(graph, coloring)) def test_lf_shc(self): graph = lf_shc() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_largest_first, interchange=False) assert_true(verify_length(coloring, 2) or verify_length(coloring, 3)) assert_true(verify_coloring(graph, coloring)) def test_lf_hc(self): graph = lf_hc() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_largest_first, interchange=False) assert_true(verify_length(coloring, 4)) assert_true(verify_coloring(graph, coloring)) ############################## SL tests ############################## def test_sl_empty(self): graph = emptyGraph() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_smallest_last, interchange=False) assert_true(verify_length(coloring, 0)) assert_true(verify_coloring(graph, coloring)) def test_sl_oneNode(self): graph = oneNodeGraph() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_smallest_last, interchange=False) assert_true(verify_length(coloring, 1)) assert_true(verify_coloring(graph, coloring)) def test_sl_twoNodes(self): graph = twoNodesGraph() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_smallest_last, interchange=False) assert_true(verify_length(coloring, 2)) assert_true(verify_coloring(graph, coloring)) def test_sl_threeNodeClique(self): graph = threeNodeClique() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_smallest_last, interchange=False) assert_true(verify_length(coloring, 3)) assert_true(verify_coloring(graph, coloring)) def test_sl_shc(self): graph = sl_shc() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_smallest_last, interchange=False) assert_true(verify_length(coloring, 3) or verify_length(coloring, 4)) assert_true(verify_coloring(graph, coloring)) def test_sl_hc(self): graph = sl_hc() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_smallest_last, interchange=False) assert_true(verify_length(coloring, 5)) assert_true(verify_coloring(graph, coloring)) ############################## GIS tests ############################## def test_gis_empty(self): graph = emptyGraph() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_independent_set, interchange=False) assert_true(verify_length(coloring, 0)) assert_true(verify_coloring(graph, coloring)) def test_gis_oneNode(self): graph = oneNodeGraph() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_independent_set, interchange=False) assert_true(verify_length(coloring, 1)) assert_true(verify_coloring(graph, coloring)) def test_gis_twoNodes(self): graph = twoNodesGraph() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_independent_set, interchange=False) assert_true(verify_length(coloring, 2)) assert_true(verify_coloring(graph, coloring)) def test_gis_threeNodeClique(self): graph = threeNodeClique() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_independent_set, interchange=False) assert_true(verify_length(coloring, 3)) assert_true(verify_coloring(graph, coloring)) def test_gis_shc(self): graph = gis_shc() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_independent_set, interchange=False) assert_true(verify_length(coloring, 2) or verify_length(coloring, 3)) assert_true(verify_coloring(graph, coloring)) def test_gis_hc(self): graph = gis_hc() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_independent_set, interchange=False) assert_true(verify_length(coloring, 3)) assert_true(verify_coloring(graph, coloring)) ############################## CS tests ############################## def test_cs_empty(self): graph = emptyGraph() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_connected_sequential, interchange=False) assert_true(verify_length(coloring, 0)) assert_true(verify_coloring(graph, coloring)) def test_cs_oneNode(self): graph = oneNodeGraph() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_connected_sequential, interchange=False) assert_true(verify_length(coloring, 1)) assert_true(verify_coloring(graph, coloring)) def test_cs_twoNodes(self): graph = twoNodesGraph() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_connected_sequential, interchange=False) assert_true(verify_length(coloring, 2)) assert_true(verify_coloring(graph, coloring)) def test_cs_threeNodeClique(self): graph = threeNodeClique() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_connected_sequential, interchange=False) assert_true(verify_length(coloring, 3)) assert_true(verify_coloring(graph, coloring)) def test_cs_shc(self): graph = cs_shc() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_connected_sequential, interchange=False) assert_true(verify_length(coloring, 3) or verify_length(coloring, 4)) assert_true(verify_coloring(graph, coloring)) def test_cs_dfs_empty(self): graph = emptyGraph() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_connected_sequential_dfs, interchange=False) assert_true(verify_length(coloring, 0)) assert_true(verify_coloring(graph, coloring)) def test_cs_dfs_oneNode(self): graph = oneNodeGraph() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_connected_sequential_dfs, interchange=False) assert_true(verify_length(coloring, 1)) assert_true(verify_coloring(graph, coloring)) def test_cs_dfs_twoNodes(self): graph = twoNodesGraph() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_connected_sequential_dfs, interchange=False) assert_true(verify_length(coloring, 2)) assert_true(verify_coloring(graph, coloring)) def test_cs_dfs_threeNodeClique(self): graph = threeNodeClique() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_connected_sequential_dfs, interchange=False) assert_true(verify_length(coloring, 3)) assert_true(verify_coloring(graph, coloring)) def test_cs_dfs_shc(self): graph = cs_shc() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_connected_sequential_dfs, interchange=False) assert_true(verify_length(coloring, 3) or verify_length(coloring, 4)) assert_true(verify_coloring(graph, coloring)) def test_cs_disconnected(self): # _connected_ sequential should still work on disconnected graphs graph = nx.Graph() graph.add_edges_from([ (1, 2), (2, 3), (4, 5), (5, 6) ]) coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_connected_sequential, interchange=False) assert_true(verify_length(coloring, 2)) assert_true(verify_coloring(graph, coloring)) ############################## Interchange tests ############################## # RSI def test_rsi_empty(self): graph = emptyGraph() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_random_sequential, interchange=True) assert_true(verify_length(coloring, 0)) assert_true(verify_coloring(graph, coloring)) def test_rsi_oneNode(self): graph = oneNodeGraph() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_random_sequential, interchange=True) assert_true(verify_length(coloring, 1)) assert_true(verify_coloring(graph, coloring)) def test_rsi_twoNodes(self): graph = twoNodesGraph() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_random_sequential, interchange=True) assert_true(verify_length(coloring, 2)) assert_true(verify_coloring(graph, coloring)) def test_rsi_threeNodeClique(self): graph = threeNodeClique() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_random_sequential, interchange=True) assert_true(verify_length(coloring, 3)) assert_true(verify_coloring(graph, coloring)) def test_rsi_rsshc(self): graph = rs_shc() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_random_sequential, interchange=True) assert_true(verify_length(coloring, 2)) assert_true(verify_coloring(graph, coloring)) def test_rsi_shc(self): graph = rsi_shc() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_random_sequential, interchange=True) assert_true(verify_length(coloring, 3) or verify_length(coloring, 4)) assert_true(verify_coloring(graph, coloring)) # SLFI def test_slfi_slfshc(self): graph = oneNodeGraph() assert_raises(nx.NetworkXPointlessConcept, nx.coloring.greedy_color, graph, strategy=nx.coloring.strategy_saturation_largest_first, interchange=True) # LFI def test_lfi_lfshc(self): graph = lf_shc() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_largest_first, interchange=True) assert_true(verify_length(coloring, 2)) assert_true(verify_coloring(graph, coloring)) def test_lfi_lfhc(self): graph = lf_hc() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_largest_first, interchange=True) assert_true(verify_length(coloring, 3)) assert_true(verify_coloring(graph, coloring)) def test_lfi_shc(self): graph = lfi_shc() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_largest_first, interchange=True) assert_true(verify_length(coloring, 3) or verify_length(coloring, 4)) assert_true(verify_coloring(graph, coloring)) def test_lfi_hc(self): graph = lfi_hc() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_largest_first, interchange=True) assert_true(verify_length(coloring, 4)) assert_true(verify_coloring(graph, coloring)) # SLI def test_sli_slshc(self): graph = sl_shc() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_smallest_last, interchange=True) assert_true(verify_length(coloring, 3)) assert_true(verify_coloring(graph, coloring)) def test_sli_slhc(self): graph = sl_hc() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_smallest_last, interchange=True) assert_true(verify_length(coloring, 4)) assert_true(verify_coloring(graph, coloring)) def test_sli_shc(self): graph = sli_shc() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_smallest_last, interchange=True) assert_true(verify_length(coloring, 3) or verify_length(coloring, 4)) assert_true(verify_coloring(graph, coloring)) def test_sli_hc(self): graph = sli_hc() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_smallest_last, interchange=True) assert_true(verify_length(coloring, 5)) assert_true(verify_coloring(graph, coloring)) #GISI def test_gisi_oneNode(self): graph = oneNodeGraph() assert_raises(nx.NetworkXPointlessConcept, nx.coloring.greedy_color, graph, strategy=nx.coloring.strategy_independent_set, interchange=True) # CS def test_csi_csshc(self): graph = cs_shc() coloring = nx.coloring.greedy_color(graph, strategy=nx.coloring.strategy_connected_sequential, interchange=True) assert_true(verify_length(coloring, 3)) assert_true(verify_coloring(graph, coloring)) ############################## Utility functions ############################## def verify_coloring(graph, coloring): for node in graph.nodes_iter(): if node not in coloring: return False color = coloring[node] for neighbor in graph.neighbors(node): if coloring[neighbor] == color: return False return True def verify_length(coloring, expected): coloring = dict_to_sets(coloring) return len(coloring) == expected def dict_to_sets(colors): if len(colors) == 0: return [] k = max(colors.values()) + 1 sets = [set() for _ in range(k)] for (node, color) in colors.items(): sets[color].add(node) return sets ############################## Graphs ############################## def emptyGraph(): return nx.Graph() def oneNodeGraph(): graph = nx.Graph() graph.add_nodes_from([1]) return graph def twoNodesGraph(): graph = nx.Graph() graph.add_nodes_from([1,2]) graph.add_edges_from([(1,2)]) return graph def threeNodeClique(): graph = nx.Graph() graph.add_nodes_from([1,2, 3]) graph.add_edges_from([(1,2), (1,3), (2,3)]) return graph def rs_shc(): graph = nx.Graph() graph.add_nodes_from([1,2,3,4]) graph.add_edges_from([ (1,2), (2,3), (3,4) ]) return graph def slf_shc(): graph = nx.Graph() graph.add_nodes_from([1,2,3,4,5,6,7]) graph.add_edges_from([ (1,2), (1,5), (1,6), (2,3), (2,7), (3,4), (3,7), (4,5), (4,6), (5,6) ]) return graph def slf_hc(): graph = nx.Graph() graph.add_nodes_from([1,2,3,4,5,6,7,8]) graph.add_edges_from([ (1,2), (1,3), (1,4), (1,5), (2,3), (2,4), (2,6), (5,7), (5,8), (6,7), (6,8), (7,8) ]) return graph def lf_shc(): graph = nx.Graph() graph.add_nodes_from([1, 2, 3, 4, 5, 6]) graph.add_edges_from([ (6, 1), (1, 4), (4, 3), (3, 2), (2, 5) ]) return graph def lf_hc(): graph = nx.Graph() graph.add_nodes_from([1, 2, 3, 4, 5, 6, 7]) graph.add_edges_from([ (1, 7), (1, 6), (1, 3), (1, 4), (7, 2), (2, 6), (2, 3), (2, 5), (5, 3), (5, 4), (4, 3) ]) return graph def sl_shc(): graph = nx.Graph() graph.add_nodes_from([1, 2, 3, 4, 5, 6]) graph.add_edges_from([ (1, 2), (1, 3), (2, 3), (1, 4), (2, 5), (3, 6), (4, 5), (4, 6), (5, 6) ]) return graph def sl_hc(): graph = nx.Graph() graph.add_nodes_from([1, 2, 3, 4, 5, 6, 7, 8]) graph.add_edges_from([ (1, 2), (1, 3), (1, 5), (1, 7), (2, 3), (2, 4), (2, 8), (8, 4), (8, 6), (8, 7), (7, 5), (7, 6), (3, 4), (4, 6), (6, 5), (5, 3) ]) return graph def gis_shc(): graph = nx.Graph() graph.add_nodes_from([1, 2, 3, 4]) graph.add_edges_from([ (1, 2), (2, 3), (3, 4) ]) return graph def gis_hc(): graph = nx.Graph() graph.add_nodes_from([1, 2, 3, 4, 5, 6]) graph.add_edges_from([ (1, 5), (2, 5), (3, 6), (4, 6), (5, 6) ]) return graph def cs_shc(): graph = nx.Graph() graph.add_nodes_from([1, 2, 3, 4, 5]) graph.add_edges_from([ (1, 2), (1, 5), (2, 3), (2, 4), (2, 5), (3, 4), (4, 5) ]) return graph def rsi_shc(): graph = nx.Graph() graph.add_nodes_from([1, 2, 3, 4, 5, 6]) graph.add_edges_from([ (1, 2), (1, 5), (1, 6), (2, 3), (3, 4), (4, 5), (4, 6), (5, 6) ]) return graph def lfi_shc(): graph = nx.Graph() graph.add_nodes_from([1, 2, 3, 4, 5, 6, 7]) graph.add_edges_from([ (1, 2), (1, 5), (1, 6), (2, 3), (2, 7), (3, 4), (3, 7), (4, 5), (4, 6), (5, 6) ]) return graph def lfi_hc(): graph = nx.Graph() graph.add_nodes_from([1, 2, 3, 4, 5, 6, 7, 8, 9]) graph.add_edges_from([ (1, 2), (1, 5), (1, 6), (1, 7), (2, 3), (2, 8), (2, 9), (3, 4), (3, 8), (3, 9), (4, 5), (4, 6), (4, 7), (5, 6) ]) return graph def sli_shc(): graph = nx.Graph() graph.add_nodes_from([1, 2, 3, 4, 5, 6, 7]) graph.add_edges_from([ (1, 2), (1, 3), (1, 5), (1, 7), (2, 3), (2, 6), (3, 4), (4, 5), (4, 6), (5, 7), (6, 7) ]) return graph def sli_hc(): graph = nx.Graph() graph.add_nodes_from([1, 2, 3, 4, 5, 6, 7, 8, 9]) graph.add_edges_from([ (1, 2), (1, 3), (1, 4), (1, 5), (2, 3), (2, 7), (2, 8), (2, 9), (3, 6), (3, 7), (3, 9), (4, 5), (4, 6), (4, 8), (4, 9), (5, 6), (5, 7), (5, 8), (6, 7), (6, 9), (7, 8), (8, 9) ]) return graph networkx-1.11/networkx/algorithms/dag.py0000644000175000017500000002734412637544500020352 0ustar aricaric00000000000000# -*- coding: utf-8 -*- from fractions import gcd import networkx as nx from networkx.utils.decorators import * """Algorithms for directed acyclic graphs (DAGs).""" # Copyright (C) 2006-2011 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. __author__ = """\n""".join(['Aric Hagberg ', 'Dan Schult (dschult@colgate.edu)', 'Ben Edwards (bedwards@cs.unm.edu)']) __all__ = ['descendants', 'ancestors', 'topological_sort', 'topological_sort_recursive', 'is_directed_acyclic_graph', 'is_aperiodic', 'transitive_closure', 'antichains', 'dag_longest_path', 'dag_longest_path_length'] def descendants(G, source): """Return all nodes reachable from `source` in G. Parameters ---------- G : NetworkX DiGraph source : node in G Returns ------- des : set() The descendants of source in G """ if not G.has_node(source): raise nx.NetworkXError("The node %s is not in the graph." % source) des = set(nx.shortest_path_length(G, source=source).keys()) - set([source]) return des def ancestors(G, source): """Return all nodes having a path to `source` in G. Parameters ---------- G : NetworkX DiGraph source : node in G Returns ------- ancestors : set() The ancestors of source in G """ if not G.has_node(source): raise nx.NetworkXError("The node %s is not in the graph." % source) anc = set(nx.shortest_path_length(G, target=source).keys()) - set([source]) return anc def is_directed_acyclic_graph(G): """Return True if the graph G is a directed acyclic graph (DAG) or False if not. Parameters ---------- G : NetworkX graph A graph Returns ------- is_dag : bool True if G is a DAG, false otherwise """ if not G.is_directed(): return False try: topological_sort(G, reverse=True) return True except nx.NetworkXUnfeasible: return False def topological_sort(G, nbunch=None, reverse=False): """Return a list of nodes in topological sort order. A topological sort is a nonunique permutation of the nodes such that an edge from u to v implies that u appears before v in the topological sort order. Parameters ---------- G : NetworkX digraph A directed graph nbunch : container of nodes (optional) Explore graph in specified order given in nbunch reverse : bool, optional Return postorder instead of preorder if True. Reverse mode is a bit more efficient. Raises ------ NetworkXError Topological sort is defined for directed graphs only. If the graph G is undirected, a NetworkXError is raised. NetworkXUnfeasible If G is not a directed acyclic graph (DAG) no topological sort exists and a NetworkXUnfeasible exception is raised. Notes ----- This algorithm is based on a description and proof in The Algorithm Design Manual [1]_ . See also -------- is_directed_acyclic_graph References ---------- .. [1] Skiena, S. S. The Algorithm Design Manual (Springer-Verlag, 1998). http://www.amazon.com/exec/obidos/ASIN/0387948600/ref=ase_thealgorithmrepo/ """ if not G.is_directed(): raise nx.NetworkXError( "Topological sort not defined on undirected graphs.") # nonrecursive version seen = set() order = [] explored = set() if nbunch is None: nbunch = G.nodes_iter() for v in nbunch: # process all vertices in G if v in explored: continue fringe = [v] # nodes yet to look at while fringe: w = fringe[-1] # depth first search if w in explored: # already looked down this branch fringe.pop() continue seen.add(w) # mark as seen # Check successors for cycles and for new nodes new_nodes = [] for n in G[w]: if n not in explored: if n in seen: # CYCLE !! raise nx.NetworkXUnfeasible("Graph contains a cycle.") new_nodes.append(n) if new_nodes: # Add new_nodes to fringe fringe.extend(new_nodes) else: # No new nodes so w is fully explored explored.add(w) order.append(w) fringe.pop() # done considering this node if reverse: return order else: return list(reversed(order)) def topological_sort_recursive(G, nbunch=None, reverse=False): """Return a list of nodes in topological sort order. A topological sort is a nonunique permutation of the nodes such that an edge from u to v implies that u appears before v in the topological sort order. Parameters ---------- G : NetworkX digraph nbunch : container of nodes (optional) Explore graph in specified order given in nbunch reverse : bool, optional Return postorder instead of preorder if True. Reverse mode is a bit more efficient. Raises ------ NetworkXError Topological sort is defined for directed graphs only. If the graph G is undirected, a NetworkXError is raised. NetworkXUnfeasible If G is not a directed acyclic graph (DAG) no topological sort exists and a NetworkXUnfeasible exception is raised. Notes ----- This is a recursive version of topological sort. See also -------- topological_sort is_directed_acyclic_graph """ if not G.is_directed(): raise nx.NetworkXError( "Topological sort not defined on undirected graphs.") def _dfs(v): ancestors.add(v) for w in G[v]: if w in ancestors: raise nx.NetworkXUnfeasible("Graph contains a cycle.") if w not in explored: _dfs(w) ancestors.remove(v) explored.add(v) order.append(v) ancestors = set() explored = set() order = [] if nbunch is None: nbunch = G.nodes_iter() for v in nbunch: if v not in explored: _dfs(v) if reverse: return order else: return list(reversed(order)) def is_aperiodic(G): """Return True if G is aperiodic. A directed graph is aperiodic if there is no integer k > 1 that divides the length of every cycle in the graph. Parameters ---------- G : NetworkX DiGraph Graph Returns ------- aperiodic : boolean True if the graph is aperiodic False otherwise Raises ------ NetworkXError If G is not directed Notes ----- This uses the method outlined in [1]_, which runs in O(m) time given m edges in G. Note that a graph is not aperiodic if it is acyclic as every integer trivial divides length 0 cycles. References ---------- .. [1] Jarvis, J. P.; Shier, D. R. (1996), Graph-theoretic analysis of finite Markov chains, in Shier, D. R.; Wallenius, K. T., Applied Mathematical Modeling: A Multidisciplinary Approach, CRC Press. """ if not G.is_directed(): raise nx.NetworkXError( "is_aperiodic not defined for undirected graphs") s = next(G.nodes_iter()) levels = {s: 0} this_level = [s] g = 0 l = 1 while this_level: next_level = [] for u in this_level: for v in G[u]: if v in levels: # Non-Tree Edge g = gcd(g, levels[u] - levels[v] + 1) else: # Tree Edge next_level.append(v) levels[v] = l this_level = next_level l += 1 if len(levels) == len(G): # All nodes in tree return g == 1 else: return g == 1 and nx.is_aperiodic(G.subgraph(set(G) - set(levels))) @not_implemented_for('undirected') def transitive_closure(G): """ Returns transitive closure of a directed graph The transitive closure of G = (V,E) is a graph G+ = (V,E+) such that for all v,w in V there is an edge (v,w) in E+ if and only if there is a non-null path from v to w in G. Parameters ---------- G : NetworkX DiGraph Graph Returns ------- TC : NetworkX DiGraph Graph Raises ------ NetworkXNotImplemented If G is not directed References ---------- .. [1] http://www.ics.uci.edu/~eppstein/PADS/PartialOrder.py """ TC = nx.DiGraph() TC.add_nodes_from(G.nodes_iter()) TC.add_edges_from(G.edges_iter()) for v in G: TC.add_edges_from((v, u) for u in nx.dfs_preorder_nodes(G, source=v) if v != u) return TC @not_implemented_for('undirected') def antichains(G): """Generates antichains from a DAG. An antichain is a subset of a partially ordered set such that any two elements in the subset are incomparable. Parameters ---------- G : NetworkX DiGraph Graph Returns ------- antichain : generator object Raises ------ NetworkXNotImplemented If G is not directed NetworkXUnfeasible If G contains a cycle Notes ----- This function was originally developed by Peter Jipsen and Franco Saliola for the SAGE project. It's included in NetworkX with permission from the authors. Original SAGE code at: https://sage.informatik.uni-goettingen.de/src/combinat/posets/hasse_diagram.py References ---------- .. [1] Free Lattices, by R. Freese, J. Jezek and J. B. Nation, AMS, Vol 42, 1995, p. 226. """ TC = nx.transitive_closure(G) antichains_stacks = [([], nx.topological_sort(G, reverse=True))] while antichains_stacks: (antichain, stack) = antichains_stacks.pop() # Invariant: # - the elements of antichain are independent # - the elements of stack are independent from those of antichain yield antichain while stack: x = stack.pop() new_antichain = antichain + [x] new_stack = [ t for t in stack if not ((t in TC[x]) or (x in TC[t]))] antichains_stacks.append((new_antichain, new_stack)) @not_implemented_for('undirected') def dag_longest_path(G): """Returns the longest path in a DAG Parameters ---------- G : NetworkX DiGraph Graph Returns ------- path : list Longest path Raises ------ NetworkXNotImplemented If G is not directed See also -------- dag_longest_path_length """ dist = {} # stores [node, distance] pair for node in nx.topological_sort(G): # pairs of dist,node for all incoming edges pairs = [(dist[v][0] + 1, v) for v in G.pred[node]] if pairs: dist[node] = max(pairs) else: dist[node] = (0, node) node, (length, _) = max(dist.items(), key=lambda x: x[1]) path = [] while length > 0: path.append(node) length, node = dist[node] return list(reversed(path)) @not_implemented_for('undirected') def dag_longest_path_length(G): """Returns the longest path length in a DAG Parameters ---------- G : NetworkX DiGraph Graph Returns ------- path_length : int Longest path length Raises ------ NetworkXNotImplemented If G is not directed See also -------- dag_longest_path """ path_length = len(nx.dag_longest_path(G)) - 1 return path_length networkx-1.11/networkx/algorithms/connectivity/0000755000175000017500000000000012653231454021750 5ustar aricaric00000000000000networkx-1.11/networkx/algorithms/connectivity/cuts.py0000644000175000017500000005457112637544500023315 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ Flow based cut algorithms """ import itertools import networkx as nx # Define the default maximum flow function to use in all flow based # cut algorithms. from networkx.algorithms.flow import edmonds_karp, shortest_augmenting_path from networkx.algorithms.flow import build_residual_network default_flow_func = edmonds_karp from .utils import (build_auxiliary_node_connectivity, build_auxiliary_edge_connectivity) __author__ = '\n'.join(['Jordi Torrents ']) __all__ = ['minimum_st_node_cut', 'minimum_node_cut', 'minimum_st_edge_cut', 'minimum_edge_cut'] def minimum_st_edge_cut(G, s, t, flow_func=None, auxiliary=None, residual=None): """Returns the edges of the cut-set of a minimum (s, t)-cut. This function returns the set of edges of minimum cardinality that, if removed, would destroy all paths among source and target in G. Edge weights are not considered Parameters ---------- G : NetworkX graph Edges of the graph are expected to have an attribute called 'capacity'. If this attribute is not present, the edge is considered to have infinite capacity. s : node Source node for the flow. t : node Sink node for the flow. auxiliary : NetworkX DiGraph Auxiliary digraph to compute flow based node connectivity. It has to have a graph attribute called mapping with a dictionary mapping node names in G and in the auxiliary digraph. If provided it will be reused instead of recreated. Default value: None. flow_func : function A function for computing the maximum flow among a pair of nodes. The function has to accept at least three parameters: a Digraph, a source node, and a target node. And return a residual network that follows NetworkX conventions (see :meth:`maximum_flow` for details). If flow_func is None, the default maximum flow function (:meth:`edmonds_karp`) is used. See :meth:`node_connectivity` for details. The choice of the default function may change from version to version and should not be relied on. Default value: None. residual : NetworkX DiGraph Residual network to compute maximum flow. If provided it will be reused instead of recreated. Default value: None. Returns ------- cutset : set Set of edges that, if removed from the graph, will disconnect it. See also -------- :meth:`minimum_cut` :meth:`minimum_node_cut` :meth:`minimum_edge_cut` :meth:`stoer_wagner` :meth:`node_connectivity` :meth:`edge_connectivity` :meth:`maximum_flow` :meth:`edmonds_karp` :meth:`preflow_push` :meth:`shortest_augmenting_path` Examples -------- This function is not imported in the base NetworkX namespace, so you have to explicitly import it from the connectivity package: >>> from networkx.algorithms.connectivity import minimum_st_edge_cut We use in this example the platonic icosahedral graph, which has edge connectivity 5. >>> G = nx.icosahedral_graph() >>> len(minimum_st_edge_cut(G, 0, 6)) 5 If you need to compute local edge cuts on several pairs of nodes in the same graph, it is recommended that you reuse the data structures that NetworkX uses in the computation: the auxiliary digraph for edge connectivity, and the residual network for the underlying maximum flow computation. Example of how to compute local edge cuts among all pairs of nodes of the platonic icosahedral graph reusing the data structures. >>> import itertools >>> # You also have to explicitly import the function for >>> # building the auxiliary digraph from the connectivity package >>> from networkx.algorithms.connectivity import ( ... build_auxiliary_edge_connectivity) >>> H = build_auxiliary_edge_connectivity(G) >>> # And the function for building the residual network from the >>> # flow package >>> from networkx.algorithms.flow import build_residual_network >>> # Note that the auxiliary digraph has an edge attribute named capacity >>> R = build_residual_network(H, 'capacity') >>> result = dict.fromkeys(G, dict()) >>> # Reuse the auxiliary digraph and the residual network by passing them >>> # as parameters >>> for u, v in itertools.combinations(G, 2): ... k = len(minimum_st_edge_cut(G, u, v, auxiliary=H, residual=R)) ... result[u][v] = k >>> all(result[u][v] == 5 for u, v in itertools.combinations(G, 2)) True You can also use alternative flow algorithms for computing edge cuts. For instance, in dense networks the algorithm :meth:`shortest_augmenting_path` will usually perform better than the default :meth:`edmonds_karp` which is faster for sparse networks with highly skewed degree distributions. Alternative flow functions have to be explicitly imported from the flow package. >>> from networkx.algorithms.flow import shortest_augmenting_path >>> len(minimum_st_edge_cut(G, 0, 6, flow_func=shortest_augmenting_path)) 5 """ if flow_func is None: flow_func = default_flow_func if auxiliary is None: H = build_auxiliary_edge_connectivity(G) else: H = auxiliary kwargs = dict(capacity='capacity', flow_func=flow_func, residual=residual) cut_value, partition = nx.minimum_cut(H, s, t, **kwargs) reachable, non_reachable = partition # Any edge in the original graph linking the two sets in the # partition is part of the edge cutset cutset = set() for u, nbrs in ((n, G[n]) for n in reachable): cutset.update((u, v) for v in nbrs if v in non_reachable) return cutset def minimum_st_node_cut(G, s, t, flow_func=None, auxiliary=None, residual=None): r"""Returns a set of nodes of minimum cardinality that disconnect source from target in G. This function returns the set of nodes of minimum cardinality that, if removed, would destroy all paths among source and target in G. Parameters ---------- G : NetworkX graph s : node Source node. t : node Target node. flow_func : function A function for computing the maximum flow among a pair of nodes. The function has to accept at least three parameters: a Digraph, a source node, and a target node. And return a residual network that follows NetworkX conventions (see :meth:`maximum_flow` for details). If flow_func is None, the default maximum flow function (:meth:`edmonds_karp`) is used. See below for details. The choice of the default function may change from version to version and should not be relied on. Default value: None. auxiliary : NetworkX DiGraph Auxiliary digraph to compute flow based node connectivity. It has to have a graph attribute called mapping with a dictionary mapping node names in G and in the auxiliary digraph. If provided it will be reused instead of recreated. Default value: None. residual : NetworkX DiGraph Residual network to compute maximum flow. If provided it will be reused instead of recreated. Default value: None. Returns ------- cutset : set Set of nodes that, if removed, would destroy all paths between source and target in G. Examples -------- This function is not imported in the base NetworkX namespace, so you have to explicitly import it from the connectivity package: >>> from networkx.algorithms.connectivity import minimum_st_node_cut We use in this example the platonic icosahedral graph, which has node connectivity 5. >>> G = nx.icosahedral_graph() >>> len(minimum_st_node_cut(G, 0, 6)) 5 If you need to compute local st cuts between several pairs of nodes in the same graph, it is recommended that you reuse the data structures that NetworkX uses in the computation: the auxiliary digraph for node connectivity and node cuts, and the residual network for the underlying maximum flow computation. Example of how to compute local st node cuts reusing the data structures: >>> # You also have to explicitly import the function for >>> # building the auxiliary digraph from the connectivity package >>> from networkx.algorithms.connectivity import ( ... build_auxiliary_node_connectivity) >>> H = build_auxiliary_node_connectivity(G) >>> # And the function for building the residual network from the >>> # flow package >>> from networkx.algorithms.flow import build_residual_network >>> # Note that the auxiliary digraph has an edge attribute named capacity >>> R = build_residual_network(H, 'capacity') >>> # Reuse the auxiliary digraph and the residual network by passing them >>> # as parameters >>> len(minimum_st_node_cut(G, 0, 6, auxiliary=H, residual=R)) 5 You can also use alternative flow algorithms for computing minimum st node cuts. For instance, in dense networks the algorithm :meth:`shortest_augmenting_path` will usually perform better than the default :meth:`edmonds_karp` which is faster for sparse networks with highly skewed degree distributions. Alternative flow functions have to be explicitly imported from the flow package. >>> from networkx.algorithms.flow import shortest_augmenting_path >>> len(minimum_st_node_cut(G, 0, 6, flow_func=shortest_augmenting_path)) 5 Notes ----- This is a flow based implementation of minimum node cut. The algorithm is based in solving a number of maximum flow computations to determine the capacity of the minimum cut on an auxiliary directed network that corresponds to the minimum node cut of G. It handles both directed and undirected graphs. This implementation is based on algorithm 11 in [1]_. See also -------- :meth:`minimum_node_cut` :meth:`minimum_edge_cut` :meth:`stoer_wagner` :meth:`node_connectivity` :meth:`edge_connectivity` :meth:`maximum_flow` :meth:`edmonds_karp` :meth:`preflow_push` :meth:`shortest_augmenting_path` References ---------- .. [1] Abdol-Hossein Esfahanian. Connectivity Algorithms. http://www.cse.msu.edu/~cse835/Papers/Graph_connectivity_revised.pdf """ if auxiliary is None: H = build_auxiliary_node_connectivity(G) else: H = auxiliary mapping = H.graph.get('mapping', None) if mapping is None: raise nx.NetworkXError('Invalid auxiliary digraph.') kwargs = dict(flow_func=flow_func, residual=residual, auxiliary=H) # The edge cut in the auxiliary digraph corresponds to the node cut in the # original graph. edge_cut = minimum_st_edge_cut(H, '%sB' % mapping[s], '%sA' % mapping[t], **kwargs) # Each node in the original graph maps to two nodes of the auxiliary graph node_cut = set(H.node[node]['id'] for edge in edge_cut for node in edge) return node_cut - set([s, t]) def minimum_node_cut(G, s=None, t=None, flow_func=None): r"""Returns a set of nodes of minimum cardinality that disconnects G. If source and target nodes are provided, this function returns the set of nodes of minimum cardinality that, if removed, would destroy all paths among source and target in G. If not, it returns a set of nodes of minimum cardinality that disconnects G. Parameters ---------- G : NetworkX graph s : node Source node. Optional. Default value: None. t : node Target node. Optional. Default value: None. flow_func : function A function for computing the maximum flow among a pair of nodes. The function has to accept at least three parameters: a Digraph, a source node, and a target node. And return a residual network that follows NetworkX conventions (see :meth:`maximum_flow` for details). If flow_func is None, the default maximum flow function (:meth:`edmonds_karp`) is used. See below for details. The choice of the default function may change from version to version and should not be relied on. Default value: None. Returns ------- cutset : set Set of nodes that, if removed, would disconnect G. If source and target nodes are provided, the set contians the nodes that if removed, would destroy all paths between source and target. Examples -------- >>> # Platonic icosahedral graph has node connectivity 5 >>> G = nx.icosahedral_graph() >>> node_cut = nx.minimum_node_cut(G) >>> len(node_cut) 5 You can use alternative flow algorithms for the underlying maximum flow computation. In dense networks the algorithm :meth:`shortest_augmenting_path` will usually perform better than the default :meth:`edmonds_karp`, which is faster for sparse networks with highly skewed degree distributions. Alternative flow functions have to be explicitly imported from the flow package. >>> from networkx.algorithms.flow import shortest_augmenting_path >>> node_cut == nx.minimum_node_cut(G, flow_func=shortest_augmenting_path) True If you specify a pair of nodes (source and target) as parameters, this function returns a local st node cut. >>> len(nx.minimum_node_cut(G, 3, 7)) 5 If you need to perform several local st cuts among different pairs of nodes on the same graph, it is recommended that you reuse the data structures used in the maximum flow computations. See :meth:`minimum_st_node_cut` for details. Notes ----- This is a flow based implementation of minimum node cut. The algorithm is based in solving a number of maximum flow computations to determine the capacity of the minimum cut on an auxiliary directed network that corresponds to the minimum node cut of G. It handles both directed and undirected graphs. This implementation is based on algorithm 11 in [1]_. See also -------- :meth:`minimum_st_node_cut` :meth:`minimum_cut` :meth:`minimum_edge_cut` :meth:`stoer_wagner` :meth:`node_connectivity` :meth:`edge_connectivity` :meth:`maximum_flow` :meth:`edmonds_karp` :meth:`preflow_push` :meth:`shortest_augmenting_path` References ---------- .. [1] Abdol-Hossein Esfahanian. Connectivity Algorithms. http://www.cse.msu.edu/~cse835/Papers/Graph_connectivity_revised.pdf """ if (s is not None and t is None) or (s is None and t is not None): raise nx.NetworkXError('Both source and target must be specified.') # Local minimum node cut. if s is not None and t is not None: if s not in G: raise nx.NetworkXError('node %s not in graph' % s) if t not in G: raise nx.NetworkXError('node %s not in graph' % t) return minimum_st_node_cut(G, s, t, flow_func=flow_func) # Global minimum node cut. # Analog to the algoritm 11 for global node connectivity in [1]. if G.is_directed(): if not nx.is_weakly_connected(G): raise nx.NetworkXError('Input graph is not connected') iter_func = itertools.permutations def neighbors(v): return itertools.chain.from_iterable([G.predecessors_iter(v), G.successors_iter(v)]) else: if not nx.is_connected(G): raise nx.NetworkXError('Input graph is not connected') iter_func = itertools.combinations neighbors = G.neighbors_iter # Reuse the auxiliary digraph and the residual network. H = build_auxiliary_node_connectivity(G) R = build_residual_network(H, 'capacity') kwargs = dict(flow_func=flow_func, auxiliary=H, residual=R) # Choose a node with minimum degree. v = min(G, key=G.degree) # Initial node cutset is all neighbors of the node with minimum degree. min_cut = set(G[v]) # Compute st node cuts between v and all its non-neighbors nodes in G. for w in set(G) - set(neighbors(v)) - set([v]): this_cut = minimum_st_node_cut(G, v, w, **kwargs) if len(min_cut) >= len(this_cut): min_cut = this_cut # Also for non adjacent pairs of neighbors of v. for x, y in iter_func(neighbors(v), 2): if y in G[x]: continue this_cut = minimum_st_node_cut(G, x, y, **kwargs) if len(min_cut) >= len(this_cut): min_cut = this_cut return min_cut def minimum_edge_cut(G, s=None, t=None, flow_func=None): r"""Returns a set of edges of minimum cardinality that disconnects G. If source and target nodes are provided, this function returns the set of edges of minimum cardinality that, if removed, would break all paths among source and target in G. If not, it returns a set of edges of minimum cardinality that disconnects G. Parameters ---------- G : NetworkX graph s : node Source node. Optional. Default value: None. t : node Target node. Optional. Default value: None. flow_func : function A function for computing the maximum flow among a pair of nodes. The function has to accept at least three parameters: a Digraph, a source node, and a target node. And return a residual network that follows NetworkX conventions (see :meth:`maximum_flow` for details). If flow_func is None, the default maximum flow function (:meth:`edmonds_karp`) is used. See below for details. The choice of the default function may change from version to version and should not be relied on. Default value: None. Returns ------- cutset : set Set of edges that, if removed, would disconnect G. If source and target nodes are provided, the set contians the edges that if removed, would destroy all paths between source and target. Examples -------- >>> # Platonic icosahedral graph has edge connectivity 5 >>> G = nx.icosahedral_graph() >>> len(nx.minimum_edge_cut(G)) 5 You can use alternative flow algorithms for the underlying maximum flow computation. In dense networks the algorithm :meth:`shortest_augmenting_path` will usually perform better than the default :meth:`edmonds_karp`, which is faster for sparse networks with highly skewed degree distributions. Alternative flow functions have to be explicitly imported from the flow package. >>> from networkx.algorithms.flow import shortest_augmenting_path >>> len(nx.minimum_edge_cut(G, flow_func=shortest_augmenting_path)) 5 If you specify a pair of nodes (source and target) as parameters, this function returns the value of local edge connectivity. >>> nx.edge_connectivity(G, 3, 7) 5 If you need to perform several local computations among different pairs of nodes on the same graph, it is recommended that you reuse the data structures used in the maximum flow computations. See :meth:`local_edge_connectivity` for details. Notes ----- This is a flow based implementation of minimum edge cut. For undirected graphs the algorithm works by finding a 'small' dominating set of nodes of G (see algorithm 7 in [1]_) and computing the maximum flow between an arbitrary node in the dominating set and the rest of nodes in it. This is an implementation of algorithm 6 in [1]_. For directed graphs, the algorithm does n calls to the max flow function. It is an implementation of algorithm 8 in [1]_. See also -------- :meth:`minimum_st_edge_cut` :meth:`minimum_node_cut` :meth:`stoer_wagner` :meth:`node_connectivity` :meth:`edge_connectivity` :meth:`maximum_flow` :meth:`edmonds_karp` :meth:`preflow_push` :meth:`shortest_augmenting_path` References ---------- .. [1] Abdol-Hossein Esfahanian. Connectivity Algorithms. http://www.cse.msu.edu/~cse835/Papers/Graph_connectivity_revised.pdf """ if (s is not None and t is None) or (s is None and t is not None): raise nx.NetworkXError('Both source and target must be specified.') # reuse auxiliary digraph and residual network H = build_auxiliary_edge_connectivity(G) R = build_residual_network(H, 'capacity') kwargs = dict(flow_func=flow_func, residual=R, auxiliary=H) # Local minimum edge cut if s and t are not None if s is not None and t is not None: if s not in G: raise nx.NetworkXError('node %s not in graph' % s) if t not in G: raise nx.NetworkXError('node %s not in graph' % t) return minimum_st_edge_cut(H, s, t, **kwargs) # Global minimum edge cut # Analog to the algoritm for global edge connectivity if G.is_directed(): # Based on algorithm 8 in [1] if not nx.is_weakly_connected(G): raise nx.NetworkXError('Input graph is not connected') # Initial cutset is all edges of a node with minimum degree node = min(G, key=G.degree) min_cut = G.edges(node) nodes = G.nodes() n = len(nodes) for i in range(n): try: this_cut = minimum_st_edge_cut(H, nodes[i], nodes[i+1], **kwargs) if len(this_cut) <= len(min_cut): min_cut = this_cut except IndexError: # Last node! this_cut = minimum_st_edge_cut(H, nodes[i], nodes[0], **kwargs) if len(this_cut) <= len(min_cut): min_cut = this_cut return min_cut else: # undirected # Based on algorithm 6 in [1] if not nx.is_connected(G): raise nx.NetworkXError('Input graph is not connected') # Initial cutset is all edges of a node with minimum degree node = min(G, key=G.degree) min_cut = G.edges(node) # A dominating set is \lambda-covering # We need a dominating set with at least two nodes for node in G: D = nx.dominating_set(G, start_with=node) v = D.pop() if D: break else: # in complete graphs the dominating set will always be of one node # thus we return min_cut, which now contains the edges of a node # with minimum degree return min_cut for w in D: this_cut = minimum_st_edge_cut(H, v, w, **kwargs) if len(this_cut) <= len(min_cut): min_cut = this_cut return min_cut networkx-1.11/networkx/algorithms/connectivity/utils.py0000644000175000017500000000640112637544500023464 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ Utilities for connectivity package """ import networkx as nx __author__ = '\n'.join(['Jordi Torrents ']) __all__ = ['build_auxiliary_node_connectivity', 'build_auxiliary_edge_connectivity'] def build_auxiliary_node_connectivity(G): r"""Creates a directed graph D from an undirected graph G to compute flow based node connectivity. For an undirected graph G having `n` nodes and `m` edges we derive a directed graph D with `2n` nodes and `2m+n` arcs by replacing each original node `v` with two nodes `vA`, `vB` linked by an (internal) arc in D. Then for each edge (`u`, `v`) in G we add two arcs (`uB`, `vA`) and (`vB`, `uA`) in D. Finally we set the attribute capacity = 1 for each arc in D [1]_. For a directed graph having `n` nodes and `m` arcs we derive a directed graph D with `2n` nodes and `m+n` arcs by replacing each original node `v` with two nodes `vA`, `vB` linked by an (internal) arc (`vA`, `vB`) in D. Then for each arc (`u`, `v`) in G we add one arc (`uB`, `vA`) in D. Finally we set the attribute capacity = 1 for each arc in D. A dictionary with a mapping between nodes in the original graph and the auxiliary digraph is stored as a graph attribute: H.graph['mapping']. References ---------- .. [1] Kammer, Frank and Hanjo Taubig. Graph Connectivity. in Brandes and Erlebach, 'Network Analysis: Methodological Foundations', Lecture Notes in Computer Science, Volume 3418, Springer-Verlag, 2005. http://www.informatik.uni-augsburg.de/thi/personen/kammer/Graph_Connectivity.pdf """ directed = G.is_directed() mapping = {} H = nx.DiGraph() for i, node in enumerate(G): mapping[node] = i H.add_node('%dA' % i, id=node) H.add_node('%dB' % i, id=node) H.add_edge('%dA' % i, '%dB' % i, capacity=1) edges = [] for (source, target) in G.edges_iter(): edges.append(('%sB' % mapping[source], '%sA' % mapping[target])) if not directed: edges.append(('%sB' % mapping[target], '%sA' % mapping[source])) H.add_edges_from(edges, capacity=1) # Store mapping as graph attribute H.graph['mapping'] = mapping return H def build_auxiliary_edge_connectivity(G): """Auxiliary digraph for computing flow based edge connectivity If the input graph is undirected, we replace each edge (`u`,`v`) with two reciprocal arcs (`u`, `v`) and (`v`, `u`) and then we set the attribute 'capacity' for each arc to 1. If the input graph is directed we simply add the 'capacity' attribute. Part of algorithm 1 in [1]_ . References ---------- .. [1] Abdol-Hossein Esfahanian. Connectivity Algorithms. (this is a chapter, look for the reference of the book). http://www.cse.msu.edu/~cse835/Papers/Graph_connectivity_revised.pdf """ if G.is_directed(): H = nx.DiGraph() H.add_nodes_from(G.nodes_iter()) H.add_edges_from(G.edges_iter(), capacity=1) return H else: H = nx.DiGraph() H.add_nodes_from(G.nodes_iter()) for (source, target) in G.edges_iter(): H.add_edges_from([(source, target), (target, source)], capacity=1) return H networkx-1.11/networkx/algorithms/connectivity/kcutsets.py0000644000175000017500000001612512637544500024175 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ Kanevsky all minimum node k cutsets algorithm. """ from operator import itemgetter import networkx as nx from .utils import build_auxiliary_node_connectivity from networkx.algorithms.flow import ( build_residual_network, edmonds_karp, shortest_augmenting_path, ) default_flow_func = edmonds_karp __author__ = '\n'.join(['Jordi Torrents ']) __all__ = ['all_node_cuts'] def all_node_cuts(G, k=None, flow_func=None): r"""Returns all minimum k cutsets of an undirected graph G. This implementation is based on Kanevsky's algorithm [1]_ for finding all minimum-size node cut-sets of an undirected graph G; ie the set (or sets) of nodes of cardinality equal to the node connectivity of G. Thus if removed, would break G into two or more connected components. Parameters ---------- G : NetworkX graph Undirected graph k : Integer Node connectivity of the input graph. If k is None, then it is computed. Default value: None. flow_func : function Function to perform the underlying flow computations. Default value edmonds_karp. This function performs better in sparse graphs with right tailed degree distributions. shortest_augmenting_path will perform better in denser graphs. Returns ------- cuts : a generator of node cutsets Each node cutset has cardinality equal to the node connectivity of the input graph. Examples -------- >>> # A two-dimensional grid graph has 4 cutsets of cardinality 2 >>> G = nx.grid_2d_graph(5, 5) >>> cutsets = list(nx.all_node_cuts(G)) >>> len(cutsets) 4 >>> all(2 == len(cutset) for cutset in cutsets) True >>> nx.node_connectivity(G) 2 Notes ----- This implementation is based on the sequential algorithm for finding all minimum-size separating vertex sets in a graph [1]_. The main idea is to compute minimum cuts using local maximum flow computations among a set of nodes of highest degree and all other non-adjacent nodes in the Graph. Once we find a minimum cut, we add an edge between the high degree node and the target node of the local maximum flow computation to make sure that we will not find that minimum cut again. See also -------- node_connectivity edmonds_karp shortest_augmenting_path References ---------- .. [1] Kanevsky, A. (1993). Finding all minimum-size separating vertex sets in a graph. Networks 23(6), 533--541. http://onlinelibrary.wiley.com/doi/10.1002/net.3230230604/abstract """ if not nx.is_connected(G): raise nx.NetworkXError('Input graph is disconnected.') # Initialize data structures. # Keep track of the cuts already computed so we do not repeat them. seen = [] # Even-Tarjan reduction is what we call auxiliary digraph # for node connectivity. H = build_auxiliary_node_connectivity(G) mapping = H.graph['mapping'] R = build_residual_network(H, 'capacity') kwargs = dict(capacity='capacity', residual=R) # Define default flow function if flow_func is None: flow_func = default_flow_func if flow_func is shortest_augmenting_path: kwargs['two_phase'] = True # Begin the actual algorithm # step 1: Find node connectivity k of G if k is None: k = nx.node_connectivity(G, flow_func=flow_func) # step 2: # Find k nodes with top degree, call it X: degree = G.degree().items() X = {n for n, d in sorted(degree, key=itemgetter(1), reverse=True)[:k]} # Check if X is a k-node-cutset if _is_separating_set(G, X): seen.append(X) yield X for x in X: # step 3: Compute local connectivity flow of x with all other # non adjacent nodes in G non_adjacent = set(G) - X - set(G[x]) for v in non_adjacent: # step 4: compute maximum flow in an Even-Tarjan reduction H of G # and step:5 build the associated residual network R R = flow_func(H, '%sB' % mapping[x], '%sA' % mapping[v], **kwargs) flow_value = R.graph['flow_value'] if flow_value == k: ## Remove saturated edges form the residual network saturated_edges = [(u, w, d) for (u, w, d) in R.edges_iter(data=True) if d['capacity'] == d['flow']] R.remove_edges_from(saturated_edges) # step 6: shrink the strongly connected components of # residual flow network R and call it L L = nx.condensation(R) cmap = L.graph['mapping'] # step 7: Compute antichains of L; they map to closed sets in H # Any edge in H that links a closed set is part of a cutset for antichain in nx.antichains(L): # Nodes in an antichain of the condensation graph of # the residual network map to a closed set of nodes that # define a node partition of the auxiliary digraph H. S = {n for n, scc in cmap.items() if scc in antichain} # Find the cutset that links the node partition (S,~S) in H cutset = set() for u in S: cutset.update((u, w) for w in H[u] if w not in S) # The edges in H that form the cutset are internal edges # (ie edges that represent a node of the original graph G) node_cut = {H.node[n]['id'] for edge in cutset for n in edge} if len(node_cut) == k: if node_cut not in seen: yield node_cut seen.append(node_cut) # Add an edge (x, v) to make sure that we do not # find this cutset again. This is equivalent # of adding the edge in the input graph # G.add_edge(x, v) and then regenerate H and R: # Add edges to the auxiliary digraph. H.add_edge('%sB' % mapping[x], '%sA' % mapping[v], capacity=1) H.add_edge('%sB' % mapping[v], '%sA' % mapping[x], capacity=1) # Add edges to the residual network. R.add_edge('%sB' % mapping[x], '%sA' % mapping[v], capacity=1) R.add_edge('%sA' % mapping[v], '%sB' % mapping[x], capacity=1) break # Add again the saturated edges to reuse the residual network R.add_edges_from(saturated_edges) def _is_separating_set(G, cut): """Assumes that the input graph is connected""" if len(cut) == len(G) - 1: return True H = G.copy() H.remove_nodes_from(cut) if nx.is_connected(H): return False return True networkx-1.11/networkx/algorithms/connectivity/kcomponents.py0000644000175000017500000002002012637544500024655 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ Moody and White algorithm for k-components """ from collections import defaultdict from itertools import combinations from operator import itemgetter import networkx as nx from networkx.utils import not_implemented_for # Define the default maximum flow function. from networkx.algorithms.flow import edmonds_karp default_flow_func = edmonds_karp __author__ = '\n'.join(['Jordi Torrents ']) __all__ = ['k_components'] @not_implemented_for('directed') def k_components(G, flow_func=None): r"""Returns the k-component structure of a graph G. A `k`-component is a maximal subgraph of a graph G that has, at least, node connectivity `k`: we need to remove at least `k` nodes to break it into more components. `k`-components have an inherent hierarchical structure because they are nested in terms of connectivity: a connected graph can contain several 2-components, each of which can contain one or more 3-components, and so forth. Parameters ---------- G : NetworkX graph flow_func : function Function to perform the underlying flow computations. Default value :meth:`edmonds_karp`. This function performs better in sparse graphs with right tailed degree distributions. :meth:`shortest_augmenting_path` will perform better in denser graphs. Returns ------- k_components : dict Dictionary with all connectivity levels `k` in the input Graph as keys and a list of sets of nodes that form a k-component of level `k` as values. Raises ------ NetworkXNotImplemented: If the input graph is directed. Examples -------- >>> # Petersen graph has 10 nodes and it is triconnected, thus all >>> # nodes are in a single component on all three connectivity levels >>> G = nx.petersen_graph() >>> k_components = nx.k_components(G) Notes ----- Moody and White [1]_ (appendix A) provide an algorithm for identifying k-components in a graph, which is based on Kanevsky's algorithm [2]_ for finding all minimum-size node cut-sets of a graph (implemented in :meth:`all_node_cuts` function): 1. Compute node connectivity, k, of the input graph G. 2. Identify all k-cutsets at the current level of connectivity using Kanevsky's algorithm. 3. Generate new graph components based on the removal of these cutsets. Nodes in a cutset belong to both sides of the induced cut. 4. If the graph is neither complete nor trivial, return to 1; else end. This implementation also uses some heuristics (see [3]_ for details) to speed up the computation. See also -------- node_connectivity all_node_cuts References ---------- .. [1] Moody, J. and D. White (2003). Social cohesion and embeddedness: A hierarchical conception of social groups. American Sociological Review 68(1), 103--28. http://www2.asanet.org/journals/ASRFeb03MoodyWhite.pdf .. [2] Kanevsky, A. (1993). Finding all minimum-size separating vertex sets in a graph. Networks 23(6), 533--541. http://onlinelibrary.wiley.com/doi/10.1002/net.3230230604/abstract .. [3] Torrents, J. and F. Ferraro (2015). Structural Cohesion: Visualization and Heuristics for Fast Computation. http://arxiv.org/pdf/1503.04476v1 """ # Dictionary with connectivity level (k) as keys and a list of # sets of nodes that form a k-component as values. Note that # k-compoents can overlap (but only k - 1 nodes). k_components = defaultdict(list) # Define default flow function if flow_func is None: flow_func = default_flow_func # Bicomponents as a base to check for higher order k-components for component in nx.connected_components(G): # isolated nodes have connectivity 0 comp = set(component) if len(comp) > 1: k_components[1].append(comp) bicomponents = list(nx.biconnected_component_subgraphs(G)) for bicomponent in bicomponents: bicomp = set(bicomponent) # avoid considering dyads as bicomponents if len(bicomp) > 2: k_components[2].append(bicomp) for B in bicomponents: if len(B) <= 2: continue k = nx.node_connectivity(B, flow_func=flow_func) if k > 2: k_components[k].append(set(B.nodes_iter())) # Perform cuts in a DFS like order. cuts = list(nx.all_node_cuts(B, k=k, flow_func=flow_func)) stack = [(k, _generate_partition(B, cuts, k))] while stack: (parent_k, partition) = stack[-1] try: nodes = next(partition) C = B.subgraph(nodes) this_k = nx.node_connectivity(C, flow_func=flow_func) if this_k > parent_k and this_k > 2: k_components[this_k].append(set(C.nodes_iter())) cuts = list(nx.all_node_cuts(C, k=this_k, flow_func=flow_func)) if cuts: stack.append((this_k, _generate_partition(C, cuts, this_k))) except StopIteration: stack.pop() # This is necessary because k-components may only be reported at their # maximum k level. But we want to return a dictionary in which keys are # connectivity levels and values list of sets of components, without # skipping any connectivity level. Also, it's possible that subsets of # an already detected k-component appear at a level k. Checking for this # in the while loop above penalizes the common case. Thus we also have to # _consolidate all connectivity levels in _reconstruct_k_components. return _reconstruct_k_components(k_components) def _consolidate(sets, k): """Merge sets that share k or more elements. See: http://rosettacode.org/wiki/Set_consolidation The iterative python implementation posted there is faster than this because of the overhead of building a Graph and calling nx.connected_components, but it's not clear for us if we can use it in NetworkX because there is no licence for the code. """ G = nx.Graph() nodes = {i: s for i, s in enumerate(sets)} G.add_nodes_from(nodes) G.add_edges_from((u, v) for u, v in combinations(nodes, 2) if len(nodes[u] & nodes[v]) >= k) for component in nx.connected_components(G): yield set.union(*[nodes[n] for n in component]) def _generate_partition(G, cuts, k): def has_nbrs_in_partition(G, node, partition): for n in G[node]: if n in partition: return True return False components = [] nodes = ({n for n, d in G.degree().items() if d > k} - {n for cut in cuts for n in cut}) H = G.subgraph(nodes) for cc in nx.connected_components(H): component = set(cc) for cut in cuts: for node in cut: if has_nbrs_in_partition(G, node, cc): component.add(node) if len(component) < G.order(): components.append(component) for component in _consolidate(components, k+1): yield component def _reconstruct_k_components(k_comps): result = dict() max_k = max(k_comps) for k in reversed(range(1, max_k+1)): if k == max_k: result[k] = list(_consolidate(k_comps[k], k)) elif k not in k_comps: result[k] = list(_consolidate(result[k+1], k)) else: nodes_at_k = set.union(*k_comps[k]) to_add = [c for c in result[k+1] if any(n not in nodes_at_k for n in c)] if to_add: result[k] = list(_consolidate(k_comps[k] + to_add, k)) else: result[k] = list(_consolidate(k_comps[k], k)) return result def build_k_number_dict(kcomps): result = {} for k, comps in sorted(kcomps.items(), key=itemgetter(0)): for comp in comps: for node in comp: result[node] = k return result networkx-1.11/networkx/algorithms/connectivity/stoerwagner.py0000644000175000017500000001254512637544500024672 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ Stoer-Wagner minimum cut algorithm. """ from itertools import islice import networkx as nx from networkx.utils import * __author__ = 'ysitu ' # Copyright (C) 2014 # ysitu # All rights reserved. # BSD license. __all__ = ['stoer_wagner'] @not_implemented_for('directed') @not_implemented_for('multigraph') def stoer_wagner(G, weight='weight', heap=BinaryHeap): """Returns the weighted minimum edge cut using the Stoer-Wagner algorithm. Determine the minimum edge cut of a connected graph using the Stoer-Wagner algorithm. In weighted cases, all weights must be nonnegative. The running time of the algorithm depends on the type of heaps used: ============== ============================================= Type of heap Running time ============== ============================================= Binary heap `O(n (m + n) \log n)` Fibonacci heap `O(nm + n^2 \log n)` Pairing heap `O(2^{2 \sqrt{\log \log n}} nm + n^2 \log n)` ============== ============================================= Parameters ---------- G : NetworkX graph Edges of the graph are expected to have an attribute named by the weight parameter below. If this attribute is not present, the edge is considered to have unit weight. weight : string Name of the weight attribute of the edges. If the attribute is not present, unit weight is assumed. Default value: 'weight'. heap : class Type of heap to be used in the algorithm. It should be a subclass of :class:`MinHeap` or implement a compatible interface. If a stock heap implementation is to be used, :class:`BinaryHeap` is recommeded over :class:`PairingHeap` for Python implementations without optimized attribute accesses (e.g., CPython) despite a slower asymptotic running time. For Python implementations with optimized attribute accesses (e.g., PyPy), :class:`PairingHeap` provides better performance. Default value: :class:`BinaryHeap`. Returns ------- cut_value : integer or float The sum of weights of edges in a minimum cut. partition : pair of node lists A partitioning of the nodes that defines a minimum cut. Raises ------ NetworkXNotImplemented If the graph is directed or a multigraph. NetworkXError If the graph has less than two nodes, is not connected or has a negative-weighted edge. Examples -------- >>> G = nx.Graph() >>> G.add_edge('x','a', weight=3) >>> G.add_edge('x','b', weight=1) >>> G.add_edge('a','c', weight=3) >>> G.add_edge('b','c', weight=5) >>> G.add_edge('b','d', weight=4) >>> G.add_edge('d','e', weight=2) >>> G.add_edge('c','y', weight=2) >>> G.add_edge('e','y', weight=3) >>> cut_value, partition = nx.stoer_wagner(G) >>> cut_value 4 """ n = len(G) if n < 2: raise nx.NetworkXError('graph has less than two nodes.') if not nx.is_connected(G): raise nx.NetworkXError('graph is not connected.') # Make a copy of the graph for internal use. G = nx.Graph((u, v, {'weight': e.get(weight, 1)}) for u, v, e in G.edges_iter(data=True) if u != v) for u, v, e, in G.edges_iter(data=True): if e['weight'] < 0: raise nx.NetworkXError('graph has a negative-weighted edge.') cut_value = float('inf') nodes = set(G) contractions = [] # contracted node pairs # Repeatedly pick a pair of nodes to contract until only one node is left. for i in range(n - 1): # Pick an arbitrary node u and create a set A = {u}. u = next(iter(G)) A = set([u]) # Repeatedly pick the node "most tightly connected" to A and add it to # A. The tightness of connectivity of a node not in A is defined by the # of edges connecting it to nodes in A. h = heap() # min-heap emulating a max-heap for v, e in G[u].items(): h.insert(v, -e['weight']) # Repeat until all but one node has been added to A. for j in range(n - i - 2): u = h.pop()[0] A.add(u) for v, e, in G[u].items(): if v not in A: h.insert(v, h.get(v, 0) - e['weight']) # A and the remaining node v define a "cut of the phase". There is a # minimum cut of the original graph that is also a cut of the phase. # Due to contractions in earlier phases, v may in fact represent # multiple nodes in the original graph. v, w = h.min() w = -w if w < cut_value: cut_value = w best_phase = i # Contract v and the last node added to A. contractions.append((u, v)) for w, e in G[v].items(): if w != u: if w not in G[u]: G.add_edge(u, w, weight=e['weight']) else: G[u][w]['weight'] += e['weight'] G.remove_node(v) # Recover the optimal partitioning from the contractions. G = nx.Graph(islice(contractions, best_phase)) v = contractions[best_phase][1] G.add_node(v) reachable = set(nx.single_source_shortest_path_length(G, v)) partition = (list(reachable), list(nodes - reachable)) return cut_value, partition networkx-1.11/networkx/algorithms/connectivity/__init__.py0000644000175000017500000000063112637544450024066 0ustar aricaric00000000000000"""Connectivity and cut algorithms """ from .connectivity import * from .cuts import * from .kcomponents import * from .kcutsets import * from .stoerwagner import * from .utils import * __all__ = sum([connectivity.__all__, cuts.__all__, kcomponents.__all__, kcutsets.__all__, stoerwagner.__all__, utils.__all__, ], []) networkx-1.11/networkx/algorithms/connectivity/tests/0000755000175000017500000000000012653231454023112 5ustar aricaric00000000000000networkx-1.11/networkx/algorithms/connectivity/tests/test_kcutsets.py0000644000175000017500000001727012637544450026404 0ustar aricaric00000000000000# Jordi Torrents # Test for k-cutsets from operator import itemgetter from nose.tools import assert_equal, assert_false, assert_true, assert_raises import networkx as nx from networkx.algorithms.connectivity.kcutsets import _is_separating_set from networkx.algorithms.flow import ( edmonds_karp, shortest_augmenting_path, preflow_push, ) ## ## Some nice synthetic graphs ## def graph_example_1(): G = nx.convert_node_labels_to_integers(nx.grid_graph([5,5]), label_attribute='labels') rlabels = nx.get_node_attributes(G, 'labels') labels = dict((v, k) for k, v in rlabels.items()) for nodes in [(labels[(0,0)], labels[(1,0)]), (labels[(0,4)], labels[(1,4)]), (labels[(3,0)], labels[(4,0)]), (labels[(3,4)], labels[(4,4)]) ]: new_node = G.order()+1 # Petersen graph is triconnected P = nx.petersen_graph() G = nx.disjoint_union(G,P) # Add two edges between the grid and P G.add_edge(new_node+1, nodes[0]) G.add_edge(new_node, nodes[1]) # K5 is 4-connected K = nx.complete_graph(5) G = nx.disjoint_union(G,K) # Add three edges between P and K5 G.add_edge(new_node+2,new_node+11) G.add_edge(new_node+3,new_node+12) G.add_edge(new_node+4,new_node+13) # Add another K5 sharing a node G = nx.disjoint_union(G,K) nbrs = G[new_node+10] G.remove_node(new_node+10) for nbr in nbrs: G.add_edge(new_node+17, nbr) G.add_edge(new_node+16, new_node+5) G.name = 'Example graph for connectivity' return G def torrents_and_ferraro_graph(): G = nx.convert_node_labels_to_integers(nx.grid_graph([5,5]), label_attribute='labels') rlabels = nx.get_node_attributes(G, 'labels') labels = dict((v, k) for k, v in rlabels.items()) for nodes in [ (labels[(0,4)], labels[(1,4)]), (labels[(3,4)], labels[(4,4)]) ]: new_node = G.order()+1 # Petersen graph is triconnected P = nx.petersen_graph() G = nx.disjoint_union(G,P) # Add two edges between the grid and P G.add_edge(new_node+1, nodes[0]) G.add_edge(new_node, nodes[1]) # K5 is 4-connected K = nx.complete_graph(5) G = nx.disjoint_union(G,K) # Add three edges between P and K5 G.add_edge(new_node+2,new_node+11) G.add_edge(new_node+3,new_node+12) G.add_edge(new_node+4,new_node+13) # Add another K5 sharing a node G = nx.disjoint_union(G,K) nbrs = G[new_node+10] G.remove_node(new_node+10) for nbr in nbrs: G.add_edge(new_node+17, nbr) # Commenting this makes the graph not biconnected !! # This stupid mistake make one reviewer very angry :P G.add_edge(new_node+16, new_node+8) for nodes in [(labels[(0,0)], labels[(1,0)]), (labels[(3,0)], labels[(4,0)])]: new_node = G.order()+1 # Petersen graph is triconnected P = nx.petersen_graph() G = nx.disjoint_union(G,P) # Add two edges between the grid and P G.add_edge(new_node+1, nodes[0]) G.add_edge(new_node, nodes[1]) # K5 is 4-connected K = nx.complete_graph(5) G = nx.disjoint_union(G,K) # Add three edges between P and K5 G.add_edge(new_node+2,new_node+11) G.add_edge(new_node+3,new_node+12) G.add_edge(new_node+4,new_node+13) # Add another K5 sharing two nodes G = nx.disjoint_union(G,K) nbrs = G[new_node+10] G.remove_node(new_node+10) for nbr in nbrs: G.add_edge(new_node+17, nbr) nbrs2 = G[new_node+9] G.remove_node(new_node+9) for nbr in nbrs2: G.add_edge(new_node+18, nbr) G.name = 'Example graph for connectivity' return G # Helper function def _check_separating_sets(G): for Gc in nx.connected_component_subgraphs(G): if len(Gc) < 3: continue for cut in nx.all_node_cuts(Gc): assert_equal(nx.node_connectivity(Gc), len(cut)) H = Gc.copy() H.remove_nodes_from(cut) assert_false(nx.is_connected(H)) def test_torrents_and_ferraro_graph(): G = torrents_and_ferraro_graph() _check_separating_sets(G) def test_example_1(): G = graph_example_1() _check_separating_sets(G) def test_random_gnp(): G = nx.gnp_random_graph(100, 0.1) _check_separating_sets(G) def test_shell(): constructor=[(20,80,0.8),(80,180,0.6)] G = nx.random_shell_graph(constructor) _check_separating_sets(G) def test_configuration(): deg_seq = nx.utils.create_degree_sequence(100,nx.utils.powerlaw_sequence) G = nx.Graph(nx.configuration_model(deg_seq)) G.remove_edges_from(G.selfloop_edges()) _check_separating_sets(G) def test_karate(): G = nx.karate_club_graph() _check_separating_sets(G) def _generate_no_biconnected(max_attempts=50): attempts = 0 while True: G = nx.fast_gnp_random_graph(100,0.0575) if nx.is_connected(G) and not nx.is_biconnected(G): attempts = 0 yield G else: if attempts >= max_attempts: msg = "Tried %d times: no suitable Graph."%attempts raise Exception(msg % max_attempts) else: attempts += 1 def test_articulation_points(): Ggen = _generate_no_biconnected() for i in range(2): G = next(Ggen) articulation_points = list({a} for a in nx.articulation_points(G)) for cut in nx.all_node_cuts(G): assert_true(cut in articulation_points) def test_grid_2d_graph(): # All minimum node cuts of a 2d grid # are the four pairs of nodes that are # neighbors of the four corner nodes. G = nx.grid_2d_graph(5, 5) solution = [ set([(0, 1), (1, 0)]), set([(3, 0), (4, 1)]), set([(3, 4), (4, 3)]), set([(0, 3), (1, 4)]), ] for cut in nx.all_node_cuts(G): assert_true(cut in solution) def test_disconnected_graph(): G = nx.fast_gnp_random_graph(100, 0.01) cuts = nx.all_node_cuts(G) assert_raises(nx.NetworkXError, next, cuts) def test_alternative_flow_functions(): flow_funcs = [edmonds_karp, shortest_augmenting_path, preflow_push] graph_funcs = [graph_example_1, nx.davis_southern_women_graph] for graph_func in graph_funcs: G = graph_func() for flow_func in flow_funcs: for cut in nx.all_node_cuts(G, flow_func=flow_func): assert_equal(nx.node_connectivity(G), len(cut)) H = G.copy() H.remove_nodes_from(cut) assert_false(nx.is_connected(H)) def test_is_separating_set_complete_graph(): G = nx.complete_graph(5) assert_true(_is_separating_set(G, {0, 1, 2, 3})) def test_is_separating_set(): for i in [5, 10, 15]: G = nx.star_graph(i) max_degree_node = max(G, key=G.degree) assert_true(_is_separating_set(G, {max_degree_node})) def test_non_repeated_cuts(): # The algorithm was repeating the cut {0, 1} for the giant biconnected # component of the Karate club graph. K = nx.karate_club_graph() G = max(list(nx.biconnected_component_subgraphs(K)), key=len) solution = [{32, 33}, {2, 33}, {0, 3}, {0, 1}, {29, 33}] cuts = list(nx.all_node_cuts(G)) if len(solution) != len(cuts): print(nx.info(G)) print("Solution: {}".format(solution)) print("Result: {}".format(cuts)) assert_true(len(solution) == len(cuts)) for cut in cuts: assert_true(cut in solution) networkx-1.11/networkx/algorithms/connectivity/tests/test_cuts.py0000644000175000017500000002430612637544500025507 0ustar aricaric00000000000000from nose.tools import assert_equal, assert_true, assert_false, assert_raises import networkx as nx from networkx.algorithms.flow import (edmonds_karp, preflow_push, shortest_augmenting_path) flow_funcs = [edmonds_karp, preflow_push, shortest_augmenting_path] # import connectivity functions not in base namespace from networkx.algorithms.connectivity import (minimum_st_edge_cut, minimum_st_node_cut) msg = "Assertion failed in function: {0}" # Tests for node and edge cutsets def _generate_no_biconnected(max_attempts=50): attempts = 0 while True: G = nx.fast_gnp_random_graph(100,0.0575) if nx.is_connected(G) and not nx.is_biconnected(G): attempts = 0 yield G else: if attempts >= max_attempts: msg = "Tried %d times: no suitable Graph."%attempts raise Exception(msg % max_attempts) else: attempts += 1 def test_articulation_points(): Ggen = _generate_no_biconnected() for flow_func in flow_funcs: for i in range(3): G = next(Ggen) cut = nx.minimum_node_cut(G, flow_func=flow_func) assert_true(len(cut) == 1, msg=msg.format(flow_func.__name__)) assert_true(cut.pop() in set(nx.articulation_points(G)), msg=msg.format(flow_func.__name__)) def test_brandes_erlebach_book(): # Figure 1 chapter 7: Connectivity # http://www.informatik.uni-augsburg.de/thi/personen/kammer/Graph_Connectivity.pdf G = nx.Graph() G.add_edges_from([(1, 2), (1, 3), (1, 4), (1, 5), (2, 3), (2, 6), (3, 4), (3, 6), (4, 6), (4, 7), (5, 7), (6, 8), (6, 9), (7, 8), (7, 10), (8, 11), (9, 10), (9, 11), (10, 11)]) for flow_func in flow_funcs: kwargs = dict(flow_func=flow_func) # edge cutsets assert_equal(3, len(nx.minimum_edge_cut(G, 1, 11, **kwargs)), msg=msg.format(flow_func.__name__)) edge_cut = nx.minimum_edge_cut(G, **kwargs) # Node 5 has only two edges assert_equal(2, len(edge_cut), msg=msg.format(flow_func.__name__)) H = G.copy() H.remove_edges_from(edge_cut) assert_false(nx.is_connected(H), msg=msg.format(flow_func.__name__)) # node cuts assert_equal(set([6, 7]), minimum_st_node_cut(G, 1, 11, **kwargs), msg=msg.format(flow_func.__name__)) assert_equal(set([6, 7]), nx.minimum_node_cut(G, 1, 11, **kwargs), msg=msg.format(flow_func.__name__)) node_cut = nx.minimum_node_cut(G, **kwargs) assert_equal(2, len(node_cut), msg=msg.format(flow_func.__name__)) H = G.copy() H.remove_nodes_from(node_cut) assert_false(nx.is_connected(H), msg=msg.format(flow_func.__name__)) def test_white_harary_paper(): # Figure 1b white and harary (2001) # http://eclectic.ss.uci.edu/~drwhite/sm-w23.PDF # A graph with high adhesion (edge connectivity) and low cohesion # (node connectivity) G = nx.disjoint_union(nx.complete_graph(4), nx.complete_graph(4)) G.remove_node(7) for i in range(4,7): G.add_edge(0,i) G = nx.disjoint_union(G, nx.complete_graph(4)) G.remove_node(G.order()-1) for i in range(7,10): G.add_edge(0,i) for flow_func in flow_funcs: kwargs = dict(flow_func=flow_func) # edge cuts edge_cut = nx.minimum_edge_cut(G, **kwargs) assert_equal(3, len(edge_cut), msg=msg.format(flow_func.__name__)) H = G.copy() H.remove_edges_from(edge_cut) assert_false(nx.is_connected(H), msg=msg.format(flow_func.__name__)) # node cuts node_cut = nx.minimum_node_cut(G, **kwargs) assert_equal(set([0]), node_cut, msg=msg.format(flow_func.__name__)) H = G.copy() H.remove_nodes_from(node_cut) assert_false(nx.is_connected(H), msg=msg.format(flow_func.__name__)) def test_petersen_cutset(): G = nx.petersen_graph() for flow_func in flow_funcs: kwargs = dict(flow_func=flow_func) # edge cuts edge_cut = nx.minimum_edge_cut(G, **kwargs) assert_equal(3, len(edge_cut), msg=msg.format(flow_func.__name__)) H = G.copy() H.remove_edges_from(edge_cut) assert_false(nx.is_connected(H), msg=msg.format(flow_func.__name__)) # node cuts node_cut = nx.minimum_node_cut(G, **kwargs) assert_equal(3, len(node_cut), msg=msg.format(flow_func.__name__)) H = G.copy() H.remove_nodes_from(node_cut) assert_false(nx.is_connected(H), msg=msg.format(flow_func.__name__)) def test_octahedral_cutset(): G=nx.octahedral_graph() for flow_func in flow_funcs: kwargs = dict(flow_func=flow_func) # edge cuts edge_cut = nx.minimum_edge_cut(G, **kwargs) assert_equal(4, len(edge_cut), msg=msg.format(flow_func.__name__)) H = G.copy() H.remove_edges_from(edge_cut) assert_false(nx.is_connected(H), msg=msg.format(flow_func.__name__)) # node cuts node_cut = nx.minimum_node_cut(G, **kwargs) assert_equal(4, len(node_cut), msg=msg.format(flow_func.__name__)) H = G.copy() H.remove_nodes_from(node_cut) assert_false(nx.is_connected(H), msg=msg.format(flow_func.__name__)) def test_icosahedral_cutset(): G=nx.icosahedral_graph() for flow_func in flow_funcs: kwargs = dict(flow_func=flow_func) # edge cuts edge_cut = nx.minimum_edge_cut(G, **kwargs) assert_equal(5, len(edge_cut), msg=msg.format(flow_func.__name__)) H = G.copy() H.remove_edges_from(edge_cut) assert_false(nx.is_connected(H), msg=msg.format(flow_func.__name__)) # node cuts node_cut = nx.minimum_node_cut(G, **kwargs) assert_equal(5, len(node_cut), msg=msg.format(flow_func.__name__)) H = G.copy() H.remove_nodes_from(node_cut) assert_false(nx.is_connected(H), msg=msg.format(flow_func.__name__)) def test_node_cutset_exception(): G=nx.Graph() G.add_edges_from([(1, 2), (3, 4)]) for flow_func in flow_funcs: assert_raises(nx.NetworkXError, nx.minimum_node_cut, G, flow_func=flow_func) def test_node_cutset_random_graphs(): for flow_func in flow_funcs: for i in range(3): G = nx.fast_gnp_random_graph(50, 0.25) if not nx.is_connected(G): ccs = iter(nx.connected_components(G)) start = next(ccs)[0] G.add_edges_from((start, c[0]) for c in ccs) cutset = nx.minimum_node_cut(G, flow_func=flow_func) assert_equal(nx.node_connectivity(G), len(cutset), msg=msg.format(flow_func.__name__)) G.remove_nodes_from(cutset) assert_false(nx.is_connected(G), msg=msg.format(flow_func.__name__)) def test_edge_cutset_random_graphs(): for flow_func in flow_funcs: for i in range(3): G = nx.fast_gnp_random_graph(50, 0.25) if not nx.is_connected(G): ccs = iter(nx.connected_components(G)) start = next(ccs)[0] G.add_edges_from( (start,c[0]) for c in ccs ) cutset = nx.minimum_edge_cut(G, flow_func=flow_func) assert_equal(nx.edge_connectivity(G), len(cutset), msg=msg.format(flow_func.__name__)) G.remove_edges_from(cutset) assert_false(nx.is_connected(G), msg=msg.format(flow_func.__name__)) def test_empty_graphs(): G = nx.Graph() D = nx.DiGraph() for interface_func in [nx.minimum_node_cut, nx.minimum_edge_cut]: for flow_func in flow_funcs: assert_raises(nx.NetworkXPointlessConcept, interface_func, G, flow_func=flow_func) assert_raises(nx.NetworkXPointlessConcept, interface_func, D, flow_func=flow_func) def test_unbounded(): G = nx.complete_graph(5) for flow_func in flow_funcs: assert_equal(4, len(minimum_st_edge_cut(G, 1, 4, flow_func=flow_func))) def test_missing_source(): G = nx.path_graph(4) for interface_func in [nx.minimum_edge_cut, nx.minimum_node_cut]: for flow_func in flow_funcs: assert_raises(nx.NetworkXError, interface_func, G, 10, 1, flow_func=flow_func) def test_missing_target(): G = nx.path_graph(4) for interface_func in [nx.minimum_edge_cut, nx.minimum_node_cut]: for flow_func in flow_funcs: assert_raises(nx.NetworkXError, interface_func, G, 1, 10, flow_func=flow_func) def test_not_weakly_connected(): G = nx.DiGraph() G.add_path([1, 2, 3]) G.add_path([4, 5]) for interface_func in [nx.minimum_edge_cut, nx.minimum_node_cut]: for flow_func in flow_funcs: assert_raises(nx.NetworkXError, interface_func, G, flow_func=flow_func) def test_not_connected(): G = nx.Graph() G.add_path([1, 2, 3]) G.add_path([4, 5]) for interface_func in [nx.minimum_edge_cut, nx.minimum_node_cut]: for flow_func in flow_funcs: assert_raises(nx.NetworkXError, interface_func, G, flow_func=flow_func) def tests_min_cut_complete(): G = nx.complete_graph(5) for interface_func in [nx.minimum_edge_cut, nx.minimum_node_cut]: for flow_func in flow_funcs: assert_equal(4, len(interface_func(G, flow_func=flow_func))) def tests_min_cut_complete_directed(): G = nx.complete_graph(5) G = G.to_directed() for interface_func in [nx.minimum_edge_cut, nx.minimum_node_cut]: for flow_func in flow_funcs: assert_equal(4, len(interface_func(G, flow_func=flow_func))) def test_invalid_auxiliary(): G = nx.complete_graph(5) assert_raises(nx.NetworkXError, minimum_st_node_cut, G, 0, 3, auxiliary=G) def test_interface_only_source(): G = nx.complete_graph(5) for interface_func in [nx.minimum_node_cut, nx.minimum_edge_cut]: assert_raises(nx.NetworkXError, interface_func, G, s=0) def test_interface_only_target(): G = nx.complete_graph(5) for interface_func in [nx.minimum_node_cut, nx.minimum_edge_cut]: assert_raises(nx.NetworkXError, interface_func, G, t=3) networkx-1.11/networkx/algorithms/connectivity/tests/test_kcomponents.py0000644000175000017500000002031312637544450027067 0ustar aricaric00000000000000# Test for Moody and White k-components algorithm from nose.tools import assert_equal, assert_true, raises import networkx as nx from networkx.algorithms.connectivity.kcomponents import ( build_k_number_dict, _consolidate, ) ## ## A nice synthetic graph ## def torrents_and_ferraro_graph(): # Graph from http://arxiv.org/pdf/1503.04476v1 p.26 G = nx.convert_node_labels_to_integers( nx.grid_graph([5, 5]), label_attribute='labels', ) rlabels = nx.get_node_attributes(G, 'labels') labels = {v: k for k, v in rlabels.items()} for nodes in [(labels[(0,4)], labels[(1,4)]), (labels[(3,4)], labels[(4,4)])]: new_node = G.order() + 1 # Petersen graph is triconnected P = nx.petersen_graph() G = nx.disjoint_union(G, P) # Add two edges between the grid and P G.add_edge(new_node+1, nodes[0]) G.add_edge(new_node, nodes[1]) # K5 is 4-connected K = nx.complete_graph(5) G = nx.disjoint_union(G, K) # Add three edges between P and K5 G.add_edge(new_node+2, new_node+11) G.add_edge(new_node+3, new_node+12) G.add_edge(new_node+4, new_node+13) # Add another K5 sharing a node G = nx.disjoint_union(G, K) nbrs = G[new_node+10] G.remove_node(new_node+10) for nbr in nbrs: G.add_edge(new_node+17, nbr) # This edge makes the graph biconnected; it's # needed because K5s share only one node. G.add_edge(new_node+16, new_node+8) for nodes in [(labels[(0, 0)], labels[(1, 0)]), (labels[(3, 0)], labels[(4, 0)])]: new_node = G.order() + 1 # Petersen graph is triconnected P = nx.petersen_graph() G = nx.disjoint_union(G, P) # Add two edges between the grid and P G.add_edge(new_node+1, nodes[0]) G.add_edge(new_node, nodes[1]) # K5 is 4-connected K = nx.complete_graph(5) G = nx.disjoint_union(G, K) # Add three edges between P and K5 G.add_edge(new_node+2, new_node+11) G.add_edge(new_node+3, new_node+12) G.add_edge(new_node+4, new_node+13) # Add another K5 sharing two nodes G = nx.disjoint_union(G,K) nbrs = G[new_node+10] G.remove_node(new_node+10) for nbr in nbrs: G.add_edge(new_node+17, nbr) nbrs2 = G[new_node+9] G.remove_node(new_node+9) for nbr in nbrs2: G.add_edge(new_node+18, nbr) G.name = 'Example graph for connectivity' return G @raises(nx.NetworkXNotImplemented) def test_directed(): G = nx.gnp_random_graph(10, 0.2, directed=True) nx.k_components(G) # Helper function def _check_connectivity(G): result = nx.k_components(G) for k, components in result.items(): if k < 3: continue for component in components: C = G.subgraph(component) assert_true(nx.node_connectivity(C) >= k) def test_torrents_and_ferraro_graph(): G = torrents_and_ferraro_graph() _check_connectivity(G) def test_random_gnp(): G = nx.gnp_random_graph(50, 0.2) _check_connectivity(G) def test_shell(): constructor=[(20, 80, 0.8), (80, 180, 0.6)] G = nx.random_shell_graph(constructor) _check_connectivity(G) def test_configuration(): deg_seq = nx.utils.create_degree_sequence(100, nx.utils.powerlaw_sequence) G = nx.Graph(nx.configuration_model(deg_seq)) G.remove_edges_from(G.selfloop_edges()) _check_connectivity(G) def test_karate(): G = nx.karate_club_graph() _check_connectivity(G) def test_karate_component_number(): karate_k_num = { 0: 4, 1: 4, 2: 4, 3: 4, 4: 3, 5: 3, 6: 3, 7: 4, 8: 4, 9: 2, 10: 3, 11: 1, 12: 2, 13: 4, 14: 2, 15: 2, 16: 2, 17: 2, 18: 2, 19: 3, 20: 2, 21: 2, 22: 2, 23: 3, 24: 3, 25: 3, 26: 2, 27: 3, 28: 3, 29: 3, 30: 4, 31: 3, 32: 4, 33: 4 } G = nx.karate_club_graph() k_components = nx.k_components(G) k_num = build_k_number_dict(k_components) assert_equal(karate_k_num, k_num) def test_torrents_and_ferraro_detail_3_and_4(): solution = { 3: [{25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 42}, {44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 61}, {63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 79, 80}, {81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 93, 94, 95, 99, 100}, {39, 40, 41, 42, 43}, {58, 59, 60, 61, 62}, {76, 77, 78, 79, 80}, {96, 97, 98, 99, 100}, ], 4: [{35, 36, 37, 38, 42}, {39, 40, 41, 42, 43}, {54, 55, 56, 57, 61}, {58, 59, 60, 61, 62}, {73, 74, 75, 79, 80}, {76, 77, 78, 79, 80}, {93, 94, 95, 99, 100}, {96, 97, 98, 99, 100}, ], } G = torrents_and_ferraro_graph() result = nx.k_components(G) for k, components in result.items(): if k < 3: continue assert_true(len(components) == len(solution[k])) for component in components: assert_true(component in solution[k]) def test_davis_southern_women(): G = nx.davis_southern_women_graph() _check_connectivity(G) def test_davis_southern_women_detail_3_and_4(): solution = { 3: [{ 'Nora Fayette', 'E10', 'Myra Liddel', 'E12', 'E14', 'Frances Anderson', 'Evelyn Jefferson', 'Ruth DeSand', 'Helen Lloyd', 'Eleanor Nye', 'E9', 'E8', 'E5', 'E4', 'E7', 'E6', 'E1', 'Verne Sanderson', 'E3', 'E2', 'Theresa Anderson', 'Pearl Oglethorpe', 'Katherina Rogers', 'Brenda Rogers', 'E13', 'Charlotte McDowd', 'Sylvia Avondale', 'Laura Mandeville', }, ], 4: [{ 'Nora Fayette', 'E10', 'Verne Sanderson', 'E12', 'Frances Anderson', 'Evelyn Jefferson', 'Ruth DeSand', 'Helen Lloyd', 'Eleanor Nye', 'E9', 'E8', 'E5', 'E4', 'E7', 'E6', 'Myra Liddel', 'E3', 'Theresa Anderson', 'Katherina Rogers', 'Brenda Rogers', 'Charlotte McDowd', 'Sylvia Avondale', 'Laura Mandeville', }, ], } G = nx.davis_southern_women_graph() result = nx.k_components(G) for k, components in result.items(): if k < 3: continue assert_true(len(components) == len(solution[k])) for component in components: assert_true(component in solution[k]) def test_set_consolidation_rosettacode(): # Tests from http://rosettacode.org/wiki/Set_consolidation def list_of_sets_equal(result, solution): assert_equal( {frozenset(s) for s in result}, {frozenset(s) for s in solution} ) question = [{'A', 'B'}, {'C', 'D'}] solution = [{'A', 'B'}, {'C', 'D'}] list_of_sets_equal(_consolidate(question, 1), solution) question = [{'A', 'B'}, {'B', 'C'}] solution = [{'A', 'B', 'C'}] list_of_sets_equal(_consolidate(question, 1), solution) question = [{'A', 'B'}, {'C', 'D'}, {'D', 'B'}] solution = [{'A', 'C', 'B', 'D'}] list_of_sets_equal(_consolidate(question, 1), solution) question = [{'H', 'I', 'K'}, {'A', 'B'}, {'C', 'D'}, {'D', 'B'}, {'F', 'G', 'H'}] solution = [{'A', 'C', 'B', 'D'}, {'G', 'F', 'I', 'H', 'K'}] list_of_sets_equal(_consolidate(question, 1), solution) question = [{'A','H'}, {'H','I','K'}, {'A','B'}, {'C','D'}, {'D','B'}, {'F','G','H'}] solution = [{'A', 'C', 'B', 'D', 'G', 'F', 'I', 'H', 'K'}] list_of_sets_equal(_consolidate(question, 1), solution) question = [{'H','I','K'}, {'A','B'}, {'C','D'}, {'D','B'}, {'F','G','H'}, {'A','H'}] solution = [{'A', 'C', 'B', 'D', 'G', 'F', 'I', 'H', 'K'}] list_of_sets_equal(_consolidate(question, 1), solution) networkx-1.11/networkx/algorithms/connectivity/tests/test_stoer_wagner.py0000644000175000017500000000605312637544500027227 0ustar aricaric00000000000000from itertools import chain import networkx as nx from nose.tools import * def _check_partition(G, cut_value, partition, weight): ok_(isinstance(partition, tuple)) assert_equal(len(partition), 2) ok_(isinstance(partition[0], list)) ok_(isinstance(partition[1], list)) ok_(len(partition[0]) > 0) ok_(len(partition[1]) > 0) assert_equal(sum(map(len, partition)), len(G)) assert_equal(set(chain.from_iterable(partition)), set(G)) partition = tuple(map(set, partition)) w = 0 for u, v, e in G.edges_iter(data=True): if (u in partition[0]) == (v in partition[1]): w += e.get(weight, 1) assert_equal(w, cut_value) def _test_stoer_wagner(G, answer, weight='weight'): cut_value, partition = nx.stoer_wagner(G, weight, heap=nx.utils.PairingHeap) assert_equal(cut_value, answer) _check_partition(G, cut_value, partition, weight) cut_value, partition = nx.stoer_wagner(G, weight, heap=nx.utils.BinaryHeap) assert_equal(cut_value, answer) _check_partition(G, cut_value, partition, weight) def test_graph1(): G = nx.Graph() G.add_edge('x','a', weight=3) G.add_edge('x','b', weight=1) G.add_edge('a','c', weight=3) G.add_edge('b','c', weight=5) G.add_edge('b','d', weight=4) G.add_edge('d','e', weight=2) G.add_edge('c','y', weight=2) G.add_edge('e','y', weight=3) _test_stoer_wagner(G, 4) def test_graph2(): G = nx.Graph() G.add_edge('x','a') G.add_edge('x','b') G.add_edge('a','c') G.add_edge('b','c') G.add_edge('b','d') G.add_edge('d','e') G.add_edge('c','y') G.add_edge('e','y') _test_stoer_wagner(G, 2) def test_graph3(): # Source: # Stoer, M. and Wagner, F. (1997). "A simple min-cut algorithm". Journal of # the ACM 44 (4), 585-591. G = nx.Graph() G.add_edge(1, 2, weight=2) G.add_edge(1, 5, weight=3) G.add_edge(2, 3, weight=3) G.add_edge(2, 5, weight=2) G.add_edge(2, 6, weight=2) G.add_edge(3, 4, weight=4) G.add_edge(3, 7, weight=2) G.add_edge(4, 7, weight=2) G.add_edge(4, 8, weight=2) G.add_edge(5, 6, weight=3) G.add_edge(6, 7, weight=1) G.add_edge(7, 8, weight=3) _test_stoer_wagner(G, 4) def test_weight_name(): G = nx.Graph() G.add_edge(1, 2, weight=1, cost=8) G.add_edge(1, 3, cost=2) G.add_edge(2, 3, cost=4) _test_stoer_wagner(G, 6, weight='cost') def test_exceptions(): G = nx.Graph() assert_raises(nx.NetworkXError, nx.stoer_wagner, G) G.add_node(1) assert_raises(nx.NetworkXError, nx.stoer_wagner, G) G.add_node(2) assert_raises(nx.NetworkXError, nx.stoer_wagner, G) G.add_edge(1, 2, weight=-2) assert_raises(nx.NetworkXError, nx.stoer_wagner, G) G = nx.DiGraph() assert_raises(nx.NetworkXNotImplemented, nx.stoer_wagner, G) G = nx.MultiGraph() assert_raises(nx.NetworkXNotImplemented, nx.stoer_wagner, G) G = nx.MultiDiGraph() assert_raises(nx.NetworkXNotImplemented, nx.stoer_wagner, G) networkx-1.11/networkx/algorithms/connectivity/tests/test_connectivity.py0000644000175000017500000003677712637544450027272 0ustar aricaric00000000000000import itertools from nose.tools import assert_equal, assert_true, assert_false, assert_raises import networkx as nx from networkx.algorithms.flow import (edmonds_karp, preflow_push, shortest_augmenting_path) flow_funcs = [edmonds_karp, preflow_push, shortest_augmenting_path] # connectivity functions not imported to the base namespace from networkx.algorithms.connectivity import (local_edge_connectivity, local_node_connectivity) msg = "Assertion failed in function: {0}" # helper functions for tests def _generate_no_biconnected(max_attempts=50): attempts = 0 while True: G = nx.fast_gnp_random_graph(100, 0.0575) if nx.is_connected(G) and not nx.is_biconnected(G): attempts = 0 yield G else: if attempts >= max_attempts: msg = "Tried %d times: no suitable Graph." raise Exception(msg % max_attempts) else: attempts += 1 def test_average_connectivity(): # figure 1 from: # Beineke, L., O. Oellermann, and R. Pippert (2002). The average # connectivity of a graph. Discrete mathematics 252(1-3), 31-45 # http://www.sciencedirect.com/science/article/pii/S0012365X01001807 G1 = nx.path_graph(3) G1.add_edges_from([(1, 3),(1, 4)]) G2 = nx.path_graph(3) G2.add_edges_from([(1, 3),(1, 4),(0, 3),(0, 4),(3, 4)]) G3 = nx.Graph() for flow_func in flow_funcs: kwargs = dict(flow_func=flow_func) assert_equal(nx.average_node_connectivity(G1, **kwargs), 1, msg=msg.format(flow_func.__name__)) assert_equal(nx.average_node_connectivity(G2, **kwargs), 2.2, msg=msg.format(flow_func.__name__)) assert_equal(nx.average_node_connectivity(G3, **kwargs), 0, msg=msg.format(flow_func.__name__)) def test_average_connectivity_directed(): G = nx.DiGraph([(1,3),(1,4),(1,5)]) for flow_func in flow_funcs: assert_equal(nx.average_node_connectivity(G), 0.25, msg=msg.format(flow_func.__name__)) def test_articulation_points(): Ggen = _generate_no_biconnected() for flow_func in flow_funcs: for i in range(3): G = next(Ggen) assert_equal(nx.node_connectivity(G, flow_func=flow_func), 1, msg=msg.format(flow_func.__name__)) def test_brandes_erlebach(): # Figure 1 chapter 7: Connectivity # http://www.informatik.uni-augsburg.de/thi/personen/kammer/Graph_Connectivity.pdf G = nx.Graph() G.add_edges_from([(1, 2), (1, 3), (1, 4), (1, 5), (2, 3), (2, 6), (3, 4), (3, 6), (4, 6), (4, 7), (5, 7), (6, 8), (6, 9), (7, 8), (7, 10), (8, 11), (9, 10), (9, 11), (10, 11)]) for flow_func in flow_funcs: kwargs = dict(flow_func=flow_func) assert_equal(3, local_edge_connectivity(G, 1, 11, **kwargs), msg=msg.format(flow_func.__name__)) assert_equal(3, nx.edge_connectivity(G, 1, 11, **kwargs), msg=msg.format(flow_func.__name__)) assert_equal(2, local_node_connectivity(G, 1, 11, **kwargs), msg=msg.format(flow_func.__name__)) assert_equal(2, nx.node_connectivity(G, 1, 11, **kwargs), msg=msg.format(flow_func.__name__)) assert_equal(2, nx.edge_connectivity(G, **kwargs), # node 5 has degree 2 msg=msg.format(flow_func.__name__)) assert_equal(2, nx.node_connectivity(G, **kwargs), msg=msg.format(flow_func.__name__)) def test_white_harary_1(): # Figure 1b white and harary (2001) # # http://eclectic.ss.uci.edu/~drwhite/sm-w23.PDF # A graph with high adhesion (edge connectivity) and low cohesion # (vertex connectivity) G = nx.disjoint_union(nx.complete_graph(4), nx.complete_graph(4)) G.remove_node(7) for i in range(4, 7): G.add_edge(0, i) G = nx.disjoint_union(G, nx.complete_graph(4)) G.remove_node(G.order() - 1) for i in range(7, 10): G.add_edge(0, i) for flow_func in flow_funcs: assert_equal(1, nx.node_connectivity(G, flow_func=flow_func), msg=msg.format(flow_func.__name__)) assert_equal(3, nx.edge_connectivity(G, flow_func=flow_func), msg=msg.format(flow_func.__name__)) def test_white_harary_2(): # Figure 8 white and harary (2001) # # http://eclectic.ss.uci.edu/~drwhite/sm-w23.PDF G = nx.disjoint_union(nx.complete_graph(4), nx.complete_graph(4)) G.add_edge(0, 4) # kappa <= lambda <= delta assert_equal(3, min(nx.core_number(G).values())) for flow_func in flow_funcs: assert_equal(1, nx.node_connectivity(G, flow_func=flow_func), msg=msg.format(flow_func.__name__)) assert_equal(1, nx.edge_connectivity(G, flow_func=flow_func), msg=msg.format(flow_func.__name__)) def test_complete_graphs(): for n in range(5, 20, 5): for flow_func in flow_funcs: G = nx.complete_graph(n) assert_equal(n-1, nx.node_connectivity(G, flow_func=flow_func), msg=msg.format(flow_func.__name__)) assert_equal(n-1, nx.node_connectivity(G.to_directed(), flow_func=flow_func), msg=msg.format(flow_func.__name__)) assert_equal(n-1, nx.edge_connectivity(G, flow_func=flow_func), msg=msg.format(flow_func.__name__)) assert_equal(n-1, nx.edge_connectivity(G.to_directed(), flow_func=flow_func), msg=msg.format(flow_func.__name__)) def test_empty_graphs(): for k in range(5, 25, 5): G = nx.empty_graph(k) for flow_func in flow_funcs: assert_equal(0, nx.node_connectivity(G, flow_func=flow_func), msg=msg.format(flow_func.__name__)) assert_equal(0, nx.edge_connectivity(G, flow_func=flow_func), msg=msg.format(flow_func.__name__)) def test_petersen(): G = nx.petersen_graph() for flow_func in flow_funcs: assert_equal(3, nx.node_connectivity(G, flow_func=flow_func), msg=msg.format(flow_func.__name__)) assert_equal(3, nx.edge_connectivity(G, flow_func=flow_func), msg=msg.format(flow_func.__name__)) def test_tutte(): G = nx.tutte_graph() for flow_func in flow_funcs: assert_equal(3, nx.node_connectivity(G, flow_func=flow_func), msg=msg.format(flow_func.__name__)) assert_equal(3, nx.edge_connectivity(G, flow_func=flow_func), msg=msg.format(flow_func.__name__)) def test_dodecahedral(): G = nx.dodecahedral_graph() for flow_func in flow_funcs: assert_equal(3, nx.node_connectivity(G, flow_func=flow_func), msg=msg.format(flow_func.__name__)) assert_equal(3, nx.edge_connectivity(G, flow_func=flow_func), msg=msg.format(flow_func.__name__)) def test_octahedral(): G=nx.octahedral_graph() for flow_func in flow_funcs: assert_equal(4, nx.node_connectivity(G, flow_func=flow_func), msg=msg.format(flow_func.__name__)) assert_equal(4, nx.edge_connectivity(G, flow_func=flow_func), msg=msg.format(flow_func.__name__)) def test_icosahedral(): G=nx.icosahedral_graph() for flow_func in flow_funcs: assert_equal(5, nx.node_connectivity(G, flow_func=flow_func), msg=msg.format(flow_func.__name__)) assert_equal(5, nx.edge_connectivity(G, flow_func=flow_func), msg=msg.format(flow_func.__name__)) def test_missing_source(): G = nx.path_graph(4) for flow_func in flow_funcs: assert_raises(nx.NetworkXError, nx.node_connectivity, G, 10, 1, flow_func=flow_func) def test_missing_target(): G = nx.path_graph(4) for flow_func in flow_funcs: assert_raises(nx.NetworkXError, nx.node_connectivity, G, 1, 10, flow_func=flow_func) def test_edge_missing_source(): G = nx.path_graph(4) for flow_func in flow_funcs: assert_raises(nx.NetworkXError, nx.edge_connectivity, G, 10, 1, flow_func=flow_func) def test_edge_missing_target(): G = nx.path_graph(4) for flow_func in flow_funcs: assert_raises(nx.NetworkXError, nx.edge_connectivity, G, 1, 10, flow_func=flow_func) def test_not_weakly_connected(): G = nx.DiGraph() G.add_path([1, 2, 3]) G.add_path([4, 5]) for flow_func in flow_funcs: assert_equal(nx.node_connectivity(G), 0, msg=msg.format(flow_func.__name__)) assert_equal(nx.edge_connectivity(G), 0, msg=msg.format(flow_func.__name__)) def test_not_connected(): G = nx.Graph() G.add_path([1, 2, 3]) G.add_path([4, 5]) for flow_func in flow_funcs: assert_equal(nx.node_connectivity(G), 0, msg=msg.format(flow_func.__name__)) assert_equal(nx.edge_connectivity(G), 0, msg=msg.format(flow_func.__name__)) def test_directed_edge_connectivity(): G = nx.cycle_graph(10, create_using=nx.DiGraph()) # only one direction D = nx.cycle_graph(10).to_directed() # 2 reciprocal edges for flow_func in flow_funcs: assert_equal(1, nx.edge_connectivity(G, flow_func=flow_func), msg=msg.format(flow_func.__name__)) assert_equal(1, local_edge_connectivity(G, 1, 4, flow_func=flow_func), msg=msg.format(flow_func.__name__)) assert_equal(1, nx.edge_connectivity(G, 1, 4, flow_func=flow_func), msg=msg.format(flow_func.__name__)) assert_equal(2, nx.edge_connectivity(D, flow_func=flow_func), msg=msg.format(flow_func.__name__)) assert_equal(2, local_edge_connectivity(D, 1, 4, flow_func=flow_func), msg=msg.format(flow_func.__name__)) assert_equal(2, nx.edge_connectivity(D, 1, 4, flow_func=flow_func), msg=msg.format(flow_func.__name__)) def test_cutoff(): G = nx.complete_graph(5) for local_func in [local_edge_connectivity, local_node_connectivity]: for flow_func in flow_funcs: if flow_func is preflow_push: # cutoff is not supported by preflow_push continue for cutoff in [3, 2, 1]: result = local_func(G, 0, 4, flow_func=flow_func, cutoff=cutoff) assert_equal(cutoff, result, msg="cutoff error in {0}".format(flow_func.__name__)) def test_invalid_auxiliary(): G = nx.complete_graph(5) assert_raises(nx.NetworkXError, local_node_connectivity, G, 0, 3, auxiliary=G) def test_interface_only_source(): G = nx.complete_graph(5) for interface_func in [nx.node_connectivity, nx.edge_connectivity]: assert_raises(nx.NetworkXError, interface_func, G, s=0) def test_interface_only_target(): G = nx.complete_graph(5) for interface_func in [nx.node_connectivity, nx.edge_connectivity]: assert_raises(nx.NetworkXError, interface_func, G, t=3) def test_edge_connectivity_flow_vs_stoer_wagner(): graph_funcs = [ nx.icosahedral_graph, nx.octahedral_graph, nx.dodecahedral_graph, ] for graph_func in graph_funcs: G = graph_func() assert_equal(nx.stoer_wagner(G)[0], nx.edge_connectivity(G)) class TestAllPairsNodeConnectivity: def setUp(self): self.path = nx.path_graph(7) self.directed_path = nx.path_graph(7, create_using=nx.DiGraph()) self.cycle = nx.cycle_graph(7) self.directed_cycle = nx.cycle_graph(7, create_using=nx.DiGraph()) self.gnp = nx.gnp_random_graph(30, 0.1) self.directed_gnp = nx.gnp_random_graph(30, 0.1, directed=True) self.K20 = nx.complete_graph(20) self.K10 = nx.complete_graph(10) self.K5 = nx.complete_graph(5) self.G_list = [self.path, self.directed_path, self.cycle, self.directed_cycle, self.gnp, self.directed_gnp, self.K10, self.K5, self.K20] def test_cycles(self): K_undir = nx.all_pairs_node_connectivity(self.cycle) for source in K_undir: for target, k in K_undir[source].items(): assert_true(k == 2) K_dir = nx.all_pairs_node_connectivity(self.directed_cycle) for source in K_dir: for target, k in K_dir[source].items(): assert_true(k == 1) def test_complete(self): for G in [self.K10, self.K5, self.K20]: K = nx.all_pairs_node_connectivity(G) for source in K: for target, k in K[source].items(): assert_true(k == len(G)-1) def test_paths(self): K_undir = nx.all_pairs_node_connectivity(self.path) for source in K_undir: for target, k in K_undir[source].items(): assert_true(k == 1) K_dir = nx.all_pairs_node_connectivity(self.directed_path) for source in K_dir: for target, k in K_dir[source].items(): if source < target: assert_true(k == 1) else: assert_true(k == 0) def test_all_pairs_connectivity_nbunch(self): G = nx.complete_graph(5) nbunch = [0, 2, 3] C = nx.all_pairs_node_connectivity(G, nbunch=nbunch) assert_equal(len(C), len(nbunch)) def test_all_pairs_connectivity_icosahedral(self): G = nx.icosahedral_graph() C = nx.all_pairs_node_connectivity(G) assert_true(all(5 == C[u][v] for u, v in itertools.combinations(G, 2))) def test_all_pairs_connectivity(self): G = nx.Graph() nodes = [0, 1, 2, 3] G.add_path(nodes) A = {n: {} for n in G} for u, v in itertools.combinations(nodes,2): A[u][v] = A[v][u] = nx.node_connectivity(G, u, v) C = nx.all_pairs_node_connectivity(G) assert_equal(sorted((k, sorted(v)) for k, v in A.items()), sorted((k, sorted(v)) for k, v in C.items())) def test_all_pairs_connectivity_directed(self): G = nx.DiGraph() nodes = [0, 1, 2, 3] G.add_path(nodes) A = {n: {} for n in G} for u, v in itertools.permutations(nodes, 2): A[u][v] = nx.node_connectivity(G, u, v) C = nx.all_pairs_node_connectivity(G) assert_equal(sorted((k, sorted(v)) for k, v in A.items()), sorted((k, sorted(v)) for k, v in C.items())) def test_all_pairs_connectivity_nbunch(self): G = nx.complete_graph(5) nbunch = [0, 2, 3] A = {n: {} for n in nbunch} for u, v in itertools.combinations(nbunch, 2): A[u][v] = A[v][u] = nx.node_connectivity(G, u, v) C = nx.all_pairs_node_connectivity(G, nbunch=nbunch) assert_equal(sorted((k, sorted(v)) for k, v in A.items()), sorted((k, sorted(v)) for k, v in C.items())) def test_all_pairs_connectivity_nbunch_iter(self): G = nx.complete_graph(5) nbunch = [0, 2, 3] A = {n: {} for n in nbunch} for u, v in itertools.combinations(nbunch, 2): A[u][v] = A[v][u] = nx.node_connectivity(G, u, v) C = nx.all_pairs_node_connectivity(G, nbunch=iter(nbunch)) assert_equal(sorted((k, sorted(v)) for k, v in A.items()), sorted((k, sorted(v)) for k, v in C.items())) networkx-1.11/networkx/algorithms/connectivity/connectivity.py0000644000175000017500000007121512637544500025047 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ Flow based connectivity algorithms """ from __future__ import division import itertools import networkx as nx # Define the default maximum flow function to use in all flow based # connectivity algorithms. from networkx.algorithms.flow import edmonds_karp, shortest_augmenting_path from networkx.algorithms.flow import build_residual_network default_flow_func = edmonds_karp from .utils import (build_auxiliary_node_connectivity, build_auxiliary_edge_connectivity) __author__ = '\n'.join(['Jordi Torrents ']) __all__ = ['average_node_connectivity', 'local_node_connectivity', 'node_connectivity', 'local_edge_connectivity', 'edge_connectivity', 'all_pairs_node_connectivity'] def local_node_connectivity(G, s, t, flow_func=None, auxiliary=None, residual=None, cutoff=None): r"""Computes local node connectivity for nodes s and t. Local node connectivity for two non adjacent nodes s and t is the minimum number of nodes that must be removed (along with their incident edges) to disconnect them. This is a flow based implementation of node connectivity. We compute the maximum flow on an auxiliary digraph build from the original input graph (see below for details). Parameters ---------- G : NetworkX graph Undirected graph s : node Source node t : node Target node flow_func : function A function for computing the maximum flow among a pair of nodes. The function has to accept at least three parameters: a Digraph, a source node, and a target node. And return a residual network that follows NetworkX conventions (see :meth:`maximum_flow` for details). If flow_func is None, the default maximum flow function (:meth:`edmonds_karp`) is used. See below for details. The choice of the default function may change from version to version and should not be relied on. Default value: None. auxiliary : NetworkX DiGraph Auxiliary digraph to compute flow based node connectivity. It has to have a graph attribute called mapping with a dictionary mapping node names in G and in the auxiliary digraph. If provided it will be reused instead of recreated. Default value: None. residual : NetworkX DiGraph Residual network to compute maximum flow. If provided it will be reused instead of recreated. Default value: None. cutoff : integer, float If specified, the maximum flow algorithm will terminate when the flow value reaches or exceeds the cutoff. This is only for the algorithms that support the cutoff parameter: :meth:`edmonds_karp` and :meth:`shortest_augmenting_path`. Other algorithms will ignore this parameter. Default value: None. Returns ------- K : integer local node connectivity for nodes s and t Examples -------- This function is not imported in the base NetworkX namespace, so you have to explicitly import it from the connectivity package: >>> from networkx.algorithms.connectivity import local_node_connectivity We use in this example the platonic icosahedral graph, which has node connectivity 5. >>> G = nx.icosahedral_graph() >>> local_node_connectivity(G, 0, 6) 5 If you need to compute local connectivity on several pairs of nodes in the same graph, it is recommended that you reuse the data structures that NetworkX uses in the computation: the auxiliary digraph for node connectivity, and the residual network for the underlying maximum flow computation. Example of how to compute local node connectivity among all pairs of nodes of the platonic icosahedral graph reusing the data structures. >>> import itertools >>> # You also have to explicitly import the function for >>> # building the auxiliary digraph from the connectivity package >>> from networkx.algorithms.connectivity import ( ... build_auxiliary_node_connectivity) ... >>> H = build_auxiliary_node_connectivity(G) >>> # And the function for building the residual network from the >>> # flow package >>> from networkx.algorithms.flow import build_residual_network >>> # Note that the auxiliary digraph has an edge attribute named capacity >>> R = build_residual_network(H, 'capacity') >>> result = dict.fromkeys(G, dict()) >>> # Reuse the auxiliary digraph and the residual network by passing them >>> # as parameters >>> for u, v in itertools.combinations(G, 2): ... k = local_node_connectivity(G, u, v, auxiliary=H, residual=R) ... result[u][v] = k ... >>> all(result[u][v] == 5 for u, v in itertools.combinations(G, 2)) True You can also use alternative flow algorithms for computing node connectivity. For instance, in dense networks the algorithm :meth:`shortest_augmenting_path` will usually perform better than the default :meth:`edmonds_karp` which is faster for sparse networks with highly skewed degree distributions. Alternative flow functions have to be explicitly imported from the flow package. >>> from networkx.algorithms.flow import shortest_augmenting_path >>> local_node_connectivity(G, 0, 6, flow_func=shortest_augmenting_path) 5 Notes ----- This is a flow based implementation of node connectivity. We compute the maximum flow using, by default, the :meth:`edmonds_karp` algorithm (see: :meth:`maximum_flow`) on an auxiliary digraph build from the original input graph: For an undirected graph G having `n` nodes and `m` edges we derive a directed graph H with `2n` nodes and `2m+n` arcs by replacing each original node `v` with two nodes `v_A`, `v_B` linked by an (internal) arc in H. Then for each edge (`u`, `v`) in G we add two arcs (`u_B`, `v_A`) and (`v_B`, `u_A`) in H. Finally we set the attribute capacity = 1 for each arc in H [1]_ . For a directed graph G having `n` nodes and `m` arcs we derive a directed graph H with `2n` nodes and `m+n` arcs by replacing each original node `v` with two nodes `v_A`, `v_B` linked by an (internal) arc (`v_A`, `v_B`) in H. Then for each arc (`u`, `v`) in G we add one arc (`u_B`, `v_A`) in H. Finally we set the attribute capacity = 1 for each arc in H. This is equal to the local node connectivity because the value of a maximum s-t-flow is equal to the capacity of a minimum s-t-cut. See also -------- :meth:`local_edge_connectivity` :meth:`node_connectivity` :meth:`minimum_node_cut` :meth:`maximum_flow` :meth:`edmonds_karp` :meth:`preflow_push` :meth:`shortest_augmenting_path` References ---------- .. [1] Kammer, Frank and Hanjo Taubig. Graph Connectivity. in Brandes and Erlebach, 'Network Analysis: Methodological Foundations', Lecture Notes in Computer Science, Volume 3418, Springer-Verlag, 2005. http://www.informatik.uni-augsburg.de/thi/personen/kammer/Graph_Connectivity.pdf """ if flow_func is None: flow_func = default_flow_func if auxiliary is None: H = build_auxiliary_node_connectivity(G) else: H = auxiliary mapping = H.graph.get('mapping', None) if mapping is None: raise nx.NetworkXError('Invalid auxiliary digraph.') kwargs = dict(flow_func=flow_func, residual=residual) if flow_func is shortest_augmenting_path: kwargs['cutoff'] = cutoff kwargs['two_phase'] = True elif flow_func is edmonds_karp: kwargs['cutoff'] = cutoff return nx.maximum_flow_value(H, '%sB' % mapping[s], '%sA' % mapping[t], **kwargs) def node_connectivity(G, s=None, t=None, flow_func=None): r"""Returns node connectivity for a graph or digraph G. Node connectivity is equal to the minimum number of nodes that must be removed to disconnect G or render it trivial. If source and target nodes are provided, this function returns the local node connectivity: the minimum number of nodes that must be removed to break all paths from source to target in G. Parameters ---------- G : NetworkX graph Undirected graph s : node Source node. Optional. Default value: None. t : node Target node. Optional. Default value: None. flow_func : function A function for computing the maximum flow among a pair of nodes. The function has to accept at least three parameters: a Digraph, a source node, and a target node. And return a residual network that follows NetworkX conventions (see :meth:`maximum_flow` for details). If flow_func is None, the default maximum flow function (:meth:`edmonds_karp`) is used. See below for details. The choice of the default function may change from version to version and should not be relied on. Default value: None. Returns ------- K : integer Node connectivity of G, or local node connectivity if source and target are provided. Examples -------- >>> # Platonic icosahedral graph is 5-node-connected >>> G = nx.icosahedral_graph() >>> nx.node_connectivity(G) 5 You can use alternative flow algorithms for the underlying maximum flow computation. In dense networks the algorithm :meth:`shortest_augmenting_path` will usually perform better than the default :meth:`edmonds_karp`, which is faster for sparse networks with highly skewed degree distributions. Alternative flow functions have to be explicitly imported from the flow package. >>> from networkx.algorithms.flow import shortest_augmenting_path >>> nx.node_connectivity(G, flow_func=shortest_augmenting_path) 5 If you specify a pair of nodes (source and target) as parameters, this function returns the value of local node connectivity. >>> nx.node_connectivity(G, 3, 7) 5 If you need to perform several local computations among different pairs of nodes on the same graph, it is recommended that you reuse the data structures used in the maximum flow computations. See :meth:`local_node_connectivity` for details. Notes ----- This is a flow based implementation of node connectivity. The algorithm works by solving `O((n-\delta-1+\delta(\delta-1)/2))` maximum flow problems on an auxiliary digraph. Where `\delta` is the minimum degree of G. For details about the auxiliary digraph and the computation of local node connectivity see :meth:`local_node_connectivity`. This implementation is based on algorithm 11 in [1]_. See also -------- :meth:`local_node_connectivity` :meth:`edge_connectivity` :meth:`maximum_flow` :meth:`edmonds_karp` :meth:`preflow_push` :meth:`shortest_augmenting_path` References ---------- .. [1] Abdol-Hossein Esfahanian. Connectivity Algorithms. http://www.cse.msu.edu/~cse835/Papers/Graph_connectivity_revised.pdf """ if (s is not None and t is None) or (s is None and t is not None): raise nx.NetworkXError('Both source and target must be specified.') # Local node connectivity if s is not None and t is not None: if s not in G: raise nx.NetworkXError('node %s not in graph' % s) if t not in G: raise nx.NetworkXError('node %s not in graph' % t) return local_node_connectivity(G, s, t, flow_func=flow_func) # Global node connectivity if G.is_directed(): if not nx.is_weakly_connected(G): return 0 iter_func = itertools.permutations # It is necessary to consider both predecessors # and successors for directed graphs def neighbors(v): return itertools.chain.from_iterable([G.predecessors_iter(v), G.successors_iter(v)]) else: if not nx.is_connected(G): return 0 iter_func = itertools.combinations neighbors = G.neighbors_iter # Reuse the auxiliary digraph and the residual network H = build_auxiliary_node_connectivity(G) R = build_residual_network(H, 'capacity') kwargs = dict(flow_func=flow_func, auxiliary=H, residual=R) # Pick a node with minimum degree degree = G.degree() minimum_degree = min(degree.values()) v = next(n for n, d in degree.items() if d == minimum_degree) # Node connectivity is bounded by degree. K = minimum_degree # compute local node connectivity with all its non-neighbors nodes for w in set(G) - set(neighbors(v)) - set([v]): kwargs['cutoff'] = K K = min(K, local_node_connectivity(G, v, w, **kwargs)) # Also for non adjacent pairs of neighbors of v for x, y in iter_func(neighbors(v), 2): if y in G[x]: continue kwargs['cutoff'] = K K = min(K, local_node_connectivity(G, x, y, **kwargs)) return K def average_node_connectivity(G, flow_func=None): r"""Returns the average connectivity of a graph G. The average connectivity `\bar{\kappa}` of a graph G is the average of local node connectivity over all pairs of nodes of G [1]_ . .. math:: \bar{\kappa}(G) = \frac{\sum_{u,v} \kappa_{G}(u,v)}{{n \choose 2}} Parameters ---------- G : NetworkX graph Undirected graph flow_func : function A function for computing the maximum flow among a pair of nodes. The function has to accept at least three parameters: a Digraph, a source node, and a target node. And return a residual network that follows NetworkX conventions (see :meth:`maximum_flow` for details). If flow_func is None, the default maximum flow function (:meth:`edmonds_karp`) is used. See :meth:`local_node_connectivity` for details. The choice of the default function may change from version to version and should not be relied on. Default value: None. Returns ------- K : float Average node connectivity See also -------- :meth:`local_node_connectivity` :meth:`node_connectivity` :meth:`edge_connectivity` :meth:`maximum_flow` :meth:`edmonds_karp` :meth:`preflow_push` :meth:`shortest_augmenting_path` References ---------- .. [1] Beineke, L., O. Oellermann, and R. Pippert (2002). The average connectivity of a graph. Discrete mathematics 252(1-3), 31-45. http://www.sciencedirect.com/science/article/pii/S0012365X01001807 """ if G.is_directed(): iter_func = itertools.permutations else: iter_func = itertools.combinations # Reuse the auxiliary digraph and the residual network H = build_auxiliary_node_connectivity(G) R = build_residual_network(H, 'capacity') kwargs = dict(flow_func=flow_func, auxiliary=H, residual=R) num, den = 0, 0 for u, v in iter_func(G, 2): num += local_node_connectivity(G, u, v, **kwargs) den += 1 if den == 0: # Null Graph return 0 return num / den def all_pairs_node_connectivity(G, nbunch=None, flow_func=None): """Compute node connectivity between all pairs of nodes of G. Parameters ---------- G : NetworkX graph Undirected graph nbunch: container Container of nodes. If provided node connectivity will be computed only over pairs of nodes in nbunch. flow_func : function A function for computing the maximum flow among a pair of nodes. The function has to accept at least three parameters: a Digraph, a source node, and a target node. And return a residual network that follows NetworkX conventions (see :meth:`maximum_flow` for details). If flow_func is None, the default maximum flow function (:meth:`edmonds_karp`) is used. See below for details. The choice of the default function may change from version to version and should not be relied on. Default value: None. Returns ------- all_pairs : dict A dictionary with node connectivity between all pairs of nodes in G, or in nbunch if provided. See also -------- :meth:`local_node_connectivity` :meth:`edge_connectivity` :meth:`local_edge_connectivity` :meth:`maximum_flow` :meth:`edmonds_karp` :meth:`preflow_push` :meth:`shortest_augmenting_path` """ if nbunch is None: nbunch = G else: nbunch = set(nbunch) directed = G.is_directed() if directed: iter_func = itertools.permutations else: iter_func = itertools.combinations all_pairs = {n: {} for n in nbunch} # Reuse auxiliary digraph and residual network H = build_auxiliary_node_connectivity(G) mapping = H.graph['mapping'] R = build_residual_network(H, 'capacity') kwargs = dict(flow_func=flow_func, auxiliary=H, residual=R) for u, v in iter_func(nbunch, 2): K = local_node_connectivity(G, u, v, **kwargs) all_pairs[u][v] = K if not directed: all_pairs[v][u] = K return all_pairs def local_edge_connectivity(G, u, v, flow_func=None, auxiliary=None, residual=None, cutoff=None): r"""Returns local edge connectivity for nodes s and t in G. Local edge connectivity for two nodes s and t is the minimum number of edges that must be removed to disconnect them. This is a flow based implementation of edge connectivity. We compute the maximum flow on an auxiliary digraph build from the original network (see below for details). This is equal to the local edge connectivity because the value of a maximum s-t-flow is equal to the capacity of a minimum s-t-cut (Ford and Fulkerson theorem) [1]_ . Parameters ---------- G : NetworkX graph Undirected or directed graph s : node Source node t : node Target node flow_func : function A function for computing the maximum flow among a pair of nodes. The function has to accept at least three parameters: a Digraph, a source node, and a target node. And return a residual network that follows NetworkX conventions (see :meth:`maximum_flow` for details). If flow_func is None, the default maximum flow function (:meth:`edmonds_karp`) is used. See below for details. The choice of the default function may change from version to version and should not be relied on. Default value: None. auxiliary : NetworkX DiGraph Auxiliary digraph for computing flow based edge connectivity. If provided it will be reused instead of recreated. Default value: None. residual : NetworkX DiGraph Residual network to compute maximum flow. If provided it will be reused instead of recreated. Default value: None. cutoff : integer, float If specified, the maximum flow algorithm will terminate when the flow value reaches or exceeds the cutoff. This is only for the algorithms that support the cutoff parameter: :meth:`edmonds_karp` and :meth:`shortest_augmenting_path`. Other algorithms will ignore this parameter. Default value: None. Returns ------- K : integer local edge connectivity for nodes s and t. Examples -------- This function is not imported in the base NetworkX namespace, so you have to explicitly import it from the connectivity package: >>> from networkx.algorithms.connectivity import local_edge_connectivity We use in this example the platonic icosahedral graph, which has edge connectivity 5. >>> G = nx.icosahedral_graph() >>> local_edge_connectivity(G, 0, 6) 5 If you need to compute local connectivity on several pairs of nodes in the same graph, it is recommended that you reuse the data structures that NetworkX uses in the computation: the auxiliary digraph for edge connectivity, and the residual network for the underlying maximum flow computation. Example of how to compute local edge connectivity among all pairs of nodes of the platonic icosahedral graph reusing the data structures. >>> import itertools >>> # You also have to explicitly import the function for >>> # building the auxiliary digraph from the connectivity package >>> from networkx.algorithms.connectivity import ( ... build_auxiliary_edge_connectivity) >>> H = build_auxiliary_edge_connectivity(G) >>> # And the function for building the residual network from the >>> # flow package >>> from networkx.algorithms.flow import build_residual_network >>> # Note that the auxiliary digraph has an edge attribute named capacity >>> R = build_residual_network(H, 'capacity') >>> result = dict.fromkeys(G, dict()) >>> # Reuse the auxiliary digraph and the residual network by passing them >>> # as parameters >>> for u, v in itertools.combinations(G, 2): ... k = local_edge_connectivity(G, u, v, auxiliary=H, residual=R) ... result[u][v] = k >>> all(result[u][v] == 5 for u, v in itertools.combinations(G, 2)) True You can also use alternative flow algorithms for computing edge connectivity. For instance, in dense networks the algorithm :meth:`shortest_augmenting_path` will usually perform better than the default :meth:`edmonds_karp` which is faster for sparse networks with highly skewed degree distributions. Alternative flow functions have to be explicitly imported from the flow package. >>> from networkx.algorithms.flow import shortest_augmenting_path >>> local_edge_connectivity(G, 0, 6, flow_func=shortest_augmenting_path) 5 Notes ----- This is a flow based implementation of edge connectivity. We compute the maximum flow using, by default, the :meth:`edmonds_karp` algorithm on an auxiliary digraph build from the original input graph: If the input graph is undirected, we replace each edge (`u`,`v`) with two reciprocal arcs (`u`, `v`) and (`v`, `u`) and then we set the attribute 'capacity' for each arc to 1. If the input graph is directed we simply add the 'capacity' attribute. This is an implementation of algorithm 1 in [1]_. The maximum flow in the auxiliary network is equal to the local edge connectivity because the value of a maximum s-t-flow is equal to the capacity of a minimum s-t-cut (Ford and Fulkerson theorem). See also -------- :meth:`edge_connectivity` :meth:`local_node_connectivity` :meth:`node_connectivity` :meth:`maximum_flow` :meth:`edmonds_karp` :meth:`preflow_push` :meth:`shortest_augmenting_path` References ---------- .. [1] Abdol-Hossein Esfahanian. Connectivity Algorithms. http://www.cse.msu.edu/~cse835/Papers/Graph_connectivity_revised.pdf """ if flow_func is None: flow_func = default_flow_func if auxiliary is None: H = build_auxiliary_edge_connectivity(G) else: H = auxiliary kwargs = dict(flow_func=flow_func, residual=residual) if flow_func is shortest_augmenting_path: kwargs['cutoff'] = cutoff kwargs['two_phase'] = True elif flow_func is edmonds_karp: kwargs['cutoff'] = cutoff return nx.maximum_flow_value(H, u, v, **kwargs) def edge_connectivity(G, s=None, t=None, flow_func=None): r"""Returns the edge connectivity of the graph or digraph G. The edge connectivity is equal to the minimum number of edges that must be removed to disconnect G or render it trivial. If source and target nodes are provided, this function returns the local edge connectivity: the minimum number of edges that must be removed to break all paths from source to target in G. Parameters ---------- G : NetworkX graph Undirected or directed graph s : node Source node. Optional. Default value: None. t : node Target node. Optional. Default value: None. flow_func : function A function for computing the maximum flow among a pair of nodes. The function has to accept at least three parameters: a Digraph, a source node, and a target node. And return a residual network that follows NetworkX conventions (see :meth:`maximum_flow` for details). If flow_func is None, the default maximum flow function (:meth:`edmonds_karp`) is used. See below for details. The choice of the default function may change from version to version and should not be relied on. Default value: None. Returns ------- K : integer Edge connectivity for G, or local edge connectivity if source and target were provided Examples -------- >>> # Platonic icosahedral graph is 5-edge-connected >>> G = nx.icosahedral_graph() >>> nx.edge_connectivity(G) 5 You can use alternative flow algorithms for the underlying maximum flow computation. In dense networks the algorithm :meth:`shortest_augmenting_path` will usually perform better than the default :meth:`edmonds_karp`, which is faster for sparse networks with highly skewed degree distributions. Alternative flow functions have to be explicitly imported from the flow package. >>> from networkx.algorithms.flow import shortest_augmenting_path >>> nx.edge_connectivity(G, flow_func=shortest_augmenting_path) 5 If you specify a pair of nodes (source and target) as parameters, this function returns the value of local edge connectivity. >>> nx.edge_connectivity(G, 3, 7) 5 If you need to perform several local computations among different pairs of nodes on the same graph, it is recommended that you reuse the data structures used in the maximum flow computations. See :meth:`local_edge_connectivity` for details. Notes ----- This is a flow based implementation of global edge connectivity. For undirected graphs the algorithm works by finding a 'small' dominating set of nodes of G (see algorithm 7 in [1]_ ) and computing local maximum flow (see :meth:`local_edge_connectivity`) between an arbitrary node in the dominating set and the rest of nodes in it. This is an implementation of algorithm 6 in [1]_ . For directed graphs, the algorithm does n calls to the maximum flow function. This is an implementation of algorithm 8 in [1]_ . See also -------- :meth:`local_edge_connectivity` :meth:`local_node_connectivity` :meth:`node_connectivity` :meth:`maximum_flow` :meth:`edmonds_karp` :meth:`preflow_push` :meth:`shortest_augmenting_path` References ---------- .. [1] Abdol-Hossein Esfahanian. Connectivity Algorithms. http://www.cse.msu.edu/~cse835/Papers/Graph_connectivity_revised.pdf """ if (s is not None and t is None) or (s is None and t is not None): raise nx.NetworkXError('Both source and target must be specified.') # Local edge connectivity if s is not None and t is not None: if s not in G: raise nx.NetworkXError('node %s not in graph' % s) if t not in G: raise nx.NetworkXError('node %s not in graph' % t) return local_edge_connectivity(G, s, t, flow_func=flow_func) # Global edge connectivity # reuse auxiliary digraph and residual network H = build_auxiliary_edge_connectivity(G) R = build_residual_network(H, 'capacity') kwargs = dict(flow_func=flow_func, auxiliary=H, residual=R) if G.is_directed(): # Algorithm 8 in [1] if not nx.is_weakly_connected(G): return 0 # initial value for \lambda is minimum degree L = min(G.degree().values()) nodes = G.nodes() n = len(nodes) for i in range(n): kwargs['cutoff'] = L try: L = min(L, local_edge_connectivity(G, nodes[i], nodes[i+1], **kwargs)) except IndexError: # last node! L = min(L, local_edge_connectivity(G, nodes[i], nodes[0], **kwargs)) return L else: # undirected # Algorithm 6 in [1] if not nx.is_connected(G): return 0 # initial value for \lambda is minimum degree L = min(G.degree().values()) # A dominating set is \lambda-covering # We need a dominating set with at least two nodes for node in G: D = nx.dominating_set(G, start_with=node) v = D.pop() if D: break else: # in complete graphs the dominating sets will always be of one node # thus we return min degree return L for w in D: kwargs['cutoff'] = L L = min(L, local_edge_connectivity(G, v, w, **kwargs)) return L networkx-1.11/networkx/algorithms/tests/0000755000175000017500000000000012653231454020374 5ustar aricaric00000000000000networkx-1.11/networkx/algorithms/tests/test_hierarchy.py0000644000175000017500000000167212637544450023776 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * import networkx as nx def test_hierarchy_exception(): G = nx.cycle_graph(5) assert_raises(nx.NetworkXError,nx.flow_hierarchy,G) def test_hierarchy_cycle(): G = nx.cycle_graph(5,create_using=nx.DiGraph()) assert_equal(nx.flow_hierarchy(G),0.0) def test_hierarchy_tree(): G = nx.full_rary_tree(2,16,create_using=nx.DiGraph()) assert_equal(nx.flow_hierarchy(G),1.0) def test_hierarchy_1(): G = nx.DiGraph() G.add_edges_from([(0,1),(1,2),(2,3),(3,1),(3,4),(0,4)]) assert_equal(nx.flow_hierarchy(G),0.5) def test_hierarchy_weight(): G = nx.DiGraph() G.add_edges_from([(0,1,{'weight':.3}), (1,2,{'weight':.1}), (2,3,{'weight':.1}), (3,1,{'weight':.1}), (3,4,{'weight':.3}), (0,4,{'weight':.3})]) assert_equal(nx.flow_hierarchy(G,weight='weight'),.75)networkx-1.11/networkx/algorithms/tests/test_vitality.py0000644000175000017500000000214412637544450023660 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * import networkx as nx class TestVitality: def test_closeness_vitality_unweighted(self): G=nx.cycle_graph(3) v=nx.closeness_vitality(G) assert_equal(v,{0:4.0, 1:4.0, 2:4.0}) assert_equal(v[0],4.0) def test_closeness_vitality_weighted(self): G=nx.Graph() G.add_cycle([0,1,2],weight=2) v=nx.closeness_vitality(G,weight='weight') assert_equal(v,{0:8.0, 1:8.0, 2:8.0}) def test_closeness_vitality_unweighted_digraph(self): G=nx.DiGraph() G.add_cycle([0,1,2]) v=nx.closeness_vitality(G) assert_equal(v,{0:8.0, 1:8.0, 2:8.0}) def test_closeness_vitality_weighted_digraph(self): G=nx.DiGraph() G.add_cycle([0,1,2],weight=2) v=nx.closeness_vitality(G,weight='weight') assert_equal(v,{0:16.0, 1:16.0, 2:16.0}) def test_closeness_vitality_weighted_multidigraph(self): G=nx.MultiDiGraph() G.add_cycle([0,1,2],weight=2) v=nx.closeness_vitality(G,weight='weight') assert_equal(v,{0:16.0, 1:16.0, 2:16.0}) networkx-1.11/networkx/algorithms/tests/test_swap.py0000644000175000017500000000241112637544500022756 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * from networkx import * import random random.seed(0) def test_double_edge_swap(): graph = barabasi_albert_graph(200,1) degrees = sorted(graph.degree().values()) G = double_edge_swap(graph, 40) assert_equal(degrees, sorted(graph.degree().values())) def test_connected_double_edge_swap(): graph = barabasi_albert_graph(200,1) degrees = sorted(graph.degree().values()) G = connected_double_edge_swap(graph, 40) assert_true(is_connected(graph)) assert_equal(degrees, sorted(graph.degree().values())) @raises(NetworkXError) def test_double_edge_swap_small(): G = nx.double_edge_swap(nx.path_graph(3)) @raises(NetworkXError) def test_double_edge_swap_tries(): G = nx.double_edge_swap(nx.path_graph(10),nswap=1,max_tries=0) @raises(NetworkXError) def test_connected_double_edge_swap_small(): G = nx.connected_double_edge_swap(nx.path_graph(3)) @raises(NetworkXError) def test_connected_double_edge_swap_not_connected(): G = nx.path_graph(3) G.add_path([10,11,12]) G = nx.connected_double_edge_swap(G) def test_degree_seq_c4(): G = cycle_graph(4) degrees = sorted(G.degree().values()) G = double_edge_swap(G,1,100) assert_equal(degrees, sorted(G.degree().values())) networkx-1.11/networkx/algorithms/tests/test_mis.py0000644000175000017500000000667012637544500022607 0ustar aricaric00000000000000#!/usr/bin/env python # -*- coding: utf-8 -*- # $Id: test_maximal_independent_set.py 577 2011-03-01 06:07:53Z lleeoo $ """ Tests for maximal (not maximum) independent sets. """ # Copyright (C) 2004-2015 by # Leo Lopes # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. __author__ = """Leo Lopes (leo.lopes@monash.edu)""" from nose.tools import * import networkx as nx import random class TestMaximalIndependantSet(object): def setup(self): self.florentine = nx.Graph() self.florentine.add_edge('Acciaiuoli','Medici') self.florentine.add_edge('Castellani','Peruzzi') self.florentine.add_edge('Castellani','Strozzi') self.florentine.add_edge('Castellani','Barbadori') self.florentine.add_edge('Medici','Barbadori') self.florentine.add_edge('Medici','Ridolfi') self.florentine.add_edge('Medici','Tornabuoni') self.florentine.add_edge('Medici','Albizzi') self.florentine.add_edge('Medici','Salviati') self.florentine.add_edge('Salviati','Pazzi') self.florentine.add_edge('Peruzzi','Strozzi') self.florentine.add_edge('Peruzzi','Bischeri') self.florentine.add_edge('Strozzi','Ridolfi') self.florentine.add_edge('Strozzi','Bischeri') self.florentine.add_edge('Ridolfi','Tornabuoni') self.florentine.add_edge('Tornabuoni','Guadagni') self.florentine.add_edge('Albizzi','Ginori') self.florentine.add_edge('Albizzi','Guadagni') self.florentine.add_edge('Bischeri','Guadagni') self.florentine.add_edge('Guadagni','Lamberteschi') def test_K5(self): """Maximal independent set: K5""" G = nx.complete_graph(5) for node in G: assert_equal(nx.maximal_independent_set(G, [node]), [node]) def test_K55(self): """Maximal independent set: K55""" G = nx.complete_graph(55) for node in G: assert_equal(nx.maximal_independent_set(G, [node]), [node]) def test_exception(self): """Bad input should raise exception.""" G = self.florentine assert_raises(nx.NetworkXUnfeasible, nx.maximal_independent_set, G, ["Smith"]) assert_raises(nx.NetworkXUnfeasible, nx.maximal_independent_set, G, ["Salviati", "Pazzi"]) def test_florentine_family(self): G = self.florentine indep = nx.maximal_independent_set(G, ["Medici", "Bischeri"]) assert_equal(sorted(indep), sorted(["Medici", "Bischeri", "Castellani", "Pazzi", "Ginori", "Lamberteschi"])) def test_bipartite(self): G = nx.complete_bipartite_graph(12, 34) indep = nx.maximal_independent_set(G, [4, 5, 9, 10]) assert_equal(sorted(indep), list(range(12))) def test_random_graphs(self): """Generate 50 random graphs of different types and sizes and make sure that all sets are independent and maximal.""" for i in range(0, 50, 10): G = nx.random_graphs.erdos_renyi_graph(i*10+1, random.random()) IS = nx.maximal_independent_set(G) assert_false(G.subgraph(IS).edges()) neighbors_of_MIS = set.union(*(set(G.neighbors(v)) for v in IS)) for v in set(G.nodes()).difference(IS): assert_true(v in neighbors_of_MIS) networkx-1.11/networkx/algorithms/tests/test_distance_regular.py0000644000175000017500000000330112637544450025322 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * import networkx as nx class TestDistanceRegular: def test_is_distance_regular(self): assert_true(nx.is_distance_regular(nx.icosahedral_graph())) assert_true(nx.is_distance_regular(nx.petersen_graph())) assert_true(nx.is_distance_regular(nx.cubical_graph())) assert_true(nx.is_distance_regular(nx.complete_bipartite_graph(3,3))) assert_true(nx.is_distance_regular(nx.tetrahedral_graph())) assert_true(nx.is_distance_regular(nx.dodecahedral_graph())) assert_true(nx.is_distance_regular(nx.pappus_graph())) assert_true(nx.is_distance_regular(nx.heawood_graph())) assert_true(nx.is_distance_regular(nx.cycle_graph(3))) # no distance regular assert_false(nx.is_distance_regular(nx.path_graph(4))) def test_not_connected(self): G=nx.cycle_graph(4) G.add_cycle([5,6,7]) assert_false(nx.is_distance_regular(G)) def test_global_parameters(self): b,c=nx.intersection_array(nx.cycle_graph(5)) g=nx.global_parameters(b,c) assert_equal(list(g),[(0, 0, 2), (1, 0, 1), (1, 1, 0)]) b,c=nx.intersection_array(nx.cycle_graph(3)) g=nx.global_parameters(b,c) assert_equal(list(g),[(0, 0, 2), (1, 1, 0)]) def test_intersection_array(self): b,c=nx.intersection_array(nx.cycle_graph(5)) assert_equal(b,[2, 1]) assert_equal(c,[1, 1]) b,c=nx.intersection_array(nx.dodecahedral_graph()) assert_equal(b,[3, 2, 1, 1, 1]) assert_equal(c,[1, 1, 1, 2, 3]) b,c=nx.intersection_array(nx.icosahedral_graph()) assert_equal(b,[5, 2, 1]) assert_equal(c,[1, 2, 5]) networkx-1.11/networkx/algorithms/tests/test_block.py0000644000175000017500000000716012637544450023110 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * import networkx class TestBlock: def test_path(self): G=networkx.path_graph(6) partition=[[0,1],[2,3],[4,5]] M=networkx.blockmodel(G,partition) assert_equal(sorted(M.nodes()),[0,1,2]) assert_equal(sorted(M.edges()),[(0,1),(1,2)]) for n in M.nodes(): assert_equal(M.node[n]['nedges'],1) assert_equal(M.node[n]['nnodes'],2) assert_equal(M.node[n]['density'],1.0) def test_multigraph_path(self): G=networkx.MultiGraph(networkx.path_graph(6)) partition=[[0,1],[2,3],[4,5]] M=networkx.blockmodel(G,partition,multigraph=True) assert_equal(sorted(M.nodes()),[0,1,2]) assert_equal(sorted(M.edges()),[(0,1),(1,2)]) for n in M.nodes(): assert_equal(M.node[n]['nedges'],1) assert_equal(M.node[n]['nnodes'],2) assert_equal(M.node[n]['density'],1.0) def test_directed_path(self): G = networkx.DiGraph() G.add_path(list(range(6))) partition=[[0,1],[2,3],[4,5]] M=networkx.blockmodel(G,partition) assert_equal(sorted(M.nodes()),[0,1,2]) assert_equal(sorted(M.edges()),[(0,1),(1,2)]) for n in M.nodes(): assert_equal(M.node[n]['nedges'],1) assert_equal(M.node[n]['nnodes'],2) assert_equal(M.node[n]['density'],0.5) def test_directed_multigraph_path(self): G = networkx.MultiDiGraph() G.add_path(list(range(6))) partition=[[0,1],[2,3],[4,5]] M=networkx.blockmodel(G,partition,multigraph=True) assert_equal(sorted(M.nodes()),[0,1,2]) assert_equal(sorted(M.edges()),[(0,1),(1,2)]) for n in M.nodes(): assert_equal(M.node[n]['nedges'],1) assert_equal(M.node[n]['nnodes'],2) assert_equal(M.node[n]['density'],0.5) @raises(networkx.NetworkXException) def test_overlapping(self): G=networkx.path_graph(6) partition=[[0,1,2],[2,3],[4,5]] M=networkx.blockmodel(G,partition) def test_weighted_path(self): G=networkx.path_graph(6) G[0][1]['weight']=1 G[1][2]['weight']=2 G[2][3]['weight']=3 G[3][4]['weight']=4 G[4][5]['weight']=5 partition=[[0,1],[2,3],[4,5]] M=networkx.blockmodel(G,partition) assert_equal(sorted(M.nodes()),[0,1,2]) assert_equal(sorted(M.edges()),[(0,1),(1,2)]) assert_equal(M[0][1]['weight'],2) assert_equal(M[1][2]['weight'],4) for n in M.nodes(): assert_equal(M.node[n]['nedges'],1) assert_equal(M.node[n]['nnodes'],2) assert_equal(M.node[n]['density'],1.0) def test_barbell(self): G=networkx.barbell_graph(3,0) partition=[[0,1,2],[3,4,5]] M=networkx.blockmodel(G,partition) assert_equal(sorted(M.nodes()),[0,1]) assert_equal(sorted(M.edges()),[(0,1)]) for n in M.nodes(): assert_equal(M.node[n]['nedges'],3) assert_equal(M.node[n]['nnodes'],3) assert_equal(M.node[n]['density'],1.0) def test_barbell_plus(self): G=networkx.barbell_graph(3,0) G.add_edge(0,5) # add extra edge between bells partition=[[0,1,2],[3,4,5]] M=networkx.blockmodel(G,partition) assert_equal(sorted(M.nodes()),[0,1]) assert_equal(sorted(M.edges()),[(0,1)]) assert_equal(M[0][1]['weight'],2) for n in M.nodes(): assert_equal(M.node[n]['nedges'],3) assert_equal(M.node[n]['nnodes'],3) assert_equal(M.node[n]['density'],1.0) networkx-1.11/networkx/algorithms/tests/test_cycles.py0000644000175000017500000001715612637544450023306 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * import networkx import networkx as nx from networkx.algorithms import find_cycle FORWARD = nx.algorithms.edgedfs.FORWARD REVERSE = nx.algorithms.edgedfs.REVERSE class TestCycles: def setUp(self): G=networkx.Graph() G.add_cycle([0,1,2,3]) G.add_cycle([0,3,4,5]) G.add_cycle([0,1,6,7,8]) G.add_edge(8,9) self.G=G def is_cyclic_permutation(self,a,b): n=len(a) if len(b)!=n: return False l=a+a return any(l[i:i+n]==b for i in range(2*n-n+1)) def test_cycle_basis(self): G=self.G cy=networkx.cycle_basis(G,0) sort_cy= sorted( sorted(c) for c in cy ) assert_equal(sort_cy, [[0,1,2,3],[0,1,6,7,8],[0,3,4,5]]) cy=networkx.cycle_basis(G,1) sort_cy= sorted( sorted(c) for c in cy ) assert_equal(sort_cy, [[0,1,2,3],[0,1,6,7,8],[0,3,4,5]]) cy=networkx.cycle_basis(G,9) sort_cy= sorted( sorted(c) for c in cy ) assert_equal(sort_cy, [[0,1,2,3],[0,1,6,7,8],[0,3,4,5]]) # test disconnected graphs G.add_cycle(list("ABC")) cy=networkx.cycle_basis(G,9) sort_cy= sorted(sorted(c) for c in cy[:-1]) + [sorted(cy[-1])] assert_equal(sort_cy, [[0,1,2,3],[0,1,6,7,8],[0,3,4,5],['A','B','C']]) @raises(nx.NetworkXNotImplemented) def test_cycle_basis(self): G=nx.DiGraph() cy=networkx.cycle_basis(G,0) @raises(nx.NetworkXNotImplemented) def test_cycle_basis(self): G=nx.MultiGraph() cy=networkx.cycle_basis(G,0) def test_simple_cycles(self): G = nx.DiGraph([(0, 0), (0, 1), (0, 2), (1, 2), (2, 0), (2, 1), (2, 2)]) cc=sorted(nx.simple_cycles(G)) ca=[[0], [0, 1, 2], [0, 2], [1, 2], [2]] for c in cc: assert_true(any(self.is_cyclic_permutation(c,rc) for rc in ca)) @raises(nx.NetworkXNotImplemented) def test_simple_cycles_graph(self): G = nx.Graph() c = sorted(nx.simple_cycles(G)) def test_unsortable(self): # TODO What does this test do? das 6/2013 G=nx.DiGraph() G.add_cycle(['a',1]) c=list(nx.simple_cycles(G)) def test_simple_cycles_small(self): G = nx.DiGraph() G.add_cycle([1,2,3]) c=sorted(nx.simple_cycles(G)) assert_equal(len(c),1) assert_true(self.is_cyclic_permutation(c[0],[1,2,3])) G.add_cycle([10,20,30]) cc=sorted(nx.simple_cycles(G)) ca=[[1,2,3],[10,20,30]] for c in cc: assert_true(any(self.is_cyclic_permutation(c,rc) for rc in ca)) def test_simple_cycles_empty(self): G = nx.DiGraph() assert_equal(list(nx.simple_cycles(G)),[]) def test_complete_directed_graph(self): # see table 2 in Johnson's paper ncircuits=[1,5,20,84,409,2365,16064] for n,c in zip(range(2,9),ncircuits): G=nx.DiGraph(nx.complete_graph(n)) assert_equal(len(list(nx.simple_cycles(G))),c) def worst_case_graph(self,k): # see figure 1 in Johnson's paper # this graph has excactly 3k simple cycles G=nx.DiGraph() for n in range(2,k+2): G.add_edge(1,n) G.add_edge(n,k+2) G.add_edge(2*k+1,1) for n in range(k+2,2*k+2): G.add_edge(n,2*k+2) G.add_edge(n,n+1) G.add_edge(2*k+3,k+2) for n in range(2*k+3,3*k+3): G.add_edge(2*k+2,n) G.add_edge(n,3*k+3) G.add_edge(3*k+3,2*k+2) return G def test_worst_case_graph(self): # see figure 1 in Johnson's paper for k in range(3,10): G=self.worst_case_graph(k) l=len(list(nx.simple_cycles(G))) assert_equal(l,3*k) def test_recursive_simple_and_not(self): for k in range(2,10): G=self.worst_case_graph(k) cc=sorted(nx.simple_cycles(G)) rcc=sorted(nx.recursive_simple_cycles(G)) assert_equal(len(cc),len(rcc)) for c in cc: assert_true(any(self.is_cyclic_permutation(c,rc) for rc in rcc)) for rc in rcc: assert_true(any(self.is_cyclic_permutation(rc,c) for c in cc)) def test_simple_graph_with_reported_bug(self): G=nx.DiGraph() edges = [(0, 2), (0, 3), (1, 0), (1, 3), (2, 1), (2, 4), \ (3, 2), (3, 4), (4, 0), (4, 1), (4, 5), (5, 0), \ (5, 1), (5, 2), (5, 3)] G.add_edges_from(edges) cc=sorted(nx.simple_cycles(G)) assert_equal(len(cc),26) rcc=sorted(nx.recursive_simple_cycles(G)) assert_equal(len(cc),len(rcc)) for c in cc: assert_true(any(self.is_cyclic_permutation(c,rc) for rc in rcc)) for rc in rcc: assert_true(any(self.is_cyclic_permutation(rc,c) for c in cc)) # These tests might fail with hash randomization since they depend on # edge_dfs. For more information, see the comments in: # networkx/algorithms/traversal/tests/test_edgedfs.py class TestFindCycle(object): def setUp(self): self.nodes = [0, 1, 2, 3] self.edges = [(-1, 0), (0, 1), (1, 0), (1, 0), (2, 1), (3, 1)] def test_graph(self): G = nx.Graph(self.edges) assert_raises(nx.exception.NetworkXNoCycle, find_cycle, G, self.nodes) def test_digraph(self): G = nx.DiGraph(self.edges) x = list(find_cycle(G, self.nodes)) x_= [(0, 1), (1, 0)] assert_equal(x, x_) def test_multigraph(self): G = nx.MultiGraph(self.edges) x = list(find_cycle(G, self.nodes)) x_ = [(0, 1, 0), (1, 0, 1)] # or (1, 0, 2) # Hash randomization...could be any edge. assert_equal(x[0], x_[0]) assert_equal(x[1][:2], x_[1][:2]) def test_multidigraph(self): G = nx.MultiDiGraph(self.edges) x = list(find_cycle(G, self.nodes)) x_ = [(0, 1, 0), (1, 0, 0)] # (1, 0, 1) assert_equal(x[0], x_[0]) assert_equal(x[1][:2], x_[1][:2]) def test_digraph_ignore(self): G = nx.DiGraph(self.edges) x = list(find_cycle(G, self.nodes, orientation='ignore')) x_ = [(0, 1, FORWARD), (1, 0, FORWARD)] assert_equal(x, x_) def test_multidigraph_ignore(self): G = nx.MultiDiGraph(self.edges) x = list(find_cycle(G, self.nodes, orientation='ignore')) x_ = [(0, 1, 0, FORWARD), (1, 0, 0, FORWARD)] # or (1, 0, 1, 1) assert_equal(x[0], x_[0]) assert_equal(x[1][:2], x_[1][:2]) assert_equal(x[1][3], x_[1][3]) def test_multidigraph_ignore2(self): # Loop traversed an edge while ignoring its orientation. G = nx.MultiDiGraph([(0,1), (1,2), (1,2)]) x = list(find_cycle(G, [0,1,2], orientation='ignore')) x_ = [(1,2,0,FORWARD), (1,2,1,REVERSE)] assert_equal(x, x_) def test_multidigraph_ignore2(self): # Node 2 doesn't need to be searched again from visited from 4. # The goal here is to cover the case when 2 to be researched from 4, # when 4 is visited from the first time (so we must make sure that 4 # is not visited from 2, and hence, we respect the edge orientation). G = nx.MultiDiGraph([(0,1), (1,2), (2,3), (4,2)]) assert_raises(nx.exception.NetworkXNoCycle, find_cycle, G, [0,1,2,3,4], orientation='original') def test_dag(self): G = nx.DiGraph([(0,1), (0,2), (1,2)]) assert_raises(nx.exception.NetworkXNoCycle, find_cycle, G, orientation='original') x = list(find_cycle(G, orientation='ignore')) assert_equal(x, [(0,1,FORWARD), (1,2,FORWARD), (0,2,REVERSE)]) networkx-1.11/networkx/algorithms/tests/test_cluster.py0000644000175000017500000001623112637544450023476 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * import networkx as nx class TestTriangles: def test_empty(self): G = nx.Graph() assert_equal(list(nx.triangles(G).values()),[]) def test_path(self): G = nx.path_graph(10) assert_equal(list(nx.triangles(G).values()), [0, 0, 0, 0, 0, 0, 0, 0, 0, 0]) assert_equal(nx.triangles(G), {0: 0, 1: 0, 2: 0, 3: 0, 4: 0, 5: 0, 6: 0, 7: 0, 8: 0, 9: 0}) def test_cubical(self): G = nx.cubical_graph() assert_equal(list(nx.triangles(G).values()), [0, 0, 0, 0, 0, 0, 0, 0]) assert_equal(nx.triangles(G,1),0) assert_equal(list(nx.triangles(G,[1,2]).values()),[0, 0]) assert_equal(nx.triangles(G,1),0) assert_equal(nx.triangles(G,[1,2]),{1: 0, 2: 0}) def test_k5(self): G = nx.complete_graph(5) assert_equal(list(nx.triangles(G).values()),[6, 6, 6, 6, 6]) assert_equal(sum(nx.triangles(G).values())/3.0,10) assert_equal(nx.triangles(G,1),6) G.remove_edge(1,2) assert_equal(list(nx.triangles(G).values()),[5, 3, 3, 5, 5]) assert_equal(nx.triangles(G,1),3) class TestWeightedClustering: def test_clustering(self): G = nx.Graph() assert_equal(list(nx.clustering(G,weight='weight').values()),[]) assert_equal(nx.clustering(G),{}) def test_path(self): G = nx.path_graph(10) assert_equal(list(nx.clustering(G,weight='weight').values()), [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]) assert_equal(nx.clustering(G,weight='weight'), {0: 0.0, 1: 0.0, 2: 0.0, 3: 0.0, 4: 0.0, 5: 0.0, 6: 0.0, 7: 0.0, 8: 0.0, 9: 0.0}) def test_cubical(self): G = nx.cubical_graph() assert_equal(list(nx.clustering(G,weight='weight').values()), [0, 0, 0, 0, 0, 0, 0, 0]) assert_equal(nx.clustering(G,1),0) assert_equal(list(nx.clustering(G,[1,2],weight='weight').values()),[0, 0]) assert_equal(nx.clustering(G,1,weight='weight'),0) assert_equal(nx.clustering(G,[1,2],weight='weight'),{1: 0, 2: 0}) def test_k5(self): G = nx.complete_graph(5) assert_equal(list(nx.clustering(G,weight='weight').values()),[1, 1, 1, 1, 1]) assert_equal(nx.average_clustering(G,weight='weight'),1) G.remove_edge(1,2) assert_equal(list(nx.clustering(G,weight='weight').values()), [5./6., 1.0, 1.0, 5./6., 5./6.]) assert_equal(nx.clustering(G,[1,4],weight='weight'),{1: 1.0, 4: 0.83333333333333337}) def test_triangle_and_edge(self): G=nx.Graph() G.add_cycle([0,1,2]) G.add_edge(0,4,weight=2) assert_equal(nx.clustering(G)[0],1.0/3.0) assert_equal(nx.clustering(G,weight='weight')[0],1.0/6.0) class TestClustering: def test_clustering(self): G = nx.Graph() assert_equal(list(nx.clustering(G).values()),[]) assert_equal(nx.clustering(G),{}) def test_path(self): G = nx.path_graph(10) assert_equal(list(nx.clustering(G).values()), [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]) assert_equal(nx.clustering(G), {0: 0.0, 1: 0.0, 2: 0.0, 3: 0.0, 4: 0.0, 5: 0.0, 6: 0.0, 7: 0.0, 8: 0.0, 9: 0.0}) def test_cubical(self): G = nx.cubical_graph() assert_equal(list(nx.clustering(G).values()), [0, 0, 0, 0, 0, 0, 0, 0]) assert_equal(nx.clustering(G,1),0) assert_equal(list(nx.clustering(G,[1,2]).values()),[0, 0]) assert_equal(nx.clustering(G,1),0) assert_equal(nx.clustering(G,[1,2]),{1: 0, 2: 0}) def test_k5(self): G = nx.complete_graph(5) assert_equal(list(nx.clustering(G).values()),[1, 1, 1, 1, 1]) assert_equal(nx.average_clustering(G),1) G.remove_edge(1,2) assert_equal(list(nx.clustering(G).values()), [5./6., 1.0, 1.0, 5./6., 5./6.]) assert_equal(nx.clustering(G,[1,4]),{1: 1.0, 4: 0.83333333333333337}) class TestTransitivity: def test_transitivity(self): G = nx.Graph() assert_equal(nx.transitivity(G),0.0) def test_path(self): G = nx.path_graph(10) assert_equal(nx.transitivity(G),0.0) def test_cubical(self): G = nx.cubical_graph() assert_equal(nx.transitivity(G),0.0) def test_k5(self): G = nx.complete_graph(5) assert_equal(nx.transitivity(G),1.0) G.remove_edge(1,2) assert_equal(nx.transitivity(G),0.875) # def test_clustering_transitivity(self): # # check that weighted average of clustering is transitivity # G = nx.complete_graph(5) # G.remove_edge(1,2) # t1=nx.transitivity(G) # (cluster_d2,weights)=nx.clustering(G,weights=True) # trans=[] # for v in G.nodes(): # trans.append(cluster_d2[v]*weights[v]) # t2=sum(trans) # assert_almost_equal(abs(t1-t2),0) class TestSquareClustering: def test_clustering(self): G = nx.Graph() assert_equal(list(nx.square_clustering(G).values()),[]) assert_equal(nx.square_clustering(G),{}) def test_path(self): G = nx.path_graph(10) assert_equal(list(nx.square_clustering(G).values()), [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]) assert_equal(nx.square_clustering(G), {0: 0.0, 1: 0.0, 2: 0.0, 3: 0.0, 4: 0.0, 5: 0.0, 6: 0.0, 7: 0.0, 8: 0.0, 9: 0.0}) def test_cubical(self): G = nx.cubical_graph() assert_equal(list(nx.square_clustering(G).values()), [0.5, 0.5, 0.5, 0.5, 0.5, 0.5, 0.5, 0.5]) assert_equal(list(nx.square_clustering(G,[1,2]).values()),[0.5, 0.5]) assert_equal(nx.square_clustering(G,[1])[1],0.5) assert_equal(nx.square_clustering(G,[1,2]),{1: 0.5, 2: 0.5}) def test_k5(self): G = nx.complete_graph(5) assert_equal(list(nx.square_clustering(G).values()),[1, 1, 1, 1, 1]) def test_bipartite_k5(self): G = nx.complete_bipartite_graph(5,5) assert_equal(list(nx.square_clustering(G).values()), [1, 1, 1, 1, 1, 1, 1, 1, 1, 1]) def test_lind_square_clustering(self): """Test C4 for figure 1 Lind et al (2005)""" G = nx.Graph([(1,2),(1,3),(1,6),(1,7),(2,4),(2,5), (3,4),(3,5),(6,7),(7,8),(6,8),(7,9), (7,10),(6,11),(6,12),(2,13),(2,14),(3,15),(3,16)]) G1 = G.subgraph([1,2,3,4,5,13,14,15,16]) G2 = G.subgraph([1,6,7,8,9,10,11,12]) assert_equal(nx.square_clustering(G, [1])[1], 3/75.0) assert_equal(nx.square_clustering(G1, [1])[1], 2/6.0) assert_equal(nx.square_clustering(G2, [1])[1], 1/5.0) def test_average_clustering(): G=nx.cycle_graph(3) G.add_edge(2,3) assert_equal(nx.average_clustering(G),(1+1+1/3.0)/4.0) assert_equal(nx.average_clustering(G,count_zeros=True),(1+1+1/3.0)/4.0) assert_equal(nx.average_clustering(G,count_zeros=False),(1+1+1/3.0)/3.0) networkx-1.11/networkx/algorithms/tests/test_link_prediction.py0000644000175000017500000004467512637544450025207 0ustar aricaric00000000000000import math from functools import partial from nose.tools import * import networkx as nx def _test_func(G, ebunch, expected, predict_func, **kwargs): result = predict_func(G, ebunch, **kwargs) exp_dict = dict((tuple(sorted([u, v])), score) for u, v, score in expected) res_dict = dict((tuple(sorted([u, v])), score) for u, v, score in result) assert_equal(len(exp_dict), len(res_dict)) for p in exp_dict: assert_almost_equal(exp_dict[p], res_dict[p]) class TestResourceAllocationIndex(): def setUp(self): self.func = nx.resource_allocation_index self.test = partial(_test_func, predict_func=self.func) def test_K5(self): G = nx.complete_graph(5) self.test(G, [(0, 1)], [(0, 1, 0.75)]) def test_P3(self): G = nx.path_graph(3) self.test(G, [(0, 2)], [(0, 2, 0.5)]) def test_S4(self): G = nx.star_graph(4) self.test(G, [(1, 2)], [(1, 2, 0.25)]) @raises(nx.NetworkXNotImplemented) def test_digraph(self): G = nx.DiGraph() G.add_edges_from([(0, 1), (1, 2)]) self.func(G, [(0, 2)]) @raises(nx.NetworkXNotImplemented) def test_multigraph(self): G = nx.MultiGraph() G.add_edges_from([(0, 1), (1, 2)]) self.func(G, [(0, 2)]) @raises(nx.NetworkXNotImplemented) def test_multidigraph(self): G = nx.MultiDiGraph() G.add_edges_from([(0, 1), (1, 2)]) self.func(G, [(0, 2)]) def test_no_common_neighbor(self): G = nx.Graph() G.add_nodes_from([0, 1]) self.test(G, [(0, 1)], [(0, 1, 0)]) def test_equal_nodes(self): G = nx.complete_graph(4) self.test(G, [(0, 0)], [(0, 0, 1)]) def test_all_nonexistent_edges(self): G = nx.Graph() G.add_edges_from([(0, 1), (0, 2), (2, 3)]) self.test(G, None, [(0, 3, 0.5), (1, 2, 0.5), (1, 3, 0)]) class TestJaccardCoefficient(): def setUp(self): self.func = nx.jaccard_coefficient self.test = partial(_test_func, predict_func=self.func) def test_K5(self): G = nx.complete_graph(5) self.test(G, [(0, 1)], [(0, 1, 0.6)]) def test_P4(self): G = nx.path_graph(4) self.test(G, [(0, 2)], [(0, 2, 0.5)]) @raises(nx.NetworkXNotImplemented) def test_digraph(self): G = nx.DiGraph() G.add_edges_from([(0, 1), (1, 2)]) self.func(G, [(0, 2)]) @raises(nx.NetworkXNotImplemented) def test_multigraph(self): G = nx.MultiGraph() G.add_edges_from([(0, 1), (1, 2)]) self.func(G, [(0, 2)]) @raises(nx.NetworkXNotImplemented) def test_multidigraph(self): G = nx.MultiDiGraph() G.add_edges_from([(0, 1), (1, 2)]) self.func(G, [(0, 2)]) def test_no_common_neighbor(self): G = nx.Graph() G.add_edges_from([(0, 1), (2, 3)]) self.test(G, [(0, 2)], [(0, 2, 0)]) def test_isolated_nodes(self): G = nx.Graph() G.add_nodes_from([0, 1]) self.test(G, [(0, 1)], [(0, 1, 0)]) def test_all_nonexistent_edges(self): G = nx.Graph() G.add_edges_from([(0, 1), (0, 2), (2, 3)]) self.test(G, None, [(0, 3, 0.5), (1, 2, 0.5), (1, 3, 0)]) class TestAdamicAdarIndex(): def setUp(self): self.func = nx.adamic_adar_index self.test = partial(_test_func, predict_func=self.func) def test_K5(self): G = nx.complete_graph(5) self.test(G, [(0, 1)], [(0, 1, 3 / math.log(4))]) def test_P3(self): G = nx.path_graph(3) self.test(G, [(0, 2)], [(0, 2, 1 / math.log(2))]) def test_S4(self): G = nx.star_graph(4) self.test(G, [(1, 2)], [(1, 2, 1 / math.log(4))]) @raises(nx.NetworkXNotImplemented) def test_digraph(self): G = nx.DiGraph() G.add_edges_from([(0, 1), (1, 2)]) self.func(G, [(0, 2)]) @raises(nx.NetworkXNotImplemented) def test_multigraph(self): G = nx.MultiGraph() G.add_edges_from([(0, 1), (1, 2)]) self.func(G, [(0, 2)]) @raises(nx.NetworkXNotImplemented) def test_multidigraph(self): G = nx.MultiDiGraph() G.add_edges_from([(0, 1), (1, 2)]) self.func(G, [(0, 2)]) def test_no_common_neighbor(self): G = nx.Graph() G.add_nodes_from([0, 1]) self.test(G, [(0, 1)], [(0, 1, 0)]) def test_equal_nodes(self): G = nx.complete_graph(4) self.test(G, [(0, 0)], [(0, 0, 3 / math.log(3))]) def test_all_nonexistent_edges(self): G = nx.Graph() G.add_edges_from([(0, 1), (0, 2), (2, 3)]) self.test(G, None, [(0, 3, 1 / math.log(2)), (1, 2, 1 / math.log(2)), (1, 3, 0)]) class TestPreferentialAttachment(): def setUp(self): self.func = nx.preferential_attachment self.test = partial(_test_func, predict_func=self.func) def test_K5(self): G = nx.complete_graph(5) self.test(G, [(0, 1)], [(0, 1, 16)]) def test_P3(self): G = nx.path_graph(3) self.test(G, [(0, 1)], [(0, 1, 2)]) def test_S4(self): G = nx.star_graph(4) self.test(G, [(0, 2)], [(0, 2, 4)]) @raises(nx.NetworkXNotImplemented) def test_digraph(self): G = nx.DiGraph() G.add_edges_from([(0, 1), (1, 2)]) self.func(G, [(0, 2)]) @raises(nx.NetworkXNotImplemented) def test_multigraph(self): G = nx.MultiGraph() G.add_edges_from([(0, 1), (1, 2)]) self.func(G, [(0, 2)]) @raises(nx.NetworkXNotImplemented) def test_multidigraph(self): G = nx.MultiDiGraph() G.add_edges_from([(0, 1), (1, 2)]) self.func(G, [(0, 2)]) def test_zero_degrees(self): G = nx.Graph() G.add_nodes_from([0, 1]) self.test(G, [(0, 1)], [(0, 1, 0)]) def test_all_nonexistent_edges(self): G = nx.Graph() G.add_edges_from([(0, 1), (0, 2), (2, 3)]) self.test(G, None, [(0, 3, 2), (1, 2, 2), (1, 3, 1)]) class TestCNSoundarajanHopcroft(): def setUp(self): self.func = nx.cn_soundarajan_hopcroft self.test = partial(_test_func, predict_func=self.func, community='community') def test_K5(self): G = nx.complete_graph(5) G.node[0]['community'] = 0 G.node[1]['community'] = 0 G.node[2]['community'] = 0 G.node[3]['community'] = 0 G.node[4]['community'] = 1 self.test(G, [(0, 1)], [(0, 1, 5)]) def test_P3(self): G = nx.path_graph(3) G.node[0]['community'] = 0 G.node[1]['community'] = 1 G.node[2]['community'] = 0 self.test(G, [(0, 2)], [(0, 2, 1)]) def test_S4(self): G = nx.star_graph(4) G.node[0]['community'] = 1 G.node[1]['community'] = 1 G.node[2]['community'] = 1 G.node[3]['community'] = 0 G.node[4]['community'] = 0 self.test(G, [(1, 2)], [(1, 2, 2)]) @raises(nx.NetworkXNotImplemented) def test_digraph(self): G = nx.DiGraph() G.add_edges_from([(0, 1), (1, 2)]) G.node[0]['community'] = 0 G.node[1]['community'] = 0 G.node[2]['community'] = 0 self.func(G, [(0, 2)]) @raises(nx.NetworkXNotImplemented) def test_multigraph(self): G = nx.MultiGraph() G.add_edges_from([(0, 1), (1, 2)]) G.node[0]['community'] = 0 G.node[1]['community'] = 0 G.node[2]['community'] = 0 self.func(G, [(0, 2)]) @raises(nx.NetworkXNotImplemented) def test_multidigraph(self): G = nx.MultiDiGraph() G.add_edges_from([(0, 1), (1, 2)]) G.node[0]['community'] = 0 G.node[1]['community'] = 0 G.node[2]['community'] = 0 self.func(G, [(0, 2)]) def test_no_common_neighbor(self): G = nx.Graph() G.add_nodes_from([0, 1]) G.node[0]['community'] = 0 G.node[1]['community'] = 0 self.test(G, [(0, 1)], [(0, 1, 0)]) def test_equal_nodes(self): G = nx.complete_graph(3) G.node[0]['community'] = 0 G.node[1]['community'] = 0 G.node[2]['community'] = 0 self.test(G, [(0, 0)], [(0, 0, 4)]) def test_different_community(self): G = nx.Graph() G.add_edges_from([(0, 1), (0, 2), (1, 3), (2, 3)]) G.node[0]['community'] = 0 G.node[1]['community'] = 0 G.node[2]['community'] = 0 G.node[3]['community'] = 1 self.test(G, [(0, 3)], [(0, 3, 2)]) @raises(nx.NetworkXAlgorithmError) def test_no_community_information(self): G = nx.complete_graph(5) list(self.func(G, [(0, 1)])) @raises(nx.NetworkXAlgorithmError) def test_insufficient_community_information(self): G = nx.Graph() G.add_edges_from([(0, 1), (0, 2), (1, 3), (2, 3)]) G.node[0]['community'] = 0 G.node[1]['community'] = 0 G.node[3]['community'] = 0 list(self.func(G, [(0, 3)])) def test_sufficient_community_information(self): G = nx.Graph() G.add_edges_from([(0, 1), (1, 2), (1, 3), (2, 4), (3, 4), (4, 5)]) G.node[1]['community'] = 0 G.node[2]['community'] = 0 G.node[3]['community'] = 0 G.node[4]['community'] = 0 self.test(G, [(1, 4)], [(1, 4, 4)]) def test_custom_community_attribute_name(self): G = nx.Graph() G.add_edges_from([(0, 1), (0, 2), (1, 3), (2, 3)]) G.node[0]['cmty'] = 0 G.node[1]['cmty'] = 0 G.node[2]['cmty'] = 0 G.node[3]['cmty'] = 1 self.test(G, [(0, 3)], [(0, 3, 2)], community='cmty') def test_all_nonexistent_edges(self): G = nx.Graph() G.add_edges_from([(0, 1), (0, 2), (2, 3)]) G.node[0]['community'] = 0 G.node[1]['community'] = 1 G.node[2]['community'] = 0 G.node[3]['community'] = 0 self.test(G, None, [(0, 3, 2), (1, 2, 1), (1, 3, 0)]) class TestRAIndexSoundarajanHopcroft(): def setUp(self): self.func = nx.ra_index_soundarajan_hopcroft self.test = partial(_test_func, predict_func=self.func, community='community') def test_K5(self): G = nx.complete_graph(5) G.node[0]['community'] = 0 G.node[1]['community'] = 0 G.node[2]['community'] = 0 G.node[3]['community'] = 0 G.node[4]['community'] = 1 self.test(G, [(0, 1)], [(0, 1, 0.5)]) def test_P3(self): G = nx.path_graph(3) G.node[0]['community'] = 0 G.node[1]['community'] = 1 G.node[2]['community'] = 0 self.test(G, [(0, 2)], [(0, 2, 0)]) def test_S4(self): G = nx.star_graph(4) G.node[0]['community'] = 1 G.node[1]['community'] = 1 G.node[2]['community'] = 1 G.node[3]['community'] = 0 G.node[4]['community'] = 0 self.test(G, [(1, 2)], [(1, 2, 0.25)]) @raises(nx.NetworkXNotImplemented) def test_digraph(self): G = nx.DiGraph() G.add_edges_from([(0, 1), (1, 2)]) G.node[0]['community'] = 0 G.node[1]['community'] = 0 G.node[2]['community'] = 0 self.func(G, [(0, 2)]) @raises(nx.NetworkXNotImplemented) def test_multigraph(self): G = nx.MultiGraph() G.add_edges_from([(0, 1), (1, 2)]) G.node[0]['community'] = 0 G.node[1]['community'] = 0 G.node[2]['community'] = 0 self.func(G, [(0, 2)]) @raises(nx.NetworkXNotImplemented) def test_multidigraph(self): G = nx.MultiDiGraph() G.add_edges_from([(0, 1), (1, 2)]) G.node[0]['community'] = 0 G.node[1]['community'] = 0 G.node[2]['community'] = 0 self.func(G, [(0, 2)]) def test_no_common_neighbor(self): G = nx.Graph() G.add_nodes_from([0, 1]) G.node[0]['community'] = 0 G.node[1]['community'] = 0 self.test(G, [(0, 1)], [(0, 1, 0)]) def test_equal_nodes(self): G = nx.complete_graph(3) G.node[0]['community'] = 0 G.node[1]['community'] = 0 G.node[2]['community'] = 0 self.test(G, [(0, 0)], [(0, 0, 1)]) def test_different_community(self): G = nx.Graph() G.add_edges_from([(0, 1), (0, 2), (1, 3), (2, 3)]) G.node[0]['community'] = 0 G.node[1]['community'] = 0 G.node[2]['community'] = 0 G.node[3]['community'] = 1 self.test(G, [(0, 3)], [(0, 3, 0)]) @raises(nx.NetworkXAlgorithmError) def test_no_community_information(self): G = nx.complete_graph(5) list(self.func(G, [(0, 1)])) @raises(nx.NetworkXAlgorithmError) def test_insufficient_community_information(self): G = nx.Graph() G.add_edges_from([(0, 1), (0, 2), (1, 3), (2, 3)]) G.node[0]['community'] = 0 G.node[1]['community'] = 0 G.node[3]['community'] = 0 list(self.func(G, [(0, 3)])) def test_sufficient_community_information(self): G = nx.Graph() G.add_edges_from([(0, 1), (1, 2), (1, 3), (2, 4), (3, 4), (4, 5)]) G.node[1]['community'] = 0 G.node[2]['community'] = 0 G.node[3]['community'] = 0 G.node[4]['community'] = 0 self.test(G, [(1, 4)], [(1, 4, 1)]) def test_custom_community_attribute_name(self): G = nx.Graph() G.add_edges_from([(0, 1), (0, 2), (1, 3), (2, 3)]) G.node[0]['cmty'] = 0 G.node[1]['cmty'] = 0 G.node[2]['cmty'] = 0 G.node[3]['cmty'] = 1 self.test(G, [(0, 3)], [(0, 3, 0)], community='cmty') def test_all_nonexistent_edges(self): G = nx.Graph() G.add_edges_from([(0, 1), (0, 2), (2, 3)]) G.node[0]['community'] = 0 G.node[1]['community'] = 1 G.node[2]['community'] = 0 G.node[3]['community'] = 0 self.test(G, None, [(0, 3, 0.5), (1, 2, 0), (1, 3, 0)]) class TestWithinInterCluster(): def setUp(self): self.delta = 0.001 self.func = nx.within_inter_cluster self.test = partial(_test_func, predict_func=self.func, delta=self.delta, community='community') def test_K5(self): G = nx.complete_graph(5) G.node[0]['community'] = 0 G.node[1]['community'] = 0 G.node[2]['community'] = 0 G.node[3]['community'] = 0 G.node[4]['community'] = 1 self.test(G, [(0, 1)], [(0, 1, 2 / (1 + self.delta))]) def test_P3(self): G = nx.path_graph(3) G.node[0]['community'] = 0 G.node[1]['community'] = 1 G.node[2]['community'] = 0 self.test(G, [(0, 2)], [(0, 2, 0)]) def test_S4(self): G = nx.star_graph(4) G.node[0]['community'] = 1 G.node[1]['community'] = 1 G.node[2]['community'] = 1 G.node[3]['community'] = 0 G.node[4]['community'] = 0 self.test(G, [(1, 2)], [(1, 2, 1 / self.delta)]) @raises(nx.NetworkXNotImplemented) def test_digraph(self): G = nx.DiGraph() G.add_edges_from([(0, 1), (1, 2)]) G.node[0]['community'] = 0 G.node[1]['community'] = 0 G.node[2]['community'] = 0 self.func(G, [(0, 2)]) @raises(nx.NetworkXNotImplemented) def test_multigraph(self): G = nx.MultiGraph() G.add_edges_from([(0, 1), (1, 2)]) G.node[0]['community'] = 0 G.node[1]['community'] = 0 G.node[2]['community'] = 0 self.func(G, [(0, 2)]) @raises(nx.NetworkXNotImplemented) def test_multidigraph(self): G = nx.MultiDiGraph() G.add_edges_from([(0, 1), (1, 2)]) G.node[0]['community'] = 0 G.node[1]['community'] = 0 G.node[2]['community'] = 0 self.func(G, [(0, 2)]) def test_no_common_neighbor(self): G = nx.Graph() G.add_nodes_from([0, 1]) G.node[0]['community'] = 0 G.node[1]['community'] = 0 self.test(G, [(0, 1)], [(0, 1, 0)]) def test_equal_nodes(self): G = nx.complete_graph(3) G.node[0]['community'] = 0 G.node[1]['community'] = 0 G.node[2]['community'] = 0 self.test(G, [(0, 0)], [(0, 0, 2 / self.delta)]) def test_different_community(self): G = nx.Graph() G.add_edges_from([(0, 1), (0, 2), (1, 3), (2, 3)]) G.node[0]['community'] = 0 G.node[1]['community'] = 0 G.node[2]['community'] = 0 G.node[3]['community'] = 1 self.test(G, [(0, 3)], [(0, 3, 0)]) def test_no_inter_cluster_common_neighbor(self): G = nx.complete_graph(4) G.node[0]['community'] = 0 G.node[1]['community'] = 0 G.node[2]['community'] = 0 G.node[3]['community'] = 0 self.test(G, [(0, 3)], [(0, 3, 2 / self.delta)]) @raises(nx.NetworkXAlgorithmError) def test_no_community_information(self): G = nx.complete_graph(5) list(self.func(G, [(0, 1)])) @raises(nx.NetworkXAlgorithmError) def test_insufficient_community_information(self): G = nx.Graph() G.add_edges_from([(0, 1), (0, 2), (1, 3), (2, 3)]) G.node[0]['community'] = 0 G.node[1]['community'] = 0 G.node[3]['community'] = 0 list(self.func(G, [(0, 3)])) def test_sufficient_community_information(self): G = nx.Graph() G.add_edges_from([(0, 1), (1, 2), (1, 3), (2, 4), (3, 4), (4, 5)]) G.node[1]['community'] = 0 G.node[2]['community'] = 0 G.node[3]['community'] = 0 G.node[4]['community'] = 0 self.test(G, [(1, 4)], [(1, 4, 2 / self.delta)]) @raises(nx.NetworkXAlgorithmError) def test_zero_delta(self): G = nx.complete_graph(3) G.node[0]['community'] = 0 G.node[1]['community'] = 0 G.node[2]['community'] = 0 list(self.func(G, [(0, 1)], 0)) @raises(nx.NetworkXAlgorithmError) def test_negative_delta(self): G = nx.complete_graph(3) G.node[0]['community'] = 0 G.node[1]['community'] = 0 G.node[2]['community'] = 0 list(self.func(G, [(0, 1)], -0.5)) def test_custom_community_attribute_name(self): G = nx.complete_graph(4) G.node[0]['cmty'] = 0 G.node[1]['cmty'] = 0 G.node[2]['cmty'] = 0 G.node[3]['cmty'] = 0 self.test(G, [(0, 3)], [(0, 3, 2 / self.delta)], community='cmty') def test_all_nonexistent_edges(self): G = nx.Graph() G.add_edges_from([(0, 1), (0, 2), (2, 3)]) G.node[0]['community'] = 0 G.node[1]['community'] = 1 G.node[2]['community'] = 0 G.node[3]['community'] = 0 self.test(G, None, [(0, 3, 1 / self.delta), (1, 2, 0), (1, 3, 0)]) networkx-1.11/networkx/algorithms/tests/test_dominating.py0000644000175000017500000000250412637544450024144 0ustar aricaric00000000000000from nose.tools import assert_equal, assert_true, assert_false, raises import networkx as nx def test_dominating_set(): G = nx.gnp_random_graph(100, 0.1) D = nx.dominating_set(G) assert_true(nx.is_dominating_set(G, D)) D = nx.dominating_set(G, start_with=0) assert_true(nx.is_dominating_set(G, D)) def test_complete(): """ In complete graphs each node is a dominating set. Thus the dominating set has to be of cardinality 1. """ K4 = nx.complete_graph(4) assert_equal(len(nx.dominating_set(K4)), 1) K5 = nx.complete_graph(5) assert_equal(len(nx.dominating_set(K5)), 1) @raises(nx.NetworkXError) def test_dominating_set_error(): G = nx.path_graph(4) D = nx.dominating_set(G, start_with=10) def test_is_dominating_set(): G = nx.path_graph(4) d = set([1, 3]) assert_true(nx.is_dominating_set(G, d)) d = set([0, 2]) assert_true(nx.is_dominating_set(G, d)) d = set([1]) assert_false(nx.is_dominating_set(G, d)) def test_wikipedia_is_dominating_set(): """Example from http://en.wikipedia.org/wiki/Dominating_set """ G = nx.cycle_graph(4) G.add_edges_from([(0, 4), (1, 4), (2,5)]) assert_true(nx.is_dominating_set(G, set([4, 3, 5]))) assert_true(nx.is_dominating_set(G, set([0, 2]))) assert_true(nx.is_dominating_set(G, set([1, 2]))) networkx-1.11/networkx/algorithms/tests/test_graphical.py0000644000175000017500000001106412637544500023742 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * from nose import SkipTest import networkx as nx def test_valid_degree_sequence1(): n = 100 p = .3 for i in range(10): G = nx.erdos_renyi_graph(n,p) deg = list(G.degree().values()) assert_true( nx.is_valid_degree_sequence(deg, method='eg') ) assert_true( nx.is_valid_degree_sequence(deg, method='hh') ) def test_valid_degree_sequence2(): n = 100 for i in range(10): G = nx.barabasi_albert_graph(n,1) deg = list(G.degree().values()) assert_true( nx.is_valid_degree_sequence(deg, method='eg') ) assert_true( nx.is_valid_degree_sequence(deg, method='hh') ) @raises(nx.NetworkXException) def test_string_input(): a = nx.is_valid_degree_sequence([],'foo') def test_negative_input(): assert_false(nx.is_valid_degree_sequence([-1],'hh')) assert_false(nx.is_valid_degree_sequence([-1],'eg')) assert_false(nx.is_valid_degree_sequence([72.5],'eg')) class TestAtlas(object): @classmethod def setupClass(cls): global atlas import platform if platform.python_implementation()=='Jython': raise SkipTest('graph atlas not available under Jython.') import networkx.generators.atlas as atlas def setUp(self): self.GAG=atlas.graph_atlas_g() def test_atlas(self): for graph in self.GAG: deg = list(graph.degree().values()) assert_true( nx.is_valid_degree_sequence(deg, method='eg') ) assert_true( nx.is_valid_degree_sequence(deg, method='hh') ) def test_small_graph_true(): z=[5,3,3,3,3,2,2,2,1,1,1] assert_true(nx.is_valid_degree_sequence(z, method='hh')) assert_true(nx.is_valid_degree_sequence(z, method='eg')) z=[10,3,3,3,3,2,2,2,2,2,2] assert_true(nx.is_valid_degree_sequence(z, method='hh')) assert_true(nx.is_valid_degree_sequence(z, method='eg')) z=[1, 1, 1, 1, 1, 2, 2, 2, 3, 4] assert_true(nx.is_valid_degree_sequence(z, method='hh')) assert_true(nx.is_valid_degree_sequence(z, method='eg')) def test_small_graph_false(): z=[1000,3,3,3,3,2,2,2,1,1,1] assert_false(nx.is_valid_degree_sequence(z, method='hh')) assert_false(nx.is_valid_degree_sequence(z, method='eg')) z=[6,5,4,4,2,1,1,1] assert_false(nx.is_valid_degree_sequence(z, method='hh')) assert_false(nx.is_valid_degree_sequence(z, method='eg')) z=[1, 1, 1, 1, 1, 1, 2, 2, 2, 3, 4] assert_false(nx.is_valid_degree_sequence(z, method='hh')) assert_false(nx.is_valid_degree_sequence(z, method='eg')) def test_directed_degree_sequence(): # Test a range of valid directed degree sequences n, r = 100, 10 p = 1.0 / r for i in range(r): G = nx.erdos_renyi_graph(n,p*(i+1),None,True) din = list(G.in_degree().values()) dout = list(G.out_degree().values()) assert_true(nx.is_digraphical(din, dout)) def test_small_directed_sequences(): dout=[5,3,3,3,3,2,2,2,1,1,1] din=[3,3,3,3,3,2,2,2,2,2,1] assert_true(nx.is_digraphical(din, dout)) # Test nongraphical directed sequence dout = [1000,3,3,3,3,2,2,2,1,1,1] din=[103,102,102,102,102,102,102,102,102,102] assert_false(nx.is_digraphical(din, dout)) # Test digraphical small sequence dout=[1, 1, 1, 1, 1, 2, 2, 2, 3, 4] din=[2, 2, 2, 2, 2, 2, 2, 2, 1, 1] assert_true(nx.is_digraphical(din, dout)) # Test nonmatching sum din=[2, 2, 2, 2, 2, 2, 2, 2, 1, 1, 1] assert_false(nx.is_digraphical(din, dout)) # Test for negative integer in sequence din=[2, 2, 2, -2, 2, 2, 2, 2, 1, 1, 4] assert_false(nx.is_digraphical(din, dout)) def test_multi_sequence(): # Test nongraphical multi sequence seq=[1000,3,3,3,3,2,2,2,1,1] assert_false(nx.is_multigraphical(seq)) # Test small graphical multi sequence seq=[6,5,4,4,2,1,1,1] assert_true(nx.is_multigraphical(seq)) # Test for negative integer in sequence seq=[6,5,4,-4,2,1,1,1] assert_false(nx.is_multigraphical(seq)) # Test for sequence with odd sum seq=[1, 1, 1, 1, 1, 1, 2, 2, 2, 3, 4] assert_false(nx.is_multigraphical(seq)) def test_pseudo_sequence(): # Test small valid pseudo sequence seq=[1000,3,3,3,3,2,2,2,1,1] assert_true(nx.is_pseudographical(seq)) # Test for sequence with odd sum seq=[1000,3,3,3,3,2,2,2,1,1,1] assert_false(nx.is_pseudographical(seq)) # Test for negative integer in sequence seq=[1000,3,3,3,3,2,2,-2,1,1] assert_false(nx.is_pseudographical(seq)) networkx-1.11/networkx/algorithms/tests/test_simple_paths.py0000644000175000017500000002274712637544500024512 0ustar aricaric00000000000000#!/usr/bin/env python import random from nose.tools import * import networkx as nx from networkx import convert_node_labels_to_integers as cnlti from networkx.algorithms.simple_paths import _bidirectional_shortest_path from networkx.algorithms.simple_paths import _bidirectional_dijkstra # Tests for all_simple_paths def test_all_simple_paths(): G = nx.path_graph(4) paths = nx.all_simple_paths(G,0,3) assert_equal(list(list(p) for p in paths),[[0,1,2,3]]) def test_all_simple_paths_cutoff(): G = nx.complete_graph(4) paths = nx.all_simple_paths(G,0,1,cutoff=1) assert_equal(list(list(p) for p in paths),[[0,1]]) paths = nx.all_simple_paths(G,0,1,cutoff=2) assert_equal(list(list(p) for p in paths),[[0,1],[0,2,1],[0,3,1]]) def test_all_simple_paths_multigraph(): G = nx.MultiGraph([(1,2),(1,2)]) paths = nx.all_simple_paths(G,1,2) assert_equal(list(list(p) for p in paths),[[1,2],[1,2]]) def test_all_simple_paths_multigraph_with_cutoff(): G = nx.MultiGraph([(1,2),(1,2),(1,10),(10,2)]) paths = nx.all_simple_paths(G,1,2, cutoff=1) assert_equal(list(list(p) for p in paths),[[1,2],[1,2]]) def test_all_simple_paths_directed(): G = nx.DiGraph() G.add_path([1,2,3]) G.add_path([3,2,1]) paths = nx.all_simple_paths(G,1,3) assert_equal(list(list(p) for p in paths),[[1,2,3]]) def test_all_simple_paths_empty(): G = nx.path_graph(4) paths = nx.all_simple_paths(G,0,3,cutoff=2) assert_equal(list(list(p) for p in paths),[]) def hamiltonian_path(G,source): source = next(G.nodes_iter()) neighbors = set(G[source])-set([source]) n = len(G) for target in neighbors: for path in nx.all_simple_paths(G,source,target): if len(path) == n: yield path def test_hamiltonian_path(): from itertools import permutations G=nx.complete_graph(4) paths = [list(p) for p in hamiltonian_path(G,0)] exact = [[0]+list(p) for p in permutations([1,2,3],3) ] assert_equal(sorted(paths),sorted(exact)) def test_cutoff_zero(): G = nx.complete_graph(4) paths = nx.all_simple_paths(G,0,3,cutoff=0) assert_equal(list(list(p) for p in paths),[]) paths = nx.all_simple_paths(nx.MultiGraph(G),0,3,cutoff=0) assert_equal(list(list(p) for p in paths),[]) @raises(nx.NetworkXError) def test_source_missing(): G = nx.Graph() G.add_path([1,2,3]) paths = list(nx.all_simple_paths(nx.MultiGraph(G),0,3)) @raises(nx.NetworkXError) def test_target_missing(): G = nx.Graph() G.add_path([1,2,3]) paths = list(nx.all_simple_paths(nx.MultiGraph(G),1,4)) # Tests for shortest_simple_paths def test_shortest_simple_paths(): G = cnlti(nx.grid_2d_graph(4, 4), first_label=1, ordering="sorted") paths = nx.shortest_simple_paths(G, 1, 12) assert_equal(next(paths), [1, 2, 3, 4, 8, 12]) assert_equal(next(paths), [1, 5, 6, 7, 8, 12]) assert_equal([len(path) for path in nx.shortest_simple_paths(G, 1, 12)], sorted([len(path) for path in nx.all_simple_paths(G, 1, 12)])) def test_shortest_simple_paths_directed(): G = nx.cycle_graph(7, create_using=nx.DiGraph()) paths = nx.shortest_simple_paths(G, 0, 3) assert_equal([path for path in paths], [[0, 1, 2, 3]]) def test_Greg_Bernstein(): g1 = nx.Graph() g1.add_nodes_from(["N0", "N1", "N2", "N3", "N4"]) g1.add_edge("N4", "N1", weight=10.0, capacity=50, name="L5") g1.add_edge("N4", "N0", weight=7.0, capacity=40, name="L4") g1.add_edge("N0", "N1", weight=10.0, capacity=45, name="L1") g1.add_edge("N3", "N0", weight=10.0, capacity=50, name="L0") g1.add_edge("N2", "N3", weight=12.0, capacity=30, name="L2") g1.add_edge("N1", "N2", weight=15.0, capacity=42, name="L3") solution = [['N1', 'N0', 'N3'], ['N1', 'N2', 'N3'], ['N1', 'N4', 'N0', 'N3']] result = list(nx.shortest_simple_paths(g1, 'N1', 'N3', weight='weight')) assert_equal(result, solution) def test_weighted_shortest_simple_path(): def cost_func(path): return sum(G.edge[u][v]['weight'] for (u, v) in zip(path, path[1:])) G = nx.complete_graph(5) weight = {(u, v): random.randint(1, 100) for (u, v) in G.edges()} nx.set_edge_attributes(G, 'weight', weight) cost = 0 for path in nx.shortest_simple_paths(G, 0, 3, weight='weight'): this_cost = cost_func(path) assert_true(cost <= this_cost) cost = this_cost def test_directed_weighted_shortest_simple_path(): def cost_func(path): return sum(G.edge[u][v]['weight'] for (u, v) in zip(path, path[1:])) G = nx.complete_graph(5) G = G.to_directed() weight = {(u, v): random.randint(1, 100) for (u, v) in G.edges()} nx.set_edge_attributes(G, 'weight', weight) cost = 0 for path in nx.shortest_simple_paths(G, 0, 3, weight='weight'): this_cost = cost_func(path) assert_true(cost <= this_cost) cost = this_cost def test_weight_name(): G = nx.cycle_graph(7) nx.set_edge_attributes(G, 'weight', 1) nx.set_edge_attributes(G, 'foo', 1) G.edge[1][2]['foo'] = 7 paths = list(nx.shortest_simple_paths(G, 0, 3, weight='foo')) solution = [[0, 6, 5, 4, 3], [0, 1, 2, 3]] assert_equal(paths, solution) @raises(nx.NetworkXError) def test_ssp_source_missing(): G = nx.Graph() G.add_path([1,2,3]) paths = list(nx.shortest_simple_paths(G, 0, 3)) @raises(nx.NetworkXError) def test_ssp_target_missing(): G = nx.Graph() G.add_path([1,2,3]) paths = list(nx.shortest_simple_paths(G, 1, 4)) @raises(nx.NetworkXNotImplemented) def test_ssp_multigraph(): G = nx.MultiGraph() G.add_path([1,2,3]) paths = list(nx.shortest_simple_paths(G, 1, 4)) @raises(nx.NetworkXNoPath) def test_ssp_source_missing(): G = nx.Graph() G.add_path([0, 1, 2]) G.add_path([3, 4, 5]) paths = list(nx.shortest_simple_paths(G, 0, 3)) def test_bidirectional_shortest_path_restricted(): grid = cnlti(nx.grid_2d_graph(4,4), first_label=1, ordering="sorted") cycle = nx.cycle_graph(7) directed_cycle = nx.cycle_graph(7, create_using=nx.DiGraph()) length, path = _bidirectional_shortest_path(cycle, 0, 3) assert_equal(path, [0, 1, 2, 3]) length, path = _bidirectional_shortest_path(cycle, 0, 3, ignore_nodes=[1]) assert_equal(path, [0, 6, 5, 4, 3]) length, path = _bidirectional_shortest_path(grid, 1, 12) assert_equal(path, [1, 2, 3, 4, 8, 12]) length, path = _bidirectional_shortest_path(grid, 1, 12, ignore_nodes=[2]) assert_equal(path, [1, 5, 6, 10, 11, 12]) length, path = _bidirectional_shortest_path(grid, 1, 12, ignore_nodes=[2, 6]) assert_equal(path, [1, 5, 9, 10, 11, 12]) length, path = _bidirectional_shortest_path(grid, 1, 12, ignore_nodes=[2, 6], ignore_edges=[(10, 11)]) assert_equal(path, [1, 5, 9, 10, 14, 15, 16, 12]) length, path = _bidirectional_shortest_path(directed_cycle, 0, 3) assert_equal(path, [0, 1, 2, 3]) assert_raises( nx.NetworkXNoPath, _bidirectional_shortest_path, directed_cycle, 0, 3, ignore_nodes=[1], ) length, path = _bidirectional_shortest_path(directed_cycle, 0, 3, ignore_edges=[(2, 1)]) assert_equal(path, [0, 1, 2, 3]) assert_raises( nx.NetworkXNoPath, _bidirectional_shortest_path, directed_cycle, 0, 3, ignore_edges=[(1, 2)], ) def validate_path(G, s, t, soln_len, path): assert_equal(path[0], s) assert_equal(path[-1], t) assert_equal(soln_len, sum(G[u][v].get('weight', 1) for u, v in zip(path[:-1], path[1:]))) def validate_length_path(G, s, t, soln_len, length, path): assert_equal(soln_len, length) validate_path(G, s, t, length, path) def test_bidirectional_dijksta_restricted(): XG = nx.DiGraph() XG.add_weighted_edges_from([('s', 'u', 10), ('s', 'x', 5), ('u', 'v', 1), ('u', 'x', 2), ('v', 'y', 1), ('x', 'u', 3), ('x', 'v', 5), ('x', 'y', 2), ('y', 's', 7), ('y', 'v', 6)]) XG3 = nx.Graph() XG3.add_weighted_edges_from([[0, 1, 2], [1, 2, 12], [2, 3, 1], [3, 4, 5], [4, 5, 1], [5, 0, 10]]) validate_length_path(XG, 's', 'v', 9, *_bidirectional_dijkstra(XG, 's', 'v')) validate_length_path(XG, 's', 'v', 10, *_bidirectional_dijkstra(XG, 's', 'v', ignore_nodes=['u'])) validate_length_path(XG, 's', 'v', 11, *_bidirectional_dijkstra(XG, 's', 'v', ignore_edges=[('s', 'x')])) assert_raises( nx.NetworkXNoPath, _bidirectional_dijkstra, XG, 's', 'v', ignore_nodes=['u'], ignore_edges=[('s', 'x')], ) validate_length_path(XG3, 0, 3, 15, *_bidirectional_dijkstra(XG3, 0, 3)) validate_length_path(XG3, 0, 3, 16, *_bidirectional_dijkstra(XG3, 0, 3, ignore_nodes=[1])) validate_length_path(XG3, 0, 3, 16, *_bidirectional_dijkstra(XG3, 0, 3, ignore_edges=[(2, 3)])) assert_raises( nx.NetworkXNoPath, _bidirectional_dijkstra, XG3, 0, 3, ignore_nodes=[1], ignore_edges=[(5, 4)], ) @raises(nx.NetworkXNoPath) def test_bidirectional_dijkstra_no_path(): G = nx.Graph() G.add_path([1, 2, 3]) G.add_path([4, 5, 6]) path = _bidirectional_dijkstra(G, 1, 6) networkx-1.11/networkx/algorithms/tests/test_smetric.py0000644000175000017500000000066312637544450023465 0ustar aricaric00000000000000 from nose.tools import assert_equal,raises import networkx as nx def test_smetric(): g = nx.Graph() g.add_edge(1,2) g.add_edge(2,3) g.add_edge(2,4) g.add_edge(1,4) sm = nx.s_metric(g,normalized=False) assert_equal(sm, 19.0) # smNorm = nx.s_metric(g,normalized=True) # assert_equal(smNorm, 0.95) @raises(nx.NetworkXError) def test_normalized(): sm = nx.s_metric(nx.Graph(),normalized=True) networkx-1.11/networkx/algorithms/tests/test_mst.py0000644000175000017500000001141112637544500022607 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * import networkx as nx class TestMST: def setUp(self): # example from Wikipedia: http://en.wikipedia.org/wiki/Kruskal's_algorithm G=nx.Graph() edgelist = [(0,3,[('weight',5)]), (0,1,[('weight',7)]), (1,3,[('weight',9)]), (1,2,[('weight',8)]), (1,4,[('weight',7)]), (3,4,[('weight',15)]), (3,5,[('weight',6)]), (2,4,[('weight',5)]), (4,5,[('weight',8)]), (4,6,[('weight',9)]), (5,6,[('weight',11)])] G.add_edges_from(edgelist) self.G=G tree_edgelist = [(0,1,{'weight':7}), (0,3,{'weight':5}), (3,5,{'weight':6}), (1,4,{'weight':7}), (4,2,{'weight':5}), (4,6,{'weight':9})] self.tree_edgelist=sorted((sorted((u, v))[0], sorted((u, v))[1], d) for u,v,d in tree_edgelist) def test_mst(self): T=nx.minimum_spanning_tree(self.G) assert_equal(T.edges(data=True),self.tree_edgelist) def test_mst_edges(self): edgelist=sorted(nx.minimum_spanning_edges(self.G)) assert_equal(edgelist,self.tree_edgelist) def test_mst_disconnected(self): G=nx.Graph() G.add_path([1,2]) G.add_path([10,20]) T=nx.minimum_spanning_tree(G) assert_equal(sorted(map(sorted,T.edges())),[[1, 2], [10, 20]]) assert_equal(sorted(T.nodes()),[1, 2, 10, 20]) def test_mst_isolate(self): G=nx.Graph() G.add_nodes_from([1,2]) T=nx.minimum_spanning_tree(G) assert_equal(sorted(T.nodes()),[1, 2]) assert_equal(sorted(T.edges()),[]) def test_mst_attributes(self): G=nx.Graph() G.add_edge(1,2,weight=1,color='red',distance=7) G.add_edge(2,3,weight=1,color='green',distance=2) G.add_edge(1,3,weight=10,color='blue',distance=1) G.add_node(13,color='purple') G.graph['foo']='bar' T=nx.minimum_spanning_tree(G) assert_equal(T.graph,G.graph) assert_equal(T.node[13],G.node[13]) assert_equal(T.edge[1][2],G.edge[1][2]) def test_mst_edges_specify_weight(self): G=nx.Graph() G.add_edge(1,2,weight=1,color='red',distance=7) G.add_edge(1,3,weight=30,color='blue',distance=1) G.add_edge(2,3,weight=1,color='green',distance=1) G.add_node(13,color='purple') G.graph['foo']='bar' T=nx.minimum_spanning_tree(G) assert_equal(sorted(T.nodes()),[1,2,3,13]) assert_equal(sorted(T.edges()),[(1,2),(2,3)]) T=nx.minimum_spanning_tree(G,weight='distance') assert_equal(sorted(T.edges()),[(1,3),(2,3)]) assert_equal(sorted(T.nodes()),[1,2,3,13]) def test_prim_mst(self): T=nx.prim_mst(self.G) assert_equal(T.edges(data=True),self.tree_edgelist) def test_prim_mst_edges(self): edgelist=sorted(nx.prim_mst_edges(self.G)) edgelist=sorted((sorted((u, v))[0], sorted((u, v))[1], d) for u,v,d in edgelist) assert_equal(edgelist,self.tree_edgelist) def test_prim_mst_disconnected(self): G=nx.Graph() G.add_path([1,2]) G.add_path([10,20]) T=nx.prim_mst(G) assert_equal(sorted(map(sorted,T.edges())),[[1, 2], [10, 20]]) assert_equal(sorted(T.nodes()),[1, 2, 10, 20]) def test_prim_mst_isolate(self): G=nx.Graph() G.add_nodes_from([1,2]) T=nx.prim_mst(G) assert_equal(sorted(T.nodes()),[1, 2]) assert_equal(sorted(T.edges()),[]) def test_prim_mst_attributes(self): G=nx.Graph() G.add_edge(1,2,weight=1,color='red',distance=7) G.add_edge(2,3,weight=1,color='green',distance=2) G.add_edge(1,3,weight=10,color='blue',distance=1) G.add_node(13,color='purple') G.graph['foo']='bar' T=nx.prim_mst(G) assert_equal(T.graph,G.graph) assert_equal(T.node[13],G.node[13]) assert_equal(T.edge[1][2],G.edge[1][2]) def test_prim_mst_edges_specify_weight(self): G=nx.Graph() G.add_edge(1,2,weight=1,color='red',distance=7) G.add_edge(1,3,weight=30,color='blue',distance=1) G.add_edge(2,3,weight=1,color='green',distance=1) G.add_node(13,color='purple') G.graph['foo']='bar' T=nx.prim_mst(G) assert_equal(sorted(T.nodes()),[1,2,3,13]) assert_equal(sorted(T.edges()),[(1,2),(2,3)]) T=nx.prim_mst(G,weight='distance') assert_equal(sorted(T.edges()),[(1,3),(2,3)]) assert_equal(sorted(T.nodes()),[1,2,3,13]) networkx-1.11/networkx/algorithms/tests/test_triads.py0000644000175000017500000000161112637544450023277 0ustar aricaric00000000000000# test_triads.py - unit tests for the triads module # # Copyright 2015 NetworkX developers. # Copyright 2009 Diederik van Liere . # # This file is part of NetworkX. # # NetworkX is distributed under a BSD license; see LICENSE.txt for more # information. """Unit tests for the :mod:`networkx.algorithms.triads` module.""" from nose.tools import assert_equal import networkx as nx def test_triadic_census(): """Tests the triadic census function.""" G = nx.DiGraph() G.add_edges_from(['01', '02', '03', '04', '05', '12', '16', '51', '56', '65']) expected = {'030T': 2, '120C': 1, '210': 0, '120U': 0, '012': 9, '102': 3, '021U': 0, '111U': 0, '003': 8, '030C': 0, '021D': 9, '201': 0, '111D': 1, '300': 0, '120D': 0, '021C': 2} actual = nx.triadic_census(G) assert_equal(expected, actual) networkx-1.11/networkx/algorithms/tests/test_richclub.py0000644000175000017500000000173612637544450023614 0ustar aricaric00000000000000import networkx as nx from nose.tools import * def test_richclub(): G = nx.Graph([(0,1),(0,2),(1,2),(1,3),(1,4),(4,5)]) rc = nx.richclub.rich_club_coefficient(G,normalized=False) assert_equal(rc,{0: 12.0/30,1:8.0/12}) # test single value rc0 = nx.richclub.rich_club_coefficient(G,normalized=False)[0] assert_equal(rc0,12.0/30.0) def test_richclub_normalized(): G = nx.Graph([(0,1),(0,2),(1,2),(1,3),(1,4),(4,5)]) rcNorm = nx.richclub.rich_club_coefficient(G,Q=2) assert_equal(rcNorm,{0:1.0,1:1.0}) def test_richclub2(): T = nx.balanced_tree(2,10) rc = nx.richclub.rich_club_coefficient(T,normalized=False) assert_equal(rc,{0:4092/(2047*2046.0), 1:(2044.0/(1023*1022)), 2:(2040.0/(1022*1021))}) #def test_richclub2_normalized(): # T = nx.balanced_tree(2,10) # rcNorm = nx.richclub.rich_club_coefficient(T,Q=2) # assert_true(rcNorm[0] ==1.0 and rcNorm[1] < 0.9 and rcNorm[2] < 0.9) networkx-1.11/networkx/algorithms/tests/test_dag.py0000644000175000017500000002330412637544500022543 0ustar aricaric00000000000000#!/usr/bin/env python from itertools import combinations from nose.tools import * from networkx.testing.utils import assert_edges_equal import networkx as nx class TestDAG: def setUp(self): pass def test_topological_sort1(self): DG = nx.DiGraph() DG.add_edges_from([(1, 2), (1, 3), (2, 3)]) assert_equal(nx.topological_sort(DG), [1, 2, 3]) assert_equal(nx.topological_sort_recursive(DG), [1, 2, 3]) DG.add_edge(3, 2) assert_raises(nx.NetworkXUnfeasible, nx.topological_sort, DG) assert_raises(nx.NetworkXUnfeasible, nx.topological_sort_recursive, DG) DG.remove_edge(2, 3) assert_equal(nx.topological_sort(DG), [1, 3, 2]) assert_equal(nx.topological_sort_recursive(DG), [1, 3, 2]) def test_reverse_topological_sort1(self): DG = nx.DiGraph() DG.add_edges_from([(1, 2), (1, 3), (2, 3)]) assert_equal(nx.topological_sort(DG, reverse=True), [3, 2, 1]) assert_equal( nx.topological_sort_recursive(DG, reverse=True), [3, 2, 1]) DG.add_edge(3, 2) assert_raises(nx.NetworkXUnfeasible, nx.topological_sort, DG, reverse=True) assert_raises(nx.NetworkXUnfeasible, nx.topological_sort_recursive, DG, reverse=True) DG.remove_edge(2, 3) assert_equal(nx.topological_sort(DG, reverse=True), [2, 3, 1]) assert_equal( nx.topological_sort_recursive(DG, reverse=True), [2, 3, 1]) def test_is_directed_acyclic_graph(self): G = nx.generators.complete_graph(2) assert_false(nx.is_directed_acyclic_graph(G)) assert_false(nx.is_directed_acyclic_graph(G.to_directed())) assert_false(nx.is_directed_acyclic_graph(nx.Graph([(3, 4), (4, 5)]))) assert_true(nx.is_directed_acyclic_graph(nx.DiGraph([(3, 4), (4, 5)]))) def test_topological_sort2(self): DG = nx.DiGraph({1: [2], 2: [3], 3: [4], 4: [5], 5: [1], 11: [12], 12: [13], 13: [14], 14: [15]}) assert_raises(nx.NetworkXUnfeasible, nx.topological_sort, DG) assert_raises(nx.NetworkXUnfeasible, nx.topological_sort_recursive, DG) assert_false(nx.is_directed_acyclic_graph(DG)) DG.remove_edge(1, 2) assert_equal(nx.topological_sort_recursive(DG), [11, 12, 13, 14, 15, 2, 3, 4, 5, 1]) assert_equal(nx.topological_sort(DG), [11, 12, 13, 14, 15, 2, 3, 4, 5, 1]) assert_true(nx.is_directed_acyclic_graph(DG)) def test_topological_sort3(self): DG = nx.DiGraph() DG.add_edges_from([(1, i) for i in range(2, 5)]) DG.add_edges_from([(2, i) for i in range(5, 9)]) DG.add_edges_from([(6, i) for i in range(9, 12)]) DG.add_edges_from([(4, i) for i in range(12, 15)]) def validate(order): ok_(isinstance(order, list)) assert_equal(set(order), set(DG)) for u, v in combinations(order, 2): assert_false(nx.has_path(DG, v, u)) validate(nx.topological_sort_recursive(DG)) validate(nx.topological_sort(DG)) DG.add_edge(14, 1) assert_raises(nx.NetworkXUnfeasible, nx.topological_sort, DG) assert_raises(nx.NetworkXUnfeasible, nx.topological_sort_recursive, DG) def test_topological_sort4(self): G = nx.Graph() G.add_edge(1, 2) assert_raises(nx.NetworkXError, nx.topological_sort, G) assert_raises(nx.NetworkXError, nx.topological_sort_recursive, G) def test_topological_sort5(self): G = nx.DiGraph() G.add_edge(0, 1) assert_equal(nx.topological_sort_recursive(G), [0, 1]) assert_equal(nx.topological_sort(G), [0, 1]) def test_nbunch_argument(self): G = nx.DiGraph() G.add_edges_from([(1, 2), (2, 3), (1, 4), (1, 5), (2, 6)]) assert_equal(nx.topological_sort(G), [1, 2, 3, 6, 4, 5]) assert_equal(nx.topological_sort_recursive(G), [1, 5, 4, 2, 6, 3]) assert_equal(nx.topological_sort(G, [1]), [1, 2, 3, 6, 4, 5]) assert_equal(nx.topological_sort_recursive(G, [1]), [1, 5, 4, 2, 6, 3]) assert_equal(nx.topological_sort(G, [5]), [5]) assert_equal(nx.topological_sort_recursive(G, [5]), [5]) def test_ancestors(self): G = nx.DiGraph() ancestors = nx.algorithms.dag.ancestors G.add_edges_from([ (1, 2), (1, 3), (4, 2), (4, 3), (4, 5), (2, 6), (5, 6)]) assert_equal(ancestors(G, 6), set([1, 2, 4, 5])) assert_equal(ancestors(G, 3), set([1, 4])) assert_equal(ancestors(G, 1), set()) assert_raises(nx.NetworkXError, ancestors, G, 8) def test_descendants(self): G = nx.DiGraph() descendants = nx.algorithms.dag.descendants G.add_edges_from([ (1, 2), (1, 3), (4, 2), (4, 3), (4, 5), (2, 6), (5, 6)]) assert_equal(descendants(G, 1), set([2, 3, 6])) assert_equal(descendants(G, 4), set([2, 3, 5, 6])) assert_equal(descendants(G, 3), set()) assert_raises(nx.NetworkXError, descendants, G, 8) def test_transitive_closure(self): G = nx.DiGraph([(1, 2), (2, 3), (3, 4)]) transitive_closure = nx.algorithms.dag.transitive_closure solution = [(1, 2), (1, 3), (1, 4), (2, 3), (2, 4), (3, 4)] assert_edges_equal(transitive_closure(G).edges(), solution) G = nx.DiGraph([(1, 2), (2, 3), (2, 4)]) solution = [(1, 2), (1, 3), (1, 4), (2, 3), (2, 4)] assert_edges_equal(transitive_closure(G).edges(), solution) G = nx.Graph([(1, 2), (2, 3), (3, 4)]) assert_raises(nx.NetworkXNotImplemented, transitive_closure, G) def _check_antichains(self, solution, result): sol = [frozenset(a) for a in solution] res = [frozenset(a) for a in result] assert_true(set(sol) == set(res)) def test_antichains(self): antichains = nx.algorithms.dag.antichains G = nx.DiGraph([(1, 2), (2, 3), (3, 4)]) solution = [[], [4], [3], [2], [1]] self._check_antichains(list(antichains(G)), solution) G = nx.DiGraph([(1, 2), (2, 3), (2, 4), (3, 5), (5, 6), (5, 7)]) solution = [[], [4], [7], [7, 4], [6], [6, 4], [6, 7], [6, 7, 4], [5], [5, 4], [3], [3, 4], [2], [1]] self._check_antichains(list(antichains(G)), solution) G = nx.DiGraph([(1, 2), (1, 3), (3, 4), (3, 5), (5, 6)]) solution = [[], [6], [5], [4], [4, 6], [4, 5], [3], [2], [2, 6], [2, 5], [2, 4], [2, 4, 6], [2, 4, 5], [2, 3], [1]] self._check_antichains(list(antichains(G)), solution) G = nx.DiGraph({0: [1, 2], 1: [4], 2: [3], 3: [4]}) solution = [[], [4], [3], [2], [1], [1, 3], [1, 2], [0]] self._check_antichains(list(antichains(G)), solution) G = nx.DiGraph() self._check_antichains(list(antichains(G)), [[]]) G = nx.DiGraph() G.add_nodes_from([0, 1, 2]) solution = [[], [0], [1], [1, 0], [2], [2, 0], [2, 1], [2, 1, 0]] self._check_antichains(list(antichains(G)), solution) f = lambda x: list(antichains(x)) G = nx.Graph([(1, 2), (2, 3), (3, 4)]) assert_raises(nx.NetworkXNotImplemented, f, G) G = nx.DiGraph([(1, 2), (2, 3), (3, 1)]) assert_raises(nx.NetworkXUnfeasible, f, G) def test_dag_longest_path(self): longest_path = nx.algorithms.dag.dag_longest_path G = nx.DiGraph([(1, 2), (2, 3), (2, 4), (3, 5), (5, 6), (5, 7)]) assert_equal(longest_path(G), [1, 2, 3, 5, 6]) G = nx.DiGraph( [(1, 2), (2, 3), (3, 4), (4, 5), (1, 3), (1, 5), (3, 5)]) assert_equal(longest_path(G), [1, 2, 3, 4, 5]) G = nx.Graph() assert_raises(nx.NetworkXNotImplemented, longest_path, G) def test_dag_longest_path_length(self): longest_path_length = nx.algorithms.dag.dag_longest_path_length G = nx.DiGraph([(1, 2), (2, 3), (2, 4), (3, 5), (5, 6), (5, 7)]) assert_equal(longest_path_length(G), 4) G = nx.DiGraph( [(1, 2), (2, 3), (3, 4), (4, 5), (1, 3), (1, 5), (3, 5)]) assert_equal(longest_path_length(G), 4) G = nx.Graph() assert_raises(nx.NetworkXNotImplemented, longest_path_length, G) def test_is_aperiodic_cycle(): G = nx.DiGraph() G.add_cycle([1, 2, 3, 4]) assert_false(nx.is_aperiodic(G)) def test_is_aperiodic_cycle2(): G = nx.DiGraph() G.add_cycle([1, 2, 3, 4]) G.add_cycle([3, 4, 5, 6, 7]) assert_true(nx.is_aperiodic(G)) def test_is_aperiodic_cycle3(): G = nx.DiGraph() G.add_cycle([1, 2, 3, 4]) G.add_cycle([3, 4, 5, 6]) assert_false(nx.is_aperiodic(G)) def test_is_aperiodic_cycle4(): G = nx.DiGraph() G.add_cycle([1, 2, 3, 4]) G.add_edge(1, 3) assert_true(nx.is_aperiodic(G)) def test_is_aperiodic_selfloop(): G = nx.DiGraph() G.add_cycle([1, 2, 3, 4]) G.add_edge(1, 1) assert_true(nx.is_aperiodic(G)) def test_is_aperiodic_raise(): G = nx.Graph() assert_raises(nx.NetworkXError, nx.is_aperiodic, G) def test_is_aperiodic_bipartite(): # Bipartite graph G = nx.DiGraph(nx.davis_southern_women_graph()) assert_false(nx.is_aperiodic(G)) def test_is_aperiodic_rary_tree(): G = nx.full_rary_tree(3, 27, create_using=nx.DiGraph()) assert_false(nx.is_aperiodic(G)) def test_is_aperiodic_disconnected(): # disconnected graph G = nx.DiGraph() G.add_cycle([1, 2, 3, 4]) G.add_cycle([5, 6, 7, 8]) assert_false(nx.is_aperiodic(G)) G.add_edge(1, 3) G.add_edge(5, 7) assert_true(nx.is_aperiodic(G)) def test_is_aperiodic_disconnected2(): G = nx.DiGraph() G.add_cycle([0, 1, 2]) G.add_edge(3, 3) assert_false(nx.is_aperiodic(G)) networkx-1.11/networkx/algorithms/tests/test_matching.py0000644000175000017500000002416012637544500023603 0ustar aricaric00000000000000#!/usr/bin/env python import math from nose.tools import * import networkx as nx class TestMatching: def setUp(self): pass def test_trivial1(self): """Empty graph""" G = nx.Graph() assert_equal(nx.max_weight_matching(G),{}) def test_trivial2(self): """Self loop""" G = nx.Graph() G.add_edge(0, 0, weight=100) assert_equal(nx.max_weight_matching(G),{}) def test_trivial3(self): """Single edge""" G = nx.Graph() G.add_edge(0, 1) assert_equal(nx.max_weight_matching(G), {0: 1, 1: 0}) def test_trivial4(self): """Small graph""" G = nx.Graph() G.add_edge('one', 'two', weight=10) G.add_edge('two', 'three', weight=11) assert_equal(nx.max_weight_matching(G), {'three': 'two', 'two': 'three'}) def test_trivial5(self): """Path""" G = nx.Graph() G.add_edge(1, 2, weight=5) G.add_edge(2, 3, weight=11) G.add_edge(3, 4, weight=5) assert_equal(nx.max_weight_matching(G), {2: 3, 3: 2}) assert_equal(nx.max_weight_matching(G, 1), {1: 2, 2: 1, 3: 4, 4: 3}) def test_floating_point_weights(self): """Floating point weights""" G = nx.Graph() G.add_edge(1, 2, weight=math.pi) G.add_edge(2, 3, weight=math.exp(1)) G.add_edge(1, 3, weight=3.0) G.add_edge(1, 4, weight=math.sqrt(2.0)) assert_equal(nx.max_weight_matching(G), {1: 4, 2: 3, 3: 2, 4: 1}) def test_negative_weights(self): """Negative weights""" G = nx.Graph() G.add_edge(1, 2, weight=2) G.add_edge(1, 3, weight=-2) G.add_edge(2, 3, weight=1) G.add_edge(2, 4, weight=-1) G.add_edge(3, 4, weight=-6) assert_equal(nx.max_weight_matching(G), {1: 2, 2: 1}) assert_equal(nx.max_weight_matching(G, 1), {1: 3, 2: 4, 3: 1, 4: 2}) def test_s_blossom(self): """Create S-blossom and use it for augmentation:""" G = nx.Graph() G.add_weighted_edges_from([ (1, 2, 8), (1, 3, 9), (2, 3, 10), (3, 4, 7) ]) assert_equal(nx.max_weight_matching(G), {1: 2, 2: 1, 3: 4, 4: 3}) G.add_weighted_edges_from([ (1, 6, 5), (4, 5, 6) ]) assert_equal(nx.max_weight_matching(G), {1: 6, 2: 3, 3: 2, 4: 5, 5: 4, 6: 1}) def test_s_t_blossom(self): """Create S-blossom, relabel as T-blossom, use for augmentation:""" G = nx.Graph() G.add_weighted_edges_from([ (1, 2, 9), (1, 3, 8), (2, 3, 10), (1, 4, 5), (4, 5, 4), (1, 6, 3) ]) assert_equal(nx.max_weight_matching(G), {1: 6, 2: 3, 3: 2, 4: 5, 5: 4, 6: 1}) G.add_edge(4, 5, weight=3) G.add_edge(1, 6, weight=4) assert_equal(nx.max_weight_matching(G), {1: 6, 2: 3, 3: 2, 4: 5, 5: 4, 6: 1}) G.remove_edge(1, 6) G.add_edge(3, 6, weight=4) assert_equal(nx.max_weight_matching(G), {1: 2, 2: 1, 3: 6, 4: 5, 5: 4, 6: 3}) def test_nested_s_blossom(self): """Create nested S-blossom, use for augmentation:""" G = nx.Graph() G.add_weighted_edges_from([ (1, 2, 9), (1, 3, 9), (2, 3, 10), (2, 4, 8), (3, 5, 8), (4, 5, 10), (5, 6, 6) ]) assert_equal(nx.max_weight_matching(G), {1: 3, 2: 4, 3: 1, 4: 2, 5: 6, 6: 5}) def test_nested_s_blossom_relabel(self): """Create S-blossom, relabel as S, include in nested S-blossom:""" G = nx.Graph() G.add_weighted_edges_from([ (1, 2, 10), (1, 7, 10), (2, 3, 12), (3, 4, 20), (3, 5, 20), (4, 5, 25), (5, 6, 10), (6, 7, 10), (7, 8, 8) ]) assert_equal(nx.max_weight_matching(G), {1: 2, 2: 1, 3: 4, 4: 3, 5: 6, 6: 5, 7: 8, 8: 7}) def test_nested_s_blossom_expand(self): """Create nested S-blossom, augment, expand recursively:""" G = nx.Graph() G.add_weighted_edges_from([ (1, 2, 8), (1, 3, 8), (2, 3, 10), (2, 4, 12),(3, 5, 12), (4, 5, 14), (4, 6, 12), (5, 7, 12), (6, 7, 14), (7, 8, 12) ]) assert_equal(nx.max_weight_matching(G), {1: 2, 2: 1, 3: 5, 4: 6, 5: 3, 6: 4, 7: 8, 8: 7}) def test_s_blossom_relabel_expand(self): """Create S-blossom, relabel as T, expand:""" G = nx.Graph() G.add_weighted_edges_from([ (1, 2, 23), (1, 5, 22), (1, 6, 15), (2, 3, 25), (3, 4, 22), (4, 5, 25), (4, 8, 14), (5, 7, 13) ]) assert_equal(nx.max_weight_matching(G), {1: 6, 2: 3, 3: 2, 4: 8, 5: 7, 6: 1, 7: 5, 8: 4}) def test_nested_s_blossom_relabel_expand(self): """Create nested S-blossom, relabel as T, expand:""" G = nx.Graph() G.add_weighted_edges_from([ (1, 2, 19), (1, 3, 20), (1, 8, 8), (2, 3, 25), (2, 4, 18), (3, 5, 18), (4, 5, 13), (4, 7, 7), (5, 6, 7) ]) assert_equal(nx.max_weight_matching(G), {1: 8, 2: 3, 3: 2, 4: 7, 5: 6, 6: 5, 7: 4, 8: 1}) def test_nasty_blossom1(self): """Create blossom, relabel as T in more than one way, expand, augment: """ G = nx.Graph() G.add_weighted_edges_from([ (1, 2, 45), (1, 5, 45), (2, 3, 50), (3, 4, 45), (4, 5, 50), (1, 6, 30), (3, 9, 35), (4, 8, 35), (5, 7, 26), (9, 10, 5) ]) assert_equal(nx.max_weight_matching(G), {1: 6, 2: 3, 3: 2, 4: 8, 5: 7, 6: 1, 7: 5, 8: 4, 9: 10, 10: 9}) def test_nasty_blossom2(self): """Again but slightly different:""" G = nx.Graph() G.add_weighted_edges_from([ (1, 2, 45), (1, 5, 45), (2, 3, 50), (3, 4, 45), (4, 5, 50), (1, 6, 30), (3, 9, 35), (4, 8, 26), (5, 7, 40), (9, 10, 5) ]) assert_equal(nx.max_weight_matching(G), {1: 6, 2: 3, 3: 2, 4: 8, 5: 7, 6: 1, 7: 5, 8: 4, 9: 10, 10: 9}) def test_nasty_blossom_least_slack(self): """Create blossom, relabel as T, expand such that a new least-slack S-to-free dge is produced, augment: """ G = nx.Graph() G.add_weighted_edges_from([ (1, 2, 45), (1, 5, 45), (2, 3, 50), (3, 4, 45), (4, 5, 50), (1, 6, 30), (3, 9, 35), (4, 8, 28), (5, 7, 26), (9, 10, 5) ]) assert_equal(nx.max_weight_matching(G), {1: 6, 2: 3, 3: 2, 4: 8, 5: 7, 6: 1, 7: 5, 8: 4, 9: 10, 10: 9}) def test_nasty_blossom_augmenting(self): """Create nested blossom, relabel as T in more than one way""" # expand outer blossom such that inner blossom ends up on an # augmenting path: G = nx.Graph() G.add_weighted_edges_from([ (1, 2, 45), (1, 7, 45), (2, 3, 50), (3, 4, 45), (4, 5, 95), (4, 6, 94), (5, 6, 94), (6, 7, 50), (1, 8, 30), (3, 11, 35), (5, 9, 36), (7, 10, 26), (11, 12, 5) ]) assert_equal(nx.max_weight_matching(G), {1: 8, 2: 3, 3: 2, 4: 6, 5: 9, 6: 4, 7: 10, 8: 1, 9: 5, 10: 7, 11: 12, 12: 11}) def test_nasty_blossom_expand_recursively(self): """Create nested S-blossom, relabel as S, expand recursively:""" G = nx.Graph() G.add_weighted_edges_from([ (1, 2, 40), (1, 3, 40), (2, 3, 60), (2, 4, 55), (3, 5, 55), (4, 5, 50), (1, 8, 15), (5, 7, 30), (7, 6, 10), (8, 10, 10), (4, 9, 30) ]) assert_equal(nx.max_weight_matching(G), {1: 2, 2: 1, 3: 5, 4: 9, 5: 3, 6: 7, 7: 6, 8: 10, 9: 4, 10: 8}) def test_maximal_matching(): graph = nx.Graph() graph.add_edge(0, 1) graph.add_edge(0, 2) graph.add_edge(0, 3) graph.add_edge(0, 4) graph.add_edge(0, 5) graph.add_edge(1, 2) matching = nx.maximal_matching(graph) vset = set(u for u, v in matching) vset = vset | set(v for u, v in matching) for edge in graph.edges_iter(): u, v = edge ok_(len(set([v]) & vset) > 0 or len(set([u]) & vset) > 0, \ "not a proper matching!") eq_(1, len(matching), "matching not length 1!") graph = nx.Graph() graph.add_edge(1, 2) graph.add_edge(1, 5) graph.add_edge(2, 3) graph.add_edge(2, 5) graph.add_edge(3, 4) graph.add_edge(3, 6) graph.add_edge(5, 6) matching = nx.maximal_matching(graph) vset = set(u for u, v in matching) vset = vset | set(v for u, v in matching) for edge in graph.edges_iter(): u, v = edge ok_(len(set([v]) & vset) > 0 or len(set([u]) & vset) > 0, \ "not a proper matching!") def test_maximal_matching_ordering(): # check edge ordering G = nx.Graph() G.add_nodes_from([100,200,300]) G.add_edges_from([(100,200),(100,300)]) matching = nx.maximal_matching(G) assert_equal(len(matching), 1) G = nx.Graph() G.add_nodes_from([200,100,300]) G.add_edges_from([(100,200),(100,300)]) matching = nx.maximal_matching(G) assert_equal(len(matching), 1) G = nx.Graph() G.add_nodes_from([300,200,100]) G.add_edges_from([(100,200),(100,300)]) matching = nx.maximal_matching(G) assert_equal(len(matching), 1) networkx-1.11/networkx/algorithms/tests/test_boundary.py0000644000175000017500000001052612637544450023641 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * import networkx as nx from networkx import convert_node_labels_to_integers as cnlti class TestBoundary: def setUp(self): self.null=nx.null_graph() self.P10=cnlti(nx.path_graph(10),first_label=1) self.K10=cnlti(nx.complete_graph(10),first_label=1) def test_null_node_boundary(self): """null graph has empty node boundaries""" null=self.null assert_equal(nx.node_boundary(null,[]),[]) assert_equal(nx.node_boundary(null,[],[]),[]) assert_equal(nx.node_boundary(null,[1,2,3]),[]) assert_equal(nx.node_boundary(null,[1,2,3],[4,5,6]),[]) assert_equal(nx.node_boundary(null,[1,2,3],[3,4,5]),[]) def test_null_edge_boundary(self): """null graph has empty edge boundaries""" null=self.null assert_equal(nx.edge_boundary(null,[]),[]) assert_equal(nx.edge_boundary(null,[],[]),[]) assert_equal(nx.edge_boundary(null,[1,2,3]),[]) assert_equal(nx.edge_boundary(null,[1,2,3],[4,5,6]),[]) assert_equal(nx.edge_boundary(null,[1,2,3],[3,4,5]),[]) def test_path_node_boundary(self): """Check node boundaries in path graph.""" P10=self.P10 assert_equal(nx.node_boundary(P10,[]),[]) assert_equal(nx.node_boundary(P10,[],[]),[]) assert_equal(nx.node_boundary(P10,[1,2,3]),[4]) assert_equal(sorted(nx.node_boundary(P10,[4,5,6])),[3, 7]) assert_equal(sorted(nx.node_boundary(P10,[3,4,5,6,7])),[2, 8]) assert_equal(nx.node_boundary(P10,[8,9,10]),[7]) assert_equal(sorted(nx.node_boundary(P10,[4,5,6],[9,10])),[]) def test_path_edge_boundary(self): """Check edge boundaries in path graph.""" P10=self.P10 assert_equal(nx.edge_boundary(P10,[]),[]) assert_equal(nx.edge_boundary(P10,[],[]),[]) assert_equal(nx.edge_boundary(P10,[1,2,3]),[(3, 4)]) assert_equal(sorted(nx.edge_boundary(P10,[4,5,6])),[(4, 3), (6, 7)]) assert_equal(sorted(nx.edge_boundary(P10,[3,4,5,6,7])),[(3, 2), (7, 8)]) assert_equal(nx.edge_boundary(P10,[8,9,10]),[(8, 7)]) assert_equal(sorted(nx.edge_boundary(P10,[4,5,6],[9,10])),[]) assert_equal(nx.edge_boundary(P10,[1,2,3],[3,4,5]) ,[(2, 3), (3, 4)]) def test_k10_node_boundary(self): """Check node boundaries in K10""" K10=self.K10 assert_equal(nx.node_boundary(K10,[]),[]) assert_equal(nx.node_boundary(K10,[],[]),[]) assert_equal(sorted(nx.node_boundary(K10,[1,2,3])), [4, 5, 6, 7, 8, 9, 10]) assert_equal(sorted(nx.node_boundary(K10,[4,5,6])), [1, 2, 3, 7, 8, 9, 10]) assert_equal(sorted(nx.node_boundary(K10,[3,4,5,6,7])), [1, 2, 8, 9, 10]) assert_equal(nx.node_boundary(K10,[4,5,6],[]),[]) assert_equal(nx.node_boundary(K10,K10),[]) assert_equal(nx.node_boundary(K10,[1,2,3],[3,4,5]),[4, 5]) def test_k10_edge_boundary(self): """Check edge boundaries in K10""" K10=self.K10 assert_equal(nx.edge_boundary(K10,[]),[]) assert_equal(nx.edge_boundary(K10,[],[]),[]) assert_equal(len(nx.edge_boundary(K10,[1,2,3])),21) assert_equal(len(nx.edge_boundary(K10,[4,5,6,7])),24) assert_equal(len(nx.edge_boundary(K10,[3,4,5,6,7])),25) assert_equal(len(nx.edge_boundary(K10,[8,9,10])),21) assert_equal(sorted(nx.edge_boundary(K10,[4,5,6],[9,10])), [(4, 9), (4, 10), (5, 9), (5, 10), (6, 9), (6, 10)]) assert_equal(nx.edge_boundary(K10,[1,2,3],[3,4,5]), [(1, 3), (1, 4), (1, 5), (2, 3), (2, 4), (2, 5), (3, 4), (3, 5)]) def test_petersen(self): """Check boundaries in the petersen graph cheeger(G,k)=min(|bdy(S)|/|S| for |S|=k, 0. # # This file is part of NetworkX. # # NetworkX is distributed under a BSD license; see LICENSE.txt for more # information. """Unit tests for the :mod:`networkx.algorithms.minors` module.""" from nose.tools import assert_equal from nose.tools import assert_true from nose.tools import raises import networkx as nx class TestQuotient(object): """Unit tests for computing quotient graphs.""" def test_quotient_graph_complete_multipartite(self): """Tests that the quotient graph of the complete *n*-partite graph under the "same neighbors" node relation is the complete graph on *n* nodes. """ G = nx.complete_multipartite_graph(2, 3, 4) # Two nodes are equivalent if they are not adjacent but have the same # neighbor set. same_neighbors = lambda u, v: (u not in G[v] and v not in G[u] and G[u] == G[v]) expected = nx.complete_graph(3) actual = nx.quotient_graph(G, same_neighbors) # It won't take too long to run a graph isomorphism algorithm on such # small graphs. assert_true(nx.is_isomorphic(expected, actual)) def test_quotient_graph_complete_bipartite(self): """Tests that the quotient graph of the complete bipartite graph under the "same neighbors" node relation is `K_2`. """ G = nx.complete_bipartite_graph(2, 3) # Two nodes are equivalent if they are not adjacent but have the same # neighbor set. same_neighbors = lambda u, v: (u not in G[v] and v not in G[u] and G[u] == G[v]) expected = nx.complete_graph(2) actual = nx.quotient_graph(G, same_neighbors) # It won't take too long to run a graph isomorphism algorithm on such # small graphs. assert_true(nx.is_isomorphic(expected, actual)) def test_quotient_graph_edge_relation(self): """Tests for specifying an alternate edge relation for the quotient graph. """ G = nx.path_graph(5) identity = lambda u, v: u == v peek = lambda x: next(iter(x)) same_parity = lambda b, c: peek(b) % 2 == peek(c) % 2 actual = nx.quotient_graph(G, identity, same_parity) expected = nx.Graph() expected.add_edges_from([(0, 2), (0, 4), (2, 4)]) expected.add_edge(1, 3) assert_true(nx.is_isomorphic(actual, expected)) def test_condensation_as_quotient(self): """This tests that the condensation of a graph can be viewed as the quotient graph under the "in the same connected component" equivalence relation. """ # This example graph comes from the file `test_strongly_connected.py`. G = nx.DiGraph() G.add_edges_from([(1, 2), (2, 3), (2, 11), (2, 12), (3, 4), (4, 3), (4, 5), (5, 6), (6, 5), (6, 7), (7, 8), (7, 9), (7, 10), (8, 9), (9, 7), (10, 6), (11, 2), (11, 4), (11, 6), (12, 6), (12, 11)]) scc = list(nx.strongly_connected_components(G)) C = nx.condensation(G, scc) component_of = C.graph['mapping'] # Two nodes are equivalent if they are in the same connected component. same_component = lambda u, v: component_of[u] == component_of[v] Q = nx.quotient_graph(G, same_component) assert_true(nx.is_isomorphic(C, Q)) class TestContraction(object): """Unit tests for node and edge contraction functions.""" def test_undirected_node_contraction(self): """Tests for node contraction in an undirected graph.""" G = nx.cycle_graph(4) actual = nx.contracted_nodes(G, 0, 1) expected = nx.complete_graph(3) expected.add_edge(0, 0) assert_true(nx.is_isomorphic(actual, expected)) def test_directed_node_contraction(self): """Tests for node contraction in a directed graph.""" G = nx.DiGraph(nx.cycle_graph(4)) actual = nx.contracted_nodes(G, 0, 1) expected = nx.DiGraph(nx.complete_graph(3)) expected.add_edge(0, 0) expected.add_edge(0, 0) assert_true(nx.is_isomorphic(actual, expected)) def test_node_attributes(self): """Tests that node contraction preserves node attributes.""" G = nx.cycle_graph(4) # Add some data to the two nodes being contracted. G.node[0] = dict(foo='bar') G.node[1] = dict(baz='xyzzy') actual = nx.contracted_nodes(G, 0, 1) # We expect that contracting the nodes 0 and 1 in C_4 yields K_3, but # with nodes labeled 0, 2, and 3, and with a self-loop on 0. expected = nx.complete_graph(3) expected = nx.relabel_nodes(expected, {1: 2, 2: 3}) expected.add_edge(0, 0) expected.node[0] = dict(foo='bar', contraction={1: dict(baz='xyzzy')}) assert_true(nx.is_isomorphic(actual, expected)) assert_equal(actual.node, expected.node) def test_without_self_loops(self): """Tests for node contraction without preserving self-loops.""" G = nx.cycle_graph(4) actual = nx.contracted_nodes(G, 0, 1, self_loops=False) expected = nx.complete_graph(3) assert_true(nx.is_isomorphic(actual, expected)) def test_undirected_edge_contraction(self): """Tests for edge contraction in an undirected graph.""" G = nx.cycle_graph(4) actual = nx.contracted_edge(G, (0, 1)) expected = nx.complete_graph(3) expected.add_edge(0, 0) assert_true(nx.is_isomorphic(actual, expected)) @raises(ValueError) def test_nonexistent_edge(self): """Tests that attempting to contract a non-existent edge raises an exception. """ G = nx.cycle_graph(4) nx.contracted_edge(G, (0, 2)) networkx-1.11/networkx/algorithms/tests/test_core.py0000644000175000017500000001030312637544450022737 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * import networkx as nx class TestCore: def setUp(self): # G is the example graph in Figure 1 from Batagelj and # Zaversnik's paper titled An O(m) Algorithm for Cores # Decomposition of Networks, 2003, # http://arXiv.org/abs/cs/0310049. With nodes labeled as # shown, the 3-core is given by nodes 1-8, the 2-core by nodes # 9-16, the 1-core by nodes 17-20 and node 21 is in the # 0-core. t1=nx.convert_node_labels_to_integers(nx.tetrahedral_graph(),1) t2=nx.convert_node_labels_to_integers(t1,5) G=nx.union(t1,t2) G.add_edges_from( [(3,7), (2,11), (11,5), (11,12), (5,12), (12,19), (12,18), (3,9), (7,9), (7,10), (9,10), (9,20), (17,13), (13,14), (14,15), (15,16), (16,13)]) G.add_node(21) self.G=G # Create the graph H resulting from the degree sequence # [0,1,2,2,2,2,3] when using the Havel-Hakimi algorithm. degseq=[0,1,2,2,2,2,3] H = nx.havel_hakimi_graph(degseq) mapping = {6:0, 0:1, 4:3, 5:6, 3:4, 1:2, 2:5 } self.H = nx.relabel_nodes(H, mapping) def test_trivial(self): """Empty graph""" G = nx.Graph() assert_equal(nx.find_cores(G),{}) def test_find_cores(self): cores=nx.find_cores(self.G) nodes_by_core=[] for val in [0,1,2,3]: nodes_by_core.append( sorted([k for k in cores if cores[k]==val])) assert_equal(nodes_by_core[0],[21]) assert_equal(nodes_by_core[1],[17, 18, 19, 20]) assert_equal(nodes_by_core[2],[9, 10, 11, 12, 13, 14, 15, 16]) assert_equal(nodes_by_core[3], [1, 2, 3, 4, 5, 6, 7, 8]) def test_core_number(self): # smoke test real name cores=nx.core_number(self.G) def test_find_cores2(self): cores=nx.find_cores(self.H) nodes_by_core=[] for val in [0,1,2]: nodes_by_core.append( sorted([k for k in cores if cores[k]==val])) assert_equal(nodes_by_core[0],[0]) assert_equal(nodes_by_core[1],[1, 3]) assert_equal(nodes_by_core[2],[2, 4, 5, 6]) def test_main_core(self): main_core_subgraph=nx.k_core(self.H) assert_equal(sorted(main_core_subgraph.nodes()),[2,4,5,6]) def test_k_core(self): # k=0 k_core_subgraph=nx.k_core(self.H,k=0) assert_equal(sorted(k_core_subgraph.nodes()),sorted(self.H.nodes())) # k=1 k_core_subgraph=nx.k_core(self.H,k=1) assert_equal(sorted(k_core_subgraph.nodes()),[1,2,3,4,5,6]) # k=2 k_core_subgraph=nx.k_core(self.H,k=2) assert_equal(sorted(k_core_subgraph.nodes()),[2,4,5,6]) def test_main_crust(self): main_crust_subgraph=nx.k_crust(self.H) assert_equal(sorted(main_crust_subgraph.nodes()),[0,1,3]) def test_k_crust(self): # k=0 k_crust_subgraph=nx.k_crust(self.H,k=2) assert_equal(sorted(k_crust_subgraph.nodes()),sorted(self.H.nodes())) # k=1 k_crust_subgraph=nx.k_crust(self.H,k=1) assert_equal(sorted(k_crust_subgraph.nodes()),[0,1,3]) # k=2 k_crust_subgraph=nx.k_crust(self.H,k=0) assert_equal(sorted(k_crust_subgraph.nodes()),[0]) def test_main_shell(self): main_shell_subgraph=nx.k_shell(self.H) assert_equal(sorted(main_shell_subgraph.nodes()),[2,4,5,6]) def test_k_shell(self): # k=0 k_shell_subgraph=nx.k_shell(self.H,k=2) assert_equal(sorted(k_shell_subgraph.nodes()),[2,4,5,6]) # k=1 k_shell_subgraph=nx.k_shell(self.H,k=1) assert_equal(sorted(k_shell_subgraph.nodes()),[1,3]) # k=2 k_shell_subgraph=nx.k_shell(self.H,k=0) assert_equal(sorted(k_shell_subgraph.nodes()),[0]) def test_k_corona(self): # k=0 k_corona_subgraph=nx.k_corona(self.H,k=2) assert_equal(sorted(k_corona_subgraph.nodes()),[2,4,5,6]) # k=1 k_corona_subgraph=nx.k_corona(self.H,k=1) assert_equal(sorted(k_corona_subgraph.nodes()),[1]) # k=2 k_corona_subgraph=nx.k_corona(self.H,k=0) assert_equal(sorted(k_corona_subgraph.nodes()),[0]) networkx-1.11/networkx/algorithms/tests/test_clique.py0000644000175000017500000001735612637544500023304 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * import networkx as nx from networkx import convert_node_labels_to_integers as cnlti class TestCliques: def setUp(self): z = [3, 4, 3, 4, 2, 4, 2, 1, 1, 1, 1] self.G = cnlti(nx.generators.havel_hakimi_graph(z), first_label=1) self.cl = list(nx.find_cliques(self.G)) H = nx.complete_graph(6) H = nx.relabel_nodes(H, dict([(i, i + 1) for i in range(6)])) H.remove_edges_from([(2, 6), (2, 5), (2, 4), (1, 3), (5, 3)]) self.H = H def test_find_cliques1(self): cl = list(nx.find_cliques(self.G)) rcl = nx.find_cliques_recursive(self.G) expected = [[2, 6, 1, 3], [2, 6, 4], [5, 4, 7], [8, 9], [10, 11]] assert_equal(sorted(map(sorted, cl)), sorted(map(sorted, rcl))) assert_equal(sorted(map(sorted, cl)), sorted(map(sorted, expected))) def test_selfloops(self): self.G.add_edge(1, 1) cl = list(nx.find_cliques(self.G)) rcl = nx.find_cliques_recursive(self.G) assert_equal(sorted(map(sorted, cl)), sorted(map(sorted, rcl))) assert_equal(cl, [[2, 6, 1, 3], [2, 6, 4], [5, 4, 7], [8, 9], [10, 11]]) def test_find_cliques2(self): hcl = list(nx.find_cliques(self.H)) assert_equal(sorted(map(sorted, hcl)), [[1, 2], [1, 4, 5, 6], [2, 3], [3, 4, 6]]) def test_clique_number(self): G = self.G assert_equal(nx.graph_clique_number(G), 4) assert_equal(nx.graph_clique_number(G, cliques=self.cl), 4) def test_number_of_cliques(self): G = self.G assert_equal(nx.graph_number_of_cliques(G), 5) assert_equal(nx.graph_number_of_cliques(G, cliques=self.cl), 5) assert_equal(nx.number_of_cliques(G, 1), 1) assert_equal(list(nx.number_of_cliques(G, [1]).values()), [1]) assert_equal(list(nx.number_of_cliques(G, [1, 2]).values()), [1, 2]) assert_equal(nx.number_of_cliques(G, [1, 2]), {1: 1, 2: 2}) assert_equal(nx.number_of_cliques(G, 2), 2) assert_equal(nx.number_of_cliques(G), {1: 1, 2: 2, 3: 1, 4: 2, 5: 1, 6: 2, 7: 1, 8: 1, 9: 1, 10: 1, 11: 1}) assert_equal(nx.number_of_cliques(G, nodes=G.nodes()), {1: 1, 2: 2, 3: 1, 4: 2, 5: 1, 6: 2, 7: 1, 8: 1, 9: 1, 10: 1, 11: 1}) assert_equal(nx.number_of_cliques(G, nodes=[2, 3, 4]), {2: 2, 3: 1, 4: 2}) assert_equal(nx.number_of_cliques(G, cliques=self.cl), {1: 1, 2: 2, 3: 1, 4: 2, 5: 1, 6: 2, 7: 1, 8: 1, 9: 1, 10: 1, 11: 1}) assert_equal(nx.number_of_cliques(G, G.nodes(), cliques=self.cl), {1: 1, 2: 2, 3: 1, 4: 2, 5: 1, 6: 2, 7: 1, 8: 1, 9: 1, 10: 1, 11: 1}) def test_node_clique_number(self): G = self.G assert_equal(nx.node_clique_number(G, 1), 4) assert_equal(list(nx.node_clique_number(G, [1]).values()), [4]) assert_equal(list(nx.node_clique_number(G, [1, 2]).values()), [4, 4]) assert_equal(nx.node_clique_number(G, [1, 2]), {1: 4, 2: 4}) assert_equal(nx.node_clique_number(G, 1), 4) assert_equal(nx.node_clique_number(G), {1: 4, 2: 4, 3: 4, 4: 3, 5: 3, 6: 4, 7: 3, 8: 2, 9: 2, 10: 2, 11: 2}) assert_equal(nx.node_clique_number(G, cliques=self.cl), {1: 4, 2: 4, 3: 4, 4: 3, 5: 3, 6: 4, 7: 3, 8: 2, 9: 2, 10: 2, 11: 2}) def test_cliques_containing_node(self): G = self.G assert_equal(nx.cliques_containing_node(G, 1), [[2, 6, 1, 3]]) assert_equal(list(nx.cliques_containing_node(G, [1]).values()), [[[2, 6, 1, 3]]]) assert_equal(list(nx.cliques_containing_node(G, [1, 2]).values()), [[[2, 6, 1, 3]], [[2, 6, 1, 3], [2, 6, 4]]]) assert_equal(nx.cliques_containing_node(G, [1, 2]), {1: [[2, 6, 1, 3]], 2: [[2, 6, 1, 3], [2, 6, 4]]}) assert_equal(nx.cliques_containing_node(G, 1), [[2, 6, 1, 3]]) assert_equal(nx.cliques_containing_node(G, 2), [[2, 6, 1, 3], [2, 6, 4]]) assert_equal(nx.cliques_containing_node(G, 2, cliques=self.cl), [[2, 6, 1, 3], [2, 6, 4]]) assert_equal(len(nx.cliques_containing_node(G)), 11) def test_make_clique_bipartite(self): G = self.G B = nx.make_clique_bipartite(G) assert_equal(sorted(B.nodes()), [-5, -4, -3, -2, -1, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11]) H = nx.project_down(B) assert_equal(H.adj, G.adj) H1 = nx.project_up(B) assert_equal(H1.nodes(), [1, 2, 3, 4, 5]) H2 = nx.make_max_clique_graph(G) assert_equal(H1.adj, H2.adj) @raises(nx.NetworkXNotImplemented) def test_directed(self): cliques = nx.find_cliques(nx.DiGraph()) class TestEnumerateAllCliques: def test_paper_figure_4(self): # Same graph as given in Fig. 4 of paper enumerate_all_cliques is # based on. # http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=1559964&isnumber=33129 G = nx.Graph() edges_fig_4 = [('a', 'b'), ('a', 'c'), ('a', 'd'), ('a', 'e'), ('b', 'c'), ('b', 'd'), ('b', 'e'), ('c', 'd'), ('c', 'e'), ('d', 'e'), ('f', 'b'), ('f', 'c'), ('f', 'g'), ('g', 'f'), ('g', 'c'), ('g', 'd'), ('g', 'e')] G.add_edges_from(edges_fig_4) cliques = list(nx.enumerate_all_cliques(G)) clique_sizes = list(map(len, cliques)) assert_equal(sorted(clique_sizes), clique_sizes) expected_cliques = [['a'], ['b'], ['c'], ['d'], ['e'], ['f'], ['g'], ['a', 'b'], ['a', 'b', 'd'], ['a', 'b', 'd', 'e'], ['a', 'b', 'e'], ['a', 'c'], ['a', 'c', 'd'], ['a', 'c', 'd', 'e'], ['a', 'c', 'e'], ['a', 'd'], ['a', 'd', 'e'], ['a', 'e'], ['b', 'c'], ['b', 'c', 'd'], ['b', 'c', 'd', 'e'], ['b', 'c', 'e'], ['b', 'c', 'f'], ['b', 'd'], ['b', 'd', 'e'], ['b', 'e'], ['b', 'f'], ['c', 'd'], ['c', 'd', 'e'], ['c', 'd', 'e', 'g'], ['c', 'd', 'g'], ['c', 'e'], ['c', 'e', 'g'], ['c', 'f'], ['c', 'f', 'g'], ['c', 'g'], ['d', 'e'], ['d', 'e', 'g'], ['d', 'g'], ['e', 'g'], ['f', 'g'], ['a', 'b', 'c'], ['a', 'b', 'c', 'd'], ['a', 'b', 'c', 'd', 'e'], ['a', 'b', 'c', 'e']] assert_equal(sorted(map(sorted, cliques)), sorted(map(sorted, expected_cliques))) networkx-1.11/networkx/algorithms/tests/test_hybrid.py0000644000175000017500000000145312637544450023276 0ustar aricaric00000000000000from nose.tools import * import networkx as nx def test_2d_grid_graph(): # FC article claims 2d grid graph of size n is (3,3)-connected # and (5,9)-connected, but I don't think it is (5,9)-connected G=nx.grid_2d_graph(8,8,periodic=True) assert_true(nx.is_kl_connected(G,3,3)) assert_false(nx.is_kl_connected(G,5,9)) (H,graphOK)=nx.kl_connected_subgraph(G,5,9,same_as_graph=True) assert_false(graphOK) def test_small_graph(): G=nx.Graph() G.add_edge(1,2) G.add_edge(1,3) G.add_edge(2,3) assert_true(nx.is_kl_connected(G,2,2)) H=nx.kl_connected_subgraph(G,2,2) (H,graphOK)=nx.kl_connected_subgraph(G,2,2, low_memory=True, same_as_graph=True) assert_true(graphOK) networkx-1.11/networkx/algorithms/tests/test_dominance.py0000644000175000017500000001527712637544450023763 0ustar aricaric00000000000000import networkx as nx from nose.tools import * class TestImmediateDominators(object): def test_exceptions(self): G = nx.Graph() G.add_node(0) assert_raises(nx.NetworkXNotImplemented, nx.immediate_dominators, G, 0) G = nx.MultiGraph(G) assert_raises(nx.NetworkXNotImplemented, nx.immediate_dominators, G, 0) G = nx.DiGraph([[0, 0]]) assert_raises(nx.NetworkXError, nx.immediate_dominators, G, 1) def test_singleton(self): G = nx.DiGraph() G.add_node(0) assert_equal(nx.immediate_dominators(G, 0), {0: 0}) G.add_edge(0, 0) assert_equal(nx.immediate_dominators(G, 0), {0: 0}) def test_path(self): n = 5 G = nx.path_graph(n, create_using=nx.DiGraph()) assert_equal(nx.immediate_dominators(G, 0), {i: max(i - 1, 0) for i in range(n)}) def test_cycle(self): n = 5 G = nx.cycle_graph(n, create_using=nx.DiGraph()) assert_equal(nx.immediate_dominators(G, 0), {i: max(i - 1, 0) for i in range(n)}) def test_unreachable(self): n = 5 assert_greater(n, 1) G = nx.path_graph(n, create_using=nx.DiGraph()) assert_equal(nx.immediate_dominators(G, n // 2), {i: max(i - 1, n // 2) for i in range(n // 2, n)}) def test_irreducible1(self): # Graph taken from Figure 2 of # K. D. Cooper, T. J. Harvey, and K. Kennedy. # A simple, fast dominance algorithm. # Software Practice & Experience, 4:110, 2001. edges = [(1, 2), (2, 1), (3, 2), (4, 1), (5, 3), (5, 4)] G = nx.DiGraph(edges) assert_equal(nx.immediate_dominators(G, 5), {i: 5 for i in range(1, 6)}) def test_irreducible2(self): # Graph taken from Figure 4 of # K. D. Cooper, T. J. Harvey, and K. Kennedy. # A simple, fast dominance algorithm. # Software Practice & Experience, 4:110, 2001. edges = [(1, 2), (2, 1), (2, 3), (3, 2), (4, 2), (4, 3), (5, 1), (6, 4), (6, 5)] G = nx.DiGraph(edges) assert_equal(nx.immediate_dominators(G, 6), {i: 6 for i in range(1, 7)}) def test_domrel_png(self): # Graph taken from https://commons.wikipedia.org/wiki/File:Domrel.png edges = [(1, 2), (2, 3), (2, 4), (2, 6), (3, 5), (4, 5), (5, 2)] G = nx.DiGraph(edges) assert_equal(nx.immediate_dominators(G, 1), {1: 1, 2: 1, 3: 2, 4: 2, 5: 2, 6: 2}) # Test postdominance. with nx.utils.reversed(G): assert_equal(nx.immediate_dominators(G, 6), {1: 2, 2: 6, 3: 5, 4: 5, 5: 2, 6: 6}) def test_boost_example(self): # Graph taken from Figure 1 of # http://www.boost.org/doc/libs/1_56_0/libs/graph/doc/lengauer_tarjan_dominator.htm edges = [(0, 1), (1, 2), (1, 3), (2, 7), (3, 4), (4, 5), (4, 6), (5, 7), (6, 4)] G = nx.DiGraph(edges) assert_equal(nx.immediate_dominators(G, 0), {0: 0, 1: 0, 2: 1, 3: 1, 4: 3, 5: 4, 6: 4, 7: 1}) # Test postdominance. with nx.utils.reversed(G): assert_equal(nx.immediate_dominators(G, 7), {0: 1, 1: 7, 2: 7, 3: 4, 4: 5, 5: 7, 6: 4, 7: 7}) class TestDominanceFrontiers(object): def test_exceptions(self): G = nx.Graph() G.add_node(0) assert_raises(nx.NetworkXNotImplemented, nx.dominance_frontiers, G, 0) G = nx.MultiGraph(G) assert_raises(nx.NetworkXNotImplemented, nx.dominance_frontiers, G, 0) G = nx.DiGraph([[0, 0]]) assert_raises(nx.NetworkXError, nx.dominance_frontiers, G, 1) def test_singleton(self): G = nx.DiGraph() G.add_node(0) assert_equal(nx.dominance_frontiers(G, 0), {0: []}) G.add_edge(0, 0) assert_equal(nx.dominance_frontiers(G, 0), {0: []}) def test_path(self): n = 5 G = nx.path_graph(n, create_using=nx.DiGraph()) assert_equal(nx.dominance_frontiers(G, 0), {i: [] for i in range(n)}) def test_cycle(self): n = 5 G = nx.cycle_graph(n, create_using=nx.DiGraph()) assert_equal(nx.dominance_frontiers(G, 0), {i: [] for i in range(n)}) def test_unreachable(self): n = 5 assert_greater(n, 1) G = nx.path_graph(n, create_using=nx.DiGraph()) assert_equal(nx.dominance_frontiers(G, n // 2), {i: [] for i in range(n // 2, n)}) def test_irreducible1(self): # Graph taken from Figure 2 of # K. D. Cooper, T. J. Harvey, and K. Kennedy. # A simple, fast dominance algorithm. # Software Practice & Experience, 4:110, 2001. edges = [(1, 2), (2, 1), (3, 2), (4, 1), (5, 3), (5, 4)] G = nx.DiGraph(edges) assert_equal({u: sorted(df) for u, df in nx.dominance_frontiers(G, 5).items()}, {1: [2], 2: [1], 3: [2], 4: [1], 5: []}) def test_irreducible2(self): # Graph taken from Figure 4 of # K. D. Cooper, T. J. Harvey, and K. Kennedy. # A simple, fast dominance algorithm. # Software Practice & Experience, 4:110, 2001. edges = [(1, 2), (2, 1), (2, 3), (3, 2), (4, 2), (4, 3), (5, 1), (6, 4), (6, 5)] G = nx.DiGraph(edges) assert_equal(nx.dominance_frontiers(G, 6), {1: [2], 2: [1, 3], 3: [2], 4: [2, 3], 5: [1], 6: []}) def test_domrel_png(self): # Graph taken from https://commons.wikipedia.org/wiki/File:Domrel.png edges = [(1, 2), (2, 3), (2, 4), (2, 6), (3, 5), (4, 5), (5, 2)] G = nx.DiGraph(edges) assert_equal(nx.dominance_frontiers(G, 1), {1: [], 2: [], 3: [5], 4: [5], 5: [2], 6: []}) # Test postdominance. with nx.utils.reversed(G): assert_equal(nx.dominance_frontiers(G, 6), {1: [], 2: [], 3: [2], 4: [2], 5: [2], 6: []}) def test_boost_example(self): # Graph taken from Figure 1 of # http://www.boost.org/doc/libs/1_56_0/libs/graph/doc/lengauer_tarjan_dominator.htm edges = [(0, 1), (1, 2), (1, 3), (2, 7), (3, 4), (4, 5), (4, 6), (5, 7), (6, 4)] G = nx.DiGraph(edges) assert_equal(nx.dominance_frontiers(G, 0), {0: [], 1: [], 2: [7], 3: [7], 4: [7], 5: [7], 6: [4], 7: []}) # Test postdominance. with nx.utils.reversed(G): assert_equal(nx.dominance_frontiers(G, 7), {0: [], 1: [], 2: [1], 3: [1], 4: [1], 5: [1], 6: [4], 7: []}) networkx-1.11/networkx/algorithms/tests/test_distance_measures.py0000644000175000017500000000413112637544500025503 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * import networkx class TestDistance: def setUp(self): G=networkx.Graph() from networkx import convert_node_labels_to_integers as cnlti G=cnlti(networkx.grid_2d_graph(4,4),first_label=1,ordering="sorted") self.G=G def test_eccentricity(self): assert_equal(networkx.eccentricity(self.G,1),6) e=networkx.eccentricity(self.G) assert_equal(e[1],6) sp=networkx.shortest_path_length(self.G) e=networkx.eccentricity(self.G,sp=sp) assert_equal(e[1],6) e=networkx.eccentricity(self.G,v=1) assert_equal(e,6) e=networkx.eccentricity(self.G,v=[1,1]) #This behavior changed in version 1.8 (ticket #739) assert_equal(e[1],6) e=networkx.eccentricity(self.G,v=[1,2]) assert_equal(e[1],6) # test against graph with one node G=networkx.path_graph(1) e=networkx.eccentricity(G) assert_equal(e[0],0) e=networkx.eccentricity(G,v=0) assert_equal(e,0) assert_raises(networkx.NetworkXError, networkx.eccentricity, G, 1) # test against empty graph G=networkx.empty_graph() e=networkx.eccentricity(G) assert_equal(e,{}) def test_diameter(self): assert_equal(networkx.diameter(self.G),6) def test_radius(self): assert_equal(networkx.radius(self.G),4) def test_periphery(self): assert_equal(set(networkx.periphery(self.G)),set([1, 4, 13, 16])) def test_center(self): assert_equal(set(networkx.center(self.G)),set([6, 7, 10, 11])) def test_radius_exception(self): G=networkx.Graph() G.add_edge(1,2) G.add_edge(3,4) assert_raises(networkx.NetworkXError, networkx.diameter, G) @raises(networkx.NetworkXError) def test_eccentricity_infinite(self): G=networkx.Graph([(1,2),(3,4)]) e = networkx.eccentricity(G) @raises(networkx.NetworkXError) def test_eccentricity_invalid(self): G=networkx.Graph([(1,2),(3,4)]) e = networkx.eccentricity(G,sp=1) networkx-1.11/networkx/algorithms/tests/test_euler.py0000644000175000017500000000563512637544450023137 0ustar aricaric00000000000000#!/usr/bin/env python # run with nose: nosetests -v test_euler.py from nose.tools import * import networkx as nx from networkx import is_eulerian,eulerian_circuit class TestEuler: def test_is_eulerian(self): assert_true(is_eulerian(nx.complete_graph(5))) assert_true(is_eulerian(nx.complete_graph(7))) assert_true(is_eulerian(nx.hypercube_graph(4))) assert_true(is_eulerian(nx.hypercube_graph(6))) assert_false(is_eulerian(nx.complete_graph(4))) assert_false(is_eulerian(nx.complete_graph(6))) assert_false(is_eulerian(nx.hypercube_graph(3))) assert_false(is_eulerian(nx.hypercube_graph(5))) assert_false(is_eulerian(nx.petersen_graph())) assert_false(is_eulerian(nx.path_graph(4))) def test_is_eulerian2(self): # not connected G = nx.Graph() G.add_nodes_from([1,2,3]) assert_false(is_eulerian(G)) # not strongly connected G = nx.DiGraph() G.add_nodes_from([1,2,3]) assert_false(is_eulerian(G)) G = nx.MultiDiGraph() G.add_edge(1,2) G.add_edge(2,3) G.add_edge(2,3) G.add_edge(3,1) assert_false(is_eulerian(G)) def test_eulerian_circuit_cycle(self): G=nx.cycle_graph(4) edges=list(eulerian_circuit(G,source=0)) nodes=[u for u,v in edges] assert_equal(nodes,[0,3,2,1]) assert_equal(edges,[(0,3),(3,2),(2,1),(1,0)]) edges=list(eulerian_circuit(G,source=1)) nodes=[u for u,v in edges] assert_equal(nodes,[1,2,3,0]) assert_equal(edges,[(1,2),(2,3),(3,0),(0,1)]) G=nx.complete_graph(3) edges=list(eulerian_circuit(G,source=0)) nodes=[u for u,v in edges] assert_equal(nodes,[0,2,1]) assert_equal(edges,[(0,2),(2,1),(1,0)]) edges=list(eulerian_circuit(G,source=1)) nodes=[u for u,v in edges] assert_equal(nodes,[1,2,0]) assert_equal(edges,[(1,2),(2,0),(0,1)]) def test_eulerian_circuit_digraph(self): G=nx.DiGraph() G.add_cycle([0,1,2,3]) edges=list(eulerian_circuit(G,source=0)) nodes=[u for u,v in edges] assert_equal(nodes,[0,1,2,3]) assert_equal(edges,[(0,1),(1,2),(2,3),(3,0)]) edges=list(eulerian_circuit(G,source=1)) nodes=[u for u,v in edges] assert_equal(nodes,[1,2,3,0]) assert_equal(edges,[(1,2),(2,3),(3,0),(0,1)]) def test_eulerian_circuit_multigraph(self): G=nx.MultiGraph() G.add_cycle([0,1,2,3]) G.add_edge(1,2) G.add_edge(1,2) edges=list(eulerian_circuit(G,source=0)) nodes=[u for u,v in edges] assert_equal(nodes,[0,3,2,1,2,1]) assert_equal(edges,[(0,3),(3,2),(2,1),(1,2),(2,1),(1,0)]) @raises(nx.NetworkXError) def test_not_eulerian(self): f=list(eulerian_circuit(nx.complete_graph(4))) networkx-1.11/networkx/algorithms/mis.py0000644000175000017500000000466012637544500020403 0ustar aricaric00000000000000# -*- coding: utf-8 -*- # $Id: maximalIndependentSet.py 576 2011-03-01 05:50:34Z lleeoo $ """ Algorithm to find a maximal (not maximum) independent set. """ # Leo Lopes # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. __author__ = "\n".join(["Leo Lopes ", "Loïc Séguin-C. "]) __all__ = ['maximal_independent_set'] import random import networkx as nx def maximal_independent_set(G, nodes=None): """Return a random maximal independent set guaranteed to contain a given set of nodes. An independent set is a set of nodes such that the subgraph of G induced by these nodes contains no edges. A maximal independent set is an independent set such that it is not possible to add a new node and still get an independent set. Parameters ---------- G : NetworkX graph nodes : list or iterable Nodes that must be part of the independent set. This set of nodes must be independent. Returns ------- indep_nodes : list List of nodes that are part of a maximal independent set. Raises ------ NetworkXUnfeasible If the nodes in the provided list are not part of the graph or do not form an independent set, an exception is raised. Examples -------- >>> G = nx.path_graph(5) >>> nx.maximal_independent_set(G) # doctest: +SKIP [4, 0, 2] >>> nx.maximal_independent_set(G, [1]) # doctest: +SKIP [1, 3] Notes ----- This algorithm does not solve the maximum independent set problem. """ if not nodes: nodes = set([random.choice(G.nodes())]) else: nodes = set(nodes) if not nodes.issubset(G): raise nx.NetworkXUnfeasible( "%s is not a subset of the nodes of G" % nodes) neighbors = set.union(*[set(G.neighbors(v)) for v in nodes]) if set.intersection(neighbors, nodes): raise nx.NetworkXUnfeasible( "%s is not an independent set of G" % nodes) indep_nodes = list(nodes) available_nodes = set(G.nodes()).difference(neighbors.union(nodes)) while available_nodes: node = random.choice(list(available_nodes)) indep_nodes.append(node) available_nodes.difference_update(G.neighbors(node) + [node]) return indep_nodes networkx-1.11/networkx/algorithms/richclub.py0000644000175000017500000000666712637544500021417 0ustar aricaric00000000000000# -*- coding: utf-8 -*- import networkx as nx __author__ = """\n""".join(['Ben Edwards', 'Aric Hagberg ']) __all__ = ['rich_club_coefficient'] def rich_club_coefficient(G, normalized=True, Q=100): """Return the rich-club coefficient of the graph G. The rich-club coefficient is the ratio, for every degree k, of the number of actual to the number of potential edges for nodes with degree greater than k: .. math:: \\phi(k) = \\frac{2 Ek}{Nk(Nk-1)} where Nk is the number of nodes with degree larger than k, and Ek be the number of edges among those nodes. Parameters ---------- G : NetworkX graph normalized : bool (optional) Normalize using randomized network (see [1]_) Q : float (optional, default=100) If normalized=True build a random network by performing Q*M double-edge swaps, where M is the number of edges in G, to use as a null-model for normalization. Returns ------- rc : dictionary A dictionary, keyed by degree, with rich club coefficient values. Examples -------- >>> G = nx.Graph([(0,1),(0,2),(1,2),(1,3),(1,4),(4,5)]) >>> rc = nx.rich_club_coefficient(G,normalized=False) >>> rc[0] # doctest: +SKIP 0.4 Notes ----- The rich club definition and algorithm are found in [1]_. This algorithm ignores any edge weights and is not defined for directed graphs or graphs with parallel edges or self loops. Estimates for appropriate values of Q are found in [2]_. References ---------- .. [1] Julian J. McAuley, Luciano da Fontoura Costa, and Tibério S. Caetano, "The rich-club phenomenon across complex network hierarchies", Applied Physics Letters Vol 91 Issue 8, August 2007. http://arxiv.org/abs/physics/0701290 .. [2] R. Milo, N. Kashtan, S. Itzkovitz, M. E. J. Newman, U. Alon, "Uniform generation of random graphs with arbitrary degree sequences", 2006. http://arxiv.org/abs/cond-mat/0312028 """ if G.is_multigraph() or G.is_directed(): raise Exception('rich_club_coefficient is not implemented for ', 'directed or multiedge graphs.') if len(G.selfloop_edges()) > 0: raise Exception('rich_club_coefficient is not implemented for ', 'graphs with self loops.') rc=_compute_rc(G) if normalized: # make R a copy of G, randomize with Q*|E| double edge swaps # and use rich_club coefficient of R to normalize R = G.copy() E = R.number_of_edges() nx.double_edge_swap(R,Q*E,max_tries=Q*E*10) rcran=_compute_rc(R) for d in rc: # if rcran[d] > 0: rc[d]/=rcran[d] return rc def _compute_rc(G): # compute rich club coefficient for all k degrees in G deghist = nx.degree_histogram(G) total = sum(deghist) # number of nodes with degree > k (omit last entry which is zero) nks = [total-cs for cs in nx.utils.accumulate(deghist) if total-cs > 1] deg=G.degree() edge_degrees=sorted(sorted((deg[u],deg[v])) for u,v in G.edges_iter()) ek=G.number_of_edges() k1,k2=edge_degrees.pop(0) rc={} for d,nk in zip(range(len(nks)),nks): while k1 <= d: if len(edge_degrees)==0: break k1,k2=edge_degrees.pop(0) ek-=1 rc[d] = 2.0*ek/(nk*(nk-1)) return rc networkx-1.11/networkx/algorithms/cycles.py0000644000175000017500000004062512637544500021076 0ustar aricaric00000000000000""" ======================== Cycle finding algorithms ======================== """ # Copyright (C) 2010-2012 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. from collections import defaultdict import networkx as nx from networkx.utils import * from networkx.algorithms.traversal.edgedfs import helper_funcs, edge_dfs __all__ = [ 'cycle_basis','simple_cycles','recursive_simple_cycles', 'find_cycle' ] __author__ = "\n".join(['Jon Olav Vik ', 'Dan Schult ', 'Aric Hagberg ']) @not_implemented_for('directed') @not_implemented_for('multigraph') def cycle_basis(G,root=None): """ Returns a list of cycles which form a basis for cycles of G. A basis for cycles of a network is a minimal collection of cycles such that any cycle in the network can be written as a sum of cycles in the basis. Here summation of cycles is defined as "exclusive or" of the edges. Cycle bases are useful, e.g. when deriving equations for electric circuits using Kirchhoff's Laws. Parameters ---------- G : NetworkX Graph root : node, optional Specify starting node for basis. Returns ------- A list of cycle lists. Each cycle list is a list of nodes which forms a cycle (loop) in G. Examples -------- >>> G=nx.Graph() >>> G.add_cycle([0,1,2,3]) >>> G.add_cycle([0,3,4,5]) >>> print(nx.cycle_basis(G,0)) [[3, 4, 5, 0], [1, 2, 3, 0]] Notes ----- This is adapted from algorithm CACM 491 [1]_. References ---------- .. [1] Paton, K. An algorithm for finding a fundamental set of cycles of a graph. Comm. ACM 12, 9 (Sept 1969), 514-518. See Also -------- simple_cycles """ gnodes=set(G.nodes()) cycles=[] while gnodes: # loop over connected components if root is None: root=gnodes.pop() stack=[root] pred={root:root} used={root:set()} while stack: # walk the spanning tree finding cycles z=stack.pop() # use last-in so cycles easier to find zused=used[z] for nbr in G[z]: if nbr not in used: # new node pred[nbr]=z stack.append(nbr) used[nbr]=set([z]) elif nbr == z: # self loops cycles.append([z]) elif nbr not in zused:# found a cycle pn=used[nbr] cycle=[nbr,z] p=pred[z] while p not in pn: cycle.append(p) p=pred[p] cycle.append(p) cycles.append(cycle) used[nbr].add(z) gnodes-=set(pred) root=None return cycles @not_implemented_for('undirected') def simple_cycles(G): """Find simple cycles (elementary circuits) of a directed graph. An simple cycle, or elementary circuit, is a closed path where no node appears twice, except that the first and last node are the same. Two elementary circuits are distinct if they are not cyclic permutations of each other. This is a nonrecursive, iterator/generator version of Johnson's algorithm [1]_. There may be better algorithms for some cases [2]_ [3]_. Parameters ---------- G : NetworkX DiGraph A directed graph Returns ------- cycle_generator: generator A generator that produces elementary cycles of the graph. Each cycle is a list of nodes with the first and last nodes being the same. Examples -------- >>> G = nx.DiGraph([(0, 0), (0, 1), (0, 2), (1, 2), (2, 0), (2, 1), (2, 2)]) >>> len(list(nx.simple_cycles(G))) 5 To filter the cycles so that they don't include certain nodes or edges, copy your graph and eliminate those nodes or edges before calling >>> copyG = G.copy() >>> copyG.remove_nodes_from([1]) >>> copyG.remove_edges_from([(0, 1)]) >>> len(list(nx.simple_cycles(copyG))) 3 Notes ----- The implementation follows pp. 79-80 in [1]_. The time complexity is `O((n+e)(c+1))` for `n` nodes, `e` edges and `c` elementary circuits. References ---------- .. [1] Finding all the elementary circuits of a directed graph. D. B. Johnson, SIAM Journal on Computing 4, no. 1, 77-84, 1975. http://dx.doi.org/10.1137/0204007 .. [2] Enumerating the cycles of a digraph: a new preprocessing strategy. G. Loizou and P. Thanish, Information Sciences, v. 27, 163-182, 1982. .. [3] A search strategy for the elementary cycles of a directed graph. J.L. Szwarcfiter and P.E. Lauer, BIT NUMERICAL MATHEMATICS, v. 16, no. 2, 192-204, 1976. See Also -------- cycle_basis """ def _unblock(thisnode,blocked,B): stack=set([thisnode]) while stack: node=stack.pop() if node in blocked: blocked.remove(node) stack.update(B[node]) B[node].clear() # Johnson's algorithm requires some ordering of the nodes. # We assign the arbitrary ordering given by the strongly connected comps # There is no need to track the ordering as each node removed as processed. subG = type(G)(G.edges_iter()) # save the actual graph so we can mutate it here # We only take the edges because we do not want to # copy edge and node attributes here. sccs = list(nx.strongly_connected_components(subG)) while sccs: scc=sccs.pop() # order of scc determines ordering of nodes startnode = scc.pop() # Processing node runs "circuit" routine from recursive version path=[startnode] blocked = set() # vertex: blocked from search? closed = set() # nodes involved in a cycle blocked.add(startnode) B=defaultdict(set) # graph portions that yield no elementary circuit stack=[ (startnode,list(subG[startnode])) ] # subG gives component nbrs while stack: thisnode,nbrs = stack[-1] if nbrs: nextnode = nbrs.pop() # print thisnode,nbrs,":",nextnode,blocked,B,path,stack,startnode # f=raw_input("pause") if nextnode == startnode: yield path[:] closed.update(path) # print "Found a cycle",path,closed elif nextnode not in blocked: path.append(nextnode) stack.append( (nextnode,list(subG[nextnode])) ) closed.discard(nextnode) blocked.add(nextnode) continue # done with nextnode... look for more neighbors if not nbrs: # no more nbrs if thisnode in closed: _unblock(thisnode,blocked,B) else: for nbr in subG[thisnode]: if thisnode not in B[nbr]: B[nbr].add(thisnode) stack.pop() # assert path[-1]==thisnode path.pop() # done processing this node subG.remove_node(startnode) H=subG.subgraph(scc) # make smaller to avoid work in SCC routine sccs.extend(list(nx.strongly_connected_components(H))) @not_implemented_for('undirected') def recursive_simple_cycles(G): """Find simple cycles (elementary circuits) of a directed graph. A simple cycle, or elementary circuit, is a closed path where no node appears twice, except that the first and last node are the same. Two elementary circuits are distinct if they are not cyclic permutations of each other. This version uses a recursive algorithm to build a list of cycles. You should probably use the iterator version caled simple_cycles(). Warning: This recursive version uses lots of RAM! Parameters ---------- G : NetworkX DiGraph A directed graph Returns ------- A list of circuits, where each circuit is a list of nodes, with the first and last node being the same. Example: >>> G = nx.DiGraph([(0, 0), (0, 1), (0, 2), (1, 2), (2, 0), (2, 1), (2, 2)]) >>> nx.recursive_simple_cycles(G) [[0], [0, 1, 2], [0, 2], [1, 2], [2]] See Also -------- cycle_basis (for undirected graphs) Notes ----- The implementation follows pp. 79-80 in [1]_. The time complexity is `O((n+e)(c+1))` for `n` nodes, `e` edges and `c` elementary circuits. References ---------- .. [1] Finding all the elementary circuits of a directed graph. D. B. Johnson, SIAM Journal on Computing 4, no. 1, 77-84, 1975. http://dx.doi.org/10.1137/0204007 See Also -------- simple_cycles, cycle_basis """ # Jon Olav Vik, 2010-08-09 def _unblock(thisnode): """Recursively unblock and remove nodes from B[thisnode].""" if blocked[thisnode]: blocked[thisnode] = False while B[thisnode]: _unblock(B[thisnode].pop()) def circuit(thisnode, startnode, component): closed = False # set to True if elementary path is closed path.append(thisnode) blocked[thisnode] = True for nextnode in component[thisnode]: # direct successors of thisnode if nextnode == startnode: result.append(path[:]) closed = True elif not blocked[nextnode]: if circuit(nextnode, startnode, component): closed = True if closed: _unblock(thisnode) else: for nextnode in component[thisnode]: if thisnode not in B[nextnode]: # TODO: use set for speedup? B[nextnode].append(thisnode) path.pop() # remove thisnode from path return closed path = [] # stack of nodes in current path blocked = defaultdict(bool) # vertex: blocked from search? B = defaultdict(list) # graph portions that yield no elementary circuit result = [] # list to accumulate the circuits found # Johnson's algorithm requires some ordering of the nodes. # They might not be sortable so we assign an arbitrary ordering. ordering=dict(zip(G,range(len(G)))) for s in ordering: # Build the subgraph induced by s and following nodes in the ordering subgraph = G.subgraph(node for node in G if ordering[node] >= ordering[s]) # Find the strongly connected component in the subgraph # that contains the least node according to the ordering strongcomp = nx.strongly_connected_components(subgraph) mincomp=min(strongcomp, key=lambda nodes: min(ordering[n] for n in nodes)) component = G.subgraph(mincomp) if component: # smallest node in the component according to the ordering startnode = min(component,key=ordering.__getitem__) for node in component: blocked[node] = False B[node][:] = [] dummy=circuit(startnode, startnode, component) return result def find_cycle(G, source=None, orientation='original'): """ Returns the edges of a cycle found via a directed, depth-first traversal. Parameters ---------- G : graph A directed/undirected graph/multigraph. source : node, list of nodes The node from which the traversal begins. If ``None``, then a source is chosen arbitrarily and repeatedly until all edges from each node in the graph are searched. orientation : 'original' | 'reverse' | 'ignore' For directed graphs and directed multigraphs, edge traversals need not respect the original orientation of the edges. When set to 'reverse', then every edge will be traversed in the reverse direction. When set to 'ignore', then each directed edge is treated as a single undirected edge that can be traversed in either direction. For undirected graphs and undirected multigraphs, this parameter is meaningless and is not consulted by the algorithm. Returns ------- edges : directed edges A list of directed edges indicating the path taken for the loop. If no cycle is found, then ``edges`` will be an empty list. For graphs, an edge is of the form (u, v) where ``u`` and ``v`` are the tail and head of the edge as determined by the traversal. For multigraphs, an edge is of the form (u, v, key), where ``key`` is the key of the edge. When the graph is directed, then ``u`` and ``v`` are always in the order of the actual directed edge. If orientation is 'ignore', then an edge takes the form (u, v, key, direction) where direction indicates if the edge was followed in the forward (tail to head) or reverse (head to tail) direction. When the direction is forward, the value of ``direction`` is 'forward'. When the direction is reverse, the value of ``direction`` is 'reverse'. Examples -------- In this example, we construct a DAG and find, in the first call, that there are no directed cycles, and so an exception is raised. In the second call, we ignore edge orientations and find that there is an undirected cycle. Note that the second call finds a directed cycle while effectively traversing an undirected graph, and so, we found an "undirected cycle". This means that this DAG structure does not form a directed tree (which is also known as a polytree). >>> import networkx as nx >>> G = nx.DiGraph([(0,1), (0,2), (1,2)]) >>> try: ... find_cycle(G, orientation='original') ... except: ... pass ... >>> list(find_cycle(G, orientation='ignore')) [(0, 1, 'forward'), (1, 2, 'forward'), (0, 2, 'reverse')] """ out_edge, key, tailhead = helper_funcs(G, orientation) explored = set() cycle = [] final_node = None for start_node in G.nbunch_iter(source): if start_node in explored: # No loop is possible. continue edges = [] # All nodes seen in this iteration of edge_dfs seen = {start_node} # Nodes in active path. active_nodes = {start_node} previous_node = None for edge in edge_dfs(G, start_node, orientation): # Determine if this edge is a continuation of the active path. tail, head = tailhead(edge) if previous_node is not None and tail != previous_node: # This edge results from backtracking. # Pop until we get a node whose head equals the current tail. # So for example, we might have: # (0,1), (1,2), (2,3), (1,4) # which must become: # (0,1), (1,4) while True: try: popped_edge = edges.pop() except IndexError: edges = [] active_nodes = {tail} break else: popped_head = tailhead(popped_edge)[1] active_nodes.remove(popped_head) if edges: last_head = tailhead(edges[-1])[1] if tail == last_head: break edges.append(edge) if head in active_nodes: # We have a loop! cycle.extend(edges) final_node = head break elif head in explored: # Then we've already explored it. No loop is possible. break else: seen.add(head) active_nodes.add(head) previous_node = head if cycle: break else: explored.update(seen) else: assert(len(cycle) == 0) raise nx.exception.NetworkXNoCycle('No cycle found.') # We now have a list of edges which ends on a cycle. # So we need to remove from the beginning edges that are not relevant. for i, edge in enumerate(cycle): tail, head = tailhead(edge) if tail == final_node: break return cycle[i:] networkx-1.11/networkx/algorithms/components/0000755000175000017500000000000012653231454021417 5ustar aricaric00000000000000networkx-1.11/networkx/algorithms/components/attracting.py0000644000175000017500000000665012637544450024145 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ Attracting components. """ # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import networkx as nx from networkx.utils.decorators import not_implemented_for __authors__ = "\n".join(['Christopher Ellison']) __all__ = ['number_attracting_components', 'attracting_components', 'is_attracting_component', 'attracting_component_subgraphs', ] @not_implemented_for('undirected') def attracting_components(G): """Generates a list of attracting components in `G`. An attracting component in a directed graph `G` is a strongly connected component with the property that a random walker on the graph will never leave the component, once it enters the component. The nodes in attracting components can also be thought of as recurrent nodes. If a random walker enters the attractor containing the node, then the node will be visited infinitely often. Parameters ---------- G : DiGraph, MultiDiGraph The graph to be analyzed. Returns ------- attractors : generator of sets A generator of sets of nodes, one for each attracting component of G. See Also -------- number_attracting_components is_attracting_component attracting_component_subgraphs """ scc = list(nx.strongly_connected_components(G)) cG = nx.condensation(G, scc) for n in cG: if cG.out_degree(n) == 0: yield scc[n] @not_implemented_for('undirected') def number_attracting_components(G): """Returns the number of attracting components in `G`. Parameters ---------- G : DiGraph, MultiDiGraph The graph to be analyzed. Returns ------- n : int The number of attracting components in G. See Also -------- attracting_components is_attracting_component attracting_component_subgraphs """ n = len(list(attracting_components(G))) return n @not_implemented_for('undirected') def is_attracting_component(G): """Returns True if `G` consists of a single attracting component. Parameters ---------- G : DiGraph, MultiDiGraph The graph to be analyzed. Returns ------- attracting : bool True if `G` has a single attracting component. Otherwise, False. See Also -------- attracting_components number_attracting_components attracting_component_subgraphs """ ac = list(attracting_components(G)) if len(ac[0]) == len(G): attracting = True else: attracting = False return attracting @not_implemented_for('undirected') def attracting_component_subgraphs(G, copy=True): """Generates a list of attracting component subgraphs from `G`. Parameters ---------- G : DiGraph, MultiDiGraph The graph to be analyzed. Returns ------- subgraphs : list A list of node-induced subgraphs of the attracting components of `G`. copy : bool If copy is True, graph, node, and edge attributes are copied to the subgraphs. See Also -------- attracting_components number_attracting_components is_attracting_component """ for ac in attracting_components(G): if copy: yield G.subgraph(ac).copy() else: yield G.subgraph(ac) networkx-1.11/networkx/algorithms/components/weakly_connected.py0000644000175000017500000001130312637544450025312 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """Weakly connected components. """ # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import networkx as nx from networkx.utils.decorators import not_implemented_for __authors__ = "\n".join(['Aric Hagberg (hagberg@lanl.gov)' 'Christopher Ellison']) __all__ = [ 'number_weakly_connected_components', 'weakly_connected_components', 'weakly_connected_component_subgraphs', 'is_weakly_connected', ] @not_implemented_for('undirected') def weakly_connected_components(G): """Generate weakly connected components of G. Parameters ---------- G : NetworkX graph A directed graph Returns ------- comp : generator of sets A generator of sets of nodes, one for each weakly connected component of G. Examples -------- Generate a sorted list of weakly connected components, largest first. >>> G = nx.path_graph(4, create_using=nx.DiGraph()) >>> G.add_path([10, 11, 12]) >>> [len(c) for c in sorted(nx.weakly_connected_components(G), ... key=len, reverse=True)] [4, 3] If you only want the largest component, it's more efficient to use max instead of sort. >>> largest_cc = max(nx.weakly_connected_components(G), key=len) See Also -------- strongly_connected_components Notes ----- For directed graphs only. """ seen = set() for v in G: if v not in seen: c = set(_plain_bfs(G, v)) yield c seen.update(c) @not_implemented_for('undirected') def number_weakly_connected_components(G): """Return the number of weakly connected components in G. Parameters ---------- G : NetworkX graph A directed graph. Returns ------- n : integer Number of weakly connected components See Also -------- connected_components Notes ----- For directed graphs only. """ return len(list(weakly_connected_components(G))) @not_implemented_for('undirected') def weakly_connected_component_subgraphs(G, copy=True): """Generate weakly connected components as subgraphs. Parameters ---------- G : NetworkX graph A directed graph. copy: bool (default=True) If True make a copy of the graph attributes Returns ------- comp : generator A generator of graphs, one for each weakly connected component of G. Examples -------- Generate a sorted list of weakly connected components, largest first. >>> G = nx.path_graph(4, create_using=nx.DiGraph()) >>> G.add_path([10, 11, 12]) >>> [len(c) for c in sorted(nx.weakly_connected_component_subgraphs(G), ... key=len, reverse=True)] [4, 3] If you only want the largest component, it's more efficient to use max instead of sort. >>> Gc = max(nx.weakly_connected_component_subgraphs(G), key=len) See Also -------- strongly_connected_components connected_components Notes ----- For directed graphs only. Graph, node, and edge attributes are copied to the subgraphs by default. """ for comp in weakly_connected_components(G): if copy: yield G.subgraph(comp).copy() else: yield G.subgraph(comp) @not_implemented_for('undirected') def is_weakly_connected(G): """Test directed graph for weak connectivity. A directed graph is weakly connected if, and only if, the graph is connected when the direction of the edge between nodes is ignored. Parameters ---------- G : NetworkX Graph A directed graph. Returns ------- connected : bool True if the graph is weakly connected, False otherwise. See Also -------- is_strongly_connected is_semiconnected is_connected Notes ----- For directed graphs only. """ if len(G) == 0: raise nx.NetworkXPointlessConcept( """Connectivity is undefined for the null graph.""") return len(list(weakly_connected_components(G))[0]) == len(G) def _plain_bfs(G, source): """A fast BFS node generator The direction of the edge between nodes is ignored. For directed graphs only. """ Gsucc = G.succ Gpred = G.pred seen = set() nextlevel = {source} while nextlevel: thislevel = nextlevel nextlevel = set() for v in thislevel: if v not in seen: yield v seen.add(v) nextlevel.update(Gsucc[v]) nextlevel.update(Gpred[v]) networkx-1.11/networkx/algorithms/components/__init__.py0000644000175000017500000000054112637544450023535 0ustar aricaric00000000000000from networkx.algorithms.components.connected import * from networkx.algorithms.components.strongly_connected import * from networkx.algorithms.components.weakly_connected import * from networkx.algorithms.components.attracting import * from networkx.algorithms.components.biconnected import * from networkx.algorithms.components.semiconnected import * networkx-1.11/networkx/algorithms/components/strongly_connected.py0000644000175000017500000002724112637544500025703 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """Strongly connected components. """ # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import networkx as nx from networkx.utils.decorators import not_implemented_for __authors__ = "\n".join(['Eben Kenah', 'Aric Hagberg (hagberg@lanl.gov)' 'Christopher Ellison', 'Ben Edwards (bedwards@cs.unm.edu)']) __all__ = ['number_strongly_connected_components', 'strongly_connected_components', 'strongly_connected_component_subgraphs', 'is_strongly_connected', 'strongly_connected_components_recursive', 'kosaraju_strongly_connected_components', 'condensation'] @not_implemented_for('undirected') def strongly_connected_components(G): """Generate nodes in strongly connected components of graph. Parameters ---------- G : NetworkX Graph An directed graph. Returns ------- comp : generator of sets A generator of sets of nodes, one for each strongly connected component of G. Raises ------ NetworkXNotImplemented: If G is undirected. Examples -------- Generate a sorted list of strongly connected components, largest first. >>> G = nx.cycle_graph(4, create_using=nx.DiGraph()) >>> G.add_cycle([10, 11, 12]) >>> [len(c) for c in sorted(nx.strongly_connected_components(G), ... key=len, reverse=True)] [4, 3] If you only want the largest component, it's more efficient to use max instead of sort. >>> largest = max(nx.strongly_connected_components(G), key=len) See Also -------- connected_components, weakly_connected_components Notes ----- Uses Tarjan's algorithm with Nuutila's modifications. Nonrecursive version of algorithm. References ---------- .. [1] Depth-first search and linear graph algorithms, R. Tarjan SIAM Journal of Computing 1(2):146-160, (1972). .. [2] On finding the strongly connected components in a directed graph. E. Nuutila and E. Soisalon-Soinen Information Processing Letters 49(1): 9-14, (1994).. """ preorder = {} lowlink = {} scc_found = {} scc_queue = [] i = 0 # Preorder counter for source in G: if source not in scc_found: queue = [source] while queue: v = queue[-1] if v not in preorder: i = i + 1 preorder[v] = i done = 1 v_nbrs = G[v] for w in v_nbrs: if w not in preorder: queue.append(w) done = 0 break if done == 1: lowlink[v] = preorder[v] for w in v_nbrs: if w not in scc_found: if preorder[w] > preorder[v]: lowlink[v] = min([lowlink[v], lowlink[w]]) else: lowlink[v] = min([lowlink[v], preorder[w]]) queue.pop() if lowlink[v] == preorder[v]: scc_found[v] = True scc = {v} while scc_queue and preorder[scc_queue[-1]] > preorder[v]: k = scc_queue.pop() scc_found[k] = True scc.add(k) yield scc else: scc_queue.append(v) @not_implemented_for('undirected') def kosaraju_strongly_connected_components(G, source=None): """Generate nodes in strongly connected components of graph. Parameters ---------- G : NetworkX Graph An directed graph. Returns ------- comp : generator of sets A genrator of sets of nodes, one for each strongly connected component of G. Raises ------ NetworkXNotImplemented: If G is undirected. Examples -------- Generate a sorted list of strongly connected components, largest first. >>> G = nx.cycle_graph(4, create_using=nx.DiGraph()) >>> G.add_cycle([10, 11, 12]) >>> [len(c) for c in sorted(nx.kosaraju_strongly_connected_components(G), ... key=len, reverse=True)] [4, 3] If you only want the largest component, it's more efficient to use max instead of sort. >>> largest = max(nx.kosaraju_strongly_connected_components(G), key=len) See Also -------- connected_components weakly_connected_components Notes ----- Uses Kosaraju's algorithm. """ with nx.utils.reversed(G): post = list(nx.dfs_postorder_nodes(G, source=source)) seen = set() while post: r = post.pop() if r in seen: continue c = nx.dfs_preorder_nodes(G, r) new = {v for v in c if v not in seen} yield new seen.update(new) @not_implemented_for('undirected') def strongly_connected_components_recursive(G): """Generate nodes in strongly connected components of graph. Recursive version of algorithm. Parameters ---------- G : NetworkX Graph An directed graph. Returns ------- comp : generator of sets A generator of sets of nodes, one for each strongly connected component of G. Raises ------ NetworkXNotImplemented: If G is undirected Examples -------- Generate a sorted list of strongly connected components, largest first. >>> G = nx.cycle_graph(4, create_using=nx.DiGraph()) >>> G.add_cycle([10, 11, 12]) >>> [len(c) for c in sorted(nx.strongly_connected_components_recursive(G), ... key=len, reverse=True)] [4, 3] If you only want the largest component, it's more efficient to use max instead of sort. >>> largest = max(nx.strongly_connected_components_recursive(G), key=len) See Also -------- connected_components Notes ----- Uses Tarjan's algorithm with Nuutila's modifications. References ---------- .. [1] Depth-first search and linear graph algorithms, R. Tarjan SIAM Journal of Computing 1(2):146-160, (1972). .. [2] On finding the strongly connected components in a directed graph. E. Nuutila and E. Soisalon-Soinen Information Processing Letters 49(1): 9-14, (1994).. """ def visit(v, cnt): root[v] = cnt visited[v] = cnt cnt += 1 stack.append(v) for w in G[v]: if w not in visited: for c in visit(w, cnt): yield c if w not in component: root[v] = min(root[v], root[w]) if root[v] == visited[v]: component[v] = root[v] tmpc = {v} # hold nodes in this component while stack[-1] != v: w = stack.pop() component[w] = root[v] tmpc.add(w) stack.remove(v) yield tmpc visited = {} component = {} root = {} cnt = 0 stack = [] for source in G: if source not in visited: for c in visit(source, cnt): yield c @not_implemented_for('undirected') def strongly_connected_component_subgraphs(G, copy=True): """Generate strongly connected components as subgraphs. Parameters ---------- G : NetworkX Graph A directed graph. copy : boolean, optional if copy is True, Graph, node, and edge attributes are copied to the subgraphs. Returns ------- comp : generator of graphs A generator of graphs, one for each strongly connected component of G. Examples -------- Generate a sorted list of strongly connected components, largest first. >>> G = nx.cycle_graph(4, create_using=nx.DiGraph()) >>> G.add_cycle([10, 11, 12]) >>> [len(Gc) for Gc in sorted(nx.strongly_connected_component_subgraphs(G), ... key=len, reverse=True)] [4, 3] If you only want the largest component, it's more efficient to use max instead of sort. >>> Gc = max(nx.strongly_connected_component_subgraphs(G), key=len) See Also -------- connected_component_subgraphs weakly_connected_component_subgraphs """ for comp in strongly_connected_components(G): if copy: yield G.subgraph(comp).copy() else: yield G.subgraph(comp) @not_implemented_for('undirected') def number_strongly_connected_components(G): """Return number of strongly connected components in graph. Parameters ---------- G : NetworkX graph A directed graph. Returns ------- n : integer Number of strongly connected components See Also -------- connected_components Notes ----- For directed graphs only. """ return len(list(strongly_connected_components(G))) @not_implemented_for('undirected') def is_strongly_connected(G): """Test directed graph for strong connectivity. Parameters ---------- G : NetworkX Graph A directed graph. Returns ------- connected : bool True if the graph is strongly connected, False otherwise. See Also -------- strongly_connected_components Notes ----- For directed graphs only. """ if len(G) == 0: raise nx.NetworkXPointlessConcept( """Connectivity is undefined for the null graph.""") return len(list(strongly_connected_components(G))[0]) == len(G) @not_implemented_for('undirected') def condensation(G, scc=None): """Returns the condensation of G. The condensation of G is the graph with each of the strongly connected components contracted into a single node. Parameters ---------- G : NetworkX DiGraph A directed graph. scc: list or generator (optional, default=None) Strongly connected components. If provided, the elements in `scc` must partition the nodes in `G`. If not provided, it will be calculated as scc=nx.strongly_connected_components(G). Returns ------- C : NetworkX DiGraph The condensation graph C of G. The node labels are integers corresponding to the index of the component in the list of strongly connected components of G. C has a graph attribute named 'mapping' with a dictionary mapping the original nodes to the nodes in C to which they belong. Each node in C also has a node attribute 'members' with the set of original nodes in G that form the SCC that the node in C represents. Raises ------ NetworkXNotImplemented: If G is not directed Notes ----- After contracting all strongly connected components to a single node, the resulting graph is a directed acyclic graph. """ if scc is None: scc = nx.strongly_connected_components(G) mapping = {} members = {} C = nx.DiGraph() for i, component in enumerate(scc): members[i] = component mapping.update((n, i) for n in component) number_of_components = i + 1 C.add_nodes_from(range(number_of_components)) C.add_edges_from((mapping[u], mapping[v]) for u, v in G.edges_iter() if mapping[u] != mapping[v]) # Add a list of members (ie original nodes) to each node (ie scc) in C. nx.set_node_attributes(C, 'members', members) # Add mapping dict as graph attribute C.graph['mapping'] = mapping return C networkx-1.11/networkx/algorithms/components/tests/0000755000175000017500000000000012653231454022561 5ustar aricaric00000000000000networkx-1.11/networkx/algorithms/components/tests/test_biconnected.py0000644000175000017500000001266412637544450026465 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * import networkx as nx from networkx.algorithms.components import biconnected from networkx import NetworkXNotImplemented def assert_components_edges_equal(x, y): sx = {frozenset([frozenset(e) for e in c]) for c in x} sy = {frozenset([frozenset(e) for e in c]) for c in y} assert_equal(sx, sy) def assert_components_equal(x, y): sx = {frozenset(c) for c in x} sy = {frozenset(c) for c in y} assert_equal(sx, sy) def test_barbell(): G = nx.barbell_graph(8, 4) G.add_path([7, 20, 21, 22]) G.add_cycle([22, 23, 24, 25]) pts = set(nx.articulation_points(G)) assert_equal(pts, {7, 8, 9, 10, 11, 12, 20, 21, 22}) answer = [ {12, 13, 14, 15, 16, 17, 18, 19}, {0, 1, 2, 3, 4, 5, 6, 7}, {22, 23, 24, 25}, {11, 12}, {10, 11}, {9, 10}, {8, 9}, {7, 8}, {21, 22}, {20, 21}, {7, 20}, ] assert_components_equal(list(nx.biconnected_components(G)), answer) G.add_edge(2,17) pts = set(nx.articulation_points(G)) assert_equal(pts, {7, 20, 21, 22}) def test_articulation_points_cycle(): G=nx.cycle_graph(3) G.add_cycle([1, 3, 4]) pts=set(nx.articulation_points(G)) assert_equal(pts, {1}) def test_is_biconnected(): G=nx.cycle_graph(3) assert_true(nx.is_biconnected(G)) G.add_cycle([1, 3, 4]) assert_false(nx.is_biconnected(G)) def test_empty_is_biconnected(): G=nx.empty_graph(5) assert_false(nx.is_biconnected(G)) G.add_edge(0, 1) assert_false(nx.is_biconnected(G)) def test_biconnected_components_cycle(): G=nx.cycle_graph(3) G.add_cycle([1, 3, 4]) answer = [{0, 1, 2}, {1, 3, 4}] assert_components_equal(list(nx.biconnected_components(G)), answer) def test_biconnected_component_subgraphs_cycle(): G=nx.cycle_graph(3) G.add_cycle([1, 3, 4, 5]) Gc = set(nx.biconnected_component_subgraphs(G)) assert_equal(len(Gc), 2) g1, g2=Gc if 0 in g1: assert_true(nx.is_isomorphic(g1, nx.Graph([(0,1),(0,2),(1,2)]))) assert_true(nx.is_isomorphic(g2, nx.Graph([(1,3),(1,5),(3,4),(4,5)]))) else: assert_true(nx.is_isomorphic(g1, nx.Graph([(1,3),(1,5),(3,4),(4,5)]))) assert_true(nx.is_isomorphic(g2, nx.Graph([(0,1),(0,2),(1,2)]))) def test_biconnected_components1(): # graph example from # http://www.ibluemojo.com/school/articul_algorithm.html edges=[ (0, 1), (0, 5), (0, 6), (0, 14), (1, 5), (1, 6), (1, 14), (2, 4), (2, 10), (3, 4), (3, 15), (4, 6), (4, 7), (4, 10), (5, 14), (6, 14), (7, 9), (8, 9), (8, 12), (8, 13), (10, 15), (11, 12), (11, 13), (12, 13) ] G=nx.Graph(edges) pts = set(nx.articulation_points(G)) assert_equal(pts, {4, 6, 7, 8, 9}) comps = list(nx.biconnected_component_edges(G)) answer = [ [(3, 4), (15, 3), (10, 15), (10, 4), (2, 10), (4, 2)], [(13, 12), (13, 8), (11, 13), (12, 11), (8, 12)], [(9, 8)], [(7, 9)], [(4, 7)], [(6, 4)], [(14, 0), (5, 1), (5, 0), (14, 5), (14, 1), (6, 14), (6, 0), (1, 6), (0, 1)], ] assert_components_edges_equal(comps, answer) def test_biconnected_components2(): G=nx.Graph() G.add_cycle('ABC') G.add_cycle('CDE') G.add_cycle('FIJHG') G.add_cycle('GIJ') G.add_edge('E','G') comps = list(nx.biconnected_component_edges(G)) answer = [ [tuple('GF'), tuple('FI'), tuple('IG'), tuple('IJ'), tuple('JG'), tuple('JH'), tuple('HG')], [tuple('EG')], [tuple('CD'), tuple('DE'), tuple('CE')], [tuple('AB'), tuple('BC'), tuple('AC')] ] assert_components_edges_equal(comps, answer) def test_biconnected_davis(): D = nx.davis_southern_women_graph() bcc = list(nx.biconnected_components(D))[0] assert_true(set(D) == bcc) # All nodes in a giant bicomponent # So no articulation points assert_equal(len(list(nx.articulation_points(D))), 0) def test_biconnected_karate(): K = nx.karate_club_graph() answer = [{0, 1, 2, 3, 7, 8, 9, 12, 13, 14, 15, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33}, {0, 4, 5, 6, 10, 16}, {0, 11}] bcc = list(nx.biconnected_components(K)) assert_components_equal(bcc, answer) assert_equal(set(nx.articulation_points(K)), {0}) def test_biconnected_eppstein(): # tests from http://www.ics.uci.edu/~eppstein/PADS/Biconnectivity.py G1 = nx.Graph({ 0: [1, 2, 5], 1: [0, 5], 2: [0, 3, 4], 3: [2, 4, 5, 6], 4: [2, 3, 5, 6], 5: [0, 1, 3, 4], 6: [3, 4], }) G2 = nx.Graph({ 0: [2, 5], 1: [3, 8], 2: [0, 3, 5], 3: [1, 2, 6, 8], 4: [7], 5: [0, 2], 6: [3, 8], 7: [4], 8: [1, 3, 6], }) assert_true(nx.is_biconnected(G1)) assert_false(nx.is_biconnected(G2)) answer_G2 = [{1, 3, 6, 8}, {0, 2, 5}, {2, 3}, {4, 7}] bcc = list(nx.biconnected_components(G2)) assert_components_equal(bcc, answer_G2) def test_connected_raise(): DG = nx.DiGraph() assert_raises(NetworkXNotImplemented, nx.biconnected_components, DG) assert_raises(NetworkXNotImplemented, nx.biconnected_component_subgraphs, DG) assert_raises(NetworkXNotImplemented, nx.biconnected_component_edges, DG) assert_raises(NetworkXNotImplemented, nx.articulation_points, DG) assert_raises(NetworkXNotImplemented, nx.is_biconnected, DG) networkx-1.11/networkx/algorithms/components/tests/test_weakly_connected.py0000644000175000017500000000470412637544450027522 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * import networkx as nx from networkx import NetworkXNotImplemented class TestWeaklyConnected: def setUp(self): self.gc = [] G = nx.DiGraph() G.add_edges_from([(1, 2), (2, 3), (2, 8), (3, 4), (3, 7), (4, 5), (5, 3), (5, 6), (7, 4), (7, 6), (8, 1), (8, 7)]) C = [[3, 4, 5, 7], [1, 2, 8], [6]] self.gc.append((G, C)) G = nx.DiGraph() G.add_edges_from([(1, 2), (1, 3), (1, 4), (4, 2), (3, 4), (2, 3)]) C = [[2, 3, 4],[1]] self.gc.append((G, C)) G = nx.DiGraph() G.add_edges_from([(1, 2), (2, 3), (3, 2), (2, 1)]) C = [[1, 2, 3]] self.gc.append((G,C)) # Eppstein's tests G = nx.DiGraph({0:[1], 1:[2, 3], 2:[4, 5], 3:[4, 5], 4:[6], 5:[], 6:[]}) C = [[0], [1], [2],[ 3], [4], [5], [6]] self.gc.append((G,C)) G = nx.DiGraph({0:[1], 1:[2, 3, 4], 2:[0, 3], 3:[4], 4:[3]}) C = [[0, 1, 2], [3, 4]] self.gc.append((G, C)) def test_weakly_connected_components(self): for G, C in self.gc: U = G.to_undirected() w = {frozenset(g) for g in nx.weakly_connected_components(G)} c = {frozenset(g) for g in nx.connected_components(U)} assert_equal(w, c) def test_number_weakly_connected_components(self): for G, C in self.gc: U = G.to_undirected() w = nx.number_weakly_connected_components(G) c = nx.number_connected_components(U) assert_equal(w, c) def test_weakly_connected_component_subgraphs(self): wcc = nx.weakly_connected_component_subgraphs cc = nx.connected_component_subgraphs for G, C in self.gc: U = G.to_undirected() w = {frozenset(g) for g in wcc(G)} c = {frozenset(g) for g in cc(U)} assert_equal(w, c) def test_is_weakly_connected(self): for G, C in self.gc: U = G.to_undirected() assert_equal(nx.is_weakly_connected(G), nx.is_connected(U)) def test_connected_raise(self): G=nx.Graph() assert_raises(NetworkXNotImplemented,nx.weakly_connected_components, G) assert_raises(NetworkXNotImplemented,nx.number_weakly_connected_components, G) assert_raises(NetworkXNotImplemented,nx.weakly_connected_component_subgraphs, G) assert_raises(NetworkXNotImplemented,nx.is_weakly_connected, G) networkx-1.11/networkx/algorithms/components/tests/test_semiconnected.py0000644000175000017500000000355512637544450027027 0ustar aricaric00000000000000from itertools import chain import networkx as nx from nose.tools import * class TestIsSemiconnected(object): def test_undirected(self): assert_raises(nx.NetworkXNotImplemented, nx.is_semiconnected, nx.Graph()) assert_raises(nx.NetworkXNotImplemented, nx.is_semiconnected, nx.MultiGraph()) def test_empty(self): assert_raises(nx.NetworkXPointlessConcept, nx.is_semiconnected, nx.DiGraph()) assert_raises(nx.NetworkXPointlessConcept, nx.is_semiconnected, nx.MultiDiGraph()) def test_single_node_graph(self): G = nx.DiGraph() G.add_node(0) ok_(nx.is_semiconnected(G)) def test_path(self): G = nx.path_graph(100, create_using=nx.DiGraph()) ok_(nx.is_semiconnected(G)) G.add_edge(100, 99) ok_(not nx.is_semiconnected(G)) def test_cycle(self): G = nx.cycle_graph(100, create_using=nx.DiGraph()) ok_(nx.is_semiconnected(G)) G = nx.path_graph(100, create_using=nx.DiGraph()) G.add_edge(0, 99) ok_(nx.is_semiconnected(G)) def test_tree(self): G = nx.DiGraph() G.add_edges_from(chain.from_iterable([(i, 2 * i + 1), (i, 2 * i + 2)] for i in range(100))) ok_(not nx.is_semiconnected(G)) def test_dumbbell(self): G = nx.cycle_graph(100, create_using=nx.DiGraph()) G.add_edges_from((i + 100, (i + 1) % 100 + 100) for i in range(100)) ok_(not nx.is_semiconnected(G)) # G is disconnected. G.add_edge(100, 99) ok_(nx.is_semiconnected(G)) def test_alternating_path(self): G = nx.DiGraph(chain.from_iterable([(i, i - 1), (i, i + 1)] for i in range(0, 100, 2))) ok_(not nx.is_semiconnected(G)) networkx-1.11/networkx/algorithms/components/tests/test_subgraph_copies.py0000644000175000017500000000665112637544450027364 0ustar aricaric00000000000000""" Tests for subgraphs attributes """ from copy import deepcopy from nose.tools import assert_equal import networkx as nx class TestSubgraphAttributesDicts: def setUp(self): self.undirected = [ nx.connected_component_subgraphs, nx.biconnected_component_subgraphs, ] self.directed = [ nx.weakly_connected_component_subgraphs, nx.strongly_connected_component_subgraphs, nx.attracting_component_subgraphs, ] self.subgraph_funcs = self.undirected + self.directed self.D = nx.DiGraph() self.D.add_edge(1, 2, eattr='red') self.D.add_edge(2, 1, eattr='red') self.D.node[1]['nattr'] = 'blue' self.D.graph['gattr'] = 'green' self.G = nx.Graph() self.G.add_edge(1, 2, eattr='red') self.G.node[1]['nattr'] = 'blue' self.G.graph['gattr'] = 'green' def test_subgraphs_default_copy_behavior(self): # Test the default behavior of subgraph functions # For the moment (1.10) the default is to copy for subgraph_func in self.subgraph_funcs: G = deepcopy(self.G if subgraph_func in self.undirected else self.D) SG = list(subgraph_func(G))[0] assert_equal(SG[1][2]['eattr'], 'red') assert_equal(SG.node[1]['nattr'], 'blue') assert_equal(SG.graph['gattr'], 'green') SG[1][2]['eattr'] = 'foo' assert_equal(G[1][2]['eattr'], 'red') assert_equal(SG[1][2]['eattr'], 'foo') SG.node[1]['nattr'] = 'bar' assert_equal(G.node[1]['nattr'], 'blue') assert_equal(SG.node[1]['nattr'], 'bar') SG.graph['gattr'] = 'baz' assert_equal(G.graph['gattr'], 'green') assert_equal(SG.graph['gattr'], 'baz') def test_subgraphs_copy(self): for subgraph_func in self.subgraph_funcs: test_graph = self.G if subgraph_func in self.undirected else self.D G = deepcopy(test_graph) SG = list(subgraph_func(G, copy=True))[0] assert_equal(SG[1][2]['eattr'], 'red') assert_equal(SG.node[1]['nattr'], 'blue') assert_equal(SG.graph['gattr'], 'green') SG[1][2]['eattr'] = 'foo' assert_equal(G[1][2]['eattr'], 'red') assert_equal(SG[1][2]['eattr'], 'foo') SG.node[1]['nattr'] = 'bar' assert_equal(G.node[1]['nattr'], 'blue') assert_equal(SG.node[1]['nattr'], 'bar') SG.graph['gattr'] = 'baz' assert_equal(G.graph['gattr'], 'green') assert_equal(SG.graph['gattr'], 'baz') def test_subgraphs_no_copy(self): for subgraph_func in self.subgraph_funcs: G = deepcopy(self.G if subgraph_func in self.undirected else self.D) SG = list(subgraph_func(G, copy=False))[0] assert_equal(SG[1][2]['eattr'], 'red') assert_equal(SG.node[1]['nattr'], 'blue') assert_equal(SG.graph['gattr'], 'green') SG[1][2]['eattr'] = 'foo' assert_equal(G[1][2]['eattr'], 'foo') assert_equal(SG[1][2]['eattr'], 'foo') SG.node[1]['nattr'] = 'bar' assert_equal(G.node[1]['nattr'], 'bar') assert_equal(SG.node[1]['nattr'], 'bar') SG.graph['gattr'] = 'baz' assert_equal(G.graph['gattr'], 'baz') assert_equal(SG.graph['gattr'], 'baz') networkx-1.11/networkx/algorithms/components/tests/test_strongly_connected.py0000644000175000017500000001232112637544500030075 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * import networkx as nx from networkx import NetworkXNotImplemented class TestStronglyConnected: def setUp(self): self.gc = [] G = nx.DiGraph() G.add_edges_from([(1, 2), (2, 3), (2, 8), (3, 4), (3, 7), (4, 5), (5, 3), (5, 6), (7, 4), (7, 6), (8, 1), (8, 7)]) C = {frozenset([3, 4, 5, 7]), frozenset([1, 2, 8]), frozenset([6])} self.gc.append((G, C)) G = nx.DiGraph() G.add_edges_from([(1, 2), (1, 3), (1, 4), (4, 2), (3, 4), (2, 3)]) C = {frozenset([2, 3, 4]), frozenset([1])} self.gc.append((G, C)) G = nx.DiGraph() G.add_edges_from([(1, 2), (2, 3), (3, 2), (2, 1)]) C = {frozenset([1, 2, 3])} self.gc.append((G, C)) # Eppstein's tests G = nx.DiGraph({0: [1], 1:[2, 3], 2:[4, 5], 3:[4, 5], 4:[6], 5:[], 6:[]}) C = { frozenset([0]), frozenset([1]), frozenset([2]), frozenset([3]), frozenset([4]), frozenset([5]), frozenset([6]), } self.gc.append((G, C)) G = nx.DiGraph({0:[1], 1:[2, 3, 4], 2:[0, 3], 3:[4], 4:[3]}) C = {frozenset([0, 1, 2]), frozenset([3, 4])} self.gc.append((G, C)) def test_tarjan(self): scc = nx.strongly_connected_components for G, C in self.gc: assert_equal({frozenset(g) for g in scc(G)}, C) def test_tarjan_recursive(self): scc=nx.strongly_connected_components_recursive for G, C in self.gc: assert_equal({frozenset(g) for g in scc(G)}, C) def test_kosaraju(self): scc = nx.kosaraju_strongly_connected_components for G, C in self.gc: assert_equal({frozenset(g) for g in scc(G)}, C) def test_number_strongly_connected_components(self): ncc = nx.number_strongly_connected_components for G, C in self.gc: assert_equal(ncc(G), len(C)) def test_is_strongly_connected(self): for G, C in self.gc: if len(C) == 1: assert_true(nx.is_strongly_connected(G)) else: assert_false(nx.is_strongly_connected(G)) def test_strongly_connected_component_subgraphs(self): scc = nx.strongly_connected_component_subgraphs for G, C in self.gc: assert_equal({frozenset(g) for g in scc(G)}, C) def test_contract_scc1(self): G = nx.DiGraph() G.add_edges_from([ (1, 2), (2, 3), (2, 11), (2, 12), (3, 4), (4, 3), (4, 5), (5, 6), (6, 5), (6, 7), (7, 8), (7, 9), (7, 10), (8, 9), (9, 7), (10, 6), (11, 2), (11, 4), (11, 6), (12, 6), (12, 11), ]) scc = list(nx.strongly_connected_components(G)) cG = nx.condensation(G, scc) # DAG assert_true(nx.is_directed_acyclic_graph(cG)) # nodes assert_equal(sorted(cG.nodes()), [0, 1, 2, 3]) # edges mapping={} for i, component in enumerate(scc): for n in component: mapping[n] = i edge = (mapping[2], mapping[3]) assert_true(cG.has_edge(*edge)) edge = (mapping[2], mapping[5]) assert_true(cG.has_edge(*edge)) edge = (mapping[3], mapping[5]) assert_true(cG.has_edge(*edge)) def test_contract_scc_isolate(self): # Bug found and fixed in [1687]. G = nx.DiGraph() G.add_edge(1, 2) G.add_edge(2, 1) scc = list(nx.strongly_connected_components(G)) cG = nx.condensation(G, scc) assert_equal(list(cG.nodes()), [0]) assert_equal(list(cG.edges()), []) def test_contract_scc_edge(self): G = nx.DiGraph() G.add_edge(1, 2) G.add_edge(2, 1) G.add_edge(2, 3) G.add_edge(3, 4) G.add_edge(4, 3) scc = list(nx.strongly_connected_components(G)) cG = nx.condensation(G, scc) assert_equal(cG.nodes(), [0, 1]) if 1 in scc[0]: edge = (0, 1) else: edge = (1, 0) assert_equal(list(cG.edges()), [edge]) def test_condensation_mapping_and_members(self): G, C = self.gc[1] C = sorted(C, key=len, reverse=True) cG = nx.condensation(G) mapping = cG.graph['mapping'] assert_true(all(n in G for n in mapping)) assert_true(all(0 == cN for n, cN in mapping.items() if n in C[0])) assert_true(all(1 == cN for n, cN in mapping.items() if n in C[1])) for n, d in cG.nodes(data=True): assert_equal(set(C[n]), cG.node[n]['members']) def test_connected_raise(self): G=nx.Graph() assert_raises(NetworkXNotImplemented, nx.strongly_connected_components, G) assert_raises(NetworkXNotImplemented, nx.kosaraju_strongly_connected_components, G) assert_raises(NetworkXNotImplemented, nx.strongly_connected_components_recursive, G) assert_raises(NetworkXNotImplemented, nx.strongly_connected_component_subgraphs, G) assert_raises(NetworkXNotImplemented, nx.is_strongly_connected, G) assert_raises(nx.NetworkXPointlessConcept, nx.is_strongly_connected, nx.DiGraph()) assert_raises(NetworkXNotImplemented, nx.condensation, G) networkx-1.11/networkx/algorithms/components/tests/test_connected.py0000644000175000017500000000703112637544450026142 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * import networkx as nx from networkx import convert_node_labels_to_integers as cnlti from networkx import NetworkXError,NetworkXNotImplemented class TestConnected: def setUp(self): G1 = cnlti(nx.grid_2d_graph(2, 2), first_label=0, ordering="sorted") G2 = cnlti(nx.lollipop_graph(3, 3), first_label=4, ordering="sorted") G3 = cnlti(nx.house_graph(), first_label=10, ordering="sorted") self.G = nx.union(G1, G2) self.G = nx.union(self.G, G3) self.DG = nx.DiGraph([(1, 2), (1, 3), (2, 3)]) self.grid = cnlti(nx.grid_2d_graph(4, 4), first_label=1) self.gc = [] G = nx.DiGraph() G.add_edges_from([(1, 2), (2, 3), (2, 8), (3, 4), (3, 7), (4, 5), (5, 3), (5, 6), (7, 4), (7, 6), (8, 1), (8, 7)]) C = [[3, 4, 5, 7], [1, 2, 8], [6]] self.gc.append((G, C)) G = nx.DiGraph() G.add_edges_from([(1, 2), (1, 3), (1, 4), (4, 2), (3, 4), (2, 3)]) C = [[2, 3, 4],[1]] self.gc.append((G, C)) G = nx.DiGraph() G.add_edges_from([(1, 2), (2, 3), (3, 2), (2, 1)]) C = [[1, 2, 3]] self.gc.append((G,C)) # Eppstein's tests G = nx.DiGraph({0:[1], 1:[2, 3], 2:[4, 5], 3:[4, 5], 4:[6], 5:[], 6:[]}) C = [[0], [1], [2],[ 3], [4], [5], [6]] self.gc.append((G,C)) G = nx.DiGraph({0:[1], 1:[2, 3, 4], 2:[0, 3], 3:[4], 4:[3]}) C = [[0, 1, 2], [3, 4]] self.gc.append((G, C)) def test_connected_components(self): cc = nx.connected_components G = self.G C = { frozenset([0, 1, 2, 3]), frozenset([4, 5, 6, 7, 8, 9]), frozenset([10, 11, 12, 13, 14]) } assert_equal({frozenset(g) for g in cc(G)}, C) def test_number_connected_components(self): ncc = nx.number_connected_components assert_equal(ncc(self.G), 3) def test_number_connected_components2(self): ncc = nx.number_connected_components assert_equal(ncc(self.grid), 1) def test_connected_components2(self): cc = nx.connected_components G = self.grid C = {frozenset([1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16])} assert_equal({frozenset(g) for g in cc(G)}, C) def test_node_connected_components(self): ncc = nx.node_connected_component G = self.grid C = {1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16} assert_equal(ncc(G, 1), C) def test_connected_component_subgraphs(self): wcc = nx.weakly_connected_component_subgraphs cc = nx.connected_component_subgraphs for G, C in self.gc: U = G.to_undirected() w = {frozenset(g) for g in wcc(G)} c = {frozenset(g) for g in cc(U)} assert_equal(w, c) def test_is_connected(self): assert_true(nx.is_connected(self.grid)) G = nx.Graph() G.add_nodes_from([1, 2]) assert_false(nx.is_connected(G)) def test_connected_raise(self): assert_raises(NetworkXNotImplemented, nx.connected_components, self.DG) assert_raises(NetworkXNotImplemented, nx.number_connected_components, self.DG) assert_raises(NetworkXNotImplemented, nx.connected_component_subgraphs, self.DG) assert_raises(NetworkXNotImplemented, nx.node_connected_component, self.DG,1) assert_raises(NetworkXNotImplemented, nx.is_connected, self.DG) assert_raises(nx.NetworkXPointlessConcept, nx.is_connected, nx.Graph()) networkx-1.11/networkx/algorithms/components/tests/test_attracting.py0000644000175000017500000000374712637544450026352 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * import networkx as nx from networkx import NetworkXNotImplemented class TestAttractingComponents(object): def setUp(self): self.G1 = nx.DiGraph() self.G1.add_edges_from([(5, 11), (11, 2), (11, 9), (11, 10), (7, 11), (7, 8), (8, 9), (3, 8), (3, 10)]) self.G2 = nx.DiGraph() self.G2.add_edges_from([(0, 1), (0, 2), (1, 1), (1, 2), (2, 1)]) self.G3 = nx.DiGraph() self.G3.add_edges_from([(0, 1), (1, 2), (2, 1), (0, 3), (3, 4), (4, 3)]) def test_attracting_components(self): ac = list(nx.attracting_components(self.G1)) assert_true({2} in ac) assert_true({9} in ac) assert_true({10} in ac) ac = list(nx.attracting_components(self.G2)) ac = [tuple(sorted(x)) for x in ac] assert_true(ac == [(1, 2)]) ac = list(nx.attracting_components(self.G3)) ac = [tuple(sorted(x)) for x in ac] assert_true((1, 2) in ac) assert_true((3, 4) in ac) assert_equal(len(ac), 2) def test_number_attacting_components(self): assert_equal(nx.number_attracting_components(self.G1), 3) assert_equal(nx.number_attracting_components(self.G2), 1) assert_equal(nx.number_attracting_components(self.G3), 2) def test_is_attracting_component(self): assert_false(nx.is_attracting_component(self.G1)) assert_false(nx.is_attracting_component(self.G2)) assert_false(nx.is_attracting_component(self.G3)) g2 = self.G3.subgraph([1, 2]) assert_true(nx.is_attracting_component(g2)) def test_connected_raise(self): G=nx.Graph() assert_raises(NetworkXNotImplemented, nx.attracting_components, G) assert_raises(NetworkXNotImplemented, nx.number_attracting_components, G) assert_raises(NetworkXNotImplemented, nx.is_attracting_component, G) assert_raises(NetworkXNotImplemented, nx.attracting_component_subgraphs, G) networkx-1.11/networkx/algorithms/components/semiconnected.py0000644000175000017500000000301612637544500024612 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ Semiconnectedness. """ __author__ = """ysitu """ # Copyright (C) 2014 ysitu # All rights reserved. # BSD license. import networkx as nx from networkx.utils import not_implemented_for __all__ = ['is_semiconnected'] @not_implemented_for('undirected') def is_semiconnected(G): """Return True if the graph is semiconnected, False otherwise. A graph is semiconnected if, and only if, for any pair of nodes, either one is reachable from the other, or they are mutually reachable. Parameters ---------- G : NetworkX graph A directed graph. Returns ------- semiconnected : bool True if the graph is semiconnected, False otherwise. Raises ------ NetworkXNotImplemented : If the input graph is not directed. NetworkXPointlessConcept : If the graph is empty. Examples -------- >>> G=nx.path_graph(4,create_using=nx.DiGraph()) >>> print(nx.is_semiconnected(G)) True >>> G=nx.DiGraph([(1, 2), (3, 2)]) >>> print(nx.is_semiconnected(G)) False See Also -------- is_strongly_connected, is_weakly_connected """ if len(G) == 0: raise nx.NetworkXPointlessConcept( 'Connectivity is undefined for the null graph.') if not nx.is_weakly_connected(G): return False G = nx.condensation(G) path = nx.topological_sort(G) return all(G.has_edge(u, v) for u, v in zip(path[:-1], path[1:])) networkx-1.11/networkx/algorithms/components/biconnected.py0000644000175000017500000003613212637544450024260 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ Biconnected components and articulation points. """ # Copyright (C) 2011-2013 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. from itertools import chain import networkx as nx from networkx.utils.decorators import not_implemented_for __author__ = '\n'.join(['Jordi Torrents ', 'Dan Schult ', 'Aric Hagberg ']) __all__ = [ 'biconnected_components', 'biconnected_component_edges', 'biconnected_component_subgraphs', 'is_biconnected', 'articulation_points', ] @not_implemented_for('directed') def is_biconnected(G): """Return True if the graph is biconnected, False otherwise. A graph is biconnected if, and only if, it cannot be disconnected by removing only one node (and all edges incident on that node). If removing a node increases the number of disconnected components in the graph, that node is called an articulation point, or cut vertex. A biconnected graph has no articulation points. Parameters ---------- G : NetworkX Graph An undirected graph. Returns ------- biconnected : bool True if the graph is biconnected, False otherwise. Raises ------ NetworkXNotImplemented : If the input graph is not undirected. Examples -------- >>> G = nx.path_graph(4) >>> print(nx.is_biconnected(G)) False >>> G.add_edge(0, 3) >>> print(nx.is_biconnected(G)) True See Also -------- biconnected_components articulation_points biconnected_component_edges biconnected_component_subgraphs Notes ----- The algorithm to find articulation points and biconnected components is implemented using a non-recursive depth-first-search (DFS) that keeps track of the highest level that back edges reach in the DFS tree. A node `n` is an articulation point if, and only if, there exists a subtree rooted at `n` such that there is no back edge from any successor of `n` that links to a predecessor of `n` in the DFS tree. By keeping track of all the edges traversed by the DFS we can obtain the biconnected components because all edges of a bicomponent will be traversed consecutively between articulation points. References ---------- .. [1] Hopcroft, J.; Tarjan, R. (1973). "Efficient algorithms for graph manipulation". Communications of the ACM 16: 372–378. doi:10.1145/362248.362272 """ bcc = list(biconnected_components(G)) if not bcc: # No bicomponents (it could be an empty graph) return False return len(bcc[0]) == len(G) @not_implemented_for('directed') def biconnected_component_edges(G): """Return a generator of lists of edges, one list for each biconnected component of the input graph. Biconnected components are maximal subgraphs such that the removal of a node (and all edges incident on that node) will not disconnect the subgraph. Note that nodes may be part of more than one biconnected component. Those nodes are articulation points, or cut vertices. However, each edge belongs to one, and only one, biconnected component. Notice that by convention a dyad is considered a biconnected component. Parameters ---------- G : NetworkX Graph An undirected graph. Returns ------- edges : generator of lists Generator of lists of edges, one list for each bicomponent. Raises ------ NetworkXNotImplemented : If the input graph is not undirected. Examples -------- >>> G = nx.barbell_graph(4, 2) >>> print(nx.is_biconnected(G)) False >>> bicomponents_edges = list(nx.biconnected_component_edges(G)) >>> len(bicomponents_edges) 5 >>> G.add_edge(2, 8) >>> print(nx.is_biconnected(G)) True >>> bicomponents_edges = list(nx.biconnected_component_edges(G)) >>> len(bicomponents_edges) 1 See Also -------- is_biconnected, biconnected_components, articulation_points, biconnected_component_subgraphs Notes ----- The algorithm to find articulation points and biconnected components is implemented using a non-recursive depth-first-search (DFS) that keeps track of the highest level that back edges reach in the DFS tree. A node `n` is an articulation point if, and only if, there exists a subtree rooted at `n` such that there is no back edge from any successor of `n` that links to a predecessor of `n` in the DFS tree. By keeping track of all the edges traversed by the DFS we can obtain the biconnected components because all edges of a bicomponent will be traversed consecutively between articulation points. References ---------- .. [1] Hopcroft, J.; Tarjan, R. (1973). "Efficient algorithms for graph manipulation". Communications of the ACM 16: 372–378. doi:10.1145/362248.362272 """ for comp in _biconnected_dfs(G, components=True): yield comp @not_implemented_for('directed') def biconnected_components(G): """Return a generator of sets of nodes, one set for each biconnected component of the graph Biconnected components are maximal subgraphs such that the removal of a node (and all edges incident on that node) will not disconnect the subgraph. Note that nodes may be part of more than one biconnected component. Those nodes are articulation points, or cut vertices. The removal of articulation points will increase the number of connected components of the graph. Notice that by convention a dyad is considered a biconnected component. Parameters ---------- G : NetworkX Graph An undirected graph. Returns ------- nodes : generator Generator of sets of nodes, one set for each biconnected component. Raises ------ NetworkXNotImplemented : If the input graph is not undirected. Examples -------- >>> G = nx.lollipop_graph(5, 1) >>> print(nx.is_biconnected(G)) False >>> bicomponents = list(nx.biconnected_components(G)) >>> len(bicomponents) 2 >>> G.add_edge(0, 5) >>> print(nx.is_biconnected(G)) True >>> bicomponents = list(nx.biconnected_components(G)) >>> len(bicomponents) 1 You can generate a sorted list of biconnected components, largest first, using sort. >>> G.remove_edge(0, 5) >>> [len(c) for c in sorted(nx.biconnected_components(G), key=len, reverse=True)] [5, 2] If you only want the largest connected component, it's more efficient to use max instead of sort. >>> Gc = max(nx.biconnected_components(G), key=len) See Also -------- is_biconnected, articulation_points, biconnected_component_edges, biconnected_component_subgraphs Notes ----- The algorithm to find articulation points and biconnected components is implemented using a non-recursive depth-first-search (DFS) that keeps track of the highest level that back edges reach in the DFS tree. A node `n` is an articulation point if, and only if, there exists a subtree rooted at `n` such that there is no back edge from any successor of `n` that links to a predecessor of `n` in the DFS tree. By keeping track of all the edges traversed by the DFS we can obtain the biconnected components because all edges of a bicomponent will be traversed consecutively between articulation points. References ---------- .. [1] Hopcroft, J.; Tarjan, R. (1973). "Efficient algorithms for graph manipulation". Communications of the ACM 16: 372–378. doi:10.1145/362248.362272 """ for comp in _biconnected_dfs(G, components=True): yield set(chain.from_iterable(comp)) @not_implemented_for('directed') def biconnected_component_subgraphs(G, copy=True): """Return a generator of graphs, one graph for each biconnected component of the input graph. Biconnected components are maximal subgraphs such that the removal of a node (and all edges incident on that node) will not disconnect the subgraph. Note that nodes may be part of more than one biconnected component. Those nodes are articulation points, or cut vertices. The removal of articulation points will increase the number of connected components of the graph. Notice that by convention a dyad is considered a biconnected component. Parameters ---------- G : NetworkX Graph An undirected graph. Returns ------- graphs : generator Generator of graphs, one graph for each biconnected component. Raises ------ NetworkXNotImplemented : If the input graph is not undirected. Examples -------- >>> G = nx.lollipop_graph(5, 1) >>> print(nx.is_biconnected(G)) False >>> bicomponents = list(nx.biconnected_component_subgraphs(G)) >>> len(bicomponents) 2 >>> G.add_edge(0, 5) >>> print(nx.is_biconnected(G)) True >>> bicomponents = list(nx.biconnected_component_subgraphs(G)) >>> len(bicomponents) 1 You can generate a sorted list of biconnected components, largest first, using sort. >>> G.remove_edge(0, 5) >>> [len(c) for c in sorted(nx.biconnected_component_subgraphs(G), ... key=len, reverse=True)] [5, 2] If you only want the largest connected component, it's more efficient to use max instead of sort. >>> Gc = max(nx.biconnected_component_subgraphs(G), key=len) See Also -------- is_biconnected, articulation_points, biconnected_component_edges, biconnected_components Notes ----- The algorithm to find articulation points and biconnected components is implemented using a non-recursive depth-first-search (DFS) that keeps track of the highest level that back edges reach in the DFS tree. A node `n` is an articulation point if, and only if, there exists a subtree rooted at `n` such that there is no back edge from any successor of `n` that links to a predecessor of `n` in the DFS tree. By keeping track of all the edges traversed by the DFS we can obtain the biconnected components because all edges of a bicomponent will be traversed consecutively between articulation points. Graph, node, and edge attributes are copied to the subgraphs. References ---------- .. [1] Hopcroft, J.; Tarjan, R. (1973). "Efficient algorithms for graph manipulation". Communications of the ACM 16: 372–378. doi:10.1145/362248.362272 """ for comp_nodes in biconnected_components(G): if copy: yield G.subgraph(comp_nodes).copy() else: yield G.subgraph(comp_nodes) @not_implemented_for('directed') def articulation_points(G): """Return a generator of articulation points, or cut vertices, of a graph. An articulation point or cut vertex is any node whose removal (along with all its incident edges) increases the number of connected components of a graph. An undirected connected graph without articulation points is biconnected. Articulation points belong to more than one biconnected component of a graph. Notice that by convention a dyad is considered a biconnected component. Parameters ---------- G : NetworkX Graph An undirected graph. Returns ------- articulation points : generator generator of nodes Raises ------ NetworkXNotImplemented : If the input graph is not undirected. Examples -------- >>> G = nx.barbell_graph(4, 2) >>> print(nx.is_biconnected(G)) False >>> len(list(nx.articulation_points(G))) 4 >>> G.add_edge(2, 8) >>> print(nx.is_biconnected(G)) True >>> len(list(nx.articulation_points(G))) 0 See Also -------- is_biconnected, biconnected_components, biconnected_component_edges, biconnected_component_subgraphs Notes ----- The algorithm to find articulation points and biconnected components is implemented using a non-recursive depth-first-search (DFS) that keeps track of the highest level that back edges reach in the DFS tree. A node `n` is an articulation point if, and only if, there exists a subtree rooted at `n` such that there is no back edge from any successor of `n` that links to a predecessor of `n` in the DFS tree. By keeping track of all the edges traversed by the DFS we can obtain the biconnected components because all edges of a bicomponent will be traversed consecutively between articulation points. References ---------- .. [1] Hopcroft, J.; Tarjan, R. (1973). "Efficient algorithms for graph manipulation". Communications of the ACM 16: 372–378. doi:10.1145/362248.362272 """ return _biconnected_dfs(G, components=False) @not_implemented_for('directed') def _biconnected_dfs(G, components=True): # depth-first search algorithm to generate articulation points # and biconnected components visited = set() for start in G: if start in visited: continue discovery = {start:0} # "time" of first discovery of node during search low = {start:0} root_children = 0 visited.add(start) edge_stack = [] stack = [(start, start, iter(G[start]))] while stack: grandparent, parent, children = stack[-1] try: child = next(children) if grandparent == child: continue if child in visited: if discovery[child] <= discovery[parent]: # back edge low[parent] = min(low[parent],discovery[child]) if components: edge_stack.append((parent,child)) else: low[child] = discovery[child] = len(discovery) visited.add(child) stack.append((parent, child, iter(G[child]))) if components: edge_stack.append((parent,child)) except StopIteration: stack.pop() if len(stack) > 1: if low[parent] >= discovery[grandparent]: if components: ind = edge_stack.index((grandparent,parent)) yield edge_stack[ind:] edge_stack=edge_stack[:ind] else: yield grandparent low[grandparent] = min(low[parent], low[grandparent]) elif stack: # length 1 so grandparent is root root_children += 1 if components: ind = edge_stack.index((grandparent,parent)) yield edge_stack[ind:] if not components: # root node is articulation point if it has more than 1 child if root_children > 1: yield start networkx-1.11/networkx/algorithms/components/connected.py0000644000175000017500000001107712637544500023742 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ Connected components. """ # Copyright (C) 2004-2013 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import networkx as nx from networkx.utils.decorators import not_implemented_for __authors__ = "\n".join(['Eben Kenah', 'Aric Hagberg ' 'Christopher Ellison']) __all__ = [ 'number_connected_components', 'connected_components', 'connected_component_subgraphs', 'is_connected', 'node_connected_component', ] @not_implemented_for('directed') def connected_components(G): """Generate connected components. Parameters ---------- G : NetworkX graph An undirected graph Returns ------- comp : generator of sets A generator of sets of nodes, one for each component of G. Examples -------- Generate a sorted list of connected components, largest first. >>> G = nx.path_graph(4) >>> G.add_path([10, 11, 12]) >>> [len(c) for c in sorted(nx.connected_components(G), key=len, reverse=True)] [4, 3] If you only want the largest connected component, it's more efficient to use max instead of sort. >>> largest_cc = max(nx.connected_components(G), key=len) See Also -------- strongly_connected_components Notes ----- For undirected graphs only. """ seen = set() for v in G: if v not in seen: c = set(_plain_bfs(G, v)) yield c seen.update(c) @not_implemented_for('directed') def connected_component_subgraphs(G, copy=True): """Generate connected components as subgraphs. Parameters ---------- G : NetworkX graph An undirected graph. copy: bool (default=True) If True make a copy of the graph attributes Returns ------- comp : generator A generator of graphs, one for each connected component of G. Examples -------- >>> G = nx.path_graph(4) >>> G.add_edge(5,6) >>> graphs = list(nx.connected_component_subgraphs(G)) If you only want the largest connected component, it's more efficient to use max than sort. >>> Gc = max(nx.connected_component_subgraphs(G), key=len) See Also -------- connected_components Notes ----- For undirected graphs only. Graph, node, and edge attributes are copied to the subgraphs by default. """ for c in connected_components(G): if copy: yield G.subgraph(c).copy() else: yield G.subgraph(c) def number_connected_components(G): """Return the number of connected components. Parameters ---------- G : NetworkX graph An undirected graph. Returns ------- n : integer Number of connected components See Also -------- connected_components Notes ----- For undirected graphs only. """ return len(list(connected_components(G))) @not_implemented_for('directed') def is_connected(G): """Return True if the graph is connected, false otherwise. Parameters ---------- G : NetworkX Graph An undirected graph. Returns ------- connected : bool True if the graph is connected, false otherwise. Examples -------- >>> G = nx.path_graph(4) >>> print(nx.is_connected(G)) True See Also -------- connected_components Notes ----- For undirected graphs only. """ if len(G) == 0: raise nx.NetworkXPointlessConcept('Connectivity is undefined ', 'for the null graph.') return len(set(_plain_bfs(G, next(G.nodes_iter())))) == len(G) @not_implemented_for('directed') def node_connected_component(G, n): """Return the nodes in the component of graph containing node n. Parameters ---------- G : NetworkX Graph An undirected graph. n : node label A node in G Returns ------- comp : set A set of nodes in the component of G containing node n. See Also -------- connected_components Notes ----- For undirected graphs only. """ return set(_plain_bfs(G, n)) def _plain_bfs(G, source): """A fast BFS node generator""" seen = set() nextlevel = {source} while nextlevel: thislevel = nextlevel nextlevel = set() for v in thislevel: if v not in seen: yield v seen.add(v) nextlevel.update(G[v]) networkx-1.11/networkx/algorithms/operators/0000755000175000017500000000000012653231454021250 5ustar aricaric00000000000000networkx-1.11/networkx/algorithms/operators/product.py0000644000175000017500000003230612637544500023307 0ustar aricaric00000000000000""" Graph products. """ # Copyright (C) 2011 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import networkx as nx from itertools import product __author__ = """\n""".join(['Aric Hagberg (hagberg@lanl.gov)', 'Pieter Swart (swart@lanl.gov)', 'Dan Schult(dschult@colgate.edu)' 'Ben Edwards(bedwards@cs.unm.edu)']) __all__ = ['tensor_product', 'cartesian_product', 'lexicographic_product', 'strong_product', 'power'] def _dict_product(d1, d2): return dict((k, (d1.get(k), d2.get(k))) for k in set(d1) | set(d2)) # Generators for producting graph products def _node_product(G, H): for u, v in product(G, H): yield ((u, v), _dict_product(G.node[u], H.node[v])) def _directed_edges_cross_edges(G, H): if not G.is_multigraph() and not H.is_multigraph(): for u, v, c in G.edges_iter(data=True): for x, y, d in H.edges_iter(data=True): yield (u, x), (v, y), _dict_product(c, d) if not G.is_multigraph() and H.is_multigraph(): for u, v, c in G.edges_iter(data=True): for x, y, k, d in H.edges_iter(data=True, keys=True): yield (u, x), (v, y), k, _dict_product(c, d) if G.is_multigraph() and not H.is_multigraph(): for u, v, k, c in G.edges_iter(data=True, keys=True): for x, y, d in H.edges_iter(data=True): yield (u, x), (v, y), k, _dict_product(c, d) if G.is_multigraph() and H.is_multigraph(): for u, v, j, c in G.edges_iter(data=True, keys=True): for x, y, k, d in H.edges_iter(data=True, keys=True): yield (u, x), (v, y), (j, k), _dict_product(c, d) def _undirected_edges_cross_edges(G, H): if not G.is_multigraph() and not H.is_multigraph(): for u, v, c in G.edges_iter(data=True): for x, y, d in H.edges_iter(data=True): yield (v, x), (u, y), _dict_product(c, d) if not G.is_multigraph() and H.is_multigraph(): for u, v, c in G.edges_iter(data=True): for x, y, k, d in H.edges_iter(data=True, keys=True): yield (v, x), (u, y), k, _dict_product(c, d) if G.is_multigraph() and not H.is_multigraph(): for u, v, k, c in G.edges_iter(data=True, keys=True): for x, y, d in H.edges_iter(data=True): yield (v, x), (u, y), k, _dict_product(c, d) if G.is_multigraph() and H.is_multigraph(): for u, v, j, c in G.edges_iter(data=True, keys=True): for x, y, k, d in H.edges_iter(data=True, keys=True): yield (v, x), (u, y), (j, k), _dict_product(c, d) def _edges_cross_nodes(G, H): if G.is_multigraph(): for u, v, k, d in G.edges_iter(data=True, keys=True): for x in H: yield (u, x), (v, x), k, d else: for u, v, d in G.edges_iter(data=True): for x in H: if H.is_multigraph(): yield (u, x), (v, x), None, d else: yield (u, x), (v, x), d def _nodes_cross_edges(G, H): if H.is_multigraph(): for x in G: for u, v, k, d in H.edges_iter(data=True, keys=True): yield (x, u), (x, v), k, d else: for x in G: for u, v, d in H.edges_iter(data=True): if G.is_multigraph(): yield (x, u), (x, v), None, d else: yield (x, u), (x, v), d def _edges_cross_nodes_and_nodes(G, H): if G.is_multigraph(): for u, v, k, d in G.edges_iter(data=True, keys=True): for x in H: for y in H: yield (u, x), (v, y), k, d else: for u, v, d in G.edges_iter(data=True): for x in H: for y in H: if H.is_multigraph(): yield (u, x), (v, y), None, d else: yield (u, x), (v, y), d def _init_product_graph(G, H): if not G.is_directed() == H.is_directed(): raise nx.NetworkXError("G and H must be both directed or", "both undirected") if G.is_multigraph() or H.is_multigraph(): GH = nx.MultiGraph() else: GH = nx.Graph() if G.is_directed(): GH = GH.to_directed() return GH def tensor_product(G, H): r"""Return the tensor product of G and H. The tensor product P of the graphs G and H has a node set that is the Cartesian product of the node sets, :math:`V(P)=V(G) \times V(H)`. P has an edge ((u,v),(x,y)) if and only if (u,x) is an edge in G and (v,y) is an edge in H. Tensor product is sometimes also referred to as the categorical product, direct product, cardinal product or conjunction. Parameters ---------- G, H: graphs Networkx graphs. Returns ------- P: NetworkX graph The tensor product of G and H. P will be a multi-graph if either G or H is a multi-graph, will be a directed if G and H are directed, and undirected if G and H are undirected. Raises ------ NetworkXError If G and H are not both directed or both undirected. Notes ----- Node attributes in P are two-tuple of the G and H node attributes. Missing attributes are assigned None. Examples -------- >>> G = nx.Graph() >>> H = nx.Graph() >>> G.add_node(0,a1=True) >>> H.add_node('a',a2='Spam') >>> P = nx.tensor_product(G,H) >>> P.nodes() [(0, 'a')] Edge attributes and edge keys (for multigraphs) are also copied to the new product graph """ GH = _init_product_graph(G, H) GH.add_nodes_from(_node_product(G, H)) GH.add_edges_from(_directed_edges_cross_edges(G, H)) if not GH.is_directed(): GH.add_edges_from(_undirected_edges_cross_edges(G, H)) GH.name = "Tensor product(" + G.name + "," + H.name + ")" return GH def cartesian_product(G, H): """Return the Cartesian product of G and H. The Cartesian product P of the graphs G and H has a node set that is the Cartesian product of the node sets, :math:`V(P)=V(G) \times V(H)`. P has an edge ((u,v),(x,y)) if and only if either u is equal to x and v & y are adjacent in H or if v is equal to y and u & x are adjacent in G. Parameters ---------- G, H: graphs Networkx graphs. Returns ------- P: NetworkX graph The Cartesian product of G and H. P will be a multi-graph if either G or H is a multi-graph. Will be a directed if G and H are directed, and undirected if G and H are undirected. Raises ------ NetworkXError If G and H are not both directed or both undirected. Notes ----- Node attributes in P are two-tuple of the G and H node attributes. Missing attributes are assigned None. Examples -------- >>> G = nx.Graph() >>> H = nx.Graph() >>> G.add_node(0,a1=True) >>> H.add_node('a',a2='Spam') >>> P = nx.cartesian_product(G,H) >>> P.nodes() [(0, 'a')] Edge attributes and edge keys (for multigraphs) are also copied to the new product graph """ if not G.is_directed() == H.is_directed(): raise nx.NetworkXError("G and H must be both directed or", "both undirected") GH = _init_product_graph(G, H) GH.add_nodes_from(_node_product(G, H)) GH.add_edges_from(_edges_cross_nodes(G, H)) GH.add_edges_from(_nodes_cross_edges(G, H)) GH.name = "Cartesian product(" + G.name + "," + H.name + ")" return GH def lexicographic_product(G, H): """Return the lexicographic product of G and H. The lexicographical product P of the graphs G and H has a node set that is the Cartesian product of the node sets, $V(P)=V(G) \times V(H)$. P has an edge ((u,v),(x,y)) if and only if (u,v) is an edge in G or u==v and (x,y) is an edge in H. Parameters ---------- G, H: graphs Networkx graphs. Returns ------- P: NetworkX graph The Cartesian product of G and H. P will be a multi-graph if either G or H is a multi-graph. Will be a directed if G and H are directed, and undirected if G and H are undirected. Raises ------ NetworkXError If G and H are not both directed or both undirected. Notes ----- Node attributes in P are two-tuple of the G and H node attributes. Missing attributes are assigned None. Examples -------- >>> G = nx.Graph() >>> H = nx.Graph() >>> G.add_node(0,a1=True) >>> H.add_node('a',a2='Spam') >>> P = nx.lexicographic_product(G,H) >>> P.nodes() [(0, 'a')] Edge attributes and edge keys (for multigraphs) are also copied to the new product graph """ GH = _init_product_graph(G, H) GH.add_nodes_from(_node_product(G, H)) # Edges in G regardless of H designation GH.add_edges_from(_edges_cross_nodes_and_nodes(G, H)) # For each x in G, only if there is an edge in H GH.add_edges_from(_nodes_cross_edges(G, H)) GH.name = "Lexicographic product(" + G.name + "," + H.name + ")" return GH def strong_product(G, H): """Return the strong product of G and H. The strong product P of the graphs G and H has a node set that is the Cartesian product of the node sets, $V(P)=V(G) \times V(H)$. P has an edge ((u,v),(x,y)) if and only if u==v and (x,y) is an edge in H, or x==y and (u,v) is an edge in G, or (u,v) is an edge in G and (x,y) is an edge in H. Parameters ---------- G, H: graphs Networkx graphs. Returns ------- P: NetworkX graph The Cartesian product of G and H. P will be a multi-graph if either G or H is a multi-graph. Will be a directed if G and H are directed, and undirected if G and H are undirected. Raises ------ NetworkXError If G and H are not both directed or both undirected. Notes ----- Node attributes in P are two-tuple of the G and H node attributes. Missing attributes are assigned None. Examples -------- >>> G = nx.Graph() >>> H = nx.Graph() >>> G.add_node(0,a1=True) >>> H.add_node('a',a2='Spam') >>> P = nx.strong_product(G,H) >>> P.nodes() [(0, 'a')] Edge attributes and edge keys (for multigraphs) are also copied to the new product graph """ GH = _init_product_graph(G, H) GH.add_nodes_from(_node_product(G, H)) GH.add_edges_from(_nodes_cross_edges(G, H)) GH.add_edges_from(_edges_cross_nodes(G, H)) GH.add_edges_from(_directed_edges_cross_edges(G, H)) if not GH.is_directed(): GH.add_edges_from(_undirected_edges_cross_edges(G, H)) GH.name = "Strong product(" + G.name + "," + H.name + ")" return GH def power(G, k): """Returns the specified power of a graph. The `k`-th power of a simple graph `G = (V, E)` is the graph `G^k` whose vertex set is `V`, two distinct vertices `u,v` are adjacent in `G^k` if and only if the shortest path distance between `u` and `v` in `G` is at most `k`. Parameters ---------- G: graph A NetworkX simple graph object. k: positive integer The power to which to raise the graph `G`. Returns ------- NetworkX simple graph `G` to the `k`-th power. Raises ------ :exc:`ValueError` If the exponent `k` is not positive. NetworkXError If G is not a simple graph. Examples -------- >>> G = nx.path_graph(4) >>> nx.power(G,2).edges() [(0, 1), (0, 2), (1, 2), (1, 3), (2, 3)] >>> nx.power(G,3).edges() [(0, 1), (0, 2), (0, 3), (1, 2), (1, 3), (2, 3)] A complete graph of order n is returned if *k* is greater than equal to n/2 for a cycle graph of even order n, and if *k* is greater than equal to (n-1)/2 for a cycle graph of odd order. >>> G = nx.cycle_graph(5) >>> nx.power(G,2).edges() == nx.complete_graph(5).edges() True >>> G = nx.cycle_graph(8) >>> nx.power(G,4).edges() == nx.complete_graph(8).edges() True References ---------- .. [1] J. A. Bondy, U. S. R. Murty, Graph Theory. Springer, 2008. Notes ----- Exercise 3.1.6 of *Graph Theory* by J. A. Bondy and U. S. R. Murty [1]_. """ if k <= 0: raise ValueError('k must be a positive integer') if G.is_multigraph() or G.is_directed(): raise nx.NetworkXError("G should be a simple graph") H = nx.Graph() # update BFS code to ignore self loops. for n in G: seen = {} # level (number of hops) when seen in BFS level = 1 # the current level nextlevel = G[n] while nextlevel: thislevel = nextlevel # advance to next level nextlevel = {} # and start a new list (fringe) for v in thislevel: if v == n: # avoid self loop continue if v not in seen: seen[v] = level # set the level of vertex v nextlevel.update(G[v]) # add neighbors of v if (k <= level): break level = level + 1 H.add_edges_from((n, nbr) for nbr in seen) return H networkx-1.11/networkx/algorithms/operators/__init__.py0000644000175000017500000000031112637544450023361 0ustar aricaric00000000000000from networkx.algorithms.operators.all import * from networkx.algorithms.operators.binary import * from networkx.algorithms.operators.product import * from networkx.algorithms.operators.unary import * networkx-1.11/networkx/algorithms/operators/tests/0000755000175000017500000000000012653231454022412 5ustar aricaric00000000000000networkx-1.11/networkx/algorithms/operators/tests/test_unary.py0000644000175000017500000000246012637544450025170 0ustar aricaric00000000000000from nose.tools import * import networkx as nx from networkx import * def test_complement(): null=null_graph() empty1=empty_graph(1) empty10=empty_graph(10) K3=complete_graph(3) K5=complete_graph(5) K10=complete_graph(10) P2=path_graph(2) P3=path_graph(3) P5=path_graph(5) P10=path_graph(10) #complement of the complete graph is empty G=complement(K3) assert_true(is_isomorphic(G,empty_graph(3))) G=complement(K5) assert_true(is_isomorphic(G,empty_graph(5))) # for any G, G=complement(complement(G)) P3cc=complement(complement(P3)) assert_true(is_isomorphic(P3,P3cc)) nullcc=complement(complement(null)) assert_true(is_isomorphic(null,nullcc)) b=bull_graph() bcc=complement(complement(b)) assert_true(is_isomorphic(b,bcc)) def test_complement_2(): G1=nx.DiGraph() G1.add_edge('A','B') G1.add_edge('A','C') G1.add_edge('A','D') G1C=complement(G1) assert_equal(sorted(G1C.edges()), [('B', 'A'), ('B', 'C'), ('B', 'D'), ('C', 'A'), ('C', 'B'), ('C', 'D'), ('D', 'A'), ('D', 'B'), ('D', 'C')]) def test_reverse1(): # Other tests for reverse are done by the DiGraph and MultiDigraph. G1=nx.Graph() assert_raises(nx.NetworkXError, nx.reverse, G1) networkx-1.11/networkx/algorithms/operators/tests/test_all.py0000644000175000017500000001231012637544500024571 0ustar aricaric00000000000000from nose.tools import * import networkx as nx from networkx.testing import * def test_union_all_attributes(): g = nx.Graph() g.add_node(0, x=4) g.add_node(1, x=5) g.add_edge(0, 1, size=5) g.graph['name'] = 'g' h = g.copy() h.graph['name'] = 'h' h.graph['attr'] = 'attr' h.node[0]['x'] = 7 j = g.copy() j.graph['name'] = 'j' j.graph['attr'] = 'attr' j.node[0]['x'] = 7 ghj = nx.union_all([g, h, j], rename=('g', 'h', 'j')) assert_equal( set(ghj.nodes()) , set(['h0', 'h1', 'g0', 'g1', 'j0', 'j1']) ) for n in ghj: graph, node = n assert_equal( ghj.node[n], eval(graph).node[int(node)] ) assert_equal(ghj.graph['attr'],'attr') assert_equal(ghj.graph['name'],'j') # j graph attributes take precendent def test_intersection_all(): G=nx.Graph() H=nx.Graph() R=nx.Graph() G.add_nodes_from([1,2,3,4]) G.add_edge(1,2) G.add_edge(2,3) H.add_nodes_from([1,2,3,4]) H.add_edge(2,3) H.add_edge(3,4) R.add_nodes_from([1,2,3,4]) R.add_edge(2,3) R.add_edge(4,1) I=nx.intersection_all([G,H,R]) assert_equal( set(I.nodes()) , set([1,2,3,4]) ) assert_equal( sorted(I.edges()) , [(2,3)] ) def test_intersection_all_attributes(): g = nx.Graph() g.add_node(0, x=4) g.add_node(1, x=5) g.add_edge(0, 1, size=5) g.graph['name'] = 'g' h = g.copy() h.graph['name'] = 'h' h.graph['attr'] = 'attr' h.node[0]['x'] = 7 gh = nx.intersection_all([g, h]) assert_equal( set(gh.nodes()) , set(g.nodes()) ) assert_equal( set(gh.nodes()) , set(h.nodes()) ) assert_equal( sorted(gh.edges()) , sorted(g.edges()) ) h.remove_node(0) assert_raises(nx.NetworkXError, nx.intersection, g, h) def test_intersection_all_multigraph_attributes(): g = nx.MultiGraph() g.add_edge(0, 1, key=0) g.add_edge(0, 1, key=1) g.add_edge(0, 1, key=2) h = nx.MultiGraph() h.add_edge(0, 1, key=0) h.add_edge(0, 1, key=3) gh = nx.intersection_all([g, h]) assert_equal( set(gh.nodes()) , set(g.nodes()) ) assert_equal( set(gh.nodes()) , set(h.nodes()) ) assert_equal( sorted(gh.edges()) , [(0,1)] ) assert_equal( sorted(gh.edges(keys=True)) , [(0,1,0)] ) def test_union_all_and_compose_all(): K3=nx.complete_graph(3) P3=nx.path_graph(3) G1=nx.DiGraph() G1.add_edge('A','B') G1.add_edge('A','C') G1.add_edge('A','D') G2=nx.DiGraph() G2.add_edge('1','2') G2.add_edge('1','3') G2.add_edge('1','4') G=nx.union_all([G1,G2]) H=nx.compose_all([G1,G2]) assert_edges_equal(G.edges(),H.edges()) assert_false(G.has_edge('A','1')) assert_raises(nx.NetworkXError, nx.union, K3, P3) H1=nx.union_all([H,G1],rename=('H','G1')) assert_equal(sorted(H1.nodes()), ['G1A', 'G1B', 'G1C', 'G1D', 'H1', 'H2', 'H3', 'H4', 'HA', 'HB', 'HC', 'HD']) H2=nx.union_all([H,G2],rename=("H","")) assert_equal(sorted(H2.nodes()), ['1', '2', '3', '4', 'H1', 'H2', 'H3', 'H4', 'HA', 'HB', 'HC', 'HD']) assert_false(H1.has_edge('NB','NA')) G=nx.compose_all([G,G]) assert_edges_equal(G.edges(),H.edges()) G2=nx.union_all([G2,G2],rename=('','copy')) assert_equal(sorted(G2.nodes()), ['1', '2', '3', '4', 'copy1', 'copy2', 'copy3', 'copy4']) assert_equal(G2.neighbors('copy4'),[]) assert_equal(sorted(G2.neighbors('copy1')),['copy2', 'copy3', 'copy4']) assert_equal(len(G),8) assert_equal(nx.number_of_edges(G),6) E=nx.disjoint_union_all([G,G]) assert_equal(len(E),16) assert_equal(nx.number_of_edges(E),12) E=nx.disjoint_union_all([G1,G2]) assert_equal(sorted(E.nodes()),[0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11]) G1=nx.DiGraph() G1.add_edge('A','B') G2=nx.DiGraph() G2.add_edge(1,2) G3=nx.DiGraph() G3.add_edge(11,22) G4=nx.union_all([G1,G2,G3],rename=("G1","G2","G3")) assert_equal(sorted(G4.nodes()), ['G1A', 'G1B', 'G21', 'G22', 'G311', 'G322']) def test_union_all_multigraph(): G=nx.MultiGraph() G.add_edge(1,2,key=0) G.add_edge(1,2,key=1) H=nx.MultiGraph() H.add_edge(3,4,key=0) H.add_edge(3,4,key=1) GH=nx.union_all([G,H]) assert_equal( set(GH) , set(G)|set(H)) assert_equal( set(GH.edges(keys=True)) , set(G.edges(keys=True))|set(H.edges(keys=True))) def test_input_output(): l = [nx.Graph([(1,2)]),nx.Graph([(3,4)])] U = nx.disjoint_union_all(l) assert_equal(len(l),2) C = nx.compose_all(l) assert_equal(len(l),2) l = [nx.Graph([(1,2)]),nx.Graph([(1,2)])] R = nx.intersection_all(l) assert_equal(len(l),2) @raises(nx.NetworkXError) def test_mixed_type_union(): G = nx.Graph() H = nx.MultiGraph() I = nx.Graph() U = nx.union_all([G,H,I]) @raises(nx.NetworkXError) def test_mixed_type_disjoint_union(): G = nx.Graph() H = nx.MultiGraph() I = nx.Graph() U = nx.disjoint_union_all([G,H,I]) @raises(nx.NetworkXError) def test_mixed_type_intersection(): G = nx.Graph() H = nx.MultiGraph() I = nx.Graph() U = nx.intersection_all([G,H,I]) @raises(nx.NetworkXError) def test_mixed_type_compose(): G = nx.Graph() H = nx.MultiGraph() I = nx.Graph() U = nx.compose_all([G,H,I]) networkx-1.11/networkx/algorithms/operators/tests/test_product.py0000644000175000017500000003066612637544500025517 0ustar aricaric00000000000000import networkx as nx from networkx import tensor_product, cartesian_product, lexicographic_product, strong_product from nose.tools import assert_raises, assert_true, assert_equal, raises @raises(nx.NetworkXError) def test_tensor_product_raises(): P = tensor_product(nx.DiGraph(), nx.Graph()) def test_tensor_product_null(): null = nx.null_graph() empty10 = nx.empty_graph(10) K3 = nx.complete_graph(3) K10 = nx.complete_graph(10) P3 = nx.path_graph(3) P10 = nx.path_graph(10) # null graph G = tensor_product(null, null) assert_true(nx.is_isomorphic(G, null)) # null_graph X anything = null_graph and v.v. G = tensor_product(null, empty10) assert_true(nx.is_isomorphic(G, null)) G = tensor_product(null, K3) assert_true(nx.is_isomorphic(G, null)) G = tensor_product(null, K10) assert_true(nx.is_isomorphic(G, null)) G = tensor_product(null, P3) assert_true(nx.is_isomorphic(G, null)) G = tensor_product(null, P10) assert_true(nx.is_isomorphic(G, null)) G = tensor_product(empty10, null) assert_true(nx.is_isomorphic(G, null)) G = tensor_product(K3, null) assert_true(nx.is_isomorphic(G, null)) G = tensor_product(K10, null) assert_true(nx.is_isomorphic(G, null)) G = tensor_product(P3, null) assert_true(nx.is_isomorphic(G, null)) G = tensor_product(P10, null) assert_true(nx.is_isomorphic(G, null)) def test_tensor_product_size(): P5 = nx.path_graph(5) K3 = nx.complete_graph(3) K5 = nx.complete_graph(5) G = tensor_product(P5, K3) assert_equal(nx.number_of_nodes(G), 5 * 3) G = tensor_product(K3, K5) assert_equal(nx.number_of_nodes(G), 3 * 5) def test_tensor_product_combinations(): # basic smoke test, more realistic tests would be usefule P5 = nx.path_graph(5) K3 = nx.complete_graph(3) G = tensor_product(P5, K3) assert_equal(nx.number_of_nodes(G), 5 * 3) G = tensor_product(P5, nx.MultiGraph(K3)) assert_equal(nx.number_of_nodes(G), 5 * 3) G = tensor_product(nx.MultiGraph(P5), K3) assert_equal(nx.number_of_nodes(G), 5 * 3) G = tensor_product(nx.MultiGraph(P5), nx.MultiGraph(K3)) assert_equal(nx.number_of_nodes(G), 5 * 3) G = tensor_product(nx.DiGraph(P5), nx.DiGraph(K3)) assert_equal(nx.number_of_nodes(G), 5 * 3) def test_tensor_product_classic_result(): K2 = nx.complete_graph(2) G = nx.petersen_graph() G = tensor_product(G, K2) assert_true(nx.is_isomorphic(G, nx.desargues_graph())) G = nx.cycle_graph(5) G = tensor_product(G, K2) assert_true(nx.is_isomorphic(G, nx.cycle_graph(10))) G = nx.tetrahedral_graph() G = tensor_product(G, K2) assert_true(nx.is_isomorphic(G, nx.cubical_graph())) def test_tensor_product_random(): G = nx.erdos_renyi_graph(10, 2 / 10.) H = nx.erdos_renyi_graph(10, 2 / 10.) GH = tensor_product(G, H) for (u_G, u_H) in GH.nodes_iter(): for (v_G, v_H) in GH.nodes_iter(): if H.has_edge(u_H, v_H) and G.has_edge(u_G, v_G): assert_true(GH.has_edge((u_G, u_H), (v_G, v_H))) else: assert_true(not GH.has_edge((u_G, u_H), (v_G, v_H))) def test_cartesian_product_multigraph(): G = nx.MultiGraph() G.add_edge(1, 2, key=0) G.add_edge(1, 2, key=1) H = nx.MultiGraph() H.add_edge(3, 4, key=0) H.add_edge(3, 4, key=1) GH = cartesian_product(G, H) assert_equal(set(GH), {(1, 3), (2, 3), (2, 4), (1, 4)}) assert_equal({(frozenset([u, v]), k) for u, v, k in GH.edges(keys=True)}, {(frozenset([u, v]), k) for u, v, k in [((1, 3), (2, 3), 0), ((1, 3), (2, 3), 1), ((1, 3), (1, 4), 0), ((1, 3), (1, 4), 1), ((2, 3), (2, 4), 0), ((2, 3), (2, 4), 1), ((2, 4), (1, 4), 0), ((2, 4), (1, 4), 1)]}) @raises(nx.NetworkXError) def test_cartesian_product_raises(): P = cartesian_product(nx.DiGraph(), nx.Graph()) def test_cartesian_product_null(): null = nx.null_graph() empty10 = nx.empty_graph(10) K3 = nx.complete_graph(3) K10 = nx.complete_graph(10) P3 = nx.path_graph(3) P10 = nx.path_graph(10) # null graph G = cartesian_product(null, null) assert_true(nx.is_isomorphic(G, null)) # null_graph X anything = null_graph and v.v. G = cartesian_product(null, empty10) assert_true(nx.is_isomorphic(G, null)) G = cartesian_product(null, K3) assert_true(nx.is_isomorphic(G, null)) G = cartesian_product(null, K10) assert_true(nx.is_isomorphic(G, null)) G = cartesian_product(null, P3) assert_true(nx.is_isomorphic(G, null)) G = cartesian_product(null, P10) assert_true(nx.is_isomorphic(G, null)) G = cartesian_product(empty10, null) assert_true(nx.is_isomorphic(G, null)) G = cartesian_product(K3, null) assert_true(nx.is_isomorphic(G, null)) G = cartesian_product(K10, null) assert_true(nx.is_isomorphic(G, null)) G = cartesian_product(P3, null) assert_true(nx.is_isomorphic(G, null)) G = cartesian_product(P10, null) assert_true(nx.is_isomorphic(G, null)) def test_cartesian_product_size(): # order(GXH)=order(G)*order(H) K5 = nx.complete_graph(5) P5 = nx.path_graph(5) K3 = nx.complete_graph(3) G = cartesian_product(P5, K3) assert_equal(nx.number_of_nodes(G), 5 * 3) assert_equal(nx.number_of_edges(G), nx.number_of_edges(P5) * nx.number_of_nodes(K3) + nx.number_of_edges(K3) * nx.number_of_nodes(P5)) G = cartesian_product(K3, K5) assert_equal(nx.number_of_nodes(G), 3 * 5) assert_equal(nx.number_of_edges(G), nx.number_of_edges(K5) * nx.number_of_nodes(K3) + nx.number_of_edges(K3) * nx.number_of_nodes(K5)) def test_cartesian_product_classic(): # test some classic product graphs P2 = nx.path_graph(2) P3 = nx.path_graph(3) # cube = 2-path X 2-path G = cartesian_product(P2, P2) G = cartesian_product(P2, G) assert_true(nx.is_isomorphic(G, nx.cubical_graph())) # 3x3 grid G = cartesian_product(P3, P3) assert_true(nx.is_isomorphic(G, nx.grid_2d_graph(3, 3))) def test_cartesian_product_random(): G = nx.erdos_renyi_graph(10, 2 / 10.) H = nx.erdos_renyi_graph(10, 2 / 10.) GH = cartesian_product(G, H) for (u_G, u_H) in GH.nodes_iter(): for (v_G, v_H) in GH.nodes_iter(): if (u_G == v_G and H.has_edge(u_H, v_H)) or \ (u_H == v_H and G.has_edge(u_G, v_G)): assert_true(GH.has_edge((u_G, u_H), (v_G, v_H))) else: assert_true(not GH.has_edge((u_G, u_H), (v_G, v_H))) @raises(nx.NetworkXError) def test_lexicographic_product_raises(): P = lexicographic_product(nx.DiGraph(), nx.Graph()) def test_lexicographic_product_null(): null = nx.null_graph() empty10 = nx.empty_graph(10) K3 = nx.complete_graph(3) K10 = nx.complete_graph(10) P3 = nx.path_graph(3) P10 = nx.path_graph(10) # null graph G = lexicographic_product(null, null) assert_true(nx.is_isomorphic(G, null)) # null_graph X anything = null_graph and v.v. G = lexicographic_product(null, empty10) assert_true(nx.is_isomorphic(G, null)) G = lexicographic_product(null, K3) assert_true(nx.is_isomorphic(G, null)) G = lexicographic_product(null, K10) assert_true(nx.is_isomorphic(G, null)) G = lexicographic_product(null, P3) assert_true(nx.is_isomorphic(G, null)) G = lexicographic_product(null, P10) assert_true(nx.is_isomorphic(G, null)) G = lexicographic_product(empty10, null) assert_true(nx.is_isomorphic(G, null)) G = lexicographic_product(K3, null) assert_true(nx.is_isomorphic(G, null)) G = lexicographic_product(K10, null) assert_true(nx.is_isomorphic(G, null)) G = lexicographic_product(P3, null) assert_true(nx.is_isomorphic(G, null)) G = lexicographic_product(P10, null) assert_true(nx.is_isomorphic(G, null)) def test_lexicographic_product_size(): K5 = nx.complete_graph(5) P5 = nx.path_graph(5) K3 = nx.complete_graph(3) G = lexicographic_product(P5, K3) assert_equal(nx.number_of_nodes(G), 5 * 3) G = lexicographic_product(K3, K5) assert_equal(nx.number_of_nodes(G), 3 * 5) def test_lexicographic_product_combinations(): P5 = nx.path_graph(5) K3 = nx.complete_graph(3) G = lexicographic_product(P5, K3) assert_equal(nx.number_of_nodes(G), 5 * 3) G = lexicographic_product(nx.MultiGraph(P5), K3) assert_equal(nx.number_of_nodes(G), 5 * 3) G = lexicographic_product(P5, nx.MultiGraph(K3)) assert_equal(nx.number_of_nodes(G), 5 * 3) G = lexicographic_product(nx.MultiGraph(P5), nx.MultiGraph(K3)) assert_equal(nx.number_of_nodes(G), 5 * 3) # No classic easily found classic results for lexicographic product def test_lexicographic_product_random(): G = nx.erdos_renyi_graph(10, 2 / 10.) H = nx.erdos_renyi_graph(10, 2 / 10.) GH = lexicographic_product(G, H) for (u_G, u_H) in GH.nodes_iter(): for (v_G, v_H) in GH.nodes_iter(): if G.has_edge(u_G, v_G) or (u_G == v_G and H.has_edge(u_H, v_H)): assert_true(GH.has_edge((u_G, u_H), (v_G, v_H))) else: assert_true(not GH.has_edge((u_G, u_H), (v_G, v_H))) @raises(nx.NetworkXError) def test_strong_product_raises(): P = strong_product(nx.DiGraph(), nx.Graph()) def test_strong_product_null(): null = nx.null_graph() empty10 = nx.empty_graph(10) K3 = nx.complete_graph(3) K10 = nx.complete_graph(10) P3 = nx.path_graph(3) P10 = nx.path_graph(10) # null graph G = strong_product(null, null) assert_true(nx.is_isomorphic(G, null)) # null_graph X anything = null_graph and v.v. G = strong_product(null, empty10) assert_true(nx.is_isomorphic(G, null)) G = strong_product(null, K3) assert_true(nx.is_isomorphic(G, null)) G = strong_product(null, K10) assert_true(nx.is_isomorphic(G, null)) G = strong_product(null, P3) assert_true(nx.is_isomorphic(G, null)) G = strong_product(null, P10) assert_true(nx.is_isomorphic(G, null)) G = strong_product(empty10, null) assert_true(nx.is_isomorphic(G, null)) G = strong_product(K3, null) assert_true(nx.is_isomorphic(G, null)) G = strong_product(K10, null) assert_true(nx.is_isomorphic(G, null)) G = strong_product(P3, null) assert_true(nx.is_isomorphic(G, null)) G = strong_product(P10, null) assert_true(nx.is_isomorphic(G, null)) def test_strong_product_size(): K5 = nx.complete_graph(5) P5 = nx.path_graph(5) K3 = nx.complete_graph(3) G = strong_product(P5, K3) assert_equal(nx.number_of_nodes(G), 5 * 3) G = strong_product(K3, K5) assert_equal(nx.number_of_nodes(G), 3 * 5) def test_strong_product_combinations(): P5 = nx.path_graph(5) K3 = nx.complete_graph(3) G = strong_product(P5, K3) assert_equal(nx.number_of_nodes(G), 5 * 3) G = strong_product(nx.MultiGraph(P5), K3) assert_equal(nx.number_of_nodes(G), 5 * 3) G = strong_product(P5, nx.MultiGraph(K3)) assert_equal(nx.number_of_nodes(G), 5 * 3) G = strong_product(nx.MultiGraph(P5), nx.MultiGraph(K3)) assert_equal(nx.number_of_nodes(G), 5 * 3) # No classic easily found classic results for strong product def test_strong_product_random(): G = nx.erdos_renyi_graph(10, 2 / 10.) H = nx.erdos_renyi_graph(10, 2 / 10.) GH = strong_product(G, H) for (u_G, u_H) in GH.nodes_iter(): for (v_G, v_H) in GH.nodes_iter(): if (u_G == v_G and H.has_edge(u_H, v_H)) or \ (u_H == v_H and G.has_edge(u_G, v_G)) or \ (G.has_edge(u_G, v_G) and H.has_edge(u_H, v_H)): assert_true(GH.has_edge((u_G, u_H), (v_G, v_H))) else: assert_true(not GH.has_edge((u_G, u_H), (v_G, v_H))) @raises(nx.NetworkXError) def test_graph_power_raises(): nx.power(nx.MultiDiGraph(), 2) def test_graph_power(): # wikipedia example for graph power G = nx.cycle_graph(7) G.add_edge(6, 7) G.add_edge(7, 8) G.add_edge(8, 9) G.add_edge(9, 2) H = nx.power(G, 2) assert_equal(H.edges(), [(0, 1), (0, 2), (0, 5), (0, 6), (0, 7), (1, 9), (1, 2), (1, 3), (1, 6), (2, 3), (2, 4), (2, 8), (2, 9), (3, 4), (3, 5), (3, 9), (4, 5), (4, 6), (5, 6), (5, 7), (6, 7), (6, 8), (7, 8), (7, 9), (8, 9)]) assert_raises(ValueError, nx.power, G, -1) networkx-1.11/networkx/algorithms/operators/tests/test_binary.py0000644000175000017500000002070612637544500025315 0ustar aricaric00000000000000from nose.tools import * import networkx as nx from networkx import * from networkx.testing import * def test_union_attributes(): g = nx.Graph() g.add_node(0, x=4) g.add_node(1, x=5) g.add_edge(0, 1, size=5) g.graph['name'] = 'g' h = g.copy() h.graph['name'] = 'h' h.graph['attr'] = 'attr' h.node[0]['x'] = 7 gh = nx.union(g, h, rename=('g', 'h')) assert_equal( set(gh.nodes()) , set(['h0', 'h1', 'g0', 'g1']) ) for n in gh: graph, node = n assert_equal( gh.node[n], eval(graph).node[int(node)] ) assert_equal(gh.graph['attr'],'attr') assert_equal(gh.graph['name'],'h') # h graph attributes take precendent def test_intersection(): G=nx.Graph() H=nx.Graph() G.add_nodes_from([1,2,3,4]) G.add_edge(1,2) G.add_edge(2,3) H.add_nodes_from([1,2,3,4]) H.add_edge(2,3) H.add_edge(3,4) I=nx.intersection(G,H) assert_equal( set(I.nodes()) , set([1,2,3,4]) ) assert_equal( sorted(I.edges()) , [(2,3)] ) def test_intersection_attributes(): g = nx.Graph() g.add_node(0, x=4) g.add_node(1, x=5) g.add_edge(0, 1, size=5) g.graph['name'] = 'g' h = g.copy() h.graph['name'] = 'h' h.graph['attr'] = 'attr' h.node[0]['x'] = 7 gh = nx.intersection(g, h) assert_equal( set(gh.nodes()) , set(g.nodes()) ) assert_equal( set(gh.nodes()) , set(h.nodes()) ) assert_equal( sorted(gh.edges()) , sorted(g.edges()) ) h.remove_node(0) assert_raises(nx.NetworkXError, nx.intersection, g, h) def test_intersection_multigraph_attributes(): g = nx.MultiGraph() g.add_edge(0, 1, key=0) g.add_edge(0, 1, key=1) g.add_edge(0, 1, key=2) h = nx.MultiGraph() h.add_edge(0, 1, key=0) h.add_edge(0, 1, key=3) gh = nx.intersection(g, h) assert_equal( set(gh.nodes()) , set(g.nodes()) ) assert_equal( set(gh.nodes()) , set(h.nodes()) ) assert_equal( sorted(gh.edges()) , [(0,1)] ) assert_equal( sorted(gh.edges(keys=True)) , [(0,1,0)] ) def test_difference(): G=nx.Graph() H=nx.Graph() G.add_nodes_from([1,2,3,4]) G.add_edge(1,2) G.add_edge(2,3) H.add_nodes_from([1,2,3,4]) H.add_edge(2,3) H.add_edge(3,4) D=nx.difference(G,H) assert_equal( set(D.nodes()) , set([1,2,3,4]) ) assert_equal( sorted(D.edges()) , [(1,2)] ) D=nx.difference(H,G) assert_equal( set(D.nodes()) , set([1,2,3,4]) ) assert_equal( sorted(D.edges()) , [(3,4)] ) D=nx.symmetric_difference(G,H) assert_equal( set(D.nodes()) , set([1,2,3,4]) ) assert_equal( sorted(D.edges()) , [(1,2),(3,4)] ) def test_difference2(): G=nx.Graph() H=nx.Graph() G.add_nodes_from([1,2,3,4]) H.add_nodes_from([1,2,3,4]) G.add_edge(1,2) H.add_edge(1,2) G.add_edge(2,3) D=nx.difference(G,H) assert_equal( set(D.nodes()) , set([1,2,3,4]) ) assert_equal( sorted(D.edges()) , [(2,3)] ) D=nx.difference(H,G) assert_equal( set(D.nodes()) , set([1,2,3,4]) ) assert_equal( sorted(D.edges()) , [] ) H.add_edge(3,4) D=nx.difference(H,G) assert_equal( set(D.nodes()) , set([1,2,3,4]) ) assert_equal( sorted(D.edges()) , [(3,4)] ) def test_difference_attributes(): g = nx.Graph() g.add_node(0, x=4) g.add_node(1, x=5) g.add_edge(0, 1, size=5) g.graph['name'] = 'g' h = g.copy() h.graph['name'] = 'h' h.graph['attr'] = 'attr' h.node[0]['x'] = 7 gh = nx.difference(g, h) assert_equal( set(gh.nodes()) , set(g.nodes()) ) assert_equal( set(gh.nodes()) , set(h.nodes()) ) assert_equal( sorted(gh.edges()) , []) h.remove_node(0) assert_raises(nx.NetworkXError, nx.intersection, g, h) def test_difference_multigraph_attributes(): g = nx.MultiGraph() g.add_edge(0, 1, key=0) g.add_edge(0, 1, key=1) g.add_edge(0, 1, key=2) h = nx.MultiGraph() h.add_edge(0, 1, key=0) h.add_edge(0, 1, key=3) gh = nx.difference(g, h) assert_equal( set(gh.nodes()) , set(g.nodes()) ) assert_equal( set(gh.nodes()) , set(h.nodes()) ) assert_equal( sorted(gh.edges()) , [(0,1),(0,1)] ) assert_equal( sorted(gh.edges(keys=True)) , [(0,1,1),(0,1,2)] ) @raises(nx.NetworkXError) def test_difference_raise(): G = nx.path_graph(4) H = nx.path_graph(3) GH = nx.difference(G, H) def test_symmetric_difference_multigraph(): g = nx.MultiGraph() g.add_edge(0, 1, key=0) g.add_edge(0, 1, key=1) g.add_edge(0, 1, key=2) h = nx.MultiGraph() h.add_edge(0, 1, key=0) h.add_edge(0, 1, key=3) gh = nx.symmetric_difference(g, h) assert_equal( set(gh.nodes()) , set(g.nodes()) ) assert_equal( set(gh.nodes()) , set(h.nodes()) ) assert_equal( sorted(gh.edges()) , 3*[(0,1)] ) assert_equal( sorted(sorted(e) for e in gh.edges(keys=True)), [[0,1,1],[0,1,2],[0,1,3]] ) @raises(nx.NetworkXError) def test_symmetric_difference_raise(): G = nx.path_graph(4) H = nx.path_graph(3) GH = nx.symmetric_difference(G, H) def test_union_and_compose(): K3=complete_graph(3) P3=path_graph(3) G1=nx.DiGraph() G1.add_edge('A','B') G1.add_edge('A','C') G1.add_edge('A','D') G2=nx.DiGraph() G2.add_edge('1','2') G2.add_edge('1','3') G2.add_edge('1','4') G=union(G1,G2) H=compose(G1,G2) assert_edges_equal(G.edges(),H.edges()) assert_false(G.has_edge('A',1)) assert_raises(nx.NetworkXError, nx.union, K3, P3) H1=union(H,G1,rename=('H','G1')) assert_equal(sorted(H1.nodes()), ['G1A', 'G1B', 'G1C', 'G1D', 'H1', 'H2', 'H3', 'H4', 'HA', 'HB', 'HC', 'HD']) H2=union(H,G2,rename=("H","")) assert_equal(sorted(H2.nodes()), ['1', '2', '3', '4', 'H1', 'H2', 'H3', 'H4', 'HA', 'HB', 'HC', 'HD']) assert_false(H1.has_edge('NB','NA')) G=compose(G,G) assert_edges_equal(G.edges(),H.edges()) G2=union(G2,G2,rename=('','copy')) assert_equal(sorted(G2.nodes()), ['1', '2', '3', '4', 'copy1', 'copy2', 'copy3', 'copy4']) assert_equal(G2.neighbors('copy4'),[]) assert_equal(sorted(G2.neighbors('copy1')),['copy2', 'copy3', 'copy4']) assert_equal(len(G),8) assert_equal(number_of_edges(G),6) E=disjoint_union(G,G) assert_equal(len(E),16) assert_equal(number_of_edges(E),12) E=disjoint_union(G1,G2) assert_equal(sorted(E.nodes()),[0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11]) def test_union_multigraph(): G=nx.MultiGraph() G.add_edge(1,2,key=0) G.add_edge(1,2,key=1) H=nx.MultiGraph() H.add_edge(3,4,key=0) H.add_edge(3,4,key=1) GH=nx.union(G,H) assert_equal( set(GH) , set(G)|set(H)) assert_equal( set(GH.edges(keys=True)) , set(G.edges(keys=True))|set(H.edges(keys=True))) def test_disjoint_union_multigraph(): G=nx.MultiGraph() G.add_edge(0,1,key=0) G.add_edge(0,1,key=1) H=nx.MultiGraph() H.add_edge(2,3,key=0) H.add_edge(2,3,key=1) GH=nx.disjoint_union(G,H) assert_equal( set(GH) , set(G)|set(H)) assert_equal( set(GH.edges(keys=True)) , set(G.edges(keys=True))|set(H.edges(keys=True))) def test_compose_multigraph(): G=nx.MultiGraph() G.add_edge(1,2,key=0) G.add_edge(1,2,key=1) H=nx.MultiGraph() H.add_edge(3,4,key=0) H.add_edge(3,4,key=1) GH=nx.compose(G,H) assert_equal( set(GH) , set(G)|set(H)) assert_equal( set(GH.edges(keys=True)) , set(G.edges(keys=True))|set(H.edges(keys=True))) H.add_edge(1,2,key=2) GH=nx.compose(G,H) assert_equal( set(GH) , set(G)|set(H)) assert_equal( set(GH.edges(keys=True)) , set(G.edges(keys=True))|set(H.edges(keys=True))) @raises(nx.NetworkXError) def test_mixed_type_union(): G = nx.Graph() H = nx.MultiGraph() U = nx.union(G,H) @raises(nx.NetworkXError) def test_mixed_type_disjoint_union(): G = nx.Graph() H = nx.MultiGraph() U = nx.disjoint_union(G,H) @raises(nx.NetworkXError) def test_mixed_type_intersection(): G = nx.Graph() H = nx.MultiGraph() U = nx.intersection(G,H) @raises(nx.NetworkXError) def test_mixed_type_difference(): G = nx.Graph() H = nx.MultiGraph() U = nx.difference(G,H) @raises(nx.NetworkXError) def test_mixed_type_symmetric_difference(): G = nx.Graph() H = nx.MultiGraph() U = nx.symmetric_difference(G,H) @raises(nx.NetworkXError) def test_mixed_type_compose(): G = nx.Graph() H = nx.MultiGraph() U = nx.compose(G,H) networkx-1.11/networkx/algorithms/operators/unary.py0000644000175000017500000000332312637544500022762 0ustar aricaric00000000000000"""Unary operations on graphs""" # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import networkx as nx __author__ = """\n""".join(['Aric Hagberg ', 'Pieter Swart (swart@lanl.gov)', 'Dan Schult(dschult@colgate.edu)']) __all__ = ['complement', 'reverse'] def complement(G, name=None): """Return the graph complement of G. Parameters ---------- G : graph A NetworkX graph name : string Specify name for new graph Returns ------- GC : A new graph. Notes ------ Note that complement() does not create self-loops and also does not produce parallel edges for MultiGraphs. Graph, node, and edge data are not propagated to the new graph. """ if name is None: name = "complement(%s)" % (G.name) R = G.__class__() R.name = name R.add_nodes_from(G) R.add_edges_from(((n, n2) for n, nbrs in G.adjacency_iter() for n2 in G if n2 not in nbrs if n != n2)) return R def reverse(G, copy=True): """Return the reverse directed graph of G. Parameters ---------- G : directed graph A NetworkX directed graph copy : bool If True, then a new graph is returned. If False, then the graph is reversed in place. Returns ------- H : directed graph The reversed G. """ if not G.is_directed(): raise nx.NetworkXError("Cannot reverse an undirected graph.") else: return G.reverse(copy=copy) networkx-1.11/networkx/algorithms/operators/all.py0000644000175000017500000001017412637544450022402 0ustar aricaric00000000000000"""Operations on many graphs. """ # Copyright (C) 2013 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. try: from itertools import izip_longest as zip_longest except ImportError: # Python3 has zip_longest from itertools import zip_longest import networkx as nx # from networkx.utils import is_string_like __author__ = """\n""".join([ 'Robert King ', 'Aric Hagberg ']) __all__ = ['union_all', 'compose_all', 'disjoint_union_all', 'intersection_all'] def union_all(graphs, rename=(None,), name=None): """Return the union of all graphs. The graphs must be disjoint, otherwise an exception is raised. Parameters ---------- graphs : list of graphs List of NetworkX graphs rename : bool , default=(None, None) Node names of G and H can be changed by specifying the tuple rename=('G-','H-') (for example). Node "u" in G is then renamed "G-u" and "v" in H is renamed "H-v". name : string Specify the name for the union graph@not_implemnted_for('direct Returns ------- U : a graph with the same type as the first graph in list Notes ----- To force a disjoint union with node relabeling, use disjoint_union_all(G,H) or convert_node_labels_to integers(). Graph, edge, and node attributes are propagated to the union graph. If a graph attribute is present in multiple graphs, then the value from the last graph in the list with that attribute is used. See Also -------- union disjoint_union_all """ graphs_names = zip_longest(graphs, rename) U, gname = next(graphs_names) for H, hname in graphs_names: U = nx.union(U, H, (gname, hname), name=name) gname = None return U def disjoint_union_all(graphs): """Return the disjoint union of all graphs. This operation forces distinct integer node labels starting with 0 for the first graph in the list and numbering consecutively. Parameters ---------- graphs : list List of NetworkX graphs Returns ------- U : A graph with the same type as the first graph in list Notes ----- It is recommended that the graphs be either all directed or all undirected. Graph, edge, and node attributes are propagated to the union graph. If a graph attribute is present in multiple graphs, then the value from the last graph in the list with that attribute is used. """ graphs = iter(graphs) U = next(graphs) for H in graphs: U = nx.disjoint_union(U, H) return U def compose_all(graphs, name=None): """Return the composition of all graphs. Composition is the simple union of the node sets and edge sets. The node sets of the supplied graphs need not be disjoint. Parameters ---------- graphs : list List of NetworkX graphs name : string Specify name for new graph Returns ------- C : A graph with the same type as the first graph in list Notes ----- It is recommended that the supplied graphs be either all directed or all undirected. Graph, edge, and node attributes are propagated to the union graph. If a graph attribute is present in multiple graphs, then the value from the last graph in the list with that attribute is used. """ graphs = iter(graphs) C = next(graphs) for H in graphs: C = nx.compose(C, H, name=name) return C def intersection_all(graphs): """Return a new graph that contains only the edges that exist in all graphs. All supplied graphs must have the same node set. Parameters ---------- graphs_list : list List of NetworkX graphs Returns ------- R : A new graph with the same type as the first graph in list Notes ----- Attributes from the graph, nodes, and edges are not copied to the new graph. """ graphs = iter(graphs) R = next(graphs) for H in graphs: R = nx.intersection(R, H) return R networkx-1.11/networkx/algorithms/operators/binary.py0000644000175000017500000002333012637544500023110 0ustar aricaric00000000000000""" Operations on graphs including union, intersection, difference. """ # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import networkx as nx from networkx.utils import is_string_like __author__ = """\n""".join(['Aric Hagberg ', 'Pieter Swart (swart@lanl.gov)', 'Dan Schult(dschult@colgate.edu)']) __all__ = ['union', 'compose', 'disjoint_union', 'intersection', 'difference', 'symmetric_difference'] def union(G, H, rename=(None, None), name=None): """ Return the union of graphs G and H. Graphs G and H must be disjoint, otherwise an exception is raised. Parameters ---------- G,H : graph A NetworkX graph create_using : NetworkX graph Use specified graph for result. Otherwise rename : bool , default=(None, None) Node names of G and H can be changed by specifying the tuple rename=('G-','H-') (for example). Node "u" in G is then renamed "G-u" and "v" in H is renamed "H-v". name : string Specify the name for the union graph Returns ------- U : A union graph with the same type as G. Notes ----- To force a disjoint union with node relabeling, use disjoint_union(G,H) or convert_node_labels_to integers(). Graph, edge, and node attributes are propagated from G and H to the union graph. If a graph attribute is present in both G and H the value from H is used. See Also -------- disjoint_union """ if not G.is_multigraph() == H.is_multigraph(): raise nx.NetworkXError('G and H must both be graphs or multigraphs.') # Union is the same type as G R = G.__class__() if name is None: name = "union( %s, %s )" % (G.name, H.name) R.name = name # rename graph to obtain disjoint node labels def add_prefix(graph, prefix): if prefix is None: return graph def label(x): if is_string_like(x): name = prefix + x else: name = prefix + repr(x) return name return nx.relabel_nodes(graph, label) G = add_prefix(G, rename[0]) H = add_prefix(H, rename[1]) if set(G) & set(H): raise nx.NetworkXError('The node sets of G and H are not disjoint.', 'Use appropriate rename=(Gprefix,Hprefix)' 'or use disjoint_union(G,H).') if G.is_multigraph(): G_edges = G.edges_iter(keys=True, data=True) else: G_edges = G.edges_iter(data=True) if H.is_multigraph(): H_edges = H.edges_iter(keys=True, data=True) else: H_edges = H.edges_iter(data=True) # add nodes R.add_nodes_from(G) R.add_edges_from(G_edges) # add edges R.add_nodes_from(H) R.add_edges_from(H_edges) # add node attributes R.node.update(G.node) R.node.update(H.node) # add graph attributes, H attributes take precedent over G attributes R.graph.update(G.graph) R.graph.update(H.graph) return R def disjoint_union(G, H): """ Return the disjoint union of graphs G and H. This algorithm forces distinct integer node labels. Parameters ---------- G,H : graph A NetworkX graph Returns ------- U : A union graph with the same type as G. Notes ----- A new graph is created, of the same class as G. It is recommended that G and H be either both directed or both undirected. The nodes of G are relabeled 0 to len(G)-1, and the nodes of H are relabeled len(G) to len(G)+len(H)-1. Graph, edge, and node attributes are propagated from G and H to the union graph. If a graph attribute is present in both G and H the value from H is used. """ R1 = nx.convert_node_labels_to_integers(G) R2 = nx.convert_node_labels_to_integers(H, first_label=len(R1)) R = union(R1, R2) R.name = "disjoint_union( %s, %s )" % (G.name, H.name) R.graph.update(G.graph) R.graph.update(H.graph) return R def intersection(G, H): """Return a new graph that contains only the edges that exist in both G and H. The node sets of H and G must be the same. Parameters ---------- G,H : graph A NetworkX graph. G and H must have the same node sets. Returns ------- GH : A new graph with the same type as G. Notes ----- Attributes from the graph, nodes, and edges are not copied to the new graph. If you want a new graph of the intersection of G and H with the attributes (including edge data) from G use remove_nodes_from() as follows >>> G=nx.path_graph(3) >>> H=nx.path_graph(5) >>> R=G.copy() >>> R.remove_nodes_from(n for n in G if n not in H) """ # create new graph R = nx.create_empty_copy(G) R.name = "Intersection of (%s and %s)" % (G.name, H.name) if not G.is_multigraph() == H.is_multigraph(): raise nx.NetworkXError('G and H must both be graphs or multigraphs.') if set(G) != set(H): raise nx.NetworkXError("Node sets of graphs are not equal") if G.number_of_edges() <= H.number_of_edges(): if G.is_multigraph(): edges = G.edges_iter(keys=True) else: edges = G.edges_iter() for e in edges: if H.has_edge(*e): R.add_edge(*e) else: if H.is_multigraph(): edges = H.edges_iter(keys=True) else: edges = H.edges_iter() for e in edges: if G.has_edge(*e): R.add_edge(*e) return R def difference(G, H): """Return a new graph that contains the edges that exist in G but not in H. The node sets of H and G must be the same. Parameters ---------- G,H : graph A NetworkX graph. G and H must have the same node sets. Returns ------- D : A new graph with the same type as G. Notes ----- Attributes from the graph, nodes, and edges are not copied to the new graph. If you want a new graph of the difference of G and H with with the attributes (including edge data) from G use remove_nodes_from() as follows: >>> G = nx.path_graph(3) >>> H = nx.path_graph(5) >>> R = G.copy() >>> R.remove_nodes_from(n for n in G if n in H) """ # create new graph if not G.is_multigraph() == H.is_multigraph(): raise nx.NetworkXError('G and H must both be graphs or multigraphs.') R = nx.create_empty_copy(G) R.name = "Difference of (%s and %s)" % (G.name, H.name) if set(G) != set(H): raise nx.NetworkXError("Node sets of graphs not equal") if G.is_multigraph(): edges = G.edges_iter(keys=True) else: edges = G.edges_iter() for e in edges: if not H.has_edge(*e): R.add_edge(*e) return R def symmetric_difference(G, H): """Return new graph with edges that exist in either G or H but not both. The node sets of H and G must be the same. Parameters ---------- G,H : graph A NetworkX graph. G and H must have the same node sets. Returns ------- D : A new graph with the same type as G. Notes ----- Attributes from the graph, nodes, and edges are not copied to the new graph. """ # create new graph if not G.is_multigraph() == H.is_multigraph(): raise nx.NetworkXError('G and H must both be graphs or multigraphs.') R = nx.create_empty_copy(G) R.name = "Symmetric difference of (%s and %s)" % (G.name, H.name) if set(G) != set(H): raise nx.NetworkXError("Node sets of graphs not equal") gnodes = set(G) # set of nodes in G hnodes = set(H) # set of nodes in H nodes = gnodes.symmetric_difference(hnodes) R.add_nodes_from(nodes) if G.is_multigraph(): edges = G.edges_iter(keys=True) else: edges = G.edges_iter() # we could copy the data here but then this function doesn't # match intersection and difference for e in edges: if not H.has_edge(*e): R.add_edge(*e) if H.is_multigraph(): edges = H.edges_iter(keys=True) else: edges = H.edges_iter() for e in edges: if not G.has_edge(*e): R.add_edge(*e) return R def compose(G, H, name=None): """Return a new graph of G composed with H. Composition is the simple union of the node sets and edge sets. The node sets of G and H do not need to be disjoint. Parameters ---------- G,H : graph A NetworkX graph name : string Specify name for new graph Returns ------- C: A new graph with the same type as G Notes ----- It is recommended that G and H be either both directed or both undirected. Attributes from H take precedent over attributes from G. """ if not G.is_multigraph() == H.is_multigraph(): raise nx.NetworkXError('G and H must both be graphs or multigraphs.') if name is None: name = "compose( %s, %s )" % (G.name, H.name) R = G.__class__() R.name = name R.add_nodes_from(H.nodes()) R.add_nodes_from(G.nodes()) if G.is_multigraph(): R.add_edges_from(G.edges_iter(keys=True, data=True)) else: R.add_edges_from(G.edges_iter(data=True)) if H.is_multigraph(): R.add_edges_from(H.edges_iter(keys=True, data=True)) else: R.add_edges_from(H.edges_iter(data=True)) # add node attributes, H attributes take precedent over G attributes R.node.update(G.node) R.node.update(H.node) # add graph attributes, H attributes take precedent over G attributes R.graph.update(G.graph) R.graph.update(H.graph) return R networkx-1.11/networkx/algorithms/shortest_paths/0000755000175000017500000000000012653231454022304 5ustar aricaric00000000000000networkx-1.11/networkx/algorithms/shortest_paths/generic.py0000644000175000017500000002665412637544500024310 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ Compute the shortest paths and path lengths between nodes in the graph. These algorithms work with undirected and directed graphs. """ # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import networkx as nx __author__ = """\n""".join(['Aric Hagberg ', 'Sérgio Nery Simões ']) __all__ = ['shortest_path', 'all_shortest_paths', 'shortest_path_length', 'average_shortest_path_length', 'has_path'] def has_path(G, source, target): """Return True if G has a path from source to target, False otherwise. Parameters ---------- G : NetworkX graph source : node Starting node for path target : node Ending node for path """ try: sp = nx.shortest_path(G,source, target) except nx.NetworkXNoPath: return False return True def shortest_path(G, source=None, target=None, weight=None): """Compute shortest paths in the graph. Parameters ---------- G : NetworkX graph source : node, optional Starting node for path. If not specified, compute shortest paths using all nodes as source nodes. target : node, optional Ending node for path. If not specified, compute shortest paths using all nodes as target nodes. weight : None or string, optional (default = None) If None, every edge has weight/distance/cost 1. If a string, use this edge attribute as the edge weight. Any edge attribute not present defaults to 1. Returns ------- path: list or dictionary All returned paths include both the source and target in the path. If the source and target are both specified, return a single list of nodes in a shortest path from the source to the target. If only the source is specified, return a dictionary keyed by targets with a list of nodes in a shortest path from the source to one of the targets. If only the target is specified, return a dictionary keyed by sources with a list of nodes in a shortest path from one of the sources to the target. If neither the source nor target are specified return a dictionary of dictionaries with path[source][target]=[list of nodes in path]. Examples -------- >>> G=nx.path_graph(5) >>> print(nx.shortest_path(G,source=0,target=4)) [0, 1, 2, 3, 4] >>> p=nx.shortest_path(G,source=0) # target not specified >>> p[4] [0, 1, 2, 3, 4] >>> p=nx.shortest_path(G,target=4) # source not specified >>> p[0] [0, 1, 2, 3, 4] >>> p=nx.shortest_path(G) # source,target not specified >>> p[0][4] [0, 1, 2, 3, 4] Notes ----- There may be more than one shortest path between a source and target. This returns only one of them. See Also -------- all_pairs_shortest_path() all_pairs_dijkstra_path() single_source_shortest_path() single_source_dijkstra_path() """ if source is None: if target is None: ## Find paths between all pairs. if weight is None: paths=nx.all_pairs_shortest_path(G) else: paths=nx.all_pairs_dijkstra_path(G,weight=weight) else: ## Find paths from all nodes co-accessible to the target. with nx.utils.reversed(G): if weight is None: paths=nx.single_source_shortest_path(G, target) else: paths=nx.single_source_dijkstra_path(G, target, weight=weight) # Now flip the paths so they go from a source to the target. for target in paths: paths[target] = list(reversed(paths[target])) else: if target is None: ## Find paths to all nodes accessible from the source. if weight is None: paths=nx.single_source_shortest_path(G,source) else: paths=nx.single_source_dijkstra_path(G,source,weight=weight) else: ## Find shortest source-target path. if weight is None: paths=nx.bidirectional_shortest_path(G,source,target) else: paths=nx.dijkstra_path(G,source,target,weight) return paths def shortest_path_length(G, source=None, target=None, weight=None): """Compute shortest path lengths in the graph. Parameters ---------- G : NetworkX graph source : node, optional Starting node for path. If not specified, compute shortest path lengths using all nodes as source nodes. target : node, optional Ending node for path. If not specified, compute shortest path lengths using all nodes as target nodes. weight : None or string, optional (default = None) If None, every edge has weight/distance/cost 1. If a string, use this edge attribute as the edge weight. Any edge attribute not present defaults to 1. Returns ------- length: int or dictionary If the source and target are both specified, return the length of the shortest path from the source to the target. If only the source is specified, return a dictionary keyed by targets whose values are the lengths of the shortest path from the source to one of the targets. If only the target is specified, return a dictionary keyed by sources whose values are the lengths of the shortest path from one of the sources to the target. If neither the source nor target are specified return a dictionary of dictionaries with path[source][target]=L, where L is the length of the shortest path from source to target. Raises ------ NetworkXNoPath If no path exists between source and target. Examples -------- >>> G=nx.path_graph(5) >>> print(nx.shortest_path_length(G,source=0,target=4)) 4 >>> p=nx.shortest_path_length(G,source=0) # target not specified >>> p[4] 4 >>> p=nx.shortest_path_length(G,target=4) # source not specified >>> p[0] 4 >>> p=nx.shortest_path_length(G) # source,target not specified >>> p[0][4] 4 Notes ----- The length of the path is always 1 less than the number of nodes involved in the path since the length measures the number of edges followed. For digraphs this returns the shortest directed path length. To find path lengths in the reverse direction use G.reverse(copy=False) first to flip the edge orientation. See Also -------- all_pairs_shortest_path_length() all_pairs_dijkstra_path_length() single_source_shortest_path_length() single_source_dijkstra_path_length() """ if source is None: if target is None: ## Find paths between all pairs. if weight is None: paths=nx.all_pairs_shortest_path_length(G) else: paths=nx.all_pairs_dijkstra_path_length(G, weight=weight) else: ## Find paths from all nodes co-accessible to the target. with nx.utils.reversed(G): if weight is None: paths=nx.single_source_shortest_path_length(G, target) else: paths=nx.single_source_dijkstra_path_length(G, target, weight=weight) else: if target is None: ## Find paths to all nodes accessible from the source. if weight is None: paths=nx.single_source_shortest_path_length(G,source) else: paths=nx.single_source_dijkstra_path_length(G,source,weight=weight) else: ## Find shortest source-target path. if weight is None: p=nx.bidirectional_shortest_path(G,source,target) paths=len(p)-1 else: paths=nx.dijkstra_path_length(G,source,target,weight) return paths def average_shortest_path_length(G, weight=None): r"""Return the average shortest path length. The average shortest path length is .. math:: a =\sum_{s,t \in V} \frac{d(s, t)}{n(n-1)} where `V` is the set of nodes in `G`, `d(s, t)` is the shortest path from `s` to `t`, and `n` is the number of nodes in `G`. Parameters ---------- G : NetworkX graph weight : None or string, optional (default = None) If None, every edge has weight/distance/cost 1. If a string, use this edge attribute as the edge weight. Any edge attribute not present defaults to 1. Raises ------ NetworkXError: if the graph is not connected. Examples -------- >>> G=nx.path_graph(5) >>> print(nx.average_shortest_path_length(G)) 2.0 For disconnected graphs you can compute the average shortest path length for each component: >>> G=nx.Graph([(1,2),(3,4)]) >>> for g in nx.connected_component_subgraphs(G): ... print(nx.average_shortest_path_length(g)) 1.0 1.0 """ if G.is_directed(): if not nx.is_weakly_connected(G): raise nx.NetworkXError("Graph is not connected.") else: if not nx.is_connected(G): raise nx.NetworkXError("Graph is not connected.") avg=0.0 if weight is None: for node in G: path_length=nx.single_source_shortest_path_length(G, node) avg += sum(path_length.values()) else: for node in G: path_length=nx.single_source_dijkstra_path_length(G, node, weight=weight) avg += sum(path_length.values()) n=len(G) return avg/(n*(n-1)) def all_shortest_paths(G, source, target, weight=None): """Compute all shortest paths in the graph. Parameters ---------- G : NetworkX graph source : node Starting node for path. target : node Ending node for path. weight : None or string, optional (default = None) If None, every edge has weight/distance/cost 1. If a string, use this edge attribute as the edge weight. Any edge attribute not present defaults to 1. Returns ------- paths: generator of lists A generator of all paths between source and target. Examples -------- >>> G=nx.Graph() >>> G.add_path([0,1,2]) >>> G.add_path([0,10,2]) >>> print([p for p in nx.all_shortest_paths(G,source=0,target=2)]) [[0, 1, 2], [0, 10, 2]] Notes ----- There may be many shortest paths between the source and target. See Also -------- shortest_path() single_source_shortest_path() all_pairs_shortest_path() """ if weight is not None: pred,dist = nx.dijkstra_predecessor_and_distance(G,source,weight=weight) else: pred = nx.predecessor(G,source) if target not in pred: raise nx.NetworkXNoPath() stack = [[target,0]] top = 0 while top >= 0: node,i = stack[top] if node == source: yield [p for p,n in reversed(stack[:top+1])] if len(pred[node]) > i: top += 1 if top == len(stack): stack.append([pred[node][i],0]) else: stack[top] = [pred[node][i],0] else: stack[top-1][1] += 1 top -= 1 networkx-1.11/networkx/algorithms/shortest_paths/unweighted.py0000644000175000017500000002264112637544500025027 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ Shortest path algorithms for unweighted graphs. """ __author__ = """Aric Hagberg (hagberg@lanl.gov)""" # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. __all__ = ['bidirectional_shortest_path', 'single_source_shortest_path', 'single_source_shortest_path_length', 'all_pairs_shortest_path', 'all_pairs_shortest_path_length', 'predecessor'] import networkx as nx def single_source_shortest_path_length(G,source,cutoff=None): """Compute the shortest path lengths from source to all reachable nodes. Parameters ---------- G : NetworkX graph source : node Starting node for path cutoff : integer, optional Depth to stop the search. Only paths of length <= cutoff are returned. Returns ------- lengths : dictionary Dictionary of shortest path lengths keyed by target. Examples -------- >>> G=nx.path_graph(5) >>> length=nx.single_source_shortest_path_length(G,0) >>> length[4] 4 >>> print(length) {0: 0, 1: 1, 2: 2, 3: 3, 4: 4} See Also -------- shortest_path_length """ seen={} # level (number of hops) when seen in BFS level=0 # the current level nextlevel={source:1} # dict of nodes to check at next level while nextlevel: thislevel=nextlevel # advance to next level nextlevel={} # and start a new list (fringe) for v in thislevel: if v not in seen: seen[v]=level # set the level of vertex v nextlevel.update(G[v]) # add neighbors of v if (cutoff is not None and cutoff <= level): break level=level+1 return seen # return all path lengths as dictionary def all_pairs_shortest_path_length(G, cutoff=None): """Computes the shortest path lengths between all nodes in ``G``. Parameters ---------- G : NetworkX graph cutoff : integer, optional Depth at which to stop the search. Only paths of length at most ``cutoff`` are returned. Returns ------- lengths : dictionary Dictionary of shortest path lengths keyed by source and target. Notes ----- The dictionary returned only has keys for reachable node pairs. Examples -------- >>> G = nx.path_graph(5) >>> length = nx.all_pairs_shortest_path_length(G) >>> print(length[1][4]) 3 >>> length[1] {0: 1, 1: 0, 2: 1, 3: 2, 4: 3} """ length = single_source_shortest_path_length # TODO This can be trivially parallelized. return {n: length(G, n, cutoff=cutoff) for n in G} def bidirectional_shortest_path(G,source,target): """Return a list of nodes in a shortest path between source and target. Parameters ---------- G : NetworkX graph source : node label starting node for path target : node label ending node for path Returns ------- path: list List of nodes in a path from source to target. Raises ------ NetworkXNoPath If no path exists between source and target. See Also -------- shortest_path Notes ----- This algorithm is used by shortest_path(G,source,target). """ # call helper to do the real work results=_bidirectional_pred_succ(G,source,target) pred,succ,w=results # build path from pred+w+succ path=[] # from source to w while w is not None: path.append(w) w=pred[w] path.reverse() # from w to target w=succ[path[-1]] while w is not None: path.append(w) w=succ[w] return path def _bidirectional_pred_succ(G, source, target): """Bidirectional shortest path helper. Returns (pred,succ,w) where pred is a dictionary of predecessors from w to the source, and succ is a dictionary of successors from w to the target. """ # does BFS from both source and target and meets in the middle if target == source: return ({target:None},{source:None},source) # handle either directed or undirected if G.is_directed(): Gpred=G.predecessors_iter Gsucc=G.successors_iter else: Gpred=G.neighbors_iter Gsucc=G.neighbors_iter # predecesssor and successors in search pred={source:None} succ={target:None} # initialize fringes, start with forward forward_fringe=[source] reverse_fringe=[target] while forward_fringe and reverse_fringe: if len(forward_fringe) <= len(reverse_fringe): this_level=forward_fringe forward_fringe=[] for v in this_level: for w in Gsucc(v): if w not in pred: forward_fringe.append(w) pred[w]=v if w in succ: return pred,succ,w # found path else: this_level=reverse_fringe reverse_fringe=[] for v in this_level: for w in Gpred(v): if w not in succ: succ[w]=v reverse_fringe.append(w) if w in pred: return pred,succ,w # found path raise nx.NetworkXNoPath("No path between %s and %s." % (source, target)) def single_source_shortest_path(G,source,cutoff=None): """Compute shortest path between source and all other nodes reachable from source. Parameters ---------- G : NetworkX graph source : node label Starting node for path cutoff : integer, optional Depth to stop the search. Only paths of length <= cutoff are returned. Returns ------- lengths : dictionary Dictionary, keyed by target, of shortest paths. Examples -------- >>> G=nx.path_graph(5) >>> path=nx.single_source_shortest_path(G,0) >>> path[4] [0, 1, 2, 3, 4] Notes ----- The shortest path is not necessarily unique. So there can be multiple paths between the source and each target node, all of which have the same 'shortest' length. For each target node, this function returns only one of those paths. See Also -------- shortest_path """ level=0 # the current level nextlevel={source:1} # list of nodes to check at next level paths={source:[source]} # paths dictionary (paths to key from source) if cutoff==0: return paths while nextlevel: thislevel=nextlevel nextlevel={} for v in thislevel: for w in G[v]: if w not in paths: paths[w]=paths[v]+[w] nextlevel[w]=1 level=level+1 if (cutoff is not None and cutoff <= level): break return paths def all_pairs_shortest_path(G, cutoff=None): """Compute shortest paths between all nodes. Parameters ---------- G : NetworkX graph cutoff : integer, optional Depth at which to stop the search. Only paths of length at most ``cutoff`` are returned. Returns ------- lengths : dictionary Dictionary, keyed by source and target, of shortest paths. Examples -------- >>> G = nx.path_graph(5) >>> path = nx.all_pairs_shortest_path(G) >>> print(path[0][4]) [0, 1, 2, 3, 4] See Also -------- floyd_warshall() """ # TODO This can be trivially parallelized. return {n: single_source_shortest_path(G, n, cutoff=cutoff) for n in G} def predecessor(G,source,target=None,cutoff=None,return_seen=None): """ Returns dictionary of predecessors for the path from source to all nodes in G. Parameters ---------- G : NetworkX graph source : node label Starting node for path target : node label, optional Ending node for path. If provided only predecessors between source and target are returned cutoff : integer, optional Depth to stop the search. Only paths of length <= cutoff are returned. Returns ------- pred : dictionary Dictionary, keyed by node, of predecessors in the shortest path. Examples -------- >>> G=nx.path_graph(4) >>> print(G.nodes()) [0, 1, 2, 3] >>> nx.predecessor(G,0) {0: [], 1: [0], 2: [1], 3: [2]} """ level=0 # the current level nextlevel=[source] # list of nodes to check at next level seen={source:level} # level (number of hops) when seen in BFS pred={source:[]} # predecessor dictionary while nextlevel: level=level+1 thislevel=nextlevel nextlevel=[] for v in thislevel: for w in G[v]: if w not in seen: pred[w]=[v] seen[w]=level nextlevel.append(w) elif (seen[w]==level):# add v to predecessor list if it pred[w].append(v) # is at the correct level if (cutoff and cutoff <= level): break if target is not None: if return_seen: if not target in pred: return ([],-1) # No predecessor return (pred[target],seen[target]) else: if not target in pred: return [] # No predecessor return pred[target] else: if return_seen: return (pred,seen) else: return pred networkx-1.11/networkx/algorithms/shortest_paths/__init__.py0000644000175000017500000000043612637544450024425 0ustar aricaric00000000000000from networkx.algorithms.shortest_paths.generic import * from networkx.algorithms.shortest_paths.unweighted import * from networkx.algorithms.shortest_paths.weighted import * from networkx.algorithms.shortest_paths.astar import * from networkx.algorithms.shortest_paths.dense import * networkx-1.11/networkx/algorithms/shortest_paths/weighted.py0000644000175000017500000010015012637544500024454 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ Shortest path algorithms for weighed graphs. """ __author__ = """\n""".join(['Aric Hagberg ', 'Loïc Séguin-C. ', 'Dan Schult ']) # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. __all__ = ['dijkstra_path', 'dijkstra_path_length', 'bidirectional_dijkstra', 'single_source_dijkstra', 'single_source_dijkstra_path', 'single_source_dijkstra_path_length', 'all_pairs_dijkstra_path', 'all_pairs_dijkstra_path_length', 'dijkstra_predecessor_and_distance', 'bellman_ford', 'negative_edge_cycle', 'goldberg_radzik', 'johnson'] from collections import deque from heapq import heappush, heappop from itertools import count import networkx as nx from networkx.utils import generate_unique_node def dijkstra_path(G, source, target, weight='weight'): """Returns the shortest path from source to target in a weighted graph G. Parameters ---------- G : NetworkX graph source : node Starting node target : node Ending node weight: string, optional (default='weight') Edge data key corresponding to the edge weight Returns ------- path : list List of nodes in a shortest path. Raises ------ NetworkXNoPath If no path exists between source and target. Examples -------- >>> G=nx.path_graph(5) >>> print(nx.dijkstra_path(G,0,4)) [0, 1, 2, 3, 4] Notes ----- Edge weight attributes must be numerical. Distances are calculated as sums of weighted edges traversed. See Also -------- bidirectional_dijkstra() """ (length, path) = single_source_dijkstra(G, source, target=target, weight=weight) try: return path[target] except KeyError: raise nx.NetworkXNoPath( "node %s not reachable from %s" % (source, target)) def dijkstra_path_length(G, source, target, weight='weight'): """Returns the shortest path length from source to target in a weighted graph. Parameters ---------- G : NetworkX graph source : node label starting node for path target : node label ending node for path weight: string, optional (default='weight') Edge data key corresponding to the edge weight Returns ------- length : number Shortest path length. Raises ------ NetworkXNoPath If no path exists between source and target. Examples -------- >>> G=nx.path_graph(5) >>> print(nx.dijkstra_path_length(G,0,4)) 4 Notes ----- Edge weight attributes must be numerical. Distances are calculated as sums of weighted edges traversed. See Also -------- bidirectional_dijkstra() """ length = single_source_dijkstra_path_length(G, source, weight=weight) try: return length[target] except KeyError: raise nx.NetworkXNoPath( "node %s not reachable from %s" % (source, target)) def single_source_dijkstra_path(G, source, cutoff=None, weight='weight'): """Compute shortest path between source and all other reachable nodes for a weighted graph. Parameters ---------- G : NetworkX graph source : node Starting node for path. weight: string, optional (default='weight') Edge data key corresponding to the edge weight cutoff : integer or float, optional Depth to stop the search. Only paths of length <= cutoff are returned. Returns ------- paths : dictionary Dictionary of shortest path lengths keyed by target. Examples -------- >>> G=nx.path_graph(5) >>> path=nx.single_source_dijkstra_path(G,0) >>> path[4] [0, 1, 2, 3, 4] Notes ----- Edge weight attributes must be numerical. Distances are calculated as sums of weighted edges traversed. See Also -------- single_source_dijkstra() """ (length, path) = single_source_dijkstra( G, source, cutoff=cutoff, weight=weight) return path def single_source_dijkstra_path_length(G, source, cutoff=None, weight='weight'): """Compute the shortest path length between source and all other reachable nodes for a weighted graph. Parameters ---------- G : NetworkX graph source : node label Starting node for path weight: string, optional (default='weight') Edge data key corresponding to the edge weight. cutoff : integer or float, optional Depth to stop the search. Only paths of length <= cutoff are returned. Returns ------- length : dictionary Dictionary of shortest lengths keyed by target. Examples -------- >>> G=nx.path_graph(5) >>> length=nx.single_source_dijkstra_path_length(G,0) >>> length[4] 4 >>> print(length) {0: 0, 1: 1, 2: 2, 3: 3, 4: 4} Notes ----- Edge weight attributes must be numerical. Distances are calculated as sums of weighted edges traversed. See Also -------- single_source_dijkstra() """ if G.is_multigraph(): get_weight = lambda u, v, data: min( eattr.get(weight, 1) for eattr in data.values()) else: get_weight = lambda u, v, data: data.get(weight, 1) return _dijkstra(G, source, get_weight, cutoff=cutoff) def single_source_dijkstra(G, source, target=None, cutoff=None, weight='weight'): """Compute shortest paths and lengths in a weighted graph G. Uses Dijkstra's algorithm for shortest paths. Parameters ---------- G : NetworkX graph source : node label Starting node for path target : node label, optional Ending node for path cutoff : integer or float, optional Depth to stop the search. Only paths of length <= cutoff are returned. Returns ------- distance,path : dictionaries Returns a tuple of two dictionaries keyed by node. The first dictionary stores distance from the source. The second stores the path from the source to that node. Examples -------- >>> G=nx.path_graph(5) >>> length,path=nx.single_source_dijkstra(G,0) >>> print(length[4]) 4 >>> print(length) {0: 0, 1: 1, 2: 2, 3: 3, 4: 4} >>> path[4] [0, 1, 2, 3, 4] Notes ----- Edge weight attributes must be numerical. Distances are calculated as sums of weighted edges traversed. Based on the Python cookbook recipe (119466) at http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/119466 This algorithm is not guaranteed to work if edge weights are negative or are floating point numbers (overflows and roundoff errors can cause problems). See Also -------- single_source_dijkstra_path() single_source_dijkstra_path_length() """ if source == target: return ({source: 0}, {source: [source]}) if G.is_multigraph(): get_weight = lambda u, v, data: min( eattr.get(weight, 1) for eattr in data.values()) else: get_weight = lambda u, v, data: data.get(weight, 1) paths = {source: [source]} # dictionary of paths return _dijkstra(G, source, get_weight, paths=paths, cutoff=cutoff, target=target) def _dijkstra(G, source, get_weight, pred=None, paths=None, cutoff=None, target=None): """Implementation of Dijkstra's algorithm Parameters ---------- G : NetworkX graph source : node label Starting node for path get_weight: function Function for getting edge weight pred: list, optional(default=None) List of predecessors of a node paths: dict, optional (default=None) Path from the source to a target node. target : node label, optional Ending node for path cutoff : integer or float, optional Depth to stop the search. Only paths of length <= cutoff are returned. Returns ------- distance,path : dictionaries Returns a tuple of two dictionaries keyed by node. The first dictionary stores distance from the source. The second stores the path from the source to that node. pred,distance : dictionaries Returns two dictionaries representing a list of predecessors of a node and the distance to each node. distance : dictionary Dictionary of shortest lengths keyed by target. """ G_succ = G.succ if G.is_directed() else G.adj push = heappush pop = heappop dist = {} # dictionary of final distances seen = {source: 0} c = count() fringe = [] # use heapq with (distance,label) tuples push(fringe, (0, next(c), source)) while fringe: (d, _, v) = pop(fringe) if v in dist: continue # already searched this node. dist[v] = d if v == target: break for u, e in G_succ[v].items(): cost = get_weight(v, u, e) if cost is None: continue vu_dist = dist[v] + get_weight(v, u, e) if cutoff is not None: if vu_dist > cutoff: continue if u in dist: if vu_dist < dist[u]: raise ValueError('Contradictory paths found:', 'negative weights?') elif u not in seen or vu_dist < seen[u]: seen[u] = vu_dist push(fringe, (vu_dist, next(c), u)) if paths is not None: paths[u] = paths[v] + [u] if pred is not None: pred[u] = [v] elif vu_dist == seen[u]: if pred is not None: pred[u].append(v) if paths is not None: return (dist, paths) if pred is not None: return (pred, dist) return dist def dijkstra_predecessor_and_distance(G, source, cutoff=None, weight='weight'): """Compute shortest path length and predecessors on shortest paths in weighted graphs. Parameters ---------- G : NetworkX graph source : node label Starting node for path weight: string, optional (default='weight') Edge data key corresponding to the edge weight cutoff : integer or float, optional Depth to stop the search. Only paths of length <= cutoff are returned. Returns ------- pred,distance : dictionaries Returns two dictionaries representing a list of predecessors of a node and the distance to each node. Notes ----- Edge weight attributes must be numerical. Distances are calculated as sums of weighted edges traversed. The list of predecessors contains more than one element only when there are more than one shortest paths to the key node. """ if G.is_multigraph(): get_weight = lambda u, v, data: min( eattr.get(weight, 1) for eattr in data.values()) else: get_weight = lambda u, v, data: data.get(weight, 1) pred = {source: []} # dictionary of predecessors return _dijkstra(G, source, get_weight, pred=pred, cutoff=cutoff) def all_pairs_dijkstra_path_length(G, cutoff=None, weight='weight'): """ Compute shortest path lengths between all nodes in a weighted graph. Parameters ---------- G : NetworkX graph weight: string, optional (default='weight') Edge data key corresponding to the edge weight cutoff : integer or float, optional Depth to stop the search. Only paths of length <= cutoff are returned. Returns ------- distance : dictionary Dictionary, keyed by source and target, of shortest path lengths. Examples -------- >>> G=nx.path_graph(5) >>> length=nx.all_pairs_dijkstra_path_length(G) >>> print(length[1][4]) 3 >>> length[1] {0: 1, 1: 0, 2: 1, 3: 2, 4: 3} Notes ----- Edge weight attributes must be numerical. Distances are calculated as sums of weighted edges traversed. The dictionary returned only has keys for reachable node pairs. """ length = single_source_dijkstra_path_length # TODO This can be trivially parallelized. return {n: length(G, n, cutoff=cutoff, weight=weight) for n in G} def all_pairs_dijkstra_path(G, cutoff=None, weight='weight'): """ Compute shortest paths between all nodes in a weighted graph. Parameters ---------- G : NetworkX graph weight: string, optional (default='weight') Edge data key corresponding to the edge weight cutoff : integer or float, optional Depth to stop the search. Only paths of length <= cutoff are returned. Returns ------- distance : dictionary Dictionary, keyed by source and target, of shortest paths. Examples -------- >>> G=nx.path_graph(5) >>> path=nx.all_pairs_dijkstra_path(G) >>> print(path[0][4]) [0, 1, 2, 3, 4] Notes ----- Edge weight attributes must be numerical. Distances are calculated as sums of weighted edges traversed. See Also -------- floyd_warshall() """ path = single_source_dijkstra_path # TODO This can be trivially parallelized. return {n: path(G, n, cutoff=cutoff, weight=weight) for n in G} def bellman_ford(G, source, weight='weight'): """Compute shortest path lengths and predecessors on shortest paths in weighted graphs. The algorithm has a running time of O(mn) where n is the number of nodes and m is the number of edges. It is slower than Dijkstra but can handle negative edge weights. Parameters ---------- G : NetworkX graph The algorithm works for all types of graphs, including directed graphs and multigraphs. source: node label Starting node for path weight: string, optional (default='weight') Edge data key corresponding to the edge weight Returns ------- pred, dist : dictionaries Returns two dictionaries keyed by node to predecessor in the path and to the distance from the source respectively. Raises ------ NetworkXUnbounded If the (di)graph contains a negative cost (di)cycle, the algorithm raises an exception to indicate the presence of the negative cost (di)cycle. Note: any negative weight edge in an undirected graph is a negative cost cycle. Examples -------- >>> import networkx as nx >>> G = nx.path_graph(5, create_using = nx.DiGraph()) >>> pred, dist = nx.bellman_ford(G, 0) >>> sorted(pred.items()) [(0, None), (1, 0), (2, 1), (3, 2), (4, 3)] >>> sorted(dist.items()) [(0, 0), (1, 1), (2, 2), (3, 3), (4, 4)] >>> from nose.tools import assert_raises >>> G = nx.cycle_graph(5, create_using = nx.DiGraph()) >>> G[1][2]['weight'] = -7 >>> assert_raises(nx.NetworkXUnbounded, nx.bellman_ford, G, 0) Notes ----- Edge weight attributes must be numerical. Distances are calculated as sums of weighted edges traversed. The dictionaries returned only have keys for nodes reachable from the source. In the case where the (di)graph is not connected, if a component not containing the source contains a negative cost (di)cycle, it will not be detected. """ if source not in G: raise KeyError("Node %s is not found in the graph" % source) for u, v, attr in G.selfloop_edges(data=True): if attr.get(weight, 1) < 0: raise nx.NetworkXUnbounded("Negative cost cycle detected.") dist = {source: 0} pred = {source: None} if len(G) == 1: return pred, dist return _bellman_ford_relaxation(G, pred, dist, [source], weight) def _bellman_ford_relaxation(G, pred, dist, source, weight): """Relaxation loop for Bellman–Ford algorithm Parameters ---------- G : NetworkX graph pred: dict Keyed by node to predecessor in the path dist: dict Keyed by node to the distance from the source source: list List of source nodes weight: string Edge data key corresponding to the edge weight Returns ------- Returns two dictionaries keyed by node to predecessor in the path and to the distance from the source respectively. Raises ------ NetworkXUnbounded If the (di)graph contains a negative cost (di)cycle, the algorithm raises an exception to indicate the presence of the negative cost (di)cycle. Note: any negative weight edge in an undirected graph is a negative cost cycle """ if G.is_multigraph(): def get_weight(edge_dict): return min(eattr.get(weight, 1) for eattr in edge_dict.values()) else: def get_weight(edge_dict): return edge_dict.get(weight, 1) G_succ = G.succ if G.is_directed() else G.adj inf = float('inf') n = len(G) count = {} q = deque(source) in_q = set(source) while q: u = q.popleft() in_q.remove(u) # Skip relaxations if the predecessor of u is in the queue. if pred[u] not in in_q: dist_u = dist[u] for v, e in G_succ[u].items(): dist_v = dist_u + get_weight(e) if dist_v < dist.get(v, inf): if v not in in_q: q.append(v) in_q.add(v) count_v = count.get(v, 0) + 1 if count_v == n: raise nx.NetworkXUnbounded( "Negative cost cycle detected.") count[v] = count_v dist[v] = dist_v pred[v] = u return pred, dist def goldberg_radzik(G, source, weight='weight'): """Compute shortest path lengths and predecessors on shortest paths in weighted graphs. The algorithm has a running time of O(mn) where n is the number of nodes and m is the number of edges. It is slower than Dijkstra but can handle negative edge weights. Parameters ---------- G : NetworkX graph The algorithm works for all types of graphs, including directed graphs and multigraphs. source: node label Starting node for path weight: string, optional (default='weight') Edge data key corresponding to the edge weight Returns ------- pred, dist : dictionaries Returns two dictionaries keyed by node to predecessor in the path and to the distance from the source respectively. Raises ------ NetworkXUnbounded If the (di)graph contains a negative cost (di)cycle, the algorithm raises an exception to indicate the presence of the negative cost (di)cycle. Note: any negative weight edge in an undirected graph is a negative cost cycle. Examples -------- >>> import networkx as nx >>> G = nx.path_graph(5, create_using = nx.DiGraph()) >>> pred, dist = nx.goldberg_radzik(G, 0) >>> sorted(pred.items()) [(0, None), (1, 0), (2, 1), (3, 2), (4, 3)] >>> sorted(dist.items()) [(0, 0), (1, 1), (2, 2), (3, 3), (4, 4)] >>> from nose.tools import assert_raises >>> G = nx.cycle_graph(5, create_using = nx.DiGraph()) >>> G[1][2]['weight'] = -7 >>> assert_raises(nx.NetworkXUnbounded, nx.goldberg_radzik, G, 0) Notes ----- Edge weight attributes must be numerical. Distances are calculated as sums of weighted edges traversed. The dictionaries returned only have keys for nodes reachable from the source. In the case where the (di)graph is not connected, if a component not containing the source contains a negative cost (di)cycle, it will not be detected. """ if source not in G: raise KeyError("Node %s is not found in the graph" % source) for u, v, attr in G.selfloop_edges(data=True): if attr.get(weight, 1) < 0: raise nx.NetworkXUnbounded("Negative cost cycle detected.") if len(G) == 1: return {source: None}, {source: 0} if G.is_multigraph(): def get_weight(edge_dict): return min(attr.get(weight, 1) for attr in edge_dict.values()) else: def get_weight(edge_dict): return edge_dict.get(weight, 1) if G.is_directed(): G_succ = G.succ else: G_succ = G.adj inf = float('inf') d = dict((u, inf) for u in G) d[source] = 0 pred = {source: None} def topo_sort(relabeled): """Topologically sort nodes relabeled in the previous round and detect negative cycles. """ # List of nodes to scan in this round. Denoted by A in Goldberg and # Radzik's paper. to_scan = [] # In the DFS in the loop below, neg_count records for each node the # number of edges of negative reduced costs on the path from a DFS root # to the node in the DFS forest. The reduced cost of an edge (u, v) is # defined as d[u] + weight[u][v] - d[v]. # # neg_count also doubles as the DFS visit marker array. neg_count = {} for u in relabeled: # Skip visited nodes. if u in neg_count: continue d_u = d[u] # Skip nodes without out-edges of negative reduced costs. if all(d_u + get_weight(e) >= d[v] for v, e in G_succ[u].items()): continue # Nonrecursive DFS that inserts nodes reachable from u via edges of # nonpositive reduced costs into to_scan in (reverse) topological # order. stack = [(u, iter(G_succ[u].items()))] in_stack = set([u]) neg_count[u] = 0 while stack: u, it = stack[-1] try: v, e = next(it) except StopIteration: to_scan.append(u) stack.pop() in_stack.remove(u) continue t = d[u] + get_weight(e) d_v = d[v] if t <= d_v: is_neg = t < d_v d[v] = t pred[v] = u if v not in neg_count: neg_count[v] = neg_count[u] + int(is_neg) stack.append((v, iter(G_succ[v].items()))) in_stack.add(v) elif (v in in_stack and neg_count[u] + int(is_neg) > neg_count[v]): # (u, v) is a back edge, and the cycle formed by the # path v to u and (u, v) contains at least one edge of # negative reduced cost. The cycle must be of negative # cost. raise nx.NetworkXUnbounded( 'Negative cost cycle detected.') to_scan.reverse() return to_scan def relax(to_scan): """Relax out-edges of relabeled nodes. """ relabeled = set() # Scan nodes in to_scan in topological order and relax incident # out-edges. Add the relabled nodes to labeled. for u in to_scan: d_u = d[u] for v, e in G_succ[u].items(): w_e = get_weight(e) if d_u + w_e < d[v]: d[v] = d_u + w_e pred[v] = u relabeled.add(v) return relabeled # Set of nodes relabled in the last round of scan operations. Denoted by B # in Goldberg and Radzik's paper. relabeled = set([source]) while relabeled: to_scan = topo_sort(relabeled) relabeled = relax(to_scan) d = dict((u, d[u]) for u in pred) return pred, d def negative_edge_cycle(G, weight='weight'): """Return True if there exists a negative edge cycle anywhere in G. Parameters ---------- G : NetworkX graph weight: string, optional (default='weight') Edge data key corresponding to the edge weight Returns ------- negative_cycle : bool True if a negative edge cycle exists, otherwise False. Examples -------- >>> import networkx as nx >>> G = nx.cycle_graph(5, create_using = nx.DiGraph()) >>> print(nx.negative_edge_cycle(G)) False >>> G[1][2]['weight'] = -7 >>> print(nx.negative_edge_cycle(G)) True Notes ----- Edge weight attributes must be numerical. Distances are calculated as sums of weighted edges traversed. This algorithm uses bellman_ford() but finds negative cycles on any component by first adding a new node connected to every node, and starting bellman_ford on that node. It then removes that extra node. """ newnode = generate_unique_node() G.add_edges_from([(newnode, n) for n in G]) try: bellman_ford(G, newnode, weight) except nx.NetworkXUnbounded: return True finally: G.remove_node(newnode) return False def bidirectional_dijkstra(G, source, target, weight='weight'): """Dijkstra's algorithm for shortest paths using bidirectional search. Parameters ---------- G : NetworkX graph source : node Starting node. target : node Ending node. weight: string, optional (default='weight') Edge data key corresponding to the edge weight Returns ------- length : number Shortest path length. Returns a tuple of two dictionaries keyed by node. The first dictionary stores distance from the source. The second stores the path from the source to that node. Raises ------ NetworkXNoPath If no path exists between source and target. Examples -------- >>> G=nx.path_graph(5) >>> length,path=nx.bidirectional_dijkstra(G,0,4) >>> print(length) 4 >>> print(path) [0, 1, 2, 3, 4] Notes ----- Edge weight attributes must be numerical. Distances are calculated as sums of weighted edges traversed. In practice bidirectional Dijkstra is much more than twice as fast as ordinary Dijkstra. Ordinary Dijkstra expands nodes in a sphere-like manner from the source. The radius of this sphere will eventually be the length of the shortest path. Bidirectional Dijkstra will expand nodes from both the source and the target, making two spheres of half this radius. Volume of the first sphere is pi*r*r while the others are 2*pi*r/2*r/2, making up half the volume. This algorithm is not guaranteed to work if edge weights are negative or are floating point numbers (overflows and roundoff errors can cause problems). See Also -------- shortest_path shortest_path_length """ if source == target: return (0, [source]) push = heappush pop = heappop # Init: Forward Backward dists = [{}, {}] # dictionary of final distances paths = [{source: [source]}, {target: [target]}] # dictionary of paths fringe = [[], []] # heap of (distance, node) tuples for # extracting next node to expand seen = [{source: 0}, {target: 0}] # dictionary of distances to # nodes seen c = count() # initialize fringe heap push(fringe[0], (0, next(c), source)) push(fringe[1], (0, next(c), target)) # neighs for extracting correct neighbor information if G.is_directed(): neighs = [G.successors_iter, G.predecessors_iter] else: neighs = [G.neighbors_iter, G.neighbors_iter] # variables to hold shortest discovered path #finaldist = 1e30000 finalpath = [] dir = 1 while fringe[0] and fringe[1]: # choose direction # dir == 0 is forward direction and dir == 1 is back dir = 1 - dir # extract closest to expand (dist, _, v) = pop(fringe[dir]) if v in dists[dir]: # Shortest path to v has already been found continue # update distance dists[dir][v] = dist # equal to seen[dir][v] if v in dists[1 - dir]: # if we have scanned v in both directions we are done # we have now discovered the shortest path return (finaldist, finalpath) for w in neighs[dir](v): if(dir == 0): # forward if G.is_multigraph(): minweight = min((dd.get(weight, 1) for k, dd in G[v][w].items())) else: minweight = G[v][w].get(weight, 1) vwLength = dists[dir][v] + minweight # G[v][w].get(weight,1) else: # back, must remember to change v,w->w,v if G.is_multigraph(): minweight = min((dd.get(weight, 1) for k, dd in G[w][v].items())) else: minweight = G[w][v].get(weight, 1) vwLength = dists[dir][v] + minweight # G[w][v].get(weight,1) if w in dists[dir]: if vwLength < dists[dir][w]: raise ValueError( "Contradictory paths found: negative weights?") elif w not in seen[dir] or vwLength < seen[dir][w]: # relaxing seen[dir][w] = vwLength push(fringe[dir], (vwLength, next(c), w)) paths[dir][w] = paths[dir][v] + [w] if w in seen[0] and w in seen[1]: # see if this path is better than than the already # discovered shortest path totaldist = seen[0][w] + seen[1][w] if finalpath == [] or finaldist > totaldist: finaldist = totaldist revpath = paths[1][w][:] revpath.reverse() finalpath = paths[0][w] + revpath[1:] raise nx.NetworkXNoPath("No path between %s and %s." % (source, target)) def johnson(G, weight='weight'): """Compute shortest paths between all nodes in a weighted graph using Johnson's algorithm. Parameters ---------- G : NetworkX graph weight: string, optional (default='weight') Edge data key corresponding to the edge weight. Returns ------- distance : dictionary Dictionary, keyed by source and target, of shortest paths. Raises ------ NetworkXError If given graph is not weighted. Examples -------- >>> import networkx as nx >>> graph = nx.DiGraph() >>> graph.add_weighted_edges_from([('0', '3', 3), ('0', '1', -5), ... ('0', '2', 2), ('1', '2', 4), ('2', '3', 1)]) >>> paths = nx.johnson(graph, weight='weight') >>> paths['0']['2'] ['0', '1', '2'] Notes ----- Johnson's algorithm is suitable even for graphs with negative weights. It works by using the Bellman–Ford algorithm to compute a transformation of the input graph that removes all negative weights, allowing Dijkstra's algorithm to be used on the transformed graph. It may be faster than Floyd - Warshall algorithm in sparse graphs. Algorithm complexity: O(V^2 * logV + V * E) See Also -------- floyd_warshall_predecessor_and_distance floyd_warshall_numpy all_pairs_shortest_path all_pairs_shortest_path_length all_pairs_dijkstra_path bellman_ford """ if not nx.is_weighted(G, weight=weight): raise nx.NetworkXError('Graph is not weighted.') dist = {v: 0 for v in G} pred = {v: None for v in G} # Calculate distance of shortest paths dist_bellman = _bellman_ford_relaxation(G, pred, dist, G.nodes(), weight)[1] if G.is_multigraph(): get_weight = lambda u, v, data: ( min(eattr.get(weight, 1) for eattr in data.values()) + dist_bellman[u] - dist_bellman[v]) else: get_weight = lambda u, v, data: (data.get(weight, 1) + dist_bellman[u] - dist_bellman[v]) all_pairs = {v: _dijkstra(G, v, get_weight, paths={v: [v]})[1] for v in G} return all_pairs networkx-1.11/networkx/algorithms/shortest_paths/tests/0000755000175000017500000000000012653231454023446 5ustar aricaric00000000000000networkx-1.11/networkx/algorithms/shortest_paths/tests/test_generic.py0000644000175000017500000001433612637544500026503 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * import networkx as nx def validate_grid_path(r, c, s, t, p): ok_(isinstance(p, list)) assert_equal(p[0], s) assert_equal(p[-1], t) s = ((s - 1) // c, (s - 1) % c) t = ((t - 1) // c, (t - 1) % c) assert_equal(len(p), abs(t[0] - s[0]) + abs(t[1] - s[1]) + 1) p = [((u - 1) // c, (u - 1) % c) for u in p] for u in p: ok_(0 <= u[0] < r) ok_(0 <= u[1] < c) for u, v in zip(p[:-1], p[1:]): ok_((abs(v[0] - u[0]), abs(v[1] - u[1])) in [(0, 1), (1, 0)]) class TestGenericPath: def setUp(self): from networkx import convert_node_labels_to_integers as cnlti self.grid=cnlti(nx.grid_2d_graph(4,4),first_label=1,ordering="sorted") self.cycle=nx.cycle_graph(7) self.directed_cycle=nx.cycle_graph(7,create_using=nx.DiGraph()) def test_shortest_path(self): assert_equal(nx.shortest_path(self.cycle,0,3),[0, 1, 2, 3]) assert_equal(nx.shortest_path(self.cycle,0,4),[0, 6, 5, 4]) validate_grid_path(4, 4, 1, 12, nx.shortest_path(self.grid,1,12)) assert_equal(nx.shortest_path(self.directed_cycle,0,3),[0, 1, 2, 3]) # now with weights assert_equal(nx.shortest_path(self.cycle,0,3,weight='weight'),[0, 1, 2, 3]) assert_equal(nx.shortest_path(self.cycle,0,4,weight='weight'),[0, 6, 5, 4]) validate_grid_path(4, 4, 1, 12, nx.shortest_path(self.grid,1,12,weight='weight')) assert_equal(nx.shortest_path(self.directed_cycle,0,3,weight='weight'), [0, 1, 2, 3]) def test_shortest_path_target(self): sp = nx.shortest_path(nx.path_graph(3), target=1) assert_equal(sp, {0: [0, 1], 1: [1], 2: [2, 1]}) def test_shortest_path_length(self): assert_equal(nx.shortest_path_length(self.cycle,0,3),3) assert_equal(nx.shortest_path_length(self.grid,1,12),5) assert_equal(nx.shortest_path_length(self.directed_cycle,0,4),4) # now with weights assert_equal(nx.shortest_path_length(self.cycle,0,3,weight='weight'),3) assert_equal(nx.shortest_path_length(self.grid,1,12,weight='weight'),5) assert_equal(nx.shortest_path_length(self.directed_cycle,0,4,weight='weight'),4) def test_shortest_path_length_target(self): sp = nx.shortest_path_length(nx.path_graph(3), target=1) assert_equal(sp[0], 1) assert_equal(sp[1], 0) assert_equal(sp[2], 1) def test_single_source_shortest_path(self): p=nx.shortest_path(self.cycle,0) assert_equal(p[3],[0,1,2,3]) assert_equal(p,nx.single_source_shortest_path(self.cycle,0)) p=nx.shortest_path(self.grid,1) validate_grid_path(4, 4, 1, 12, p[12]) # now with weights p=nx.shortest_path(self.cycle,0,weight='weight') assert_equal(p[3],[0,1,2,3]) assert_equal(p,nx.single_source_dijkstra_path(self.cycle,0)) p=nx.shortest_path(self.grid,1,weight='weight') validate_grid_path(4, 4, 1, 12, p[12]) def test_single_source_shortest_path_length(self): l=nx.shortest_path_length(self.cycle,0) assert_equal(l,{0:0,1:1,2:2,3:3,4:3,5:2,6:1}) assert_equal(l,nx.single_source_shortest_path_length(self.cycle,0)) l=nx.shortest_path_length(self.grid,1) assert_equal(l[16],6) # now with weights l=nx.shortest_path_length(self.cycle,0,weight='weight') assert_equal(l,{0:0,1:1,2:2,3:3,4:3,5:2,6:1}) assert_equal(l,nx.single_source_dijkstra_path_length(self.cycle,0)) l=nx.shortest_path_length(self.grid,1,weight='weight') assert_equal(l[16],6) def test_all_pairs_shortest_path(self): p=nx.shortest_path(self.cycle) assert_equal(p[0][3],[0,1,2,3]) assert_equal(p,nx.all_pairs_shortest_path(self.cycle)) p=nx.shortest_path(self.grid) validate_grid_path(4, 4, 1, 12, p[1][12]) # now with weights p=nx.shortest_path(self.cycle,weight='weight') assert_equal(p[0][3],[0,1,2,3]) assert_equal(p,nx.all_pairs_dijkstra_path(self.cycle)) p=nx.shortest_path(self.grid,weight='weight') validate_grid_path(4, 4, 1, 12, p[1][12]) def test_all_pairs_shortest_path_length(self): l=nx.shortest_path_length(self.cycle) assert_equal(l[0],{0:0,1:1,2:2,3:3,4:3,5:2,6:1}) assert_equal(l,nx.all_pairs_shortest_path_length(self.cycle)) l=nx.shortest_path_length(self.grid) assert_equal(l[1][16],6) # now with weights l=nx.shortest_path_length(self.cycle,weight='weight') assert_equal(l[0],{0:0,1:1,2:2,3:3,4:3,5:2,6:1}) assert_equal(l,nx.all_pairs_dijkstra_path_length(self.cycle)) l=nx.shortest_path_length(self.grid,weight='weight') assert_equal(l[1][16],6) def test_average_shortest_path(self): l=nx.average_shortest_path_length(self.cycle) assert_almost_equal(l,2) l=nx.average_shortest_path_length(nx.path_graph(5)) assert_almost_equal(l,2) def test_weighted_average_shortest_path(self): G=nx.Graph() G.add_cycle(range(7),weight=2) l=nx.average_shortest_path_length(G,weight='weight') assert_almost_equal(l,4) G=nx.Graph() G.add_path(range(5),weight=2) l=nx.average_shortest_path_length(G,weight='weight') assert_almost_equal(l,4) def test_average_shortest_disconnected(self): g = nx.Graph() g.add_nodes_from(range(3)) g.add_edge(0, 1) assert_raises(nx.NetworkXError,nx.average_shortest_path_length,g) g = g.to_directed() assert_raises(nx.NetworkXError,nx.average_shortest_path_length,g) def test_has_path(self): G = nx.Graph() G.add_path(range(3)) G.add_path(range(3,5)) assert_true(nx.has_path(G,0,2)) assert_false(nx.has_path(G,0,4)) def test_all_shortest_paths(self): G = nx.Graph() G.add_path([0,1,2,3]) G.add_path([0,10,20,3]) assert_equal([[0,1,2,3],[0,10,20,3]], sorted(nx.all_shortest_paths(G,0,3))) @raises(nx.NetworkXNoPath) def test_all_shortest_paths_raise(self): G = nx.Graph() G.add_path([0,1,2,3]) G.add_node(4) paths = list(nx.all_shortest_paths(G,0,4)) networkx-1.11/networkx/algorithms/shortest_paths/tests/test_dense.py0000644000175000017500000001123112637544450026160 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * from nose import SkipTest import networkx as nx class TestFloyd: def setUp(self): pass def test_floyd_warshall_predecessor_and_distance(self): XG=nx.DiGraph() XG.add_weighted_edges_from([('s','u',10) ,('s','x',5) , ('u','v',1) ,('u','x',2) , ('v','y',1) ,('x','u',3) , ('x','v',5) ,('x','y',2) , ('y','s',7) ,('y','v',6)]) path, dist =nx.floyd_warshall_predecessor_and_distance(XG) assert_equal(dist['s']['v'],9) assert_equal(path['s']['v'],'u') assert_equal(dist, {'y': {'y': 0, 'x': 12, 's': 7, 'u': 15, 'v': 6}, 'x': {'y': 2, 'x': 0, 's': 9, 'u': 3, 'v': 4}, 's': {'y': 7, 'x': 5, 's': 0, 'u': 8, 'v': 9}, 'u': {'y': 2, 'x': 2, 's': 9, 'u': 0, 'v': 1}, 'v': {'y': 1, 'x': 13, 's': 8, 'u': 16, 'v': 0}}) GG=XG.to_undirected() # make sure we get lower weight # to_undirected might choose either edge with weight 2 or weight 3 GG['u']['x']['weight']=2 path, dist = nx.floyd_warshall_predecessor_and_distance(GG) assert_equal(dist['s']['v'],8) # skip this test, could be alternate path s-u-v # assert_equal(path['s']['v'],'y') G=nx.DiGraph() # no weights G.add_edges_from([('s','u'), ('s','x'), ('u','v'), ('u','x'), ('v','y'), ('x','u'), ('x','v'), ('x','y'), ('y','s'), ('y','v')]) path, dist = nx.floyd_warshall_predecessor_and_distance(G) assert_equal(dist['s']['v'],2) # skip this test, could be alternate path s-u-v # assert_equal(path['s']['v'],'x') # alternate interface dist = nx.floyd_warshall(G) assert_equal(dist['s']['v'],2) def test_cycle(self): path, dist = nx.floyd_warshall_predecessor_and_distance(nx.cycle_graph(7)) assert_equal(dist[0][3],3) assert_equal(path[0][3],2) assert_equal(dist[0][4],3) def test_weighted(self): XG3=nx.Graph() XG3.add_weighted_edges_from([ [0,1,2],[1,2,12],[2,3,1], [3,4,5],[4,5,1],[5,0,10] ]) path, dist = nx.floyd_warshall_predecessor_and_distance(XG3) assert_equal(dist[0][3],15) assert_equal(path[0][3],2) def test_weighted2(self): XG4=nx.Graph() XG4.add_weighted_edges_from([ [0,1,2],[1,2,2],[2,3,1], [3,4,1],[4,5,1],[5,6,1], [6,7,1],[7,0,1] ]) path, dist = nx.floyd_warshall_predecessor_and_distance(XG4) assert_equal(dist[0][2],4) assert_equal(path[0][2],1) def test_weight_parameter(self): XG4 = nx.Graph() XG4.add_edges_from([ (0, 1, {'heavy': 2}), (1, 2, {'heavy': 2}), (2, 3, {'heavy': 1}), (3, 4, {'heavy': 1}), (4, 5, {'heavy': 1}), (5, 6, {'heavy': 1}), (6, 7, {'heavy': 1}), (7, 0, {'heavy': 1}) ]) path, dist = nx.floyd_warshall_predecessor_and_distance(XG4, weight='heavy') assert_equal(dist[0][2], 4) assert_equal(path[0][2], 1) def test_zero_distance(self): XG=nx.DiGraph() XG.add_weighted_edges_from([('s','u',10) ,('s','x',5) , ('u','v',1) ,('u','x',2) , ('v','y',1) ,('x','u',3) , ('x','v',5) ,('x','y',2) , ('y','s',7) ,('y','v',6)]) path, dist =nx.floyd_warshall_predecessor_and_distance(XG) for u in XG: assert_equal(dist[u][u], 0) GG=XG.to_undirected() # make sure we get lower weight # to_undirected might choose either edge with weight 2 or weight 3 GG['u']['x']['weight']=2 path, dist = nx.floyd_warshall_predecessor_and_distance(GG) for u in GG: dist[u][u] = 0 def test_zero_weight(self): G = nx.DiGraph() edges = [(1,2,-2), (2,3,-4), (1,5,1), (5,4,0), (4,3,-5), (2,5,-7)] G.add_weighted_edges_from(edges) dist = nx.floyd_warshall(G) assert_equal(dist[1][3], -14) G = nx.MultiDiGraph() edges.append( (2,5,-7) ) G.add_weighted_edges_from(edges) dist = nx.floyd_warshall(G) assert_equal(dist[1][3], -14) networkx-1.11/networkx/algorithms/shortest_paths/tests/test_weighted.py0000644000175000017500000003601712637544500026667 0ustar aricaric00000000000000from nose.tools import * import networkx as nx def _setUp(self): cnlti = nx.convert_node_labels_to_integers self.grid = cnlti(nx.grid_2d_graph(4, 4), first_label=1, ordering="sorted") self.cycle = nx.cycle_graph(7) self.directed_cycle = nx.cycle_graph(7, create_using=nx.DiGraph()) self.XG = nx.DiGraph() self.XG.add_weighted_edges_from([('s', 'u', 10), ('s', 'x', 5), ('u', 'v', 1), ('u', 'x', 2), ('v', 'y', 1), ('x', 'u', 3), ('x', 'v', 5), ('x', 'y', 2), ('y', 's', 7), ('y', 'v', 6)]) self.MXG = nx.MultiDiGraph(self.XG) self.MXG.add_edge('s', 'u', weight=15) self.XG2 = nx.DiGraph() self.XG2.add_weighted_edges_from([[1, 4, 1], [4, 5, 1], [5, 6, 1], [6, 3, 1], [1, 3, 50], [1, 2, 100], [2, 3, 100]]) self.XG3 = nx.Graph() self.XG3.add_weighted_edges_from([[0, 1, 2], [1, 2, 12], [2, 3, 1], [3, 4, 5], [4, 5, 1], [5, 0, 10]]) self.XG4 = nx.Graph() self.XG4.add_weighted_edges_from([[0, 1, 2], [1, 2, 2], [2, 3, 1], [3, 4, 1], [4, 5, 1], [5, 6, 1], [6, 7, 1], [7, 0, 1]]) self.MXG4 = nx.MultiGraph(self.XG4) self.MXG4.add_edge(0, 1, weight=3) self.G = nx.DiGraph() # no weights self.G.add_edges_from([('s', 'u'), ('s', 'x'), ('u', 'v'), ('u', 'x'), ('v', 'y'), ('x', 'u'), ('x', 'v'), ('x', 'y'), ('y', 's'), ('y', 'v')]) def validate_path(G, s, t, soln_len, path): assert_equal(path[0], s) assert_equal(path[-1], t) if not G.is_multigraph(): assert_equal( soln_len, sum(G[u][v].get('weight', 1) for u, v in zip(path[:-1], path[1:]))) else: assert_equal( soln_len, sum(min(e.get('weight', 1) for e in G[u][v].values()) for u, v in zip(path[:-1], path[1:]))) def validate_length_path(G, s, t, soln_len, length, path): assert_equal(soln_len, length) validate_path(G, s, t, length, path) class TestWeightedPath: setUp = _setUp def test_dijkstra(self): (D, P) = nx.single_source_dijkstra(self.XG, 's') validate_path(self.XG, 's', 'v', 9, P['v']) assert_equal(D['v'], 9) validate_path( self.XG, 's', 'v', 9, nx.single_source_dijkstra_path(self.XG, 's')['v']) assert_equal( nx.single_source_dijkstra_path_length(self.XG, 's')['v'], 9) validate_path( self.XG, 's', 'v', 9, nx.single_source_dijkstra(self.XG, 's')[1]['v']) validate_path( self.MXG, 's', 'v', 9, nx.single_source_dijkstra_path(self.MXG, 's')['v']) GG = self.XG.to_undirected() # make sure we get lower weight # to_undirected might choose either edge with weight 2 or weight 3 GG['u']['x']['weight'] = 2 (D, P) = nx.single_source_dijkstra(GG, 's') validate_path(GG, 's', 'v', 8, P['v']) assert_equal(D['v'], 8) # uses lower weight of 2 on u<->x edge validate_path(GG, 's', 'v', 8, nx.dijkstra_path(GG, 's', 'v')) assert_equal(nx.dijkstra_path_length(GG, 's', 'v'), 8) validate_path(self.XG2, 1, 3, 4, nx.dijkstra_path(self.XG2, 1, 3)) validate_path(self.XG3, 0, 3, 15, nx.dijkstra_path(self.XG3, 0, 3)) assert_equal(nx.dijkstra_path_length(self.XG3, 0, 3), 15) validate_path(self.XG4, 0, 2, 4, nx.dijkstra_path(self.XG4, 0, 2)) assert_equal(nx.dijkstra_path_length(self.XG4, 0, 2), 4) validate_path(self.MXG4, 0, 2, 4, nx.dijkstra_path(self.MXG4, 0, 2)) validate_path( self.G, 's', 'v', 2, nx.single_source_dijkstra(self.G, 's', 'v')[1]['v']) validate_path( self.G, 's', 'v', 2, nx.single_source_dijkstra(self.G, 's')[1]['v']) validate_path(self.G, 's', 'v', 2, nx.dijkstra_path(self.G, 's', 'v')) assert_equal(nx.dijkstra_path_length(self.G, 's', 'v'), 2) # NetworkXError: node s not reachable from moon assert_raises(nx.NetworkXNoPath, nx.dijkstra_path, self.G, 's', 'moon') assert_raises( nx.NetworkXNoPath, nx.dijkstra_path_length, self.G, 's', 'moon') validate_path(self.cycle, 0, 3, 3, nx.dijkstra_path(self.cycle, 0, 3)) validate_path(self.cycle, 0, 4, 3, nx.dijkstra_path(self.cycle, 0, 4)) assert_equal( nx.single_source_dijkstra(self.cycle, 0, 0), ({0: 0}, {0: [0]})) def test_bidirectional_dijkstra(self): validate_length_path( self.XG, 's', 'v', 9, *nx.bidirectional_dijkstra(self.XG, 's', 'v')) validate_length_path( self.G, 's', 'v', 2, *nx.bidirectional_dijkstra(self.G, 's', 'v')) validate_length_path( self.cycle, 0, 3, 3, *nx.bidirectional_dijkstra(self.cycle, 0, 3)) validate_length_path( self.cycle, 0, 4, 3, *nx.bidirectional_dijkstra(self.cycle, 0, 4)) validate_length_path( self.XG3, 0, 3, 15, *nx.bidirectional_dijkstra(self.XG3, 0, 3)) validate_length_path( self.XG4, 0, 2, 4, *nx.bidirectional_dijkstra(self.XG4, 0, 2)) # need more tests here P = nx.single_source_dijkstra_path(self.XG, 's')['v'] validate_path(self.XG, 's', 'v', sum(self.XG[u][v]['weight'] for u, v in zip( P[:-1], P[1:])), nx.dijkstra_path(self.XG, 's', 'v')) @raises(nx.NetworkXNoPath) def test_bidirectional_dijkstra_no_path(self): G = nx.Graph() G.add_path([1, 2, 3]) G.add_path([4, 5, 6]) path = nx.bidirectional_dijkstra(G, 1, 6) def test_dijkstra_predecessor(self): G = nx.path_graph(4) assert_equal(nx.dijkstra_predecessor_and_distance(G, 0), ({0: [], 1: [0], 2: [1], 3: [2]}, {0: 0, 1: 1, 2: 2, 3: 3})) G = nx.grid_2d_graph(2, 2) pred, dist = nx.dijkstra_predecessor_and_distance(G, (0, 0)) assert_equal(sorted(pred.items()), [((0, 0), []), ((0, 1), [(0, 0)]), ((1, 0), [(0, 0)]), ((1, 1), [(0, 1), (1, 0)])]) assert_equal(sorted(dist.items()), [((0, 0), 0), ((0, 1), 1), ((1, 0), 1), ((1, 1), 2)]) XG = nx.DiGraph() XG.add_weighted_edges_from([('s', 'u', 10), ('s', 'x', 5), ('u', 'v', 1), ('u', 'x', 2), ('v', 'y', 1), ('x', 'u', 3), ('x', 'v', 5), ('x', 'y', 2), ('y', 's', 7), ('y', 'v', 6)]) (P, D) = nx.dijkstra_predecessor_and_distance(XG, 's') assert_equal(P['v'], ['u']) assert_equal(D['v'], 9) (P, D) = nx.dijkstra_predecessor_and_distance(XG, 's', cutoff=8) assert_false('v' in D) def test_single_source_dijkstra_path_length(self): pl = nx.single_source_dijkstra_path_length assert_equal(pl(self.MXG4, 0)[2], 4) spl = pl(self.MXG4, 0, cutoff=2) assert_false(2 in spl) def test_bidirectional_dijkstra_multigraph(self): G = nx.MultiGraph() G.add_edge('a', 'b', weight=10) G.add_edge('a', 'b', weight=100) dp = nx.bidirectional_dijkstra(G, 'a', 'b') assert_equal(dp, (10, ['a', 'b'])) def test_dijkstra_pred_distance_multigraph(self): G = nx.MultiGraph() G.add_edge('a', 'b', key='short', foo=5, weight=100) G.add_edge('a', 'b', key='long', bar=1, weight=110) p, d = nx.dijkstra_predecessor_and_distance(G, 'a') assert_equal(p, {'a': [], 'b': ['a']}) assert_equal(d, {'a': 0, 'b': 100}) def test_negative_edge_cycle(self): G = nx.cycle_graph(5, create_using=nx.DiGraph()) assert_equal(nx.negative_edge_cycle(G), False) G.add_edge(8, 9, weight=-7) G.add_edge(9, 8, weight=3) graph_size = len(G) assert_equal(nx.negative_edge_cycle(G), True) assert_equal(graph_size, len(G)) assert_raises(ValueError, nx.single_source_dijkstra_path_length, G, 8) assert_raises(ValueError, nx.single_source_dijkstra, G, 8) assert_raises(ValueError, nx.dijkstra_predecessor_and_distance, G, 8) G.add_edge(9, 10) assert_raises(ValueError, nx.bidirectional_dijkstra, G, 8, 10) class TestBellmanFordAndGoldbergRadizk: setUp = _setUp def test_single_node_graph(self): G = nx.DiGraph() G.add_node(0) assert_equal(nx.bellman_ford(G, 0), ({0: None}, {0: 0})) assert_equal(nx.goldberg_radzik(G, 0), ({0: None}, {0: 0})) assert_raises(KeyError, nx.bellman_ford, G, 1) assert_raises(KeyError, nx.goldberg_radzik, G, 1) def test_negative_weight_cycle(self): G = nx.cycle_graph(5, create_using=nx.DiGraph()) G.add_edge(1, 2, weight=-7) for i in range(5): assert_raises(nx.NetworkXUnbounded, nx.bellman_ford, G, i) assert_raises(nx.NetworkXUnbounded, nx.goldberg_radzik, G, i) G = nx.cycle_graph(5) # undirected Graph G.add_edge(1, 2, weight=-3) for i in range(5): assert_raises(nx.NetworkXUnbounded, nx.bellman_ford, G, i) assert_raises(nx.NetworkXUnbounded, nx.goldberg_radzik, G, i) G = nx.DiGraph([(1, 1, {'weight': -1})]) assert_raises(nx.NetworkXUnbounded, nx.bellman_ford, G, 1) assert_raises(nx.NetworkXUnbounded, nx.goldberg_radzik, G, 1) # no negative cycle but negative weight G = nx.cycle_graph(5, create_using=nx.DiGraph()) G.add_edge(1, 2, weight=-3) assert_equal(nx.bellman_ford(G, 0), ({0: None, 1: 0, 2: 1, 3: 2, 4: 3}, {0: 0, 1: 1, 2: -2, 3: -1, 4: 0})) assert_equal(nx.goldberg_radzik(G, 0), ({0: None, 1: 0, 2: 1, 3: 2, 4: 3}, {0: 0, 1: 1, 2: -2, 3: -1, 4: 0})) def test_not_connected(self): G = nx.complete_graph(6) G.add_edge(10, 11) G.add_edge(10, 12) assert_equal(nx.bellman_ford(G, 0), ({0: None, 1: 0, 2: 0, 3: 0, 4: 0, 5: 0}, {0: 0, 1: 1, 2: 1, 3: 1, 4: 1, 5: 1})) assert_equal(nx.goldberg_radzik(G, 0), ({0: None, 1: 0, 2: 0, 3: 0, 4: 0, 5: 0}, {0: 0, 1: 1, 2: 1, 3: 1, 4: 1, 5: 1})) # not connected, with a component not containing the source that # contains a negative cost cycle. G = nx.complete_graph(6) G.add_edges_from([('A', 'B', {'load': 3}), ('B', 'C', {'load': -10}), ('C', 'A', {'load': 2})]) assert_equal(nx.bellman_ford(G, 0, weight='load'), ({0: None, 1: 0, 2: 0, 3: 0, 4: 0, 5: 0}, {0: 0, 1: 1, 2: 1, 3: 1, 4: 1, 5: 1})) assert_equal(nx.goldberg_radzik(G, 0, weight='load'), ({0: None, 1: 0, 2: 0, 3: 0, 4: 0, 5: 0}, {0: 0, 1: 1, 2: 1, 3: 1, 4: 1, 5: 1})) def test_multigraph(self): P, D = nx.bellman_ford(self.MXG, 's') assert_equal(P['v'], 'u') assert_equal(D['v'], 9) P, D = nx.goldberg_radzik(self.MXG, 's') assert_equal(P['v'], 'u') assert_equal(D['v'], 9) P, D = nx.bellman_ford(self.MXG4, 0) assert_equal(P[2], 1) assert_equal(D[2], 4) P, D = nx.goldberg_radzik(self.MXG4, 0) assert_equal(P[2], 1) assert_equal(D[2], 4) def test_others(self): (P, D) = nx.bellman_ford(self.XG, 's') assert_equal(P['v'], 'u') assert_equal(D['v'], 9) (P, D) = nx.goldberg_radzik(self.XG, 's') assert_equal(P['v'], 'u') assert_equal(D['v'], 9) G = nx.path_graph(4) assert_equal(nx.bellman_ford(G, 0), ({0: None, 1: 0, 2: 1, 3: 2}, {0: 0, 1: 1, 2: 2, 3: 3})) assert_equal(nx.goldberg_radzik(G, 0), ({0: None, 1: 0, 2: 1, 3: 2}, {0: 0, 1: 1, 2: 2, 3: 3})) assert_equal(nx.bellman_ford(G, 3), ({0: 1, 1: 2, 2: 3, 3: None}, {0: 3, 1: 2, 2: 1, 3: 0})) assert_equal(nx.goldberg_radzik(G, 3), ({0: 1, 1: 2, 2: 3, 3: None}, {0: 3, 1: 2, 2: 1, 3: 0})) G = nx.grid_2d_graph(2, 2) pred, dist = nx.bellman_ford(G, (0, 0)) assert_equal(sorted(pred.items()), [((0, 0), None), ((0, 1), (0, 0)), ((1, 0), (0, 0)), ((1, 1), (0, 1))]) assert_equal(sorted(dist.items()), [((0, 0), 0), ((0, 1), 1), ((1, 0), 1), ((1, 1), 2)]) pred, dist = nx.goldberg_radzik(G, (0, 0)) assert_equal(sorted(pred.items()), [((0, 0), None), ((0, 1), (0, 0)), ((1, 0), (0, 0)), ((1, 1), (0, 1))]) assert_equal(sorted(dist.items()), [((0, 0), 0), ((0, 1), 1), ((1, 0), 1), ((1, 1), 2)]) class TestJohnsonAlgorithm: setUp = _setUp @raises(nx.NetworkXError) def test_single_node_graph(self): G = nx.DiGraph() G.add_node(0) nx.johnson(G) def test_negative_cycle(self): G = nx.DiGraph() G.add_weighted_edges_from([('0', '3', 3), ('0', '1', -5), ('1', '0', -5), ('0', '2', 2), ('1', '2', 4), ('2', '3', 1)]) assert_raises(nx.NetworkXUnbounded, nx.johnson, G) G = nx.Graph() G.add_weighted_edges_from([('0', '3', 3), ('0', '1', -5), ('1', '0', -5), ('0', '2', 2), ('1', '2', 4), ('2', '3', 1)]) assert_raises(nx.NetworkXUnbounded, nx.johnson, G) def test_negative_weights(self): G = nx.DiGraph() G.add_weighted_edges_from([('0', '3', 3), ('0', '1', -5), ('0', '2', 2), ('1', '2', 4), ('2', '3', 1)]) paths = nx.johnson(G) assert_equal(paths, {'1': {'1': ['1'], '3': ['1', '2', '3'], '2': ['1', '2']}, '0': {'1': ['0', '1'], '0': ['0'], '3': ['0', '1', '2', '3'], '2': ['0', '1', '2']}, '3': {'3': ['3']}, '2': {'3': ['2', '3'], '2': ['2']}}) @raises(nx.NetworkXError) def test_unweighted_graph(self): G = nx.path_graph(5) nx.johnson(G) def test_graphs(self): validate_path(self.XG, 's', 'v', 9, nx.johnson(self.XG)['s']['v']) validate_path(self.MXG, 's', 'v', 9, nx.johnson(self.MXG)['s']['v']) validate_path(self.XG2, 1, 3, 4, nx.johnson(self.XG2)[1][3]) validate_path(self.XG3, 0, 3, 15, nx.johnson(self.XG3)[0][3]) validate_path(self.XG4, 0, 2, 4, nx.johnson(self.XG4)[0][2]) validate_path(self.MXG4, 0, 2, 4, nx.johnson(self.MXG4)[0][2]) networkx-1.11/networkx/algorithms/shortest_paths/tests/test_unweighted.py0000644000175000017500000000715212637544500027230 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * import networkx as nx def validate_grid_path(r, c, s, t, p): ok_(isinstance(p, list)) assert_equal(p[0], s) assert_equal(p[-1], t) s = ((s - 1) // c, (s - 1) % c) t = ((t - 1) // c, (t - 1) % c) assert_equal(len(p), abs(t[0] - s[0]) + abs(t[1] - s[1]) + 1) p = [((u - 1) // c, (u - 1) % c) for u in p] for u in p: ok_(0 <= u[0] < r) ok_(0 <= u[1] < c) for u, v in zip(p[:-1], p[1:]): ok_((abs(v[0] - u[0]), abs(v[1] - u[1])) in [(0, 1), (1, 0)]) class TestUnweightedPath: def setUp(self): from networkx import convert_node_labels_to_integers as cnlti self.grid=cnlti(nx.grid_2d_graph(4,4),first_label=1,ordering="sorted") self.cycle=nx.cycle_graph(7) self.directed_cycle=nx.cycle_graph(7,create_using=nx.DiGraph()) def test_bidirectional_shortest_path(self): assert_equal(nx.bidirectional_shortest_path(self.cycle,0,3), [0, 1, 2, 3]) assert_equal(nx.bidirectional_shortest_path(self.cycle,0,4), [0, 6, 5, 4]) validate_grid_path(4, 4, 1, 12, nx.bidirectional_shortest_path(self.grid,1,12)) assert_equal(nx.bidirectional_shortest_path(self.directed_cycle,0,3), [0, 1, 2, 3]) def test_shortest_path_length(self): assert_equal(nx.shortest_path_length(self.cycle,0,3),3) assert_equal(nx.shortest_path_length(self.grid,1,12),5) assert_equal(nx.shortest_path_length(self.directed_cycle,0,4),4) # now with weights assert_equal(nx.shortest_path_length(self.cycle,0,3,weight=True),3) assert_equal(nx.shortest_path_length(self.grid,1,12,weight=True),5) assert_equal(nx.shortest_path_length(self.directed_cycle,0,4,weight=True),4) def test_single_source_shortest_path(self): p=nx.single_source_shortest_path(self.cycle,0) assert_equal(p[3],[0,1,2,3]) p=nx.single_source_shortest_path(self.cycle,0, cutoff=0) assert_equal(p,{0 : [0]}) def test_single_source_shortest_path_length(self): assert_equal(nx.single_source_shortest_path_length(self.cycle,0), {0:0,1:1,2:2,3:3,4:3,5:2,6:1}) def test_all_pairs_shortest_path(self): p=nx.all_pairs_shortest_path(self.cycle) assert_equal(p[0][3],[0,1,2,3]) p=nx.all_pairs_shortest_path(self.grid) validate_grid_path(4, 4, 1, 12, p[1][12]) def test_all_pairs_shortest_path_length(self): l=nx.all_pairs_shortest_path_length(self.cycle) assert_equal(l[0],{0:0,1:1,2:2,3:3,4:3,5:2,6:1}) l=nx.all_pairs_shortest_path_length(self.grid) assert_equal(l[1][16],6) def test_predecessor(self): G=nx.path_graph(4) assert_equal(nx.predecessor(G,0),{0: [], 1: [0], 2: [1], 3: [2]}) assert_equal(nx.predecessor(G,0,3),[2]) G=nx.grid_2d_graph(2,2) assert_equal(sorted(nx.predecessor(G,(0,0)).items()), [((0, 0), []), ((0, 1), [(0, 0)]), ((1, 0), [(0, 0)]), ((1, 1), [(0, 1), (1, 0)])]) def test_predecessor_cutoff(self): G=nx.path_graph(4) p = nx.predecessor(G,0,3) assert_false(4 in p) def test_predecessor_target(self): G=nx.path_graph(4) p = nx.predecessor(G,0,3) assert_equal(p,[2]) p = nx.predecessor(G,0,3,cutoff=2) assert_equal(p,[]) p,s = nx.predecessor(G,0,3,return_seen=True) assert_equal(p,[2]) assert_equal(s,3) p,s = nx.predecessor(G,0,3,cutoff=2,return_seen=True) assert_equal(p,[]) assert_equal(s,-1) networkx-1.11/networkx/algorithms/shortest_paths/tests/test_astar.py0000644000175000017500000001147712637544450026210 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * import networkx as nx from random import random, choice class TestAStar: def setUp(self): self.XG=nx.DiGraph() self.XG.add_edges_from([('s','u',{'weight':10}), ('s','x',{'weight':5}), ('u','v',{'weight':1}), ('u','x',{'weight':2}), ('v','y',{'weight':1}), ('x','u',{'weight':3}), ('x','v',{'weight':5}), ('x','y',{'weight':2}), ('y','s',{'weight':7}), ('y','v',{'weight':6})]) def test_random_graph(self): def dist(a, b): (x1, y1) = a (x2, y2) = b return ((x1 - x2) ** 2 + (y1 - y2) ** 2) ** 0.5 G = nx.Graph() points = [(random(), random()) for _ in range(100)] # Build a path from points[0] to points[-1] to be sure it exists for p1, p2 in zip(points[:-1], points[1:]): G.add_edge(p1, p2, weight=dist(p1, p2)) # Add other random edges for _ in range(100): p1, p2 = choice(points), choice(points) G.add_edge(p1, p2, weight=dist(p1, p2)) path = nx.astar_path(G, points[0], points[-1], dist) assert path == nx.dijkstra_path(G, points[0], points[-1]) def test_astar_directed(self): assert nx.astar_path(self.XG,'s','v')==['s', 'x', 'u', 'v'] assert nx.astar_path_length(self.XG,'s','v')==9 def test_astar_multigraph(self): G=nx.MultiDiGraph(self.XG) assert_raises((TypeError,nx.NetworkXError), nx.astar_path, [G,'s','v']) assert_raises((TypeError,nx.NetworkXError), nx.astar_path_length, [G,'s','v']) def test_astar_undirected(self): GG=self.XG.to_undirected() # make sure we get lower weight # to_undirected might choose either edge with weight 2 or weight 3 GG['u']['x']['weight']=2 GG['y']['v']['weight'] = 2 assert_equal(nx.astar_path(GG,'s','v'),['s', 'x', 'u', 'v']) assert_equal(nx.astar_path_length(GG,'s','v'),8) def test_astar_directed2(self): XG2=nx.DiGraph() XG2.add_edges_from([[1,4,{'weight':1}], [4,5,{'weight':1}], [5,6,{'weight':1}], [6,3,{'weight':1}], [1,3,{'weight':50}], [1,2,{'weight':100}], [2,3,{'weight':100}]]) assert nx.astar_path(XG2,1,3)==[1, 4, 5, 6, 3] def test_astar_undirected2(self): XG3=nx.Graph() XG3.add_edges_from([ [0,1,{'weight':2}], [1,2,{'weight':12}], [2,3,{'weight':1}], [3,4,{'weight':5}], [4,5,{'weight':1}], [5,0,{'weight':10}] ]) assert nx.astar_path(XG3,0,3)==[0, 1, 2, 3] assert nx.astar_path_length(XG3,0,3)==15 def test_astar_undirected3(self): XG4=nx.Graph() XG4.add_edges_from([ [0,1,{'weight':2}], [1,2,{'weight':2}], [2,3,{'weight':1}], [3,4,{'weight':1}], [4,5,{'weight':1}], [5,6,{'weight':1}], [6,7,{'weight':1}], [7,0,{'weight':1}] ]) assert nx.astar_path(XG4,0,2)==[0, 1, 2] assert nx.astar_path_length(XG4,0,2)==4 # >>> MXG4=NX.MultiGraph(XG4) # >>> MXG4.add_edge(0,1,3) # >>> NX.dijkstra_path(MXG4,0,2) # [0, 1, 2] def test_astar_w1(self): G=nx.DiGraph() G.add_edges_from([('s','u'), ('s','x'), ('u','v'), ('u','x'), ('v','y'), ('x','u'), ('x','w'), ('w', 'v'), ('x','y'), ('y','s'), ('y','v')]) assert nx.astar_path(G,'s','v')==['s', 'u', 'v'] assert nx.astar_path_length(G,'s','v')== 2 @raises(nx.NetworkXNoPath) def test_astar_nopath(self): p = nx.astar_path(self.XG,'s','moon') def test_cycle(self): C=nx.cycle_graph(7) assert nx.astar_path(C,0,3)==[0, 1, 2, 3] assert nx.dijkstra_path(C,0,4)==[0, 6, 5, 4] def test_orderable(self): class UnorderableClass: pass node_1 = UnorderableClass() node_2 = UnorderableClass() node_3 = UnorderableClass() node_4 = UnorderableClass() G = nx.Graph() G.add_edge(node_1, node_2) G.add_edge(node_1, node_3) G.add_edge(node_2, node_4) G.add_edge(node_3, node_4) path=nx.algorithms.shortest_paths.astar.astar_path(G, node_1, node_4) networkx-1.11/networkx/algorithms/shortest_paths/tests/test_dense_numpy.py0000644000175000017500000000461612637544450027421 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * from nose import SkipTest import networkx as nx class TestFloydNumpy(object): numpy=1 # nosetests attribute, use nosetests -a 'not numpy' to skip test @classmethod def setupClass(cls): global numpy global assert_equal global assert_almost_equal try: import numpy from numpy.testing import assert_equal,assert_almost_equal except ImportError: raise SkipTest('NumPy not available.') def test_cycle_numpy(self): dist = nx.floyd_warshall_numpy(nx.cycle_graph(7)) assert_equal(dist[0,3],3) assert_equal(dist[0,4],3) def test_weighted_numpy(self): XG3=nx.Graph() XG3.add_weighted_edges_from([ [0,1,2],[1,2,12],[2,3,1], [3,4,5],[4,5,1],[5,0,10] ]) dist = nx.floyd_warshall_numpy(XG3) assert_equal(dist[0,3],15) def test_weighted_numpy(self): XG4=nx.Graph() XG4.add_weighted_edges_from([ [0,1,2],[1,2,2],[2,3,1], [3,4,1],[4,5,1],[5,6,1], [6,7,1],[7,0,1] ]) dist = nx.floyd_warshall_numpy(XG4) assert_equal(dist[0,2],4) def test_weight_parameter_numpy(self): XG4 = nx.Graph() XG4.add_edges_from([ (0, 1, {'heavy': 2}), (1, 2, {'heavy': 2}), (2, 3, {'heavy': 1}), (3, 4, {'heavy': 1}), (4, 5, {'heavy': 1}), (5, 6, {'heavy': 1}), (6, 7, {'heavy': 1}), (7, 0, {'heavy': 1}) ]) dist = nx.floyd_warshall_numpy(XG4, weight='heavy') assert_equal(dist[0, 2], 4) def test_directed_cycle_numpy(self): G = nx.DiGraph() G.add_cycle([0,1,2,3]) pred,dist = nx.floyd_warshall_predecessor_and_distance(G) D = nx.utils.dict_to_numpy_array(dist) assert_equal(nx.floyd_warshall_numpy(G),D) def test_zero_weight(self): G = nx.DiGraph() edges = [(1,2,-2), (2,3,-4), (1,5,1), (5,4,0), (4,3,-5), (2,5,-7)] G.add_weighted_edges_from(edges) dist = nx.floyd_warshall_numpy(G) assert_equal(int(numpy.min(dist)), -14) G = nx.MultiDiGraph() edges.append( (2,5,-7) ) G.add_weighted_edges_from(edges) dist = nx.floyd_warshall_numpy(G) assert_equal(int(numpy.min(dist)), -14) networkx-1.11/networkx/algorithms/shortest_paths/dense.py0000644000175000017500000001175612637544450023773 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """Floyd-Warshall algorithm for shortest paths. """ # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import networkx as nx __author__ = """Aric Hagberg """ __all__ = ['floyd_warshall', 'floyd_warshall_predecessor_and_distance', 'floyd_warshall_numpy'] def floyd_warshall_numpy(G, nodelist=None, weight='weight'): """Find all-pairs shortest path lengths using Floyd's algorithm. Parameters ---------- G : NetworkX graph nodelist : list, optional The rows and columns are ordered by the nodes in nodelist. If nodelist is None then the ordering is produced by G.nodes(). weight: string, optional (default= 'weight') Edge data key corresponding to the edge weight. Returns ------- distance : NumPy matrix A matrix of shortest path distances between nodes. If there is no path between to nodes the corresponding matrix entry will be Inf. Notes ------ Floyd's algorithm is appropriate for finding shortest paths in dense graphs or graphs with negative weights when Dijkstra's algorithm fails. This algorithm can still fail if there are negative cycles. It has running time O(n^3) with running space of O(n^2). """ try: import numpy as np except ImportError: raise ImportError(\ "to_numpy_matrix() requires numpy: http://scipy.org/ ") # To handle cases when an edge has weight=0, we must make sure that # nonedges are not given the value 0 as well. A = nx.to_numpy_matrix(G, nodelist=nodelist, multigraph_weight=min, weight=weight, nonedge=np.inf) n,m = A.shape I = np.identity(n) A[I==1] = 0 # diagonal elements should be zero for i in range(n): A = np.minimum(A, A[i,:] + A[:,i]) return A def floyd_warshall_predecessor_and_distance(G, weight='weight'): """Find all-pairs shortest path lengths using Floyd's algorithm. Parameters ---------- G : NetworkX graph weight: string, optional (default= 'weight') Edge data key corresponding to the edge weight. Returns ------- predecessor,distance : dictionaries Dictionaries, keyed by source and target, of predecessors and distances in the shortest path. Notes ------ Floyd's algorithm is appropriate for finding shortest paths in dense graphs or graphs with negative weights when Dijkstra's algorithm fails. This algorithm can still fail if there are negative cycles. It has running time O(n^3) with running space of O(n^2). See Also -------- floyd_warshall floyd_warshall_numpy all_pairs_shortest_path all_pairs_shortest_path_length """ from collections import defaultdict # dictionary-of-dictionaries representation for dist and pred # use some defaultdict magick here # for dist the default is the floating point inf value dist = defaultdict(lambda : defaultdict(lambda: float('inf'))) for u in G: dist[u][u] = 0 pred = defaultdict(dict) # initialize path distance dictionary to be the adjacency matrix # also set the distance to self to 0 (zero diagonal) undirected = not G.is_directed() for u,v,d in G.edges(data=True): e_weight = d.get(weight, 1.0) dist[u][v] = min(e_weight, dist[u][v]) pred[u][v] = u if undirected: dist[v][u] = min(e_weight, dist[v][u]) pred[v][u] = v for w in G: for u in G: for v in G: if dist[u][v] > dist[u][w] + dist[w][v]: dist[u][v] = dist[u][w] + dist[w][v] pred[u][v] = pred[w][v] return dict(pred),dict(dist) def floyd_warshall(G, weight='weight'): """Find all-pairs shortest path lengths using Floyd's algorithm. Parameters ---------- G : NetworkX graph weight: string, optional (default= 'weight') Edge data key corresponding to the edge weight. Returns ------- distance : dict A dictionary, keyed by source and target, of shortest paths distances between nodes. Notes ------ Floyd's algorithm is appropriate for finding shortest paths in dense graphs or graphs with negative weights when Dijkstra's algorithm fails. This algorithm can still fail if there are negative cycles. It has running time O(n^3) with running space of O(n^2). See Also -------- floyd_warshall_predecessor_and_distance floyd_warshall_numpy all_pairs_shortest_path all_pairs_shortest_path_length """ # could make this its own function to reduce memory costs return floyd_warshall_predecessor_and_distance(G, weight=weight)[1] # fixture for nose tests def setup_module(module): from nose import SkipTest try: import numpy except: raise SkipTest("NumPy not available") networkx-1.11/networkx/algorithms/shortest_paths/astar.py0000644000175000017500000001152512637544450024001 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """Shortest paths and path lengths using A* ("A star") algorithm. """ # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. from heapq import heappush, heappop from itertools import count from networkx import NetworkXError import networkx as nx __author__ = "\n".join(["Salim Fadhley ", "Matteo Dell'Amico "]) __all__ = ['astar_path', 'astar_path_length'] def astar_path(G, source, target, heuristic=None, weight='weight'): """Return a list of nodes in a shortest path between source and target using the A* ("A-star") algorithm. There may be more than one shortest path. This returns only one. Parameters ---------- G : NetworkX graph source : node Starting node for path target : node Ending node for path heuristic : function A function to evaluate the estimate of the distance from the a node to the target. The function takes two nodes arguments and must return a number. weight: string, optional (default='weight') Edge data key corresponding to the edge weight. Raises ------ NetworkXNoPath If no path exists between source and target. Examples -------- >>> G=nx.path_graph(5) >>> print(nx.astar_path(G,0,4)) [0, 1, 2, 3, 4] >>> G=nx.grid_graph(dim=[3,3]) # nodes are two-tuples (x,y) >>> def dist(a, b): ... (x1, y1) = a ... (x2, y2) = b ... return ((x1 - x2) ** 2 + (y1 - y2) ** 2) ** 0.5 >>> print(nx.astar_path(G,(0,0),(2,2),dist)) [(0, 0), (0, 1), (1, 1), (1, 2), (2, 2)] See Also -------- shortest_path, dijkstra_path """ if G.is_multigraph(): raise NetworkXError("astar_path() not implemented for Multi(Di)Graphs") if heuristic is None: # The default heuristic is h=0 - same as Dijkstra's algorithm def heuristic(u, v): return 0 push = heappush pop = heappop # The queue stores priority, node, cost to reach, and parent. # Uses Python heapq to keep in priority order. # Add a counter to the queue to prevent the underlying heap from # attempting to compare the nodes themselves. The hash breaks ties in the # priority and is guarenteed unique for all nodes in the graph. c = count() queue = [(0, next(c), source, 0, None)] # Maps enqueued nodes to distance of discovered paths and the # computed heuristics to target. We avoid computing the heuristics # more than once and inserting the node into the queue too many times. enqueued = {} # Maps explored nodes to parent closest to the source. explored = {} while queue: # Pop the smallest item from queue. _, __, curnode, dist, parent = pop(queue) if curnode == target: path = [curnode] node = parent while node is not None: path.append(node) node = explored[node] path.reverse() return path if curnode in explored: continue explored[curnode] = parent for neighbor, w in G[curnode].items(): if neighbor in explored: continue ncost = dist + w.get(weight, 1) if neighbor in enqueued: qcost, h = enqueued[neighbor] # if qcost < ncost, a longer path to neighbor remains # enqueued. Removing it would need to filter the whole # queue, it's better just to leave it there and ignore # it when we visit the node a second time. if qcost <= ncost: continue else: h = heuristic(neighbor, target) enqueued[neighbor] = ncost, h push(queue, (ncost + h, next(c), neighbor, ncost, curnode)) raise nx.NetworkXNoPath("Node %s not reachable from %s" % (source, target)) def astar_path_length(G, source, target, heuristic=None, weight='weight'): """Return the length of the shortest path between source and target using the A* ("A-star") algorithm. Parameters ---------- G : NetworkX graph source : node Starting node for path target : node Ending node for path heuristic : function A function to evaluate the estimate of the distance from the a node to the target. The function takes two nodes arguments and must return a number. Raises ------ NetworkXNoPath If no path exists between source and target. See Also -------- astar_path """ path = astar_path(G, source, target, heuristic, weight) return sum(G[u][v].get(weight, 1) for u, v in zip(path[:-1], path[1:])) networkx-1.11/networkx/algorithms/link_analysis/0000755000175000017500000000000012653231454022072 5ustar aricaric00000000000000networkx-1.11/networkx/algorithms/link_analysis/hits_alg.py0000644000175000017500000002233512637544500024244 0ustar aricaric00000000000000"""Hubs and authorities analysis of graph structure. """ # Copyright (C) 2008-2012 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. # NetworkX:http://networkx.github.io/ import networkx as nx from networkx.exception import NetworkXError __author__ = """Aric Hagberg (hagberg@lanl.gov)""" __all__ = ['hits','hits_numpy','hits_scipy','authority_matrix','hub_matrix'] def hits(G,max_iter=100,tol=1.0e-8,nstart=None,normalized=True): """Return HITS hubs and authorities values for nodes. The HITS algorithm computes two numbers for a node. Authorities estimates the node value based on the incoming links. Hubs estimates the node value based on outgoing links. Parameters ---------- G : graph A NetworkX graph max_iter : interger, optional Maximum number of iterations in power method. tol : float, optional Error tolerance used to check convergence in power method iteration. nstart : dictionary, optional Starting value of each node for power method iteration. normalized : bool (default=True) Normalize results by the sum of all of the values. Returns ------- (hubs,authorities) : two-tuple of dictionaries Two dictionaries keyed by node containing the hub and authority values. Examples -------- >>> G=nx.path_graph(4) >>> h,a=nx.hits(G) Notes ----- The eigenvector calculation is done by the power iteration method and has no guarantee of convergence. The iteration will stop after max_iter iterations or an error tolerance of number_of_nodes(G)*tol has been reached. The HITS algorithm was designed for directed graphs but this algorithm does not check if the input graph is directed and will execute on undirected graphs. References ---------- .. [1] A. Langville and C. Meyer, "A survey of eigenvector methods of web information retrieval." http://citeseer.ist.psu.edu/713792.html .. [2] Jon Kleinberg, Authoritative sources in a hyperlinked environment Journal of the ACM 46 (5): 604-32, 1999. doi:10.1145/324133.324140. http://www.cs.cornell.edu/home/kleinber/auth.pdf. """ if type(G) == nx.MultiGraph or type(G) == nx.MultiDiGraph: raise Exception("hits() not defined for graphs with multiedges.") if len(G) == 0: return {},{} # choose fixed starting vector if not given if nstart is None: h=dict.fromkeys(G,1.0/G.number_of_nodes()) else: h=nstart # normalize starting vector s=1.0/sum(h.values()) for k in h: h[k]*=s i=0 while True: # power iteration: make up to max_iter iterations hlast=h h=dict.fromkeys(hlast.keys(),0) a=dict.fromkeys(hlast.keys(),0) # this "matrix multiply" looks odd because it is # doing a left multiply a^T=hlast^T*G for n in h: for nbr in G[n]: a[nbr]+=hlast[n]*G[n][nbr].get('weight',1) # now multiply h=Ga for n in h: for nbr in G[n]: h[n]+=a[nbr]*G[n][nbr].get('weight',1) # normalize vector s=1.0/max(h.values()) for n in h: h[n]*=s # normalize vector s=1.0/max(a.values()) for n in a: a[n]*=s # check convergence, l1 norm err=sum([abs(h[n]-hlast[n]) for n in h]) if err < tol: break if i>max_iter: raise NetworkXError(\ "HITS: power iteration failed to converge in %d iterations."%(i+1)) i+=1 if normalized: s = 1.0/sum(a.values()) for n in a: a[n] *= s s = 1.0/sum(h.values()) for n in h: h[n] *= s return h,a def authority_matrix(G,nodelist=None): """Return the HITS authority matrix.""" M=nx.to_numpy_matrix(G,nodelist=nodelist) return M.T*M def hub_matrix(G,nodelist=None): """Return the HITS hub matrix.""" M=nx.to_numpy_matrix(G,nodelist=nodelist) return M*M.T def hits_numpy(G,normalized=True): """Return HITS hubs and authorities values for nodes. The HITS algorithm computes two numbers for a node. Authorities estimates the node value based on the incoming links. Hubs estimates the node value based on outgoing links. Parameters ---------- G : graph A NetworkX graph normalized : bool (default=True) Normalize results by the sum of all of the values. Returns ------- (hubs,authorities) : two-tuple of dictionaries Two dictionaries keyed by node containing the hub and authority values. Examples -------- >>> G=nx.path_graph(4) >>> h,a=nx.hits(G) Notes ----- The eigenvector calculation uses NumPy's interface to LAPACK. The HITS algorithm was designed for directed graphs but this algorithm does not check if the input graph is directed and will execute on undirected graphs. References ---------- .. [1] A. Langville and C. Meyer, "A survey of eigenvector methods of web information retrieval." http://citeseer.ist.psu.edu/713792.html .. [2] Jon Kleinberg, Authoritative sources in a hyperlinked environment Journal of the ACM 46 (5): 604-32, 1999. doi:10.1145/324133.324140. http://www.cs.cornell.edu/home/kleinber/auth.pdf. """ try: import numpy as np except ImportError: raise ImportError(\ "hits_numpy() requires NumPy: http://scipy.org/") if len(G) == 0: return {},{} H=nx.hub_matrix(G,G.nodes()) e,ev=np.linalg.eig(H) m=e.argsort()[-1] # index of maximum eigenvalue h=np.array(ev[:,m]).flatten() A=nx.authority_matrix(G,G.nodes()) e,ev=np.linalg.eig(A) m=e.argsort()[-1] # index of maximum eigenvalue a=np.array(ev[:,m]).flatten() if normalized: h = h/h.sum() a = a/a.sum() else: h = h/h.max() a = a/a.max() hubs=dict(zip(G.nodes(),map(float,h))) authorities=dict(zip(G.nodes(),map(float,a))) return hubs,authorities def hits_scipy(G,max_iter=100,tol=1.0e-6,normalized=True): """Return HITS hubs and authorities values for nodes. The HITS algorithm computes two numbers for a node. Authorities estimates the node value based on the incoming links. Hubs estimates the node value based on outgoing links. Parameters ---------- G : graph A NetworkX graph max_iter : interger, optional Maximum number of iterations in power method. tol : float, optional Error tolerance used to check convergence in power method iteration. nstart : dictionary, optional Starting value of each node for power method iteration. normalized : bool (default=True) Normalize results by the sum of all of the values. Returns ------- (hubs,authorities) : two-tuple of dictionaries Two dictionaries keyed by node containing the hub and authority values. Examples -------- >>> G=nx.path_graph(4) >>> h,a=nx.hits(G) Notes ----- This implementation uses SciPy sparse matrices. The eigenvector calculation is done by the power iteration method and has no guarantee of convergence. The iteration will stop after max_iter iterations or an error tolerance of number_of_nodes(G)*tol has been reached. The HITS algorithm was designed for directed graphs but this algorithm does not check if the input graph is directed and will execute on undirected graphs. References ---------- .. [1] A. Langville and C. Meyer, "A survey of eigenvector methods of web information retrieval." http://citeseer.ist.psu.edu/713792.html .. [2] Jon Kleinberg, Authoritative sources in a hyperlinked environment Journal of the ACM 46 (5): 604-632, 1999. doi:10.1145/324133.324140. http://www.cs.cornell.edu/home/kleinber/auth.pdf. """ try: import scipy.sparse import numpy as np except ImportError: raise ImportError(\ "hits_scipy() requires SciPy: http://scipy.org/") if len(G) == 0: return {},{} M=nx.to_scipy_sparse_matrix(G,nodelist=G.nodes()) (n,m)=M.shape # should be square A=M.T*M # authority matrix x=scipy.ones((n,1))/n # initial guess # power iteration on authority matrix i=0 while True: xlast=x x=A*x x=x/x.max() # check convergence, l1 norm err=scipy.absolute(x-xlast).sum() if err < tol: break if i>max_iter: raise NetworkXError(\ "HITS: power iteration failed to converge in %d iterations."%(i+1)) i+=1 a=np.asarray(x).flatten() # h=M*a h=np.asarray(M*a).flatten() if normalized: h = h/h.sum() a = a/a.sum() hubs=dict(zip(G.nodes(),map(float,h))) authorities=dict(zip(G.nodes(),map(float,a))) return hubs,authorities # fixture for nose tests def setup_module(module): from nose import SkipTest try: import numpy except: raise SkipTest("NumPy not available") try: import scipy except: raise SkipTest("SciPy not available") networkx-1.11/networkx/algorithms/link_analysis/__init__.py0000644000175000017500000000016612637544450024213 0ustar aricaric00000000000000from networkx.algorithms.link_analysis.pagerank_alg import * from networkx.algorithms.link_analysis.hits_alg import * networkx-1.11/networkx/algorithms/link_analysis/tests/0000755000175000017500000000000012653231454023234 5ustar aricaric00000000000000networkx-1.11/networkx/algorithms/link_analysis/tests/test_pagerank.py0000644000175000017500000001323312637544450026444 0ustar aricaric00000000000000#!/usr/bin/env python import random import networkx from nose.tools import * from nose import SkipTest from nose.plugins.attrib import attr # Example from # A. Langville and C. Meyer, "A survey of eigenvector methods of web # information retrieval." http://citeseer.ist.psu.edu/713792.html class TestPageRank(object): @classmethod def setupClass(cls): global numpy try: import numpy except ImportError: raise SkipTest('NumPy not available.') def setUp(self): G = networkx.DiGraph() edges = [(1, 2), (1, 3), # 2 is a dangling node (3, 1), (3, 2), (3, 5), (4, 5), (4, 6), (5, 4), (5, 6), (6, 4)] G.add_edges_from(edges) self.G = G self.G.pagerank = dict(zip(G, [0.03721197, 0.05395735, 0.04150565, 0.37508082, 0.20599833, 0.28624589])) self.dangling_node_index = 1 self.dangling_edges = {1: 2, 2: 3, 3: 0, 4: 0, 5: 0, 6: 0} self.G.dangling_pagerank = dict(zip(G, [0.10844518, 0.18618601, 0.0710892, 0.2683668, 0.15919783, 0.20671497])) def test_pagerank(self): G = self.G p = networkx.pagerank(G, alpha=0.9, tol=1.e-08) for n in G: assert_almost_equal(p[n], G.pagerank[n], places=4) nstart = dict((n, random.random()) for n in G) p = networkx.pagerank(G, alpha=0.9, tol=1.e-08, nstart=nstart) for n in G: assert_almost_equal(p[n], G.pagerank[n], places=4) assert_raises(networkx.NetworkXError, networkx.pagerank, G, max_iter=0) def test_numpy_pagerank(self): G = self.G p = networkx.pagerank_numpy(G, alpha=0.9) for n in G: assert_almost_equal(p[n], G.pagerank[n], places=4) personalize = dict((n, random.random()) for n in G) p = networkx.pagerank_numpy(G, alpha=0.9, personalization=personalize) def test_google_matrix(self): G = self.G M = networkx.google_matrix(G, alpha=0.9) e, ev = numpy.linalg.eig(M.T) p = numpy.array(ev[:, 0] / ev[:, 0].sum())[:, 0] for (a, b) in zip(p, self.G.pagerank.values()): assert_almost_equal(a, b) personalize = dict((n, random.random()) for n in G) M = networkx.google_matrix(G, alpha=0.9, personalization=personalize) personalize.pop(1) assert_raises(networkx.NetworkXError, networkx.google_matrix, G, personalization=personalize) def test_personalization(self): G = networkx.complete_graph(4) personalize = {0: 1, 1: 1, 2: 4, 3: 4} answer = {0: 0.1, 1: 0.1, 2: 0.4, 3: 0.4} p = networkx.pagerank(G, alpha=0.0, personalization=personalize) for n in G: assert_almost_equal(p[n], answer[n], places=4) personalize.pop(0) assert_raises(networkx.NetworkXError, networkx.pagerank, G, personalization=personalize) def test_dangling_matrix(self): """ Tests that the google_matrix doesn't change except for the dangling nodes. """ G = self.G dangling = self.dangling_edges dangling_sum = float(sum(dangling.values())) M1 = networkx.google_matrix(G, personalization=dangling) M2 = networkx.google_matrix(G, personalization=dangling, dangling=dangling) for i in range(len(G)): for j in range(len(G)): if i == self.dangling_node_index and (j + 1) in dangling: assert_almost_equal(M2[i, j], dangling[j + 1] / dangling_sum, places=4) else: assert_almost_equal(M2[i, j], M1[i, j], places=4) def test_dangling_pagerank(self): pr = networkx.pagerank(self.G, dangling=self.dangling_edges) for n in self.G: assert_almost_equal(pr[n], self.G.dangling_pagerank[n], places=4) def test_dangling_numpy_pagerank(self): pr = networkx.pagerank_numpy(self.G, dangling=self.dangling_edges) for n in self.G: assert_almost_equal(pr[n], self.G.dangling_pagerank[n], places=4) def test_empty(self): G = networkx.Graph() assert_equal(networkx.pagerank(G), {}) assert_equal(networkx.pagerank_numpy(G), {}) assert_equal(networkx.google_matrix(G).shape, (0, 0)) class TestPageRankScipy(TestPageRank): @classmethod def setupClass(cls): global scipy try: import scipy except ImportError: raise SkipTest('SciPy not available.') def test_scipy_pagerank(self): G = self.G p = networkx.pagerank_scipy(G, alpha=0.9, tol=1.e-08) for n in G: assert_almost_equal(p[n], G.pagerank[n], places=4) personalize = dict((n, random.random()) for n in G) p = networkx.pagerank_scipy(G, alpha=0.9, tol=1.e-08, personalization=personalize) assert_raises(networkx.NetworkXError, networkx.pagerank_scipy, G, max_iter=0) def test_dangling_scipy_pagerank(self): pr = networkx.pagerank_scipy(self.G, dangling=self.dangling_edges) for n in self.G: assert_almost_equal(pr[n], self.G.dangling_pagerank[n], places=4) def test_empty_scipy(self): G = networkx.Graph() assert_equal(networkx.pagerank_scipy(G), {}) networkx-1.11/networkx/algorithms/link_analysis/tests/test_hits.py0000644000175000017500000000476012637544450025630 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * from nose import SkipTest from nose.plugins.attrib import attr import networkx # Example from # A. Langville and C. Meyer, "A survey of eigenvector methods of web # information retrieval." http://citeseer.ist.psu.edu/713792.html class TestHITS: def setUp(self): G=networkx.DiGraph() edges=[(1,3),(1,5),\ (2,1),\ (3,5),\ (5,4),(5,3),\ (6,5)] G.add_edges_from(edges,weight=1) self.G=G self.G.a=dict(zip(G,[0.000000, 0.000000, 0.366025, 0.133975, 0.500000, 0.000000])) self.G.h=dict(zip(G,[ 0.366025, 0.000000, 0.211325, 0.000000, 0.211325, 0.211325])) def test_hits(self): G=self.G h,a=networkx.hits(G,tol=1.e-08) for n in G: assert_almost_equal(h[n],G.h[n],places=4) for n in G: assert_almost_equal(a[n],G.a[n],places=4) def test_hits_nstart(self): G = self.G nstart = dict([(i, 1./2) for i in G]) h, a = networkx.hits(G, nstart = nstart) @attr('numpy') def test_hits_numpy(self): try: import numpy as np except ImportError: raise SkipTest('NumPy not available.') G=self.G h,a=networkx.hits_numpy(G) for n in G: assert_almost_equal(h[n],G.h[n],places=4) for n in G: assert_almost_equal(a[n],G.a[n],places=4) def test_hits_scipy(self): try: import scipy as sp except ImportError: raise SkipTest('SciPy not available.') G=self.G h,a=networkx.hits_scipy(G,tol=1.e-08) for n in G: assert_almost_equal(h[n],G.h[n],places=4) for n in G: assert_almost_equal(a[n],G.a[n],places=4) @attr('numpy') def test_empty(self): try: import numpy except ImportError: raise SkipTest('numpy not available.') G=networkx.Graph() assert_equal(networkx.hits(G),({},{})) assert_equal(networkx.hits_numpy(G),({},{})) assert_equal(networkx.authority_matrix(G).shape,(0,0)) assert_equal(networkx.hub_matrix(G).shape,(0,0)) def test_empty_scipy(self): try: import scipy except ImportError: raise SkipTest('scipy not available.') G=networkx.Graph() assert_equal(networkx.hits_scipy(G),({},{})) networkx-1.11/networkx/algorithms/link_analysis/pagerank_alg.py0000644000175000017500000004303612637544500025066 0ustar aricaric00000000000000"""PageRank analysis of graph structure. """ # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. # NetworkX:http://networkx.github.io/ import networkx as nx from networkx.exception import NetworkXError from networkx.utils import not_implemented_for __author__ = """\n""".join(["Aric Hagberg ", "Brandon Liu >> G = nx.DiGraph(nx.path_graph(4)) >>> pr = nx.pagerank(G, alpha=0.9) Notes ----- The eigenvector calculation is done by the power iteration method and has no guarantee of convergence. The iteration will stop after max_iter iterations or an error tolerance of number_of_nodes(G)*tol has been reached. The PageRank algorithm was designed for directed graphs but this algorithm does not check if the input graph is directed and will execute on undirected graphs by converting each edge in the directed graph to two edges. See Also -------- pagerank_numpy, pagerank_scipy, google_matrix References ---------- .. [1] A. Langville and C. Meyer, "A survey of eigenvector methods of web information retrieval." http://citeseer.ist.psu.edu/713792.html .. [2] Page, Lawrence; Brin, Sergey; Motwani, Rajeev and Winograd, Terry, The PageRank citation ranking: Bringing order to the Web. 1999 http://dbpubs.stanford.edu:8090/pub/showDoc.Fulltext?lang=en&doc=1999-66&format=pdf """ if len(G) == 0: return {} if not G.is_directed(): D = G.to_directed() else: D = G # Create a copy in (right) stochastic form W = nx.stochastic_graph(D, weight=weight) N = W.number_of_nodes() # Choose fixed starting vector if not given if nstart is None: x = dict.fromkeys(W, 1.0 / N) else: # Normalized nstart vector s = float(sum(nstart.values())) x = dict((k, v / s) for k, v in nstart.items()) if personalization is None: # Assign uniform personalization vector if not given p = dict.fromkeys(W, 1.0 / N) else: missing = set(G) - set(personalization) if missing: raise NetworkXError('Personalization dictionary ' 'must have a value for every node. ' 'Missing nodes %s' % missing) s = float(sum(personalization.values())) p = dict((k, v / s) for k, v in personalization.items()) if dangling is None: # Use personalization vector if dangling vector not specified dangling_weights = p else: missing = set(G) - set(dangling) if missing: raise NetworkXError('Dangling node dictionary ' 'must have a value for every node. ' 'Missing nodes %s' % missing) s = float(sum(dangling.values())) dangling_weights = dict((k, v/s) for k, v in dangling.items()) dangling_nodes = [n for n in W if W.out_degree(n, weight=weight) == 0.0] # power iteration: make up to max_iter iterations for _ in range(max_iter): xlast = x x = dict.fromkeys(xlast.keys(), 0) danglesum = alpha * sum(xlast[n] for n in dangling_nodes) for n in x: # this matrix multiply looks odd because it is # doing a left multiply x^T=xlast^T*W for nbr in W[n]: x[nbr] += alpha * xlast[n] * W[n][nbr][weight] x[n] += danglesum * dangling_weights[n] + (1.0 - alpha) * p[n] # check convergence, l1 norm err = sum([abs(x[n] - xlast[n]) for n in x]) if err < N*tol: return x raise NetworkXError('pagerank: power iteration failed to converge ' 'in %d iterations.' % max_iter) def google_matrix(G, alpha=0.85, personalization=None, nodelist=None, weight='weight', dangling=None): """Return the Google matrix of the graph. Parameters ---------- G : graph A NetworkX graph. Undirected graphs will be converted to a directed graph with two directed edges for each undirected edge. alpha : float The damping factor. personalization: dict, optional The "personalization vector" consisting of a dictionary with a key for every graph node and nonzero personalization value for each node. By default, a uniform distribution is used. nodelist : list, optional The rows and columns are ordered according to the nodes in nodelist. If nodelist is None, then the ordering is produced by G.nodes(). weight : key, optional Edge data key to use as weight. If None weights are set to 1. dangling: dict, optional The outedges to be assigned to any "dangling" nodes, i.e., nodes without any outedges. The dict key is the node the outedge points to and the dict value is the weight of that outedge. By default, dangling nodes are given outedges according to the personalization vector (uniform if not specified) This must be selected to result in an irreducible transition matrix (see notes below). It may be common to have the dangling dict to be the same as the personalization dict. Returns ------- A : NumPy matrix Google matrix of the graph Notes ----- The matrix returned represents the transition matrix that describes the Markov chain used in PageRank. For PageRank to converge to a unique solution (i.e., a unique stationary distribution in a Markov chain), the transition matrix must be irreducible. In other words, it must be that there exists a path between every pair of nodes in the graph, or else there is the potential of "rank sinks." This implementation works with Multi(Di)Graphs. For multigraphs the weight between two nodes is set to be the sum of all edge weights between those nodes. See Also -------- pagerank, pagerank_numpy, pagerank_scipy """ import numpy as np if nodelist is None: nodelist = G.nodes() M = nx.to_numpy_matrix(G, nodelist=nodelist, weight=weight) N = len(G) if N == 0: return M # Personalization vector if personalization is None: p = np.repeat(1.0 / N, N) else: missing = set(nodelist) - set(personalization) if missing: raise NetworkXError('Personalization vector dictionary ' 'must have a value for every node. ' 'Missing nodes %s' % missing) p = np.array([personalization[n] for n in nodelist], dtype=float) p /= p.sum() # Dangling nodes if dangling is None: dangling_weights = p else: missing = set(nodelist) - set(dangling) if missing: raise NetworkXError('Dangling node dictionary ' 'must have a value for every node. ' 'Missing nodes %s' % missing) # Convert the dangling dictionary into an array in nodelist order dangling_weights = np.array([dangling[n] for n in nodelist], dtype=float) dangling_weights /= dangling_weights.sum() dangling_nodes = np.where(M.sum(axis=1) == 0)[0] # Assign dangling_weights to any dangling nodes (nodes with no out links) for node in dangling_nodes: M[node] = dangling_weights M /= M.sum(axis=1) # Normalize rows to sum to 1 return alpha * M + (1 - alpha) * p def pagerank_numpy(G, alpha=0.85, personalization=None, weight='weight', dangling=None): """Return the PageRank of the nodes in the graph. PageRank computes a ranking of the nodes in the graph G based on the structure of the incoming links. It was originally designed as an algorithm to rank web pages. Parameters ---------- G : graph A NetworkX graph. Undirected graphs will be converted to a directed graph with two directed edges for each undirected edge. alpha : float, optional Damping parameter for PageRank, default=0.85. personalization: dict, optional The "personalization vector" consisting of a dictionary with a key for every graph node and nonzero personalization value for each node. By default, a uniform distribution is used. weight : key, optional Edge data key to use as weight. If None weights are set to 1. dangling: dict, optional The outedges to be assigned to any "dangling" nodes, i.e., nodes without any outedges. The dict key is the node the outedge points to and the dict value is the weight of that outedge. By default, dangling nodes are given outedges according to the personalization vector (uniform if not specified) This must be selected to result in an irreducible transition matrix (see notes under google_matrix). It may be common to have the dangling dict to be the same as the personalization dict. Returns ------- pagerank : dictionary Dictionary of nodes with PageRank as value. Examples -------- >>> G = nx.DiGraph(nx.path_graph(4)) >>> pr = nx.pagerank_numpy(G, alpha=0.9) Notes ----- The eigenvector calculation uses NumPy's interface to the LAPACK eigenvalue solvers. This will be the fastest and most accurate for small graphs. This implementation works with Multi(Di)Graphs. For multigraphs the weight between two nodes is set to be the sum of all edge weights between those nodes. See Also -------- pagerank, pagerank_scipy, google_matrix References ---------- .. [1] A. Langville and C. Meyer, "A survey of eigenvector methods of web information retrieval." http://citeseer.ist.psu.edu/713792.html .. [2] Page, Lawrence; Brin, Sergey; Motwani, Rajeev and Winograd, Terry, The PageRank citation ranking: Bringing order to the Web. 1999 http://dbpubs.stanford.edu:8090/pub/showDoc.Fulltext?lang=en&doc=1999-66&format=pdf """ import numpy as np if len(G) == 0: return {} M = google_matrix(G, alpha, personalization=personalization, weight=weight, dangling=dangling) # use numpy LAPACK solver eigenvalues, eigenvectors = np.linalg.eig(M.T) ind = eigenvalues.argsort() # eigenvector of largest eigenvalue at ind[-1], normalized largest = np.array(eigenvectors[:, ind[-1]]).flatten().real norm = float(largest.sum()) return dict(zip(G, map(float, largest / norm))) def pagerank_scipy(G, alpha=0.85, personalization=None, max_iter=100, tol=1.0e-6, weight='weight', dangling=None): """Return the PageRank of the nodes in the graph. PageRank computes a ranking of the nodes in the graph G based on the structure of the incoming links. It was originally designed as an algorithm to rank web pages. Parameters ---------- G : graph A NetworkX graph. Undirected graphs will be converted to a directed graph with two directed edges for each undirected edge. alpha : float, optional Damping parameter for PageRank, default=0.85. personalization: dict, optional The "personalization vector" consisting of a dictionary with a key for every graph node and nonzero personalization value for each node. By default, a uniform distribution is used. max_iter : integer, optional Maximum number of iterations in power method eigenvalue solver. tol : float, optional Error tolerance used to check convergence in power method solver. weight : key, optional Edge data key to use as weight. If None weights are set to 1. dangling: dict, optional The outedges to be assigned to any "dangling" nodes, i.e., nodes without any outedges. The dict key is the node the outedge points to and the dict value is the weight of that outedge. By default, dangling nodes are given outedges according to the personalization vector (uniform if not specified) This must be selected to result in an irreducible transition matrix (see notes under google_matrix). It may be common to have the dangling dict to be the same as the personalization dict. Returns ------- pagerank : dictionary Dictionary of nodes with PageRank as value Examples -------- >>> G = nx.DiGraph(nx.path_graph(4)) >>> pr = nx.pagerank_scipy(G, alpha=0.9) Notes ----- The eigenvector calculation uses power iteration with a SciPy sparse matrix representation. This implementation works with Multi(Di)Graphs. For multigraphs the weight between two nodes is set to be the sum of all edge weights between those nodes. See Also -------- pagerank, pagerank_numpy, google_matrix References ---------- .. [1] A. Langville and C. Meyer, "A survey of eigenvector methods of web information retrieval." http://citeseer.ist.psu.edu/713792.html .. [2] Page, Lawrence; Brin, Sergey; Motwani, Rajeev and Winograd, Terry, The PageRank citation ranking: Bringing order to the Web. 1999 http://dbpubs.stanford.edu:8090/pub/showDoc.Fulltext?lang=en&doc=1999-66&format=pdf """ import scipy.sparse N = len(G) if N == 0: return {} nodelist = G.nodes() M = nx.to_scipy_sparse_matrix(G, nodelist=nodelist, weight=weight, dtype=float) S = scipy.array(M.sum(axis=1)).flatten() S[S != 0] = 1.0 / S[S != 0] Q = scipy.sparse.spdiags(S.T, 0, *M.shape, format='csr') M = Q * M # initial vector x = scipy.repeat(1.0 / N, N) # Personalization vector if personalization is None: p = scipy.repeat(1.0 / N, N) else: missing = set(nodelist) - set(personalization) if missing: raise NetworkXError('Personalization vector dictionary ' 'must have a value for every node. ' 'Missing nodes %s' % missing) p = scipy.array([personalization[n] for n in nodelist], dtype=float) p = p / p.sum() # Dangling nodes if dangling is None: dangling_weights = p else: missing = set(nodelist) - set(dangling) if missing: raise NetworkXError('Dangling node dictionary ' 'must have a value for every node. ' 'Missing nodes %s' % missing) # Convert the dangling dictionary into an array in nodelist order dangling_weights = scipy.array([dangling[n] for n in nodelist], dtype=float) dangling_weights /= dangling_weights.sum() is_dangling = scipy.where(S == 0)[0] # power iteration: make up to max_iter iterations for _ in range(max_iter): xlast = x x = alpha * (x * M + sum(x[is_dangling]) * dangling_weights) + \ (1 - alpha) * p # check convergence, l1 norm err = scipy.absolute(x - xlast).sum() if err < N * tol: return dict(zip(nodelist, map(float, x))) raise NetworkXError('pagerank_scipy: power iteration failed to converge ' 'in %d iterations.' % max_iter) # fixture for nose tests def setup_module(module): from nose import SkipTest try: import numpy except: raise SkipTest("NumPy not available") try: import scipy except: raise SkipTest("SciPy not available") networkx-1.11/networkx/algorithms/simple_paths.py0000644000175000017500000004735212637544500022310 0ustar aricaric00000000000000# -*- coding: utf-8 -*- # Copyright (C) 2012 by # Sergio Nery Simoes # All rights reserved. # BSD license. from heapq import heappush, heappop from itertools import count import networkx as nx from networkx.utils import not_implemented_for __author__ = """\n""".join(['Sérgio Nery Simões ', 'Aric Hagberg ', 'Andrey Paramonov', 'Jordi Torrents ']) __all__ = [ 'all_simple_paths', 'shortest_simple_paths', ] def all_simple_paths(G, source, target, cutoff=None): """Generate all simple paths in the graph G from source to target. A simple path is a path with no repeated nodes. Parameters ---------- G : NetworkX graph source : node Starting node for path target : node Ending node for path cutoff : integer, optional Depth to stop the search. Only paths of length <= cutoff are returned. Returns ------- path_generator: generator A generator that produces lists of simple paths. If there are no paths between the source and target within the given cutoff the generator produces no output. Examples -------- >>> G = nx.complete_graph(4) >>> for path in nx.all_simple_paths(G, source=0, target=3): ... print(path) ... [0, 1, 2, 3] [0, 1, 3] [0, 2, 1, 3] [0, 2, 3] [0, 3] >>> paths = nx.all_simple_paths(G, source=0, target=3, cutoff=2) >>> print(list(paths)) [[0, 1, 3], [0, 2, 3], [0, 3]] Notes ----- This algorithm uses a modified depth-first search to generate the paths [1]_. A single path can be found in `O(V+E)` time but the number of simple paths in a graph can be very large, e.g. `O(n!)` in the complete graph of order n. References ---------- .. [1] R. Sedgewick, "Algorithms in C, Part 5: Graph Algorithms", Addison Wesley Professional, 3rd ed., 2001. See Also -------- all_shortest_paths, shortest_path """ if source not in G: raise nx.NetworkXError('source node %s not in graph'%source) if target not in G: raise nx.NetworkXError('target node %s not in graph'%target) if cutoff is None: cutoff = len(G)-1 if G.is_multigraph(): return _all_simple_paths_multigraph(G, source, target, cutoff=cutoff) else: return _all_simple_paths_graph(G, source, target, cutoff=cutoff) def _all_simple_paths_graph(G, source, target, cutoff=None): if cutoff < 1: return visited = [source] stack = [iter(G[source])] while stack: children = stack[-1] child = next(children, None) if child is None: stack.pop() visited.pop() elif len(visited) < cutoff: if child == target: yield visited + [target] elif child not in visited: visited.append(child) stack.append(iter(G[child])) else: #len(visited) == cutoff: if child == target or target in children: yield visited + [target] stack.pop() visited.pop() def _all_simple_paths_multigraph(G, source, target, cutoff=None): if cutoff < 1: return visited = [source] stack = [(v for u,v in G.edges(source))] while stack: children = stack[-1] child = next(children, None) if child is None: stack.pop() visited.pop() elif len(visited) < cutoff: if child == target: yield visited + [target] elif child not in visited: visited.append(child) stack.append((v for u,v in G.edges(child))) else: #len(visited) == cutoff: count = ([child]+list(children)).count(target) for i in range(count): yield visited + [target] stack.pop() visited.pop() @not_implemented_for('multigraph') def shortest_simple_paths(G, source, target, weight=None): """Generate all simple paths in the graph G from source to target, starting from shortest ones. A simple path is a path with no repeated nodes. If a weighted shortest path search is to be used, no negative weights are allawed. Parameters ---------- G : NetworkX graph source : node Starting node for path target : node Ending node for path weight : string Name of the edge attribute to be used as a weight. If None all edges are considered to have unit weight. Default value None. Returns ------- path_generator: generator A generator that produces lists of simple paths, in order from shortest to longest. Raises ------ NetworkXNoPath If no path exists between source and target. NetworkXError If source or target nodes are not in the input graph. NetworkXNotImplemented If the input graph is a Multi[Di]Graph. Examples -------- >>> G = nx.cycle_graph(7) >>> paths = list(nx.shortest_simple_paths(G, 0, 3)) >>> print(paths) [[0, 1, 2, 3], [0, 6, 5, 4, 3]] You can use this function to efficiently compute the k shortest/best paths between two nodes. >>> from itertools import islice >>> def k_shortest_paths(G, source, target, k, weight=None): ... return list(islice(nx.shortest_simple_paths(G, source, target, weight=weight), k)) >>> for path in k_shortest_paths(G, 0, 3, 2): ... print(path) [0, 1, 2, 3] [0, 6, 5, 4, 3] Notes ----- This procedure is based on algorithm by Jin Y. Yen [1]_. Finding the first K paths requires O(KN^3) operations. See Also -------- all_shortest_paths shortest_path all_simple_paths References ---------- .. [1] Jin Y. Yen, "Finding the K Shortest Loopless Paths in a Network", Management Science, Vol. 17, No. 11, Theory Series (Jul., 1971), pp. 712-716. """ if source not in G: raise nx.NetworkXError('source node %s not in graph' % source) if target not in G: raise nx.NetworkXError('target node %s not in graph' % target) if weight is None: length_func = len shortest_path_func = _bidirectional_shortest_path else: def length_func(path): return sum(G.edge[u][v][weight] for (u, v) in zip(path, path[1:])) shortest_path_func = _bidirectional_dijkstra listA = list() listB = PathBuffer() prev_path = None while True: if not prev_path: length, path = shortest_path_func(G, source, target, weight=weight) listB.push(length, path) else: ignore_nodes = set() ignore_edges = set() for i in range(1, len(prev_path)): root = prev_path[:i] root_length = length_func(root) for path in listA: if path[:i] == root: ignore_edges.add((path[i-1], path[i])) ignore_nodes.add(root[-1]) try: length, spur = shortest_path_func(G, root[-1], target, ignore_nodes=ignore_nodes, ignore_edges=ignore_edges, weight=weight) path = root[:-1] + spur listB.push(root_length + length, path) except nx.NetworkXNoPath: pass if listB: path = listB.pop() yield path listA.append(path) prev_path = path else: break class PathBuffer(object): def __init__(self): self.paths = set() self.sortedpaths = list() self.counter = count() def __len__(self): return len(self.sortedpaths) def push(self, cost, path): hashable_path = tuple(path) if hashable_path not in self.paths: heappush(self.sortedpaths, (cost, next(self.counter), path)) self.paths.add(hashable_path) def pop(self): (cost, num, path) = heappop(self.sortedpaths) hashable_path = tuple(path) self.paths.remove(hashable_path) return path def _bidirectional_shortest_path(G, source, target, ignore_nodes=None, ignore_edges=None, weight=None): """Return the shortest path between source and target ignoring nodes and edges in the containers ignore_nodes and ignore_edges. This is a custom modification of the standard bidirectional shortest path implementation at networkx.algorithms.unweighted Parameters ---------- G : NetworkX graph source : node starting node for path target : node ending node for path ignore_nodes : container of nodes nodes to ignore, optional ignore_edges : container of edges edges to ignore, optional weight : None This function accepts a weight argument for convinience of shortest_simple_paths function. It will be ignored. Returns ------- path: list List of nodes in a path from source to target. Raises ------ NetworkXNoPath If no path exists between source and target. See Also -------- shortest_path """ # call helper to do the real work results=_bidirectional_pred_succ(G,source,target,ignore_nodes,ignore_edges) pred,succ,w=results # build path from pred+w+succ path=[] # from w to target while w is not None: path.append(w) w=succ[w] # from source to w w=pred[path[0]] while w is not None: path.insert(0,w) w=pred[w] return len(path), path def _bidirectional_pred_succ(G, source, target, ignore_nodes=None, ignore_edges=None): """Bidirectional shortest path helper. Returns (pred,succ,w) where pred is a dictionary of predecessors from w to the source, and succ is a dictionary of successors from w to the target. """ # does BFS from both source and target and meets in the middle if target == source: return ({target:None},{source:None},source) # handle either directed or undirected if G.is_directed(): Gpred=G.predecessors_iter Gsucc=G.successors_iter else: Gpred=G.neighbors_iter Gsucc=G.neighbors_iter # support optional nodes filter if ignore_nodes: def filter_iter(nodes_iter): def iterate(v): for w in nodes_iter(v): if w not in ignore_nodes: yield w return iterate Gpred=filter_iter(Gpred) Gsucc=filter_iter(Gsucc) # support optional edges filter if ignore_edges: if G.is_directed(): def filter_pred_iter(pred_iter): def iterate(v): for w in pred_iter(v): if (w, v) not in ignore_edges: yield w return iterate def filter_succ_iter(succ_iter): def iterate(v): for w in succ_iter(v): if (v, w) not in ignore_edges: yield w return iterate Gpred=filter_pred_iter(Gpred) Gsucc=filter_succ_iter(Gsucc) else: def filter_iter(nodes_iter): def iterate(v): for w in nodes_iter(v): if (v, w) not in ignore_edges \ and (w, v) not in ignore_edges: yield w return iterate Gpred=filter_iter(Gpred) Gsucc=filter_iter(Gsucc) # predecesssor and successors in search pred={source:None} succ={target:None} # initialize fringes, start with forward forward_fringe=[source] reverse_fringe=[target] while forward_fringe and reverse_fringe: if len(forward_fringe) <= len(reverse_fringe): this_level=forward_fringe forward_fringe=[] for v in this_level: for w in Gsucc(v): if w not in pred: forward_fringe.append(w) pred[w]=v if w in succ: # found path return pred,succ,w else: this_level=reverse_fringe reverse_fringe=[] for v in this_level: for w in Gpred(v): if w not in succ: succ[w]=v reverse_fringe.append(w) if w in pred: # found path return pred,succ,w raise nx.NetworkXNoPath("No path between %s and %s." % (source, target)) def _bidirectional_dijkstra(G, source, target, weight='weight', ignore_nodes=None, ignore_edges=None): """Dijkstra's algorithm for shortest paths using bidirectional search. This function returns the shortest path between source and target ignoring nodes and edges in the containers ignore_nodes and ignore_edges. This is a custom modification of the standard Dijkstra bidirectional shortest path implementation at networkx.algorithms.weighted Parameters ---------- G : NetworkX graph source : node Starting node. target : node Ending node. weight: string, optional (default='weight') Edge data key corresponding to the edge weight ignore_nodes : container of nodes nodes to ignore, optional ignore_edges : container of edges edges to ignore, optional Returns ------- length : number Shortest path length. Returns a tuple of two dictionaries keyed by node. The first dictionary stores distance from the source. The second stores the path from the source to that node. Raises ------ NetworkXNoPath If no path exists between source and target. Notes ----- Edge weight attributes must be numerical. Distances are calculated as sums of weighted edges traversed. In practice bidirectional Dijkstra is much more than twice as fast as ordinary Dijkstra. Ordinary Dijkstra expands nodes in a sphere-like manner from the source. The radius of this sphere will eventually be the length of the shortest path. Bidirectional Dijkstra will expand nodes from both the source and the target, making two spheres of half this radius. Volume of the first sphere is pi*r*r while the others are 2*pi*r/2*r/2, making up half the volume. This algorithm is not guaranteed to work if edge weights are negative or are floating point numbers (overflows and roundoff errors can cause problems). See Also -------- shortest_path shortest_path_length """ if source == target: return (0, [source]) # handle either directed or undirected if G.is_directed(): Gpred=G.predecessors_iter Gsucc=G.successors_iter else: Gpred=G.neighbors_iter Gsucc=G.neighbors_iter # support optional nodes filter if ignore_nodes: def filter_iter(nodes_iter): def iterate(v): for w in nodes_iter(v): if w not in ignore_nodes: yield w return iterate Gpred=filter_iter(Gpred) Gsucc=filter_iter(Gsucc) # support optional edges filter if ignore_edges: if G.is_directed(): def filter_pred_iter(pred_iter): def iterate(v): for w in pred_iter(v): if (w, v) not in ignore_edges: yield w return iterate def filter_succ_iter(succ_iter): def iterate(v): for w in succ_iter(v): if (v, w) not in ignore_edges: yield w return iterate Gpred=filter_pred_iter(Gpred) Gsucc=filter_succ_iter(Gsucc) else: def filter_iter(nodes_iter): def iterate(v): for w in nodes_iter(v): if (v, w) not in ignore_edges \ and (w, v) not in ignore_edges: yield w return iterate Gpred=filter_iter(Gpred) Gsucc=filter_iter(Gsucc) push = heappush pop = heappop # Init: Forward Backward dists = [{}, {}] # dictionary of final distances paths = [{source: [source]}, {target: [target]}] # dictionary of paths fringe = [[], []] # heap of (distance, node) tuples for # extracting next node to expand seen = [{source: 0}, {target: 0}] # dictionary of distances to # nodes seen c = count() # initialize fringe heap push(fringe[0], (0, next(c), source)) push(fringe[1], (0, next(c), target)) # neighs for extracting correct neighbor information neighs = [Gsucc, Gpred] # variables to hold shortest discovered path #finaldist = 1e30000 finalpath = [] dir = 1 while fringe[0] and fringe[1]: # choose direction # dir == 0 is forward direction and dir == 1 is back dir = 1 - dir # extract closest to expand (dist, _, v) = pop(fringe[dir]) if v in dists[dir]: # Shortest path to v has already been found continue # update distance dists[dir][v] = dist # equal to seen[dir][v] if v in dists[1 - dir]: # if we have scanned v in both directions we are done # we have now discovered the shortest path return (finaldist, finalpath) for w in neighs[dir](v): if(dir == 0): # forward if G.is_multigraph(): minweight = min((dd.get(weight, 1) for k, dd in G[v][w].items())) else: minweight = G[v][w].get(weight, 1) vwLength = dists[dir][v] + minweight # G[v][w].get(weight,1) else: # back, must remember to change v,w->w,v if G.is_multigraph(): minweight = min((dd.get(weight, 1) for k, dd in G[w][v].items())) else: minweight = G[w][v].get(weight, 1) vwLength = dists[dir][v] + minweight # G[w][v].get(weight,1) if w in dists[dir]: if vwLength < dists[dir][w]: raise ValueError( "Contradictory paths found: negative weights?") elif w not in seen[dir] or vwLength < seen[dir][w]: # relaxing seen[dir][w] = vwLength push(fringe[dir], (vwLength, next(c), w)) paths[dir][w] = paths[dir][v] + [w] if w in seen[0] and w in seen[1]: # see if this path is better than than the already # discovered shortest path totaldist = seen[0][w] + seen[1][w] if finalpath == [] or finaldist > totaldist: finaldist = totaldist revpath = paths[1][w][:] revpath.reverse() finalpath = paths[0][w] + revpath[1:] raise nx.NetworkXNoPath("No path between %s and %s." % (source, target)) networkx-1.11/networkx/algorithms/centrality/0000755000175000017500000000000012653231454021410 5ustar aricaric00000000000000networkx-1.11/networkx/algorithms/centrality/current_flow_betweenness_subset.py0000644000175000017500000002251112637544450030470 0ustar aricaric00000000000000""" Current-flow betweenness centrality measures for subsets of nodes. """ # Copyright (C) 2010-2011 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. __author__ = """Aric Hagberg (hagberg@lanl.gov)""" __all__ = ['current_flow_betweenness_centrality_subset', 'edge_current_flow_betweenness_centrality_subset'] import itertools import networkx as nx from networkx.algorithms.centrality.flow_matrix import * def current_flow_betweenness_centrality_subset(G,sources,targets, normalized=True, weight='weight', dtype=float, solver='lu'): r"""Compute current-flow betweenness centrality for subsets of nodes. Current-flow betweenness centrality uses an electrical current model for information spreading in contrast to betweenness centrality which uses shortest paths. Current-flow betweenness centrality is also known as random-walk betweenness centrality [2]_. Parameters ---------- G : graph A NetworkX graph sources: list of nodes Nodes to use as sources for current targets: list of nodes Nodes to use as sinks for current normalized : bool, optional (default=True) If True the betweenness values are normalized by b=b/(n-1)(n-2) where n is the number of nodes in G. weight : string or None, optional (default='weight') Key for edge data used as the edge weight. If None, then use 1 as each edge weight. dtype: data type (float) Default data type for internal matrices. Set to np.float32 for lower memory consumption. solver: string (default='lu') Type of linear solver to use for computing the flow matrix. Options are "full" (uses most memory), "lu" (recommended), and "cg" (uses least memory). Returns ------- nodes : dictionary Dictionary of nodes with betweenness centrality as the value. See Also -------- approximate_current_flow_betweenness_centrality betweenness_centrality edge_betweenness_centrality edge_current_flow_betweenness_centrality Notes ----- Current-flow betweenness can be computed in `O(I(n-1)+mn \log n)` time [1]_, where `I(n-1)` is the time needed to compute the inverse Laplacian. For a full matrix this is `O(n^3)` but using sparse methods you can achieve `O(nm{\sqrt k})` where `k` is the Laplacian matrix condition number. The space required is `O(nw) where `w` is the width of the sparse Laplacian matrix. Worse case is `w=n` for `O(n^2)`. If the edges have a 'weight' attribute they will be used as weights in this algorithm. Unspecified weights are set to 1. References ---------- .. [1] Centrality Measures Based on Current Flow. Ulrik Brandes and Daniel Fleischer, Proc. 22nd Symp. Theoretical Aspects of Computer Science (STACS '05). LNCS 3404, pp. 533-544. Springer-Verlag, 2005. http://www.inf.uni-konstanz.de/algo/publications/bf-cmbcf-05.pdf .. [2] A measure of betweenness centrality based on random walks, M. E. J. Newman, Social Networks 27, 39-54 (2005). """ from networkx.utils import reverse_cuthill_mckee_ordering try: import numpy as np except ImportError: raise ImportError('current_flow_betweenness_centrality requires NumPy ', 'http://scipy.org/') try: import scipy except ImportError: raise ImportError('current_flow_betweenness_centrality requires SciPy ', 'http://scipy.org/') if G.is_directed(): raise nx.NetworkXError('current_flow_betweenness_centrality() ', 'not defined for digraphs.') if not nx.is_connected(G): raise nx.NetworkXError("Graph not connected.") n = G.number_of_nodes() ordering = list(reverse_cuthill_mckee_ordering(G)) # make a copy with integer labels according to rcm ordering # this could be done without a copy if we really wanted to mapping=dict(zip(ordering,range(n))) H = nx.relabel_nodes(G,mapping) betweenness = dict.fromkeys(H,0.0) # b[v]=0 for v in H for row,(s,t) in flow_matrix_row(H, weight=weight, dtype=dtype, solver=solver): for ss in sources: i=mapping[ss] for tt in targets: j=mapping[tt] betweenness[s]+=0.5*np.abs(row[i]-row[j]) betweenness[t]+=0.5*np.abs(row[i]-row[j]) if normalized: nb=(n-1.0)*(n-2.0) # normalization factor else: nb=2.0 for v in H: betweenness[v]=betweenness[v]/nb+1.0/(2-n) return dict((ordering[k],v) for k,v in betweenness.items()) def edge_current_flow_betweenness_centrality_subset(G, sources, targets, normalized=True, weight='weight', dtype=float, solver='lu'): """Compute current-flow betweenness centrality for edges using subsets of nodes. Current-flow betweenness centrality uses an electrical current model for information spreading in contrast to betweenness centrality which uses shortest paths. Current-flow betweenness centrality is also known as random-walk betweenness centrality [2]_. Parameters ---------- G : graph A NetworkX graph sources: list of nodes Nodes to use as sources for current targets: list of nodes Nodes to use as sinks for current normalized : bool, optional (default=True) If True the betweenness values are normalized by b=b/(n-1)(n-2) where n is the number of nodes in G. weight : string or None, optional (default='weight') Key for edge data used as the edge weight. If None, then use 1 as each edge weight. dtype: data type (float) Default data type for internal matrices. Set to np.float32 for lower memory consumption. solver: string (default='lu') Type of linear solver to use for computing the flow matrix. Options are "full" (uses most memory), "lu" (recommended), and "cg" (uses least memory). Returns ------- nodes : dictionary Dictionary of edge tuples with betweenness centrality as the value. See Also -------- betweenness_centrality edge_betweenness_centrality current_flow_betweenness_centrality Notes ----- Current-flow betweenness can be computed in `O(I(n-1)+mn \log n)` time [1]_, where `I(n-1)` is the time needed to compute the inverse Laplacian. For a full matrix this is `O(n^3)` but using sparse methods you can achieve `O(nm{\sqrt k})` where `k` is the Laplacian matrix condition number. The space required is `O(nw) where `w` is the width of the sparse Laplacian matrix. Worse case is `w=n` for `O(n^2)`. If the edges have a 'weight' attribute they will be used as weights in this algorithm. Unspecified weights are set to 1. References ---------- .. [1] Centrality Measures Based on Current Flow. Ulrik Brandes and Daniel Fleischer, Proc. 22nd Symp. Theoretical Aspects of Computer Science (STACS '05). LNCS 3404, pp. 533-544. Springer-Verlag, 2005. http://www.inf.uni-konstanz.de/algo/publications/bf-cmbcf-05.pdf .. [2] A measure of betweenness centrality based on random walks, M. E. J. Newman, Social Networks 27, 39-54 (2005). """ from networkx.utils import reverse_cuthill_mckee_ordering try: import numpy as np except ImportError: raise ImportError('current_flow_betweenness_centrality requires NumPy ', 'http://scipy.org/') try: import scipy except ImportError: raise ImportError('current_flow_betweenness_centrality requires SciPy ', 'http://scipy.org/') if G.is_directed(): raise nx.NetworkXError('edge_current_flow_betweenness_centrality ', 'not defined for digraphs.') if not nx.is_connected(G): raise nx.NetworkXError("Graph not connected.") n = G.number_of_nodes() ordering = list(reverse_cuthill_mckee_ordering(G)) # make a copy with integer labels according to rcm ordering # this could be done without a copy if we really wanted to mapping=dict(zip(ordering,range(n))) H = nx.relabel_nodes(G,mapping) betweenness=(dict.fromkeys(H.edges(),0.0)) if normalized: nb=(n-1.0)*(n-2.0) # normalization factor else: nb=2.0 for row,(e) in flow_matrix_row(H, weight=weight, dtype=dtype, solver=solver): for ss in sources: i=mapping[ss] for tt in targets: j=mapping[tt] betweenness[e]+=0.5*np.abs(row[i]-row[j]) betweenness[e]/=nb return dict(((ordering[s],ordering[t]),v) for (s,t),v in betweenness.items()) # fixture for nose tests def setup_module(module): from nose import SkipTest try: import numpy import scipy except: raise SkipTest("NumPy not available") networkx-1.11/networkx/algorithms/centrality/communicability_alg.py0000644000175000017500000003357512637544500026013 0ustar aricaric00000000000000""" Communicability and centrality measures. """ # Copyright (C) 2011 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import networkx as nx from networkx.utils import * __author__ = "\n".join(['Aric Hagberg (hagberg@lanl.gov)', 'Franck Kalala (franckkalala@yahoo.fr']) __all__ = ['communicability_centrality_exp', 'communicability_centrality', 'communicability_betweenness_centrality', 'communicability', 'communicability_exp', 'estrada_index', ] @not_implemented_for('directed') @not_implemented_for('multigraph') def communicability_centrality_exp(G): r"""Return the communicability centrality for each node of G Communicability centrality, also called subgraph centrality, of a node `n` is the sum of closed walks of all lengths starting and ending at node `n`. Parameters ---------- G: graph Returns ------- nodes:dictionary Dictionary of nodes with communicability centrality as the value. Raises ------ NetworkXError If the graph is not undirected and simple. See Also -------- communicability: Communicability between all pairs of nodes in G. communicability_centrality: Communicability centrality for each node of G. Notes ----- This version of the algorithm exponentiates the adjacency matrix. The communicability centrality of a node `u` in G can be found using the matrix exponential of the adjacency matrix of G [1]_ [2]_, .. math:: SC(u)=(e^A)_{uu} . References ---------- .. [1] Ernesto Estrada, Juan A. Rodriguez-Velazquez, "Subgraph centrality in complex networks", Physical Review E 71, 056103 (2005). http://arxiv.org/abs/cond-mat/0504730 .. [2] Ernesto Estrada, Naomichi Hatano, "Communicability in complex networks", Phys. Rev. E 77, 036111 (2008). http://arxiv.org/abs/0707.0756 Examples -------- >>> G = nx.Graph([(0,1),(1,2),(1,5),(5,4),(2,4),(2,3),(4,3),(3,6)]) >>> sc = nx.communicability_centrality_exp(G) """ # alternative implementation that calculates the matrix exponential import scipy.linalg nodelist = G.nodes() # ordering of nodes in matrix A = nx.to_numpy_matrix(G,nodelist) # convert to 0-1 matrix A[A!=0.0] = 1 expA = scipy.linalg.expm(A) # convert diagonal to dictionary keyed by node sc = dict(zip(nodelist,map(float,expA.diagonal()))) return sc @not_implemented_for('directed') @not_implemented_for('multigraph') def communicability_centrality(G): r"""Return communicability centrality for each node in G. Communicability centrality, also called subgraph centrality, of a node `n` is the sum of closed walks of all lengths starting and ending at node `n`. Parameters ---------- G: graph Returns ------- nodes: dictionary Dictionary of nodes with communicability centrality as the value. Raises ------ NetworkXError If the graph is not undirected and simple. See Also -------- communicability: Communicability between all pairs of nodes in G. communicability_centrality: Communicability centrality for each node of G. Notes ----- This version of the algorithm computes eigenvalues and eigenvectors of the adjacency matrix. Communicability centrality of a node `u` in G can be found using a spectral decomposition of the adjacency matrix [1]_ [2]_, .. math:: SC(u)=\sum_{j=1}^{N}(v_{j}^{u})^2 e^{\lambda_{j}}, where `v_j` is an eigenvector of the adjacency matrix `A` of G corresponding corresponding to the eigenvalue `\lambda_j`. Examples -------- >>> G = nx.Graph([(0,1),(1,2),(1,5),(5,4),(2,4),(2,3),(4,3),(3,6)]) >>> sc = nx.communicability_centrality(G) References ---------- .. [1] Ernesto Estrada, Juan A. Rodriguez-Velazquez, "Subgraph centrality in complex networks", Physical Review E 71, 056103 (2005). http://arxiv.org/abs/cond-mat/0504730 .. [2] Ernesto Estrada, Naomichi Hatano, "Communicability in complex networks", Phys. Rev. E 77, 036111 (2008). http://arxiv.org/abs/0707.0756 """ import numpy import numpy.linalg nodelist = G.nodes() # ordering of nodes in matrix A = nx.to_numpy_matrix(G,nodelist) # convert to 0-1 matrix A[A!=0.0] = 1 w,v = numpy.linalg.eigh(A) vsquare = numpy.array(v)**2 expw = numpy.exp(w) xg = numpy.dot(vsquare,expw) # convert vector dictionary keyed by node sc = dict(zip(nodelist,map(float,xg))) return sc @not_implemented_for('directed') @not_implemented_for('multigraph') def communicability_betweenness_centrality(G, normalized=True): r"""Return communicability betweenness for all pairs of nodes in G. Communicability betweenness measure makes use of the number of walks connecting every pair of nodes as the basis of a betweenness centrality measure. Parameters ---------- G: graph Returns ------- nodes:dictionary Dictionary of nodes with communicability betweenness as the value. Raises ------ NetworkXError If the graph is not undirected and simple. See Also -------- communicability: Communicability between all pairs of nodes in G. communicability_centrality: Communicability centrality for each node of G using matrix exponential. communicability_centrality_exp: Communicability centrality for each node in G using spectral decomposition. Notes ----- Let `G=(V,E)` be a simple undirected graph with `n` nodes and `m` edges, and `A` denote the adjacency matrix of `G`. Let `G(r)=(V,E(r))` be the graph resulting from removing all edges connected to node `r` but not the node itself. The adjacency matrix for `G(r)` is `A+E(r)`, where `E(r)` has nonzeros only in row and column `r`. The communicability betweenness of a node `r` is [1]_ .. math:: \omega_{r} = \frac{1}{C}\sum_{p}\sum_{q}\frac{G_{prq}}{G_{pq}}, p\neq q, q\neq r, where `G_{prq}=(e^{A}_{pq} - (e^{A+E(r)})_{pq}` is the number of walks involving node r, `G_{pq}=(e^{A})_{pq}` is the number of closed walks starting at node `p` and ending at node `q`, and `C=(n-1)^{2}-(n-1)` is a normalization factor equal to the number of terms in the sum. The resulting `\omega_{r}` takes values between zero and one. The lower bound cannot be attained for a connected graph, and the upper bound is attained in the star graph. References ---------- .. [1] Ernesto Estrada, Desmond J. Higham, Naomichi Hatano, "Communicability Betweenness in Complex Networks" Physica A 388 (2009) 764-774. http://arxiv.org/abs/0905.4102 Examples -------- >>> G = nx.Graph([(0,1),(1,2),(1,5),(5,4),(2,4),(2,3),(4,3),(3,6)]) >>> cbc = nx.communicability_betweenness_centrality(G) """ import scipy import scipy.linalg nodelist = G.nodes() # ordering of nodes in matrix n = len(nodelist) A = nx.to_numpy_matrix(G,nodelist) # convert to 0-1 matrix A[A!=0.0] = 1 expA = scipy.linalg.expm(A) mapping = dict(zip(nodelist,range(n))) sc = {} for v in G: # remove row and col of node v i = mapping[v] row = A[i,:].copy() col = A[:,i].copy() A[i,:] = 0 A[:,i] = 0 B = (expA - scipy.linalg.expm(A)) / expA # sum with row/col of node v and diag set to zero B[i,:] = 0 B[:,i] = 0 B -= scipy.diag(scipy.diag(B)) sc[v] = float(B.sum()) # put row and col back A[i,:] = row A[:,i] = col # rescaling sc = _rescale(sc,normalized=normalized) return sc def _rescale(sc,normalized): # helper to rescale betweenness centrality if normalized is True: order=len(sc) if order <=2: scale=None else: scale=1.0/((order-1.0)**2-(order-1.0)) if scale is not None: for v in sc: sc[v] *= scale return sc @not_implemented_for('directed') @not_implemented_for('multigraph') def communicability(G): r"""Return communicability between all pairs of nodes in G. The communicability between pairs of nodes in G is the sum of closed walks of different lengths starting at node u and ending at node v. Parameters ---------- G: graph Returns ------- comm: dictionary of dictionaries Dictionary of dictionaries keyed by nodes with communicability as the value. Raises ------ NetworkXError If the graph is not undirected and simple. See Also -------- communicability_centrality_exp: Communicability centrality for each node of G using matrix exponential. communicability_centrality: Communicability centrality for each node in G using spectral decomposition. communicability: Communicability between pairs of nodes in G. Notes ----- This algorithm uses a spectral decomposition of the adjacency matrix. Let G=(V,E) be a simple undirected graph. Using the connection between the powers of the adjacency matrix and the number of walks in the graph, the communicability between nodes `u` and `v` based on the graph spectrum is [1]_ .. math:: C(u,v)=\sum_{j=1}^{n}\phi_{j}(u)\phi_{j}(v)e^{\lambda_{j}}, where `\phi_{j}(u)` is the `u\rm{th}` element of the `j\rm{th}` orthonormal eigenvector of the adjacency matrix associated with the eigenvalue `\lambda_{j}`. References ---------- .. [1] Ernesto Estrada, Naomichi Hatano, "Communicability in complex networks", Phys. Rev. E 77, 036111 (2008). http://arxiv.org/abs/0707.0756 Examples -------- >>> G = nx.Graph([(0,1),(1,2),(1,5),(5,4),(2,4),(2,3),(4,3),(3,6)]) >>> c = nx.communicability(G) """ import numpy import scipy.linalg nodelist = G.nodes() # ordering of nodes in matrix A = nx.to_numpy_matrix(G,nodelist) # convert to 0-1 matrix A[A!=0.0] = 1 w,vec = numpy.linalg.eigh(A) expw = numpy.exp(w) mapping = dict(zip(nodelist,range(len(nodelist)))) sc={} # computing communicabilities for u in G: sc[u]={} for v in G: s = 0 p = mapping[u] q = mapping[v] for j in range(len(nodelist)): s += vec[:,j][p,0]*vec[:,j][q,0]*expw[j] sc[u][v] = float(s) return sc @not_implemented_for('directed') @not_implemented_for('multigraph') def communicability_exp(G): r"""Return communicability between all pairs of nodes in G. Communicability between pair of node (u,v) of node in G is the sum of closed walks of different lengths starting at node u and ending at node v. Parameters ---------- G: graph Returns ------- comm: dictionary of dictionaries Dictionary of dictionaries keyed by nodes with communicability as the value. Raises ------ NetworkXError If the graph is not undirected and simple. See Also -------- communicability_centrality_exp: Communicability centrality for each node of G using matrix exponential. communicability_centrality: Communicability centrality for each node in G using spectral decomposition. communicability_exp: Communicability between all pairs of nodes in G using spectral decomposition. Notes ----- This algorithm uses matrix exponentiation of the adjacency matrix. Let G=(V,E) be a simple undirected graph. Using the connection between the powers of the adjacency matrix and the number of walks in the graph, the communicability between nodes u and v is [1]_, .. math:: C(u,v) = (e^A)_{uv}, where `A` is the adjacency matrix of G. References ---------- .. [1] Ernesto Estrada, Naomichi Hatano, "Communicability in complex networks", Phys. Rev. E 77, 036111 (2008). http://arxiv.org/abs/0707.0756 Examples -------- >>> G = nx.Graph([(0,1),(1,2),(1,5),(5,4),(2,4),(2,3),(4,3),(3,6)]) >>> c = nx.communicability_exp(G) """ import scipy.linalg nodelist = G.nodes() # ordering of nodes in matrix A = nx.to_numpy_matrix(G,nodelist) # convert to 0-1 matrix A[A!=0.0] = 1 # communicability matrix expA = scipy.linalg.expm(A) mapping = dict(zip(nodelist,range(len(nodelist)))) sc = {} for u in G: sc[u]={} for v in G: sc[u][v] = float(expA[mapping[u],mapping[v]]) return sc def estrada_index(G): r"""Return the Estrada index of a the graph G. Parameters ---------- G: graph Returns ------- estrada index: float Raises ------ NetworkXError If the graph is not undirected and simple. See also -------- estrada_index_exp Notes ----- Let `G=(V,E)` be a simple undirected graph with `n` nodes and let `\lambda_{1}\leq\lambda_{2}\leq\cdots\lambda_{n}` be a non-increasing ordering of the eigenvalues of its adjacency matrix `A`. The Estrada index is .. math:: EE(G)=\sum_{j=1}^n e^{\lambda _j}. References ---------- .. [1] E. Estrada, Characterization of 3D molecular structure, Chem. Phys. Lett. 319, 713 (2000). Examples -------- >>> G=nx.Graph([(0,1),(1,2),(1,5),(5,4),(2,4),(2,3),(4,3),(3,6)]) >>> ei=nx.estrada_index(G) """ return sum(communicability_centrality(G).values()) # fixture for nose tests def setup_module(module): from nose import SkipTest try: import numpy except: raise SkipTest("NumPy not available") try: import scipy except: raise SkipTest("SciPy not available") networkx-1.11/networkx/algorithms/centrality/eigenvector.py0000644000175000017500000001546312637544500024306 0ustar aricaric00000000000000# coding=utf8 """ Eigenvector centrality. """ # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import networkx as nx __author__ = "\n".join(['Aric Hagberg (aric.hagberg@gmail.com)', 'Pieter Swart (swart@lanl.gov)', 'Sasha Gutfraind (ag362@cornell.edu)']) __all__ = ['eigenvector_centrality', 'eigenvector_centrality_numpy'] def eigenvector_centrality(G, max_iter=100, tol=1.0e-6, nstart=None, weight='weight'): """Compute the eigenvector centrality for the graph G. Eigenvector centrality computes the centrality for a node based on the centrality of its neighbors. The eigenvector centrality for node `i` is .. math:: \mathbf{Ax} = \lambda \mathbf{x} where `A` is the adjacency matrix of the graph G with eigenvalue `\lambda`. By virtue of the Perron–Frobenius theorem, there is a unique and positive solution if `\lambda` is the largest eigenvalue associated with the eigenvector of the adjacency matrix `A` ([2]_). Parameters ---------- G : graph A networkx graph max_iter : integer, optional Maximum number of iterations in power method. tol : float, optional Error tolerance used to check convergence in power method iteration. nstart : dictionary, optional Starting value of eigenvector iteration for each node. weight : None or string, optional If None, all edge weights are considered equal. Otherwise holds the name of the edge attribute used as weight. Returns ------- nodes : dictionary Dictionary of nodes with eigenvector centrality as the value. Examples -------- >>> G = nx.path_graph(4) >>> centrality = nx.eigenvector_centrality(G) >>> print(['%s %0.2f'%(node,centrality[node]) for node in centrality]) ['0 0.37', '1 0.60', '2 0.60', '3 0.37'] See Also -------- eigenvector_centrality_numpy pagerank hits Notes ----- The measure was introduced by [1]_. The eigenvector calculation is done by the power iteration method and has no guarantee of convergence. The iteration will stop after ``max_iter`` iterations or an error tolerance of ``number_of_nodes(G)*tol`` has been reached. For directed graphs this is "left" eigenvector centrality which corresponds to the in-edges in the graph. For out-edges eigenvector centrality first reverse the graph with ``G.reverse()``. References ---------- .. [1] Phillip Bonacich: Power and Centrality: A Family of Measures. American Journal of Sociology 92(5):1170–1182, 1986 http://www.leonidzhukov.net/hse/2014/socialnetworks/papers/Bonacich-Centrality.pdf .. [2] Mark E. J. Newman: Networks: An Introduction. Oxford University Press, USA, 2010, pp. 169. """ from math import sqrt if type(G) == nx.MultiGraph or type(G) == nx.MultiDiGraph: raise nx.NetworkXException("Not defined for multigraphs.") if len(G) == 0: raise nx.NetworkXException("Empty graph.") if nstart is None: # choose starting vector with entries of 1/len(G) x = dict([(n,1.0/len(G)) for n in G]) else: x = nstart # normalize starting vector s = 1.0/sum(x.values()) for k in x: x[k] *= s nnodes = G.number_of_nodes() # make up to max_iter iterations for i in range(max_iter): xlast = x x = dict.fromkeys(xlast, 0) # do the multiplication y^T = x^T A for n in x: for nbr in G[n]: x[nbr] += xlast[n] * G[n][nbr].get(weight, 1) # normalize vector try: s = 1.0/sqrt(sum(v**2 for v in x.values())) # this should never be zero? except ZeroDivisionError: s = 1.0 for n in x: x[n] *= s # check convergence err = sum([abs(x[n]-xlast[n]) for n in x]) if err < nnodes*tol: return x raise nx.NetworkXError("""eigenvector_centrality(): power iteration failed to converge in %d iterations."%(i+1))""") def eigenvector_centrality_numpy(G, weight='weight'): """Compute the eigenvector centrality for the graph G. Eigenvector centrality computes the centrality for a node based on the centrality of its neighbors. The eigenvector centrality for node `i` is .. math:: \mathbf{Ax} = \lambda \mathbf{x} where `A` is the adjacency matrix of the graph G with eigenvalue `\lambda`. By virtue of the Perron–Frobenius theorem, there is a unique and positive solution if `\lambda` is the largest eigenvalue associated with the eigenvector of the adjacency matrix `A` ([2]_). Parameters ---------- G : graph A networkx graph weight : None or string, optional The name of the edge attribute used as weight. If None, all edge weights are considered equal. Returns ------- nodes : dictionary Dictionary of nodes with eigenvector centrality as the value. Examples -------- >>> G = nx.path_graph(4) >>> centrality = nx.eigenvector_centrality_numpy(G) >>> print(['%s %0.2f'%(node,centrality[node]) for node in centrality]) ['0 0.37', '1 0.60', '2 0.60', '3 0.37'] See Also -------- eigenvector_centrality pagerank hits Notes ----- The measure was introduced by [1]_. This algorithm uses the SciPy sparse eigenvalue solver (ARPACK) to find the largest eigenvalue/eigenvector pair. For directed graphs this is "left" eigenvector centrality which corresponds to the in-edges in the graph. For out-edges eigenvector centrality first reverse the graph with G.reverse(). References ---------- .. [1] Phillip Bonacich: Power and Centrality: A Family of Measures. American Journal of Sociology 92(5):1170–1182, 1986 http://www.leonidzhukov.net/hse/2014/socialnetworks/papers/Bonacich-Centrality.pdf .. [2] Mark E. J. Newman: Networks: An Introduction. Oxford University Press, USA, 2010, pp. 169. """ import scipy as sp from scipy.sparse import linalg if len(G) == 0: raise nx.NetworkXException('Empty graph.') M = nx.to_scipy_sparse_matrix(G, nodelist=G.nodes(), weight=weight, dtype=float) eigenvalue, eigenvector = linalg.eigs(M.T, k=1, which='LR') largest = eigenvector.flatten().real norm = sp.sign(largest.sum())*sp.linalg.norm(largest) centrality = dict(zip(G,map(float,largest/norm))) return centrality # fixture for nose tests def setup_module(module): from nose import SkipTest try: import scipy except: raise SkipTest("SciPy not available") networkx-1.11/networkx/algorithms/centrality/dispersion.py0000644000175000017500000000700712637544450024152 0ustar aricaric00000000000000from itertools import combinations __author__ = "\n".join(['Ben Edwards (bedwards@cs.unm.edu)', 'Huston Hedinger (h@graphalchemist.com)', 'Dan Schult (dschult@colgate.edu)']) __all__ = ['dispersion'] def dispersion(G, u=None, v=None, normalized=True, alpha=1.0, b=0.0, c=0.0): r"""Calculate dispersion between `u` and `v` in `G`. A link between two actors (`u` and `v`) has a high dispersion when their mutual ties (`s` and `t`) are not well connected with each other. Parameters ---------- G : graph A NetworkX graph. u : node, optional The source for the dispersion score (e.g. ego node of the network). v : node, optional The target of the dispersion score if specified. normalized : bool If True (default) normalize by the embededness of the nodes (u and v). Returns ------- nodes : dictionary If u (v) is specified, returns a dictionary of nodes with dispersion score for all "target" ("source") nodes. If neither u nor v is specified, returns a dictionary of dictionaries for all nodes 'u' in the graph with a dispersion score for each node 'v'. Notes ----- This implementation follows Lars Backstrom and Jon Kleinberg [1]_. Typical usage would be to run dispersion on the ego network :math:`G_u` if `u` were specified. Running :func:`dispersion` with neither `u` nor `v` specified can take some time to complete. References ---------- .. [1] Romantic Partnerships and the Dispersion of Social Ties: A Network Analysis of Relationship Status on Facebook. Lars Backstrom, Jon Kleinberg. http://arxiv.org/pdf/1310.6753v1.pdf """ def _dispersion(G_u, u, v): """dispersion for all nodes 'v' in a ego network G_u of node 'u'""" u_nbrs = set(G_u[u]) ST = set(n for n in G_u[v] if n in u_nbrs) set_uv=set([u,v]) #all possible ties of connections that u and b share possib = combinations(ST, 2) total = 0 for (s,t) in possib: #neighbors of s that are in G_u, not including u and v nbrs_s = u_nbrs.intersection(G_u[s]) - set_uv #s and t are not directly connected if not t in nbrs_s: #s and t do not share a connection if nbrs_s.isdisjoint(G_u[t]): #tick for disp(u, v) total += 1 #neighbors that u and v share embededness = len(ST) if normalized: if embededness + c != 0: norm_disp = ((total + b)**alpha)/(embededness + c) else: norm_disp = (total+b)**alpha dispersion = norm_disp else: dispersion = total return dispersion if u is None: # v and u are not specified if v is None: results = dict((n,{}) for n in G) for u in G: for v in G[u]: results[u][v] = _dispersion(G, u, v) # u is not specified, but v is else: results = dict.fromkeys(G[v], {}) for u in G[v]: results[u] = _dispersion(G, v, u) else: # u is specified with no target v if v is None: results = dict.fromkeys(G[u], {}) for v in G[u]: results[v] = _dispersion(G, u, v) # both u and v are specified else: results = _dispersion(G, u, v) return results networkx-1.11/networkx/algorithms/centrality/load.py0000644000175000017500000001473112637544500022710 0ustar aricaric00000000000000# coding=utf8 """ Load centrality. """ # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. __author__ = "\n".join(['Aric Hagberg (hagberg@lanl.gov)', 'Pieter Swart (swart@lanl.gov)', 'Sasha Gutfraind (ag362@cornell.edu)']) __all__ = ['load_centrality', 'edge_load'] import networkx as nx def newman_betweenness_centrality(G,v=None,cutoff=None, normalized=True, weight=None): """Compute load centrality for nodes. The load centrality of a node is the fraction of all shortest paths that pass through that node. Parameters ---------- G : graph A networkx graph normalized : bool, optional If True the betweenness values are normalized by b=b/(n-1)(n-2) where n is the number of nodes in G. weight : None or string, optional If None, edge weights are ignored. Otherwise holds the name of the edge attribute used as weight. cutoff : bool, optional If specified, only consider paths of length <= cutoff. Returns ------- nodes : dictionary Dictionary of nodes with centrality as the value. See Also -------- betweenness_centrality() Notes ----- Load centrality is slightly different than betweenness. It was originally introduced by [2]_. For this load algorithm see [1]_. References ---------- .. [1] Mark E. J. Newman: Scientific collaboration networks. II. Shortest paths, weighted networks, and centrality. Physical Review E 64, 016132, 2001. http://journals.aps.org/pre/abstract/10.1103/PhysRevE.64.016132 .. [2] Kwang-Il Goh, Byungnam Kahng and Doochul Kim Universal behavior of Load Distribution in Scale-Free Networks. Physical Review Letters 87(27):1–4, 2001. http://phya.snu.ac.kr/~dkim/PRL87278701.pdf """ if v is not None: # only one node betweenness=0.0 for source in G: ubetween = _node_betweenness(G, source, cutoff, False, weight) betweenness += ubetween[v] if v in ubetween else 0 if normalized: order = G.order() if order <= 2: return betweenness # no normalization b=0 for all nodes betweenness *= 1.0 / ((order-1) * (order-2)) return betweenness else: betweenness = {}.fromkeys(G,0.0) for source in betweenness: ubetween = _node_betweenness(G, source, cutoff, False, weight) for vk in ubetween: betweenness[vk] += ubetween[vk] if normalized: order = G.order() if order <= 2: return betweenness # no normalization b=0 for all nodes scale = 1.0 / ((order-1) * (order-2)) for v in betweenness: betweenness[v] *= scale return betweenness # all nodes def _node_betweenness(G,source,cutoff=False,normalized=True,weight=None): """Node betweenness helper: see betweenness_centrality for what you probably want. This actually computes "load" and not betweenness. See https://networkx.lanl.gov/ticket/103 This calculates the load of each node for paths from a single source. (The fraction of number of shortests paths from source that go through each node.) To get the load for a node you need to do all-pairs shortest paths. If weight is not None then use Dijkstra for finding shortest paths. In this case a cutoff is not implemented and so is ignored. """ # get the predecessor and path length data if weight is None: (pred,length)=nx.predecessor(G,source,cutoff=cutoff,return_seen=True) else: (pred,length)=nx.dijkstra_predecessor_and_distance(G,source,weight=weight) # order the nodes by path length onodes = [ (l,vert) for (vert,l) in length.items() ] onodes.sort() onodes[:] = [vert for (l,vert) in onodes if l>0] # intialize betweenness between={}.fromkeys(length,1.0) while onodes: v=onodes.pop() if v in pred: num_paths=len(pred[v]) # Discount betweenness if more than for x in pred[v]: # one shortest path. if x==source: # stop if hit source because all remaining v break # also have pred[v]==[source] between[x]+=between[v]/float(num_paths) # remove source for v in between: between[v]-=1 # rescale to be between 0 and 1 if normalized: l=len(between) if l > 2: scale=1.0/float((l-1)*(l-2)) # 1/the number of possible paths for v in between: between[v] *= scale return between load_centrality=newman_betweenness_centrality def edge_load(G,nodes=None,cutoff=False): """Compute edge load. WARNING: This module is for demonstration and testing purposes. """ betweenness={} if not nodes: # find betweenness for every node in graph nodes=G.nodes() # that probably is what you want... for source in nodes: ubetween=_edge_betweenness(G,source,nodes,cutoff=cutoff) for v in ubetween.keys(): b=betweenness.setdefault(v,0) # get or set default betweenness[v]=ubetween[v]+b # cumulative total return betweenness def _edge_betweenness(G,source,nodes,cutoff=False): """ Edge betweenness helper. """ between={} # get the predecessor data #(pred,length)=_fast_predecessor(G,source,cutoff=cutoff) (pred,length)=nx.predecessor(G,source,cutoff=cutoff,return_seen=True) # order the nodes by path length onodes = [ nn for dd,nn in sorted( (dist,n) for n,dist in length.items() )] # intialize betweenness, doesn't account for any edge weights for u,v in G.edges(nodes): between[(u,v)]=1.0 between[(v,u)]=1.0 while onodes: # work through all paths v=onodes.pop() if v in pred: num_paths=len(pred[v]) # Discount betweenness if more than for w in pred[v]: # one shortest path. if w in pred: num_paths=len(pred[w]) # Discount betweenness, mult path for x in pred[w]: between[(w,x)]+=between[(v,w)]/num_paths between[(x,w)]+=between[(w,v)]/num_paths return between networkx-1.11/networkx/algorithms/centrality/harmonic.py0000644000175000017500000000441012637544500023562 0ustar aricaric00000000000000""" Harmonic centrality measure. """ # Copyright (C) 2015 by # Alessandro Luongo # BSD license. from __future__ import division import functools import networkx as nx __author__ = "\n".join(['Alessandro Luongo (alessandro.luongo@studenti.unimi.it']) __all__ = ['harmonic_centrality'] def harmonic_centrality(G, distance=None): r"""Compute harmonic centrality for nodes. Harmonic centrality [1]_ of a node `u` is the sum of the reciprocal of the shortest path distances from all other nodes to `u` .. math:: C(u) = \sum_{v \neq u} \frac{1}{d(v, u)} where `d(v, u)` is the shortest-path distance between `v` and `u`. Notice that higher values indicate higher centrality. Parameters ---------- G : graph A NetworkX graph distance : edge attribute key, optional (default=None) Use the specified edge attribute as the edge distance in shortest path calculations. If `None`, then each edge will have distance equal to 1. Returns ------- nodes : dictionary Dictionary of nodes with harmonic centrality as the value. See Also -------- betweenness_centrality, load_centrality, eigenvector_centrality, degree_centrality, closeness_centrality Notes ----- If the 'distance' keyword is set to an edge attribute key then the shortest-path length will be computed using Dijkstra's algorithm with that edge attribute as the edge weight. References ---------- .. [1] Boldi, Paolo, and Sebastiano Vigna. "Axioms for centrality." Internet Mathematics 10.3-4 (2014): 222-262. """ if distance is not None: # use Dijkstra's algorithm with specified attribute as edge weight path_length = functools.partial(nx.all_pairs_dijkstra_path_length, weight=distance) else: path_length = nx.all_pairs_shortest_path_length nodes = G.nodes() harmonic_centrality = {} if len(G) <= 1: for singleton in nodes: harmonic_centrality[singleton] = 0.0 return harmonic_centrality sp = path_length(G.reverse() if G.is_directed() else G) for n in nodes: harmonic_centrality[n] = sum([1/i if i > 0 else 0 for i in sp[n].values()]) return harmonic_centrality networkx-1.11/networkx/algorithms/centrality/degree_alg.py0000644000175000017500000000714712637544500024052 0ustar aricaric00000000000000""" Degree centrality measures. """ # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. __author__ = "\n".join(['Aric Hagberg (hagberg@lanl.gov)', 'Pieter Swart (swart@lanl.gov)', 'Sasha Gutfraind (ag362@cornell.edu)']) __all__ = ['degree_centrality', 'in_degree_centrality', 'out_degree_centrality'] import networkx as nx def degree_centrality(G): """Compute the degree centrality for nodes. The degree centrality for a node v is the fraction of nodes it is connected to. Parameters ---------- G : graph A networkx graph Returns ------- nodes : dictionary Dictionary of nodes with degree centrality as the value. See Also -------- betweenness_centrality, load_centrality, eigenvector_centrality Notes ----- The degree centrality values are normalized by dividing by the maximum possible degree in a simple graph n-1 where n is the number of nodes in G. For multigraphs or graphs with self loops the maximum degree might be higher than n-1 and values of degree centrality greater than 1 are possible. """ centrality={} s=1.0/(len(G)-1.0) centrality=dict((n,d*s) for n,d in G.degree_iter()) return centrality def in_degree_centrality(G): """Compute the in-degree centrality for nodes. The in-degree centrality for a node v is the fraction of nodes its incoming edges are connected to. Parameters ---------- G : graph A NetworkX graph Returns ------- nodes : dictionary Dictionary of nodes with in-degree centrality as values. Raises ------ NetworkXError If the graph is undirected. See Also -------- degree_centrality, out_degree_centrality Notes ----- The degree centrality values are normalized by dividing by the maximum possible degree in a simple graph n-1 where n is the number of nodes in G. For multigraphs or graphs with self loops the maximum degree might be higher than n-1 and values of degree centrality greater than 1 are possible. """ if not G.is_directed(): raise nx.NetworkXError(\ "in_degree_centrality() not defined for undirected graphs.") centrality={} s=1.0/(len(G)-1.0) centrality=dict((n,d*s) for n,d in G.in_degree_iter()) return centrality def out_degree_centrality(G): """Compute the out-degree centrality for nodes. The out-degree centrality for a node v is the fraction of nodes its outgoing edges are connected to. Parameters ---------- G : graph A NetworkX graph Returns ------- nodes : dictionary Dictionary of nodes with out-degree centrality as values. Raises ------ NetworkXError If the graph is undirected. See Also -------- degree_centrality, in_degree_centrality Notes ----- The degree centrality values are normalized by dividing by the maximum possible degree in a simple graph n-1 where n is the number of nodes in G. For multigraphs or graphs with self loops the maximum degree might be higher than n-1 and values of degree centrality greater than 1 are possible. """ if not G.is_directed(): raise nx.NetworkXError(\ "out_degree_centrality() not defined for undirected graphs.") centrality={} s=1.0/(len(G)-1.0) centrality=dict((n,d*s) for n,d in G.out_degree_iter()) return centrality networkx-1.11/networkx/algorithms/centrality/flow_matrix.py0000644000175000017500000001024012637544500024313 0ustar aricaric00000000000000# Helpers for current-flow betweenness and current-flow closness # Lazy computations for inverse Laplacian and flow-matrix rows. import networkx as nx def flow_matrix_row(G, weight='weight', dtype=float, solver='lu'): # Generate a row of the current-flow matrix import numpy as np from scipy import sparse from scipy.sparse import linalg solvername={"full" :FullInverseLaplacian, "lu": SuperLUInverseLaplacian, "cg": CGInverseLaplacian} n = G.number_of_nodes() L = laplacian_sparse_matrix(G, nodelist=range(n), weight=weight, dtype=dtype, format='csc') C = solvername[solver](L, dtype=dtype) # initialize solver w = C.w # w is the Laplacian matrix width # row-by-row flow matrix for u,v,d in G.edges_iter(data=True): B = np.zeros(w, dtype=dtype) c = d.get(weight,1.0) B[u%w] = c B[v%w] = -c # get only the rows needed in the inverse laplacian # and multiply to get the flow matrix row row = np.dot(B, C.get_rows(u,v)) yield row,(u,v) # Class to compute the inverse laplacian only for specified rows # Allows computation of the current-flow matrix without storing entire # inverse laplacian matrix class InverseLaplacian(object): def __init__(self, L, width=None, dtype=None): global np import numpy as np (n,n) = L.shape self.dtype = dtype self.n = n if width is None: self.w = self.width(L) else: self.w = width self.C = np.zeros((self.w,n), dtype=dtype) self.L1 = L[1:,1:] self.init_solver(L) def init_solver(self,L): pass def solve(self,r): raise("Implement solver") def solve_inverse(self,r): raise("Implement solver") def get_rows(self, r1, r2): for r in range(r1, r2+1): self.C[r%self.w, 1:] = self.solve_inverse(r) return self.C def get_row(self, r): self.C[r%self.w, 1:] = self.solve_inverse(r) return self.C[r%self.w] def width(self,L): m=0 for i,row in enumerate(L): w=0 x,y = np.nonzero(row) if len(y) > 0: v = y-i w=v.max()-v.min()+1 m = max(w,m) return m class FullInverseLaplacian(InverseLaplacian): def init_solver(self,L): self.IL = np.zeros(L.shape, dtype=self.dtype) self.IL[1:,1:] = np.linalg.inv(self.L1.todense()) def solve(self,rhs): s = np.zeros(rhs.shape, dtype=self.dtype) s = np.dot(self.IL,rhs) return s def solve_inverse(self,r): return self.IL[r,1:] class SuperLUInverseLaplacian(InverseLaplacian): def init_solver(self,L): from scipy.sparse import linalg self.lusolve = linalg.factorized(self.L1.tocsc()) def solve_inverse(self,r): rhs = np.zeros(self.n, dtype=self.dtype) rhs[r]=1 return self.lusolve(rhs[1:]) def solve(self,rhs): s = np.zeros(rhs.shape, dtype=self.dtype) s[1:]=self.lusolve(rhs[1:]) return s class CGInverseLaplacian(InverseLaplacian): def init_solver(self,L): global linalg from scipy.sparse import linalg ilu= linalg.spilu(self.L1.tocsc()) n=self.n-1 self.M = linalg.LinearOperator(shape=(n,n), matvec=ilu.solve) def solve(self,rhs): s = np.zeros(rhs.shape, dtype=self.dtype) s[1:]=linalg.cg(self.L1, rhs[1:], M=self.M)[0] return s def solve_inverse(self,r): rhs = np.zeros(self.n, self.dtype) rhs[r] = 1 return linalg.cg(self.L1, rhs[1:], M=self.M)[0] # graph laplacian, sparse version, will move to linalg/laplacianmatrix.py def laplacian_sparse_matrix(G, nodelist=None, weight='weight', dtype=None, format='csr'): import numpy as np import scipy.sparse A = nx.to_scipy_sparse_matrix(G, nodelist=nodelist, weight=weight, dtype=dtype, format=format) (n,n) = A.shape data = np.asarray(A.sum(axis=1).T) D = scipy.sparse.spdiags(data,0,n,n, format=format) return D - A networkx-1.11/networkx/algorithms/centrality/closeness.py0000644000175000017500000000667212637544500023774 0ustar aricaric00000000000000""" Closeness centrality measures. """ # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import functools import networkx as nx __author__ = "\n".join(['Aric Hagberg ', 'Pieter Swart (swart@lanl.gov)', 'Sasha Gutfraind (ag362@cornell.edu)']) __all__ = ['closeness_centrality'] def closeness_centrality(G, u=None, distance=None, normalized=True): r"""Compute closeness centrality for nodes. Closeness centrality [1]_ of a node `u` is the reciprocal of the sum of the shortest path distances from `u` to all `n-1` other nodes. Since the sum of distances depends on the number of nodes in the graph, closeness is normalized by the sum of minimum possible distances `n-1`. .. math:: C(u) = \frac{n - 1}{\sum_{v=1}^{n-1} d(v, u)}, where `d(v, u)` is the shortest-path distance between `v` and `u`, and `n` is the number of nodes in the graph. Notice that higher values of closeness indicate higher centrality. Parameters ---------- G : graph A NetworkX graph u : node, optional Return only the value for node u distance : edge attribute key, optional (default=None) Use the specified edge attribute as the edge distance in shortest path calculations normalized : bool, optional If True (default) normalize by the number of nodes in the connected part of the graph. Returns ------- nodes : dictionary Dictionary of nodes with closeness centrality as the value. See Also -------- betweenness_centrality, load_centrality, eigenvector_centrality, degree_centrality Notes ----- The closeness centrality is normalized to `(n-1)/(|G|-1)` where `n` is the number of nodes in the connected part of graph containing the node. If the graph is not completely connected, this algorithm computes the closeness centrality for each connected part separately. If the 'distance' keyword is set to an edge attribute key then the shortest-path length will be computed using Dijkstra's algorithm with that edge attribute as the edge weight. References ---------- .. [1] Linton C. Freeman: Centrality in networks: I. Conceptual clarification. Social Networks 1:215-239, 1979. http://leonidzhukov.ru/hse/2013/socialnetworks/papers/freeman79-centrality.pdf """ if distance is not None: # use Dijkstra's algorithm with specified attribute as edge weight path_length = functools.partial(nx.single_source_dijkstra_path_length, weight=distance) else: path_length = nx.single_source_shortest_path_length if u is None: nodes = G.nodes() else: nodes = [u] closeness_centrality = {} for n in nodes: sp = path_length(G,n) totsp = sum(sp.values()) if totsp > 0.0 and len(G) > 1: closeness_centrality[n] = (len(sp)-1.0) / totsp # normalize to number of nodes-1 in connected part if normalized: s = (len(sp)-1.0) / ( len(G) - 1 ) closeness_centrality[n] *= s else: closeness_centrality[n] = 0.0 if u is not None: return closeness_centrality[u] else: return closeness_centrality networkx-1.11/networkx/algorithms/centrality/betweenness_subset.py0000644000175000017500000002051012637544500025670 0ustar aricaric00000000000000""" Betweenness centrality measures for subsets of nodes. """ # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. __author__ = """Aric Hagberg (hagberg@lanl.gov)""" __all__ = ['betweenness_centrality_subset', 'edge_betweenness_centrality_subset', 'betweenness_centrality_source'] import networkx as nx from networkx.algorithms.centrality.betweenness import\ _single_source_dijkstra_path_basic as dijkstra from networkx.algorithms.centrality.betweenness import\ _single_source_shortest_path_basic as shortest_path def betweenness_centrality_subset(G,sources,targets, normalized=False, weight=None): r"""Compute betweenness centrality for a subset of nodes. .. math:: c_B(v) =\sum_{s\in S, t \in T} \frac{\sigma(s, t|v)}{\sigma(s, t)} where `S` is the set of sources, `T` is the set of targets, `\sigma(s, t)` is the number of shortest `(s, t)`-paths, and `\sigma(s, t|v)` is the number of those paths passing through some node `v` other than `s, t`. If `s = t`, `\sigma(s, t) = 1`, and if `v \in {s, t}`, `\sigma(s, t|v) = 0` [2]_. Parameters ---------- G : graph sources: list of nodes Nodes to use as sources for shortest paths in betweenness targets: list of nodes Nodes to use as targets for shortest paths in betweenness normalized : bool, optional If True the betweenness values are normalized by `2/((n-1)(n-2))` for graphs, and `1/((n-1)(n-2))` for directed graphs where `n` is the number of nodes in G. weight : None or string, optional If None, all edge weights are considered equal. Otherwise holds the name of the edge attribute used as weight. Returns ------- nodes : dictionary Dictionary of nodes with betweenness centrality as the value. See Also -------- edge_betweenness_centrality load_centrality Notes ----- The basic algorithm is from [1]_. For weighted graphs the edge weights must be greater than zero. Zero edge weights can produce an infinite number of equal length paths between pairs of nodes. The normalization might seem a little strange but it is the same as in betweenness_centrality() and is designed to make betweenness_centrality(G) be the same as betweenness_centrality_subset(G,sources=G.nodes(),targets=G.nodes()). References ---------- .. [1] Ulrik Brandes, A Faster Algorithm for Betweenness Centrality. Journal of Mathematical Sociology 25(2):163-177, 2001. http://www.inf.uni-konstanz.de/algo/publications/b-fabc-01.pdf .. [2] Ulrik Brandes: On Variants of Shortest-Path Betweenness Centrality and their Generic Computation. Social Networks 30(2):136-145, 2008. http://www.inf.uni-konstanz.de/algo/publications/b-vspbc-08.pdf """ b=dict.fromkeys(G,0.0) # b[v]=0 for v in G for s in sources: # single source shortest paths if weight is None: # use BFS S,P,sigma=shortest_path(G,s) else: # use Dijkstra's algorithm S,P,sigma=dijkstra(G,s,weight) b=_accumulate_subset(b,S,P,sigma,s,targets) b=_rescale(b,len(G),normalized=normalized,directed=G.is_directed()) return b def edge_betweenness_centrality_subset(G,sources,targets, normalized=False, weight=None): r"""Compute betweenness centrality for edges for a subset of nodes. .. math:: c_B(v) =\sum_{s\in S,t \in T} \frac{\sigma(s, t|e)}{\sigma(s, t)} where `S` is the set of sources, `T` is the set of targets, `\sigma(s, t)` is the number of shortest `(s, t)`-paths, and `\sigma(s, t|e)` is the number of those paths passing through edge `e` [2]_. Parameters ---------- G : graph A networkx graph sources: list of nodes Nodes to use as sources for shortest paths in betweenness targets: list of nodes Nodes to use as targets for shortest paths in betweenness normalized : bool, optional If True the betweenness values are normalized by `2/(n(n-1))` for graphs, and `1/(n(n-1))` for directed graphs where `n` is the number of nodes in G. weight : None or string, optional If None, all edge weights are considered equal. Otherwise holds the name of the edge attribute used as weight. Returns ------- edges : dictionary Dictionary of edges with Betweenness centrality as the value. See Also -------- betweenness_centrality edge_load Notes ----- The basic algorithm is from [1]_. For weighted graphs the edge weights must be greater than zero. Zero edge weights can produce an infinite number of equal length paths between pairs of nodes. The normalization might seem a little strange but it is the same as in edge_betweenness_centrality() and is designed to make edge_betweenness_centrality(G) be the same as edge_betweenness_centrality_subset(G,sources=G.nodes(),targets=G.nodes()). References ---------- .. [1] Ulrik Brandes, A Faster Algorithm for Betweenness Centrality. Journal of Mathematical Sociology 25(2):163-177, 2001. http://www.inf.uni-konstanz.de/algo/publications/b-fabc-01.pdf .. [2] Ulrik Brandes: On Variants of Shortest-Path Betweenness Centrality and their Generic Computation. Social Networks 30(2):136-145, 2008. http://www.inf.uni-konstanz.de/algo/publications/b-vspbc-08.pdf """ b=dict.fromkeys(G,0.0) # b[v]=0 for v in G b.update(dict.fromkeys(G.edges(),0.0)) # b[e] for e in G.edges() for s in sources: # single source shortest paths if weight is None: # use BFS S,P,sigma=shortest_path(G,s) else: # use Dijkstra's algorithm S,P,sigma=dijkstra(G,s,weight) b=_accumulate_edges_subset(b,S,P,sigma,s,targets) for n in G: # remove nodes to only return edges del b[n] b=_rescale_e(b,len(G),normalized=normalized,directed=G.is_directed()) return b # obsolete name def betweenness_centrality_source(G,normalized=True,weight=None,sources=None): if sources is None: sources=G.nodes() targets=G.nodes() return betweenness_centrality_subset(G,sources,targets,normalized,weight) def _accumulate_subset(betweenness,S,P,sigma,s,targets): delta=dict.fromkeys(S,0) target_set=set(targets) while S: w=S.pop() for v in P[w]: if w in target_set: delta[v]+=(sigma[v]/sigma[w])*(1.0+delta[w]) else: delta[v]+=delta[w]/len(P[w]) if w != s: betweenness[w]+=delta[w] return betweenness def _accumulate_edges_subset(betweenness,S,P,sigma,s,targets): delta=dict.fromkeys(S,0) target_set=set(targets) while S: w=S.pop() for v in P[w]: if w in target_set: c=(sigma[v]/sigma[w])*(1.0+delta[w]) else: c=delta[w]/len(P[w]) if (v,w) not in betweenness: betweenness[(w,v)]+=c else: betweenness[(v,w)]+=c delta[v]+=c if w != s: betweenness[w]+=delta[w] return betweenness def _rescale(betweenness,n,normalized,directed=False): if normalized is True: if n <=2: scale=None # no normalization b=0 for all nodes else: scale=1.0/((n-1)*(n-2)) else: # rescale by 2 for undirected graphs if not directed: scale=1.0/2.0 else: scale=None if scale is not None: for v in betweenness: betweenness[v] *= scale return betweenness def _rescale_e(betweenness,n,normalized,directed=False): if normalized is True: if n <=1: scale=None # no normalization b=0 for all nodes else: scale=1.0/(n*(n-1)) else: # rescale by 2 for undirected graphs if not directed: scale=1.0/2.0 else: scale=None if scale is not None: for v in betweenness: betweenness[v] *= scale return betweenness networkx-1.11/networkx/algorithms/centrality/current_flow_betweenness.py0000644000175000017500000003216612637544500027106 0ustar aricaric00000000000000""" Current-flow betweenness centrality measures. """ # Copyright (C) 2010-2012 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import random import networkx as nx from networkx.algorithms.centrality.flow_matrix import * __author__ = """Aric Hagberg (hagberg@lanl.gov)""" __all__ = ['current_flow_betweenness_centrality', 'approximate_current_flow_betweenness_centrality', 'edge_current_flow_betweenness_centrality'] def approximate_current_flow_betweenness_centrality(G, normalized=True, weight='weight', dtype=float, solver='full', epsilon=0.5, kmax=10000): r"""Compute the approximate current-flow betweenness centrality for nodes. Approximates the current-flow betweenness centrality within absolute error of epsilon with high probability [1]_. Parameters ---------- G : graph A NetworkX graph normalized : bool, optional (default=True) If True the betweenness values are normalized by 2/[(n-1)(n-2)] where n is the number of nodes in G. weight : string or None, optional (default='weight') Key for edge data used as the edge weight. If None, then use 1 as each edge weight. dtype: data type (float) Default data type for internal matrices. Set to np.float32 for lower memory consumption. solver: string (default='lu') Type of linear solver to use for computing the flow matrix. Options are "full" (uses most memory), "lu" (recommended), and "cg" (uses least memory). epsilon: float Absolute error tolerance. kmax: int Maximum number of sample node pairs to use for approximation. Returns ------- nodes : dictionary Dictionary of nodes with betweenness centrality as the value. See Also -------- current_flow_betweenness_centrality Notes ----- The running time is `O((1/\epsilon^2)m{\sqrt k} \log n)` and the space required is `O(m)` for n nodes and m edges. If the edges have a 'weight' attribute they will be used as weights in this algorithm. Unspecified weights are set to 1. References ---------- .. [1] Ulrik Brandes and Daniel Fleischer: Centrality Measures Based on Current Flow. Proc. 22nd Symp. Theoretical Aspects of Computer Science (STACS '05). LNCS 3404, pp. 533-544. Springer-Verlag, 2005. http://www.inf.uni-konstanz.de/algo/publications/bf-cmbcf-05.pdf """ from networkx.utils import reverse_cuthill_mckee_ordering try: import numpy as np except ImportError: raise ImportError('current_flow_betweenness_centrality requires NumPy ', 'http://scipy.org/') try: from scipy import sparse from scipy.sparse import linalg except ImportError: raise ImportError('current_flow_betweenness_centrality requires SciPy ', 'http://scipy.org/') if G.is_directed(): raise nx.NetworkXError('current_flow_betweenness_centrality() ', 'not defined for digraphs.') if not nx.is_connected(G): raise nx.NetworkXError("Graph not connected.") solvername={"full" :FullInverseLaplacian, "lu": SuperLUInverseLaplacian, "cg": CGInverseLaplacian} n = G.number_of_nodes() ordering = list(reverse_cuthill_mckee_ordering(G)) # make a copy with integer labels according to rcm ordering # this could be done without a copy if we really wanted to H = nx.relabel_nodes(G,dict(zip(ordering,range(n)))) L = laplacian_sparse_matrix(H, nodelist=range(n), weight=weight, dtype=dtype, format='csc') C = solvername[solver](L, dtype=dtype) # initialize solver betweenness = dict.fromkeys(H,0.0) nb = (n-1.0)*(n-2.0) # normalization factor cstar = n*(n-1)/nb l = 1 # parameter in approximation, adjustable k = l*int(np.ceil((cstar/epsilon)**2*np.log(n))) if k > kmax: raise nx.NetworkXError('Number random pairs k>kmax (%d>%d) '%(k,kmax), 'Increase kmax or epsilon') cstar2k = cstar/(2*k) for i in range(k): s,t = random.sample(range(n),2) b = np.zeros(n, dtype=dtype) b[s] = 1 b[t] = -1 p = C.solve(b) for v in H: if v==s or v==t: continue for nbr in H[v]: w = H[v][nbr].get(weight,1.0) betweenness[v] += w*np.abs(p[v]-p[nbr])*cstar2k if normalized: factor = 1.0 else: factor = nb/2.0 # remap to original node names and "unnormalize" if required return dict((ordering[k],float(v*factor)) for k,v in betweenness.items()) def current_flow_betweenness_centrality(G, normalized=True, weight='weight', dtype=float, solver='full'): r"""Compute current-flow betweenness centrality for nodes. Current-flow betweenness centrality uses an electrical current model for information spreading in contrast to betweenness centrality which uses shortest paths. Current-flow betweenness centrality is also known as random-walk betweenness centrality [2]_. Parameters ---------- G : graph A NetworkX graph normalized : bool, optional (default=True) If True the betweenness values are normalized by 2/[(n-1)(n-2)] where n is the number of nodes in G. weight : string or None, optional (default='weight') Key for edge data used as the edge weight. If None, then use 1 as each edge weight. dtype: data type (float) Default data type for internal matrices. Set to np.float32 for lower memory consumption. solver: string (default='lu') Type of linear solver to use for computing the flow matrix. Options are "full" (uses most memory), "lu" (recommended), and "cg" (uses least memory). Returns ------- nodes : dictionary Dictionary of nodes with betweenness centrality as the value. See Also -------- approximate_current_flow_betweenness_centrality betweenness_centrality edge_betweenness_centrality edge_current_flow_betweenness_centrality Notes ----- Current-flow betweenness can be computed in `O(I(n-1)+mn \log n)` time [1]_, where `I(n-1)` is the time needed to compute the inverse Laplacian. For a full matrix this is `O(n^3)` but using sparse methods you can achieve `O(nm{\sqrt k})` where `k` is the Laplacian matrix condition number. The space required is `O(nw)` where `w` is the width of the sparse Laplacian matrix. Worse case is `w=n` for `O(n^2)`. If the edges have a 'weight' attribute they will be used as weights in this algorithm. Unspecified weights are set to 1. References ---------- .. [1] Centrality Measures Based on Current Flow. Ulrik Brandes and Daniel Fleischer, Proc. 22nd Symp. Theoretical Aspects of Computer Science (STACS '05). LNCS 3404, pp. 533-544. Springer-Verlag, 2005. http://www.inf.uni-konstanz.de/algo/publications/bf-cmbcf-05.pdf .. [2] A measure of betweenness centrality based on random walks, M. E. J. Newman, Social Networks 27, 39-54 (2005). """ from networkx.utils import reverse_cuthill_mckee_ordering try: import numpy as np except ImportError: raise ImportError('current_flow_betweenness_centrality requires NumPy ', 'http://scipy.org/') try: import scipy except ImportError: raise ImportError('current_flow_betweenness_centrality requires SciPy ', 'http://scipy.org/') if G.is_directed(): raise nx.NetworkXError('current_flow_betweenness_centrality() ', 'not defined for digraphs.') if not nx.is_connected(G): raise nx.NetworkXError("Graph not connected.") n = G.number_of_nodes() ordering = list(reverse_cuthill_mckee_ordering(G)) # make a copy with integer labels according to rcm ordering # this could be done without a copy if we really wanted to H = nx.relabel_nodes(G,dict(zip(ordering,range(n)))) betweenness = dict.fromkeys(H,0.0) # b[v]=0 for v in H for row,(s,t) in flow_matrix_row(H, weight=weight, dtype=dtype, solver=solver): pos = dict(zip(row.argsort()[::-1],range(n))) for i in range(n): betweenness[s] += (i-pos[i])*row[i] betweenness[t] += (n-i-1-pos[i])*row[i] if normalized: nb = (n-1.0)*(n-2.0) # normalization factor else: nb = 2.0 for i,v in enumerate(H): # map integers to nodes betweenness[v] = float((betweenness[v]-i)*2.0/nb) return dict((ordering[k],v) for k,v in betweenness.items()) def edge_current_flow_betweenness_centrality(G, normalized=True, weight='weight', dtype=float, solver='full'): """Compute current-flow betweenness centrality for edges. Current-flow betweenness centrality uses an electrical current model for information spreading in contrast to betweenness centrality which uses shortest paths. Current-flow betweenness centrality is also known as random-walk betweenness centrality [2]_. Parameters ---------- G : graph A NetworkX graph normalized : bool, optional (default=True) If True the betweenness values are normalized by 2/[(n-1)(n-2)] where n is the number of nodes in G. weight : string or None, optional (default='weight') Key for edge data used as the edge weight. If None, then use 1 as each edge weight. dtype: data type (float) Default data type for internal matrices. Set to np.float32 for lower memory consumption. solver: string (default='lu') Type of linear solver to use for computing the flow matrix. Options are "full" (uses most memory), "lu" (recommended), and "cg" (uses least memory). Returns ------- nodes : dictionary Dictionary of edge tuples with betweenness centrality as the value. See Also -------- betweenness_centrality edge_betweenness_centrality current_flow_betweenness_centrality Notes ----- Current-flow betweenness can be computed in `O(I(n-1)+mn \log n)` time [1]_, where `I(n-1)` is the time needed to compute the inverse Laplacian. For a full matrix this is `O(n^3)` but using sparse methods you can achieve `O(nm{\sqrt k})` where `k` is the Laplacian matrix condition number. The space required is `O(nw) where `w` is the width of the sparse Laplacian matrix. Worse case is `w=n` for `O(n^2)`. If the edges have a 'weight' attribute they will be used as weights in this algorithm. Unspecified weights are set to 1. References ---------- .. [1] Centrality Measures Based on Current Flow. Ulrik Brandes and Daniel Fleischer, Proc. 22nd Symp. Theoretical Aspects of Computer Science (STACS '05). LNCS 3404, pp. 533-544. Springer-Verlag, 2005. http://www.inf.uni-konstanz.de/algo/publications/bf-cmbcf-05.pdf .. [2] A measure of betweenness centrality based on random walks, M. E. J. Newman, Social Networks 27, 39-54 (2005). """ from networkx.utils import reverse_cuthill_mckee_ordering try: import numpy as np except ImportError: raise ImportError('current_flow_betweenness_centrality requires NumPy ', 'http://scipy.org/') try: import scipy except ImportError: raise ImportError('current_flow_betweenness_centrality requires SciPy ', 'http://scipy.org/') if G.is_directed(): raise nx.NetworkXError('edge_current_flow_betweenness_centrality ', 'not defined for digraphs.') if not nx.is_connected(G): raise nx.NetworkXError("Graph not connected.") n = G.number_of_nodes() ordering = list(reverse_cuthill_mckee_ordering(G)) # make a copy with integer labels according to rcm ordering # this could be done without a copy if we really wanted to H = nx.relabel_nodes(G,dict(zip(ordering,range(n)))) betweenness=(dict.fromkeys(H.edges(),0.0)) if normalized: nb=(n-1.0)*(n-2.0) # normalization factor else: nb=2.0 for row,(e) in flow_matrix_row(H, weight=weight, dtype=dtype, solver=solver): pos=dict(zip(row.argsort()[::-1],range(1,n+1))) for i in range(n): betweenness[e]+=(i+1-pos[i])*row[i] betweenness[e]+=(n-i-pos[i])*row[i] betweenness[e]/=nb return dict(((ordering[s],ordering[t]),float(v)) for (s,t),v in betweenness.items()) # fixture for nose tests def setup_module(module): from nose import SkipTest try: import numpy import scipy except: raise SkipTest("NumPy not available") networkx-1.11/networkx/algorithms/centrality/__init__.py0000644000175000017500000000060512637544450023527 0ustar aricaric00000000000000from .betweenness import * from .betweenness_subset import * from .closeness import * from .communicability_alg import * from .current_flow_closeness import * from .current_flow_betweenness import * from .current_flow_betweenness_subset import * from .degree_alg import * from .dispersion import * from .eigenvector import * from .harmonic import * from .katz import * from .load import * networkx-1.11/networkx/algorithms/centrality/katz.py0000644000175000017500000002516512637544500022745 0ustar aricaric00000000000000# coding=utf8 """ Katz centrality. """ # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import networkx as nx from networkx.utils import not_implemented_for __author__ = "\n".join(['Aric Hagberg (aric.hagberg@gmail.com)', 'Pieter Swart (swart@lanl.gov)', 'Sasha Gutfraind (ag362@cornell.edu)', 'Vincent Gauthier (vgauthier@luxbulb.org)']) __all__ = ['katz_centrality', 'katz_centrality_numpy'] @not_implemented_for('multigraph') def katz_centrality(G, alpha=0.1, beta=1.0, max_iter=1000, tol=1.0e-6, nstart=None, normalized=True, weight = 'weight'): r"""Compute the Katz centrality for the nodes of the graph G. Katz centrality computes the centrality for a node based on the centrality of its neighbors. It is a generalization of the eigenvector centrality. The Katz centrality for node `i` is .. math:: x_i = \alpha \sum_{j} A_{ij} x_j + \beta, where `A` is the adjacency matrix of the graph G with eigenvalues `\lambda`. The parameter `\beta` controls the initial centrality and .. math:: \alpha < \frac{1}{\lambda_{max}}. Katz centrality computes the relative influence of a node within a network by measuring the number of the immediate neighbors (first degree nodes) and also all other nodes in the network that connect to the node under consideration through these immediate neighbors. Extra weight can be provided to immediate neighbors through the parameter :math:`\beta`. Connections made with distant neighbors are, however, penalized by an attenuation factor `\alpha` which should be strictly less than the inverse largest eigenvalue of the adjacency matrix in order for the Katz centrality to be computed correctly. More information is provided in [1]_ . Parameters ---------- G : graph A NetworkX graph alpha : float Attenuation factor beta : scalar or dictionary, optional (default=1.0) Weight attributed to the immediate neighborhood. If not a scalar, the dictionary must have an value for every node. max_iter : integer, optional (default=1000) Maximum number of iterations in power method. tol : float, optional (default=1.0e-6) Error tolerance used to check convergence in power method iteration. nstart : dictionary, optional Starting value of Katz iteration for each node. normalized : bool, optional (default=True) If True normalize the resulting values. weight : None or string, optional If None, all edge weights are considered equal. Otherwise holds the name of the edge attribute used as weight. Returns ------- nodes : dictionary Dictionary of nodes with Katz centrality as the value. Raises ------ NetworkXError If the parameter `beta` is not a scalar but lacks a value for at least one node Examples -------- >>> import math >>> G = nx.path_graph(4) >>> phi = (1+math.sqrt(5))/2.0 # largest eigenvalue of adj matrix >>> centrality = nx.katz_centrality(G,1/phi-0.01) >>> for n,c in sorted(centrality.items()): ... print("%d %0.2f"%(n,c)) 0 0.37 1 0.60 2 0.60 3 0.37 See Also -------- katz_centrality_numpy eigenvector_centrality eigenvector_centrality_numpy pagerank hits Notes ----- Katz centrality was introduced by [2]_. This algorithm it uses the power method to find the eigenvector corresponding to the largest eigenvalue of the adjacency matrix of G. The constant alpha should be strictly less than the inverse of largest eigenvalue of the adjacency matrix for the algorithm to converge. The iteration will stop after max_iter iterations or an error tolerance of number_of_nodes(G)*tol has been reached. When `\alpha = 1/\lambda_{max}` and `\beta=0`, Katz centrality is the same as eigenvector centrality. For directed graphs this finds "left" eigenvectors which corresponds to the in-edges in the graph. For out-edges Katz centrality first reverse the graph with G.reverse(). References ---------- .. [1] Mark E. J. Newman: Networks: An Introduction. Oxford University Press, USA, 2010, p. 720. .. [2] Leo Katz: A New Status Index Derived from Sociometric Index. Psychometrika 18(1):39–43, 1953 http://phya.snu.ac.kr/~dkim/PRL87278701.pdf """ from math import sqrt if len(G) == 0: return {} nnodes = G.number_of_nodes() if nstart is None: # choose starting vector with entries of 0 x = dict([(n,0) for n in G]) else: x = nstart try: b = dict.fromkeys(G,float(beta)) except (TypeError,ValueError,AttributeError): b = beta if set(beta) != set(G): raise nx.NetworkXError('beta dictionary ' 'must have a value for every node') # make up to max_iter iterations for i in range(max_iter): xlast = x x = dict.fromkeys(xlast, 0) # do the multiplication y^T = Alpha * x^T A - Beta for n in x: for nbr in G[n]: x[nbr] += xlast[n] * G[n][nbr].get(weight, 1) for n in x: x[n] = alpha*x[n] + b[n] # check convergence err = sum([abs(x[n]-xlast[n]) for n in x]) if err < nnodes*tol: if normalized: # normalize vector try: s = 1.0/sqrt(sum(v**2 for v in x.values())) # this should never be zero? except ZeroDivisionError: s = 1.0 else: s = 1 for n in x: x[n] *= s return x raise nx.NetworkXError('Power iteration failed to converge in ' '%d iterations.' % max_iter) @not_implemented_for('multigraph') def katz_centrality_numpy(G, alpha=0.1, beta=1.0, normalized=True, weight = 'weight'): r"""Compute the Katz centrality for the graph G. Katz centrality computes the centrality for a node based on the centrality of its neighbors. It is a generalization of the eigenvector centrality. The Katz centrality for node `i` is .. math:: x_i = \alpha \sum_{j} A_{ij} x_j + \beta, where `A` is the adjacency matrix of the graph G with eigenvalues `\lambda`. The parameter `\beta` controls the initial centrality and .. math:: \alpha < \frac{1}{\lambda_{max}}. Katz centrality computes the relative influence of a node within a network by measuring the number of the immediate neighbors (first degree nodes) and also all other nodes in the network that connect to the node under consideration through these immediate neighbors. Extra weight can be provided to immediate neighbors through the parameter :math:`\beta`. Connections made with distant neighbors are, however, penalized by an attenuation factor `\alpha` which should be strictly less than the inverse largest eigenvalue of the adjacency matrix in order for the Katz centrality to be computed correctly. More information is provided in [1]_ . Parameters ---------- G : graph A NetworkX graph alpha : float Attenuation factor beta : scalar or dictionary, optional (default=1.0) Weight attributed to the immediate neighborhood. If not a scalar the dictionary must have an value for every node. normalized : bool If True normalize the resulting values. weight : None or string, optional If None, all edge weights are considered equal. Otherwise holds the name of the edge attribute used as weight. Returns ------- nodes : dictionary Dictionary of nodes with Katz centrality as the value. Raises ------ NetworkXError If the parameter `beta` is not a scalar but lacks a value for at least one node Examples -------- >>> import math >>> G = nx.path_graph(4) >>> phi = (1+math.sqrt(5))/2.0 # largest eigenvalue of adj matrix >>> centrality = nx.katz_centrality_numpy(G,1/phi) >>> for n,c in sorted(centrality.items()): ... print("%d %0.2f"%(n,c)) 0 0.37 1 0.60 2 0.60 3 0.37 See Also -------- katz_centrality eigenvector_centrality_numpy eigenvector_centrality pagerank hits Notes ----- Katz centrality was introduced by [2]_. This algorithm uses a direct linear solver to solve the above equation. The constant alpha should be strictly less than the inverse of largest eigenvalue of the adjacency matrix for there to be a solution. When `\alpha = 1/\lambda_{max}` and `\beta=0`, Katz centrality is the same as eigenvector centrality. For directed graphs this finds "left" eigenvectors which corresponds to the in-edges in the graph. For out-edges Katz centrality first reverse the graph with G.reverse(). References ---------- .. [1] Mark E. J. Newman: Networks: An Introduction. Oxford University Press, USA, 2010, p. 720. .. [2] Leo Katz: A New Status Index Derived from Sociometric Index. Psychometrika 18(1):39–43, 1953 http://phya.snu.ac.kr/~dkim/PRL87278701.pdf """ try: import numpy as np except ImportError: raise ImportError('Requires NumPy: http://scipy.org/') if len(G) == 0: return {} try: nodelist = beta.keys() if set(nodelist) != set(G): raise nx.NetworkXError('beta dictionary ' 'must have a value for every node') b = np.array(list(beta.values()), dtype=float) except AttributeError: nodelist = G.nodes() try: b = np.ones((len(nodelist),1))*float(beta) except (TypeError,ValueError,AttributeError): raise nx.NetworkXError('beta must be a number') A = nx.adj_matrix(G, nodelist=nodelist, weight=weight).todense().T n = np.array(A).shape[0] centrality = np.linalg.solve( np.eye(n,n) - (alpha * A) , b) if normalized: norm = np.sign(sum(centrality)) * np.linalg.norm(centrality) else: norm = 1.0 centrality = dict(zip(nodelist, map(float,centrality/norm))) return centrality # fixture for nose tests def setup_module(module): from nose import SkipTest try: import numpy import scipy except: raise SkipTest("SciPy not available") networkx-1.11/networkx/algorithms/centrality/betweenness.py0000644000175000017500000002664012637544450024321 0ustar aricaric00000000000000# coding=utf8 """ Betweenness centrality measures. """ # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. from heapq import heappush, heappop from itertools import count import networkx as nx import random __author__ = """Aric Hagberg (hagberg@lanl.gov)""" __all__ = ['betweenness_centrality', 'edge_betweenness_centrality', 'edge_betweenness'] def betweenness_centrality(G, k=None, normalized=True, weight=None, endpoints=False, seed=None): r"""Compute the shortest-path betweenness centrality for nodes. Betweenness centrality of a node `v` is the sum of the fraction of all-pairs shortest paths that pass through `v` .. math:: c_B(v) =\sum_{s,t \in V} \frac{\sigma(s, t|v)}{\sigma(s, t)} where `V` is the set of nodes, `\sigma(s, t)` is the number of shortest `(s, t)`-paths, and `\sigma(s, t|v)` is the number of those paths passing through some node `v` other than `s, t`. If `s = t`, `\sigma(s, t) = 1`, and if `v \in {s, t}`, `\sigma(s, t|v) = 0` [2]_. Parameters ---------- G : graph A NetworkX graph k : int, optional (default=None) If k is not None use k node samples to estimate betweenness. The value of k <= n where n is the number of nodes in the graph. Higher values give better approximation. normalized : bool, optional If True the betweenness values are normalized by `2/((n-1)(n-2))` for graphs, and `1/((n-1)(n-2))` for directed graphs where `n` is the number of nodes in G. weight : None or string, optional If None, all edge weights are considered equal. Otherwise holds the name of the edge attribute used as weight. endpoints : bool, optional If True include the endpoints in the shortest path counts. Returns ------- nodes : dictionary Dictionary of nodes with betweenness centrality as the value. See Also -------- edge_betweenness_centrality load_centrality Notes ----- The algorithm is from Ulrik Brandes [1]_. See [4]_ for the original first published version and [2]_ for details on algorithms for variations and related metrics. For approximate betweenness calculations set k=#samples to use k nodes ("pivots") to estimate the betweenness values. For an estimate of the number of pivots needed see [3]_. For weighted graphs the edge weights must be greater than zero. Zero edge weights can produce an infinite number of equal length paths between pairs of nodes. References ---------- .. [1] Ulrik Brandes: A Faster Algorithm for Betweenness Centrality. Journal of Mathematical Sociology 25(2):163-177, 2001. http://www.inf.uni-konstanz.de/algo/publications/b-fabc-01.pdf .. [2] Ulrik Brandes: On Variants of Shortest-Path Betweenness Centrality and their Generic Computation. Social Networks 30(2):136-145, 2008. http://www.inf.uni-konstanz.de/algo/publications/b-vspbc-08.pdf .. [3] Ulrik Brandes and Christian Pich: Centrality Estimation in Large Networks. International Journal of Bifurcation and Chaos 17(7):2303-2318, 2007. http://www.inf.uni-konstanz.de/algo/publications/bp-celn-06.pdf .. [4] Linton C. Freeman: A set of measures of centrality based on betweenness. Sociometry 40: 35–41, 1977 http://moreno.ss.uci.edu/23.pdf """ betweenness = dict.fromkeys(G, 0.0) # b[v]=0 for v in G if k is None: nodes = G else: random.seed(seed) nodes = random.sample(G.nodes(), k) for s in nodes: # single source shortest paths if weight is None: # use BFS S, P, sigma = _single_source_shortest_path_basic(G, s) else: # use Dijkstra's algorithm S, P, sigma = _single_source_dijkstra_path_basic(G, s, weight) # accumulation if endpoints: betweenness = _accumulate_endpoints(betweenness, S, P, sigma, s) else: betweenness = _accumulate_basic(betweenness, S, P, sigma, s) # rescaling betweenness = _rescale(betweenness, len(G), normalized=normalized, directed=G.is_directed(), k=k) return betweenness def edge_betweenness_centrality(G, k=None, normalized=True, weight=None, seed=None): r"""Compute betweenness centrality for edges. Betweenness centrality of an edge `e` is the sum of the fraction of all-pairs shortest paths that pass through `e` .. math:: c_B(e) =\sum_{s,t \in V} \frac{\sigma(s, t|e)}{\sigma(s, t)} where `V` is the set of nodes,`\sigma(s, t)` is the number of shortest `(s, t)`-paths, and `\sigma(s, t|e)` is the number of those paths passing through edge `e` [2]_. Parameters ---------- G : graph A NetworkX graph k : int, optional (default=None) If k is not None use k node samples to estimate betweenness. The value of k <= n where n is the number of nodes in the graph. Higher values give better approximation. normalized : bool, optional If True the betweenness values are normalized by `2/(n(n-1))` for graphs, and `1/(n(n-1))` for directed graphs where `n` is the number of nodes in G. weight : None or string, optional If None, all edge weights are considered equal. Otherwise holds the name of the edge attribute used as weight. Returns ------- edges : dictionary Dictionary of edges with betweenness centrality as the value. See Also -------- betweenness_centrality edge_load Notes ----- The algorithm is from Ulrik Brandes [1]_. For weighted graphs the edge weights must be greater than zero. Zero edge weights can produce an infinite number of equal length paths between pairs of nodes. References ---------- .. [1] A Faster Algorithm for Betweenness Centrality. Ulrik Brandes, Journal of Mathematical Sociology 25(2):163-177, 2001. http://www.inf.uni-konstanz.de/algo/publications/b-fabc-01.pdf .. [2] Ulrik Brandes: On Variants of Shortest-Path Betweenness Centrality and their Generic Computation. Social Networks 30(2):136-145, 2008. http://www.inf.uni-konstanz.de/algo/publications/b-vspbc-08.pdf """ betweenness = dict.fromkeys(G, 0.0) # b[v]=0 for v in G # b[e]=0 for e in G.edges() betweenness.update(dict.fromkeys(G.edges(), 0.0)) if k is None: nodes = G else: random.seed(seed) nodes = random.sample(G.nodes(), k) for s in nodes: # single source shortest paths if weight is None: # use BFS S, P, sigma = _single_source_shortest_path_basic(G, s) else: # use Dijkstra's algorithm S, P, sigma = _single_source_dijkstra_path_basic(G, s, weight) # accumulation betweenness = _accumulate_edges(betweenness, S, P, sigma, s) # rescaling for n in G: # remove nodes to only return edges del betweenness[n] betweenness = _rescale_e(betweenness, len(G), normalized=normalized, directed=G.is_directed()) return betweenness # obsolete name def edge_betweenness(G, k=None, normalized=True, weight=None, seed=None): return edge_betweenness_centrality(G, k, normalized, weight, seed) # helpers for betweenness centrality def _single_source_shortest_path_basic(G, s): S = [] P = {} for v in G: P[v] = [] sigma = dict.fromkeys(G, 0.0) # sigma[v]=0 for v in G D = {} sigma[s] = 1.0 D[s] = 0 Q = [s] while Q: # use BFS to find shortest paths v = Q.pop(0) S.append(v) Dv = D[v] sigmav = sigma[v] for w in G[v]: if w not in D: Q.append(w) D[w] = Dv + 1 if D[w] == Dv + 1: # this is a shortest path, count paths sigma[w] += sigmav P[w].append(v) # predecessors return S, P, sigma def _single_source_dijkstra_path_basic(G, s, weight='weight'): # modified from Eppstein S = [] P = {} for v in G: P[v] = [] sigma = dict.fromkeys(G, 0.0) # sigma[v]=0 for v in G D = {} sigma[s] = 1.0 push = heappush pop = heappop seen = {s: 0} c = count() Q = [] # use Q as heap with (distance,node id) tuples push(Q, (0, next(c), s, s)) while Q: (dist, _, pred, v) = pop(Q) if v in D: continue # already searched this node. sigma[v] += sigma[pred] # count paths S.append(v) D[v] = dist for w, edgedata in G[v].items(): vw_dist = dist + edgedata.get(weight, 1) if w not in D and (w not in seen or vw_dist < seen[w]): seen[w] = vw_dist push(Q, (vw_dist, next(c), v, w)) sigma[w] = 0.0 P[w] = [v] elif vw_dist == seen[w]: # handle equal paths sigma[w] += sigma[v] P[w].append(v) return S, P, sigma def _accumulate_basic(betweenness, S, P, sigma, s): delta = dict.fromkeys(S, 0) while S: w = S.pop() coeff = (1.0 + delta[w]) / sigma[w] for v in P[w]: delta[v] += sigma[v] * coeff if w != s: betweenness[w] += delta[w] return betweenness def _accumulate_endpoints(betweenness, S, P, sigma, s): betweenness[s] += len(S) - 1 delta = dict.fromkeys(S, 0) while S: w = S.pop() coeff = (1.0 + delta[w]) / sigma[w] for v in P[w]: delta[v] += sigma[v] * coeff if w != s: betweenness[w] += delta[w] + 1 return betweenness def _accumulate_edges(betweenness, S, P, sigma, s): delta = dict.fromkeys(S, 0) while S: w = S.pop() coeff = (1.0 + delta[w]) / sigma[w] for v in P[w]: c = sigma[v] * coeff if (v, w) not in betweenness: betweenness[(w, v)] += c else: betweenness[(v, w)] += c delta[v] += c if w != s: betweenness[w] += delta[w] return betweenness def _rescale(betweenness, n, normalized, directed=False, k=None): if normalized is True: if n <= 2: scale = None # no normalization b=0 for all nodes else: scale = 1.0 / ((n - 1) * (n - 2)) else: # rescale by 2 for undirected graphs if not directed: scale = 1.0 / 2.0 else: scale = None if scale is not None: if k is not None: scale = scale * n / k for v in betweenness: betweenness[v] *= scale return betweenness def _rescale_e(betweenness, n, normalized, directed=False, k=None): if normalized is True: if n <= 1: scale = None # no normalization b=0 for all nodes else: scale = 1.0 / (n * (n - 1)) else: # rescale by 2 for undirected graphs if not directed: scale = 1.0 / 2.0 else: scale = None if scale is not None: if k is not None: scale = scale * n / k for v in betweenness: betweenness[v] *= scale return betweenness networkx-1.11/networkx/algorithms/centrality/tests/0000755000175000017500000000000012653231454022552 5ustar aricaric00000000000000networkx-1.11/networkx/algorithms/centrality/tests/test_eigenvector_centrality.py0000644000175000017500000001123612637544500030737 0ustar aricaric00000000000000#!/usr/bin/env python import math from nose import SkipTest from nose.tools import * import networkx class TestEigenvectorCentrality(object): numpy=1 # nosetests attribute, use nosetests -a 'not numpy' to skip test @classmethod def setupClass(cls): global np try: import numpy as np import scipy except ImportError: raise SkipTest('SciPy not available.') def test_K5(self): """Eigenvector centrality: K5""" G=networkx.complete_graph(5) b=networkx.eigenvector_centrality(G) v=math.sqrt(1/5.0) b_answer=dict.fromkeys(G,v) for n in sorted(G): assert_almost_equal(b[n],b_answer[n]) nstart = dict([(n,1) for n in G]) b=networkx.eigenvector_centrality(G,nstart=nstart) for n in sorted(G): assert_almost_equal(b[n],b_answer[n]) b=networkx.eigenvector_centrality_numpy(G) for n in sorted(G): assert_almost_equal(b[n],b_answer[n],places=3) def test_P3(self): """Eigenvector centrality: P3""" G=networkx.path_graph(3) b_answer={0: 0.5, 1: 0.7071, 2: 0.5} b=networkx.eigenvector_centrality_numpy(G) for n in sorted(G): assert_almost_equal(b[n],b_answer[n],places=4) def test_P3_unweighted(self): """Eigenvector centrality: P3""" G=networkx.path_graph(3) b_answer={0: 0.5, 1: 0.7071, 2: 0.5} b=networkx.eigenvector_centrality_numpy(G, weight=None) for n in sorted(G): assert_almost_equal(b[n],b_answer[n],places=4) @raises(networkx.NetworkXError) def test_maxiter(self): G=networkx.path_graph(3) b=networkx.eigenvector_centrality(G,max_iter=0) class TestEigenvectorCentralityDirected(object): numpy=1 # nosetests attribute, use nosetests -a 'not numpy' to skip test @classmethod def setupClass(cls): global np try: import numpy as np import scipy except ImportError: raise SkipTest('SciPy not available.') def setUp(self): G=networkx.DiGraph() edges=[(1,2),(1,3),(2,4),(3,2),(3,5),(4,2),(4,5),(4,6),\ (5,6),(5,7),(5,8),(6,8),(7,1),(7,5),\ (7,8),(8,6),(8,7)] G.add_edges_from(edges,weight=2.0) self.G=G.reverse() self.G.evc=[0.25368793, 0.19576478, 0.32817092, 0.40430835, 0.48199885, 0.15724483, 0.51346196, 0.32475403] H=networkx.DiGraph() edges=[(1,2),(1,3),(2,4),(3,2),(3,5),(4,2),(4,5),(4,6),\ (5,6),(5,7),(5,8),(6,8),(7,1),(7,5),\ (7,8),(8,6),(8,7)] G.add_edges_from(edges) self.H=G.reverse() self.H.evc=[0.25368793, 0.19576478, 0.32817092, 0.40430835, 0.48199885, 0.15724483, 0.51346196, 0.32475403] def test_eigenvector_centrality_weighted(self): G=self.G p=networkx.eigenvector_centrality(G) for (a,b) in zip(list(p.values()),self.G.evc): assert_almost_equal(a,b,places=4) def test_eigenvector_centrality_weighted_numpy(self): G=self.G p=networkx.eigenvector_centrality_numpy(G) for (a,b) in zip(list(p.values()),self.G.evc): assert_almost_equal(a,b) def test_eigenvector_centrality_unweighted(self): G=self.H p=networkx.eigenvector_centrality(G) for (a,b) in zip(list(p.values()),self.G.evc): assert_almost_equal(a,b,places=4) def test_eigenvector_centrality_unweighted_numpy(self): G=self.H p=networkx.eigenvector_centrality_numpy(G) for (a,b) in zip(list(p.values()),self.G.evc): assert_almost_equal(a,b) class TestEigenvectorCentralityExceptions(object): numpy=1 # nosetests attribute, use nosetests -a 'not numpy' to skip test @classmethod def setupClass(cls): global np try: import numpy as np import scipy except ImportError: raise SkipTest('SciPy not available.') numpy=1 # nosetests attribute, use nosetests -a 'not numpy' to skip test @raises(networkx.NetworkXException) def test_multigraph(self): e = networkx.eigenvector_centrality(networkx.MultiGraph()) @raises(networkx.NetworkXException) def test_multigraph_numpy(self): e = networkx.eigenvector_centrality_numpy(networkx.MultiGraph()) @raises(networkx.NetworkXException) def test_empty(self): e = networkx.eigenvector_centrality(networkx.Graph()) @raises(networkx.NetworkXException) def test_empty_numpy(self): e = networkx.eigenvector_centrality_numpy(networkx.Graph()) networkx-1.11/networkx/algorithms/centrality/tests/test_closeness_centrality.py0000644000175000017500000000527212637544450030432 0ustar aricaric00000000000000""" Tests for degree centrality. """ from nose.tools import * import networkx as nx class TestClosenessCentrality: def setUp(self): self.K = nx.krackhardt_kite_graph() self.P3 = nx.path_graph(3) self.P4 = nx.path_graph(4) self.K5 = nx.complete_graph(5) self.C4=nx.cycle_graph(4) self.T=nx.balanced_tree(r=2, h=2) self.Gb = nx.Graph() self.Gb.add_edges_from([(0,1), (0,2), (1,3), (2,3), (2,4), (4,5), (3,5)]) F = nx.florentine_families_graph() self.F = F def test_k5_closeness(self): c=nx.closeness_centrality(self.K5) d={0: 1.000, 1: 1.000, 2: 1.000, 3: 1.000, 4: 1.000} for n in sorted(self.K5): assert_almost_equal(c[n],d[n],places=3) def test_p3_closeness(self): c=nx.closeness_centrality(self.P3) d={0: 0.667, 1: 1.000, 2: 0.667} for n in sorted(self.P3): assert_almost_equal(c[n],d[n],places=3) def test_krackhardt_closeness(self): c=nx.closeness_centrality(self.K) d={0: 0.529, 1: 0.529, 2: 0.500, 3: 0.600, 4: 0.500, 5: 0.643, 6: 0.643, 7: 0.600, 8: 0.429, 9: 0.310} for n in sorted(self.K): assert_almost_equal(c[n],d[n],places=3) def test_florentine_families_closeness(self): c=nx.closeness_centrality(self.F) d={'Acciaiuoli': 0.368, 'Albizzi': 0.483, 'Barbadori': 0.4375, 'Bischeri': 0.400, 'Castellani': 0.389, 'Ginori': 0.333, 'Guadagni': 0.467, 'Lamberteschi': 0.326, 'Medici': 0.560, 'Pazzi': 0.286, 'Peruzzi': 0.368, 'Ridolfi': 0.500, 'Salviati': 0.389, 'Strozzi': 0.4375, 'Tornabuoni': 0.483} for n in sorted(self.F): assert_almost_equal(c[n],d[n],places=3) def test_weighted_closeness(self): XG=nx.Graph() XG.add_weighted_edges_from([('s','u',10), ('s','x',5), ('u','v',1), ('u','x',2), ('v','y',1), ('x','u',3), ('x','v',5), ('x','y',2), ('y','s',7), ('y','v',6)]) c=nx.closeness_centrality(XG,distance='weight') d={'y': 0.200, 'x': 0.286, 's': 0.138, 'u': 0.235, 'v': 0.200} for n in sorted(XG): assert_almost_equal(c[n],d[n],places=3) networkx-1.11/networkx/algorithms/centrality/tests/test_katz_centrality.py0000644000175000017500000002637112637544450027410 0ustar aricaric00000000000000# -*- coding: utf-8 -*- import math from nose import SkipTest from nose.tools import * import networkx class TestKatzCentrality(object): def test_K5(self): """Katz centrality: K5""" G = networkx.complete_graph(5) alpha = 0.1 b = networkx.katz_centrality(G, alpha) v = math.sqrt(1 / 5.0) b_answer = dict.fromkeys(G, v) for n in sorted(G): assert_almost_equal(b[n], b_answer[n]) nstart = dict([(n, 1) for n in G]) b = networkx.katz_centrality(G, alpha, nstart=nstart) for n in sorted(G): assert_almost_equal(b[n], b_answer[n]) def test_P3(self): """Katz centrality: P3""" alpha = 0.1 G = networkx.path_graph(3) b_answer = {0: 0.5598852584152165, 1: 0.6107839182711449, 2: 0.5598852584152162} b = networkx.katz_centrality(G, alpha) for n in sorted(G): assert_almost_equal(b[n], b_answer[n], places=4) @raises(networkx.NetworkXError) def test_maxiter(self): alpha = 0.1 G = networkx.path_graph(3) max_iter = 0 try: b = networkx.katz_centrality(G, alpha, max_iter=max_iter) except networkx.NetworkXError as e: assert str(max_iter) in e.args[0], "max_iter value not in error msg" raise # So that the decorater sees the exception. def test_beta_as_scalar(self): alpha = 0.1 beta = 0.1 b_answer = {0: 0.5598852584152165, 1: 0.6107839182711449, 2: 0.5598852584152162} G = networkx.path_graph(3) b = networkx.katz_centrality(G, alpha, beta) for n in sorted(G): assert_almost_equal(b[n], b_answer[n], places=4) def test_beta_as_dict(self): alpha = 0.1 beta = {0: 1.0, 1: 1.0, 2: 1.0} b_answer = {0: 0.5598852584152165, 1: 0.6107839182711449, 2: 0.5598852584152162} G = networkx.path_graph(3) b = networkx.katz_centrality(G, alpha, beta) for n in sorted(G): assert_almost_equal(b[n], b_answer[n], places=4) def test_multiple_alpha(self): alpha_list = [0.1, 0.2, 0.3, 0.4, 0.5, 0.6] for alpha in alpha_list: b_answer = {0.1: {0: 0.5598852584152165, 1: 0.6107839182711449, 2: 0.5598852584152162}, 0.2: {0: 0.5454545454545454, 1: 0.6363636363636365, 2: 0.5454545454545454}, 0.3: {0: 0.5333964609104419, 1: 0.6564879518897746, 2: 0.5333964609104419}, 0.4: {0: 0.5232045649263551, 1: 0.6726915834767423, 2: 0.5232045649263551}, 0.5: {0: 0.5144957746691622, 1: 0.6859943117075809, 2: 0.5144957746691622}, 0.6: {0: 0.5069794004195823, 1: 0.6970966755769258, 2: 0.5069794004195823}} G = networkx.path_graph(3) b = networkx.katz_centrality(G, alpha) for n in sorted(G): assert_almost_equal(b[n], b_answer[alpha][n], places=4) @raises(networkx.NetworkXException) def test_multigraph(self): e = networkx.katz_centrality(networkx.MultiGraph(), 0.1) def test_empty(self): e = networkx.katz_centrality(networkx.Graph(), 0.1) assert_equal(e, {}) @raises(networkx.NetworkXException) def test_bad_beta(self): G = networkx.Graph([(0,1)]) beta = {0:77} e = networkx.katz_centrality(G, 0.1,beta=beta) @raises(networkx.NetworkXException) def test_bad_beta_numbe(self): G = networkx.Graph([(0,1)]) e = networkx.katz_centrality(G, 0.1,beta='foo') class TestKatzCentralityNumpy(object): numpy = 1 # nosetests attribute, use nosetests -a 'not numpy' to skip test @classmethod def setupClass(cls): global np try: import numpy as np import scipy except ImportError: raise SkipTest('SciPy not available.') def test_K5(self): """Katz centrality: K5""" G = networkx.complete_graph(5) alpha = 0.1 b = networkx.katz_centrality(G, alpha) v = math.sqrt(1 / 5.0) b_answer = dict.fromkeys(G, v) for n in sorted(G): assert_almost_equal(b[n], b_answer[n]) nstart = dict([(n, 1) for n in G]) b = networkx.eigenvector_centrality_numpy(G) for n in sorted(G): assert_almost_equal(b[n], b_answer[n], places=3) def test_P3(self): """Katz centrality: P3""" alpha = 0.1 G = networkx.path_graph(3) b_answer = {0: 0.5598852584152165, 1: 0.6107839182711449, 2: 0.5598852584152162} b = networkx.katz_centrality_numpy(G, alpha) for n in sorted(G): assert_almost_equal(b[n], b_answer[n], places=4) def test_beta_as_scalar(self): alpha = 0.1 beta = 0.1 b_answer = {0: 0.5598852584152165, 1: 0.6107839182711449, 2: 0.5598852584152162} G = networkx.path_graph(3) b = networkx.katz_centrality_numpy(G, alpha, beta) for n in sorted(G): assert_almost_equal(b[n], b_answer[n], places=4) def test_beta_as_dict(self): alpha = 0.1 beta = {0: 1.0, 1: 1.0, 2: 1.0} b_answer = {0: 0.5598852584152165, 1: 0.6107839182711449, 2: 0.5598852584152162} G = networkx.path_graph(3) b = networkx.katz_centrality_numpy(G, alpha, beta) for n in sorted(G): assert_almost_equal(b[n], b_answer[n], places=4) def test_multiple_alpha(self): alpha_list = [0.1, 0.2, 0.3, 0.4, 0.5, 0.6] for alpha in alpha_list: b_answer = {0.1: {0: 0.5598852584152165, 1: 0.6107839182711449, 2: 0.5598852584152162}, 0.2: {0: 0.5454545454545454, 1: 0.6363636363636365, 2: 0.5454545454545454}, 0.3: {0: 0.5333964609104419, 1: 0.6564879518897746, 2: 0.5333964609104419}, 0.4: {0: 0.5232045649263551, 1: 0.6726915834767423, 2: 0.5232045649263551}, 0.5: {0: 0.5144957746691622, 1: 0.6859943117075809, 2: 0.5144957746691622}, 0.6: {0: 0.5069794004195823, 1: 0.6970966755769258, 2: 0.5069794004195823}} G = networkx.path_graph(3) b = networkx.katz_centrality_numpy(G, alpha) for n in sorted(G): assert_almost_equal(b[n], b_answer[alpha][n], places=4) @raises(networkx.NetworkXException) def test_multigraph(self): e = networkx.katz_centrality(networkx.MultiGraph(), 0.1) def test_empty(self): e = networkx.katz_centrality(networkx.Graph(), 0.1) assert_equal(e, {}) @raises(networkx.NetworkXException) def test_bad_beta(self): G = networkx.Graph([(0,1)]) beta = {0:77} e = networkx.katz_centrality_numpy(G, 0.1,beta=beta) @raises(networkx.NetworkXException) def test_bad_beta_numbe(self): G = networkx.Graph([(0,1)]) e = networkx.katz_centrality_numpy(G, 0.1,beta='foo') def test_K5_unweighted(self): """Katz centrality: K5""" G = networkx.complete_graph(5) alpha = 0.1 b = networkx.katz_centrality(G, alpha, weight=None) v = math.sqrt(1 / 5.0) b_answer = dict.fromkeys(G, v) for n in sorted(G): assert_almost_equal(b[n], b_answer[n]) nstart = dict([(n, 1) for n in G]) b = networkx.eigenvector_centrality_numpy(G) for n in sorted(G): assert_almost_equal(b[n], b_answer[n], places=3) def test_P3_unweighted(self): """Katz centrality: P3""" alpha = 0.1 G = networkx.path_graph(3) b_answer = {0: 0.5598852584152165, 1: 0.6107839182711449, 2: 0.5598852584152162} b = networkx.katz_centrality_numpy(G, alpha, weight=None) for n in sorted(G): assert_almost_equal(b[n], b_answer[n], places=4) class TestKatzCentralityDirected(object): def setUp(self): G = networkx.DiGraph() edges = [(1, 2),(1, 3),(2, 4),(3, 2),(3, 5),(4, 2),(4, 5),(4, 6),(5, 6), (5, 7),(5, 8),(6, 8),(7, 1),(7, 5),(7, 8),(8, 6),(8, 7)] G.add_edges_from(edges, weight=2.0) self.G = G.reverse() self.G.alpha = 0.1 self.G.evc = [ 0.3289589783189635, 0.2832077296243516, 0.3425906003685471, 0.3970420865198392, 0.41074871061646284, 0.272257430756461, 0.4201989685435462, 0.34229059218038554, ] H = networkx.DiGraph(edges) self.H = G.reverse() self.H.alpha = 0.1 self.H.evc = [ 0.3289589783189635, 0.2832077296243516, 0.3425906003685471, 0.3970420865198392, 0.41074871061646284, 0.272257430756461, 0.4201989685435462, 0.34229059218038554, ] def test_katz_centrality_weighted(self): G = self.G alpha = self.G.alpha p = networkx.katz_centrality(G, alpha) for (a, b) in zip(list(p.values()), self.G.evc): assert_almost_equal(a, b) def test_katz_centrality_unweighted(self): G = self.H alpha = self.H.alpha p = networkx.katz_centrality(G, alpha) for (a, b) in zip(list(p.values()), self.G.evc): assert_almost_equal(a, b) class TestKatzCentralityDirectedNumpy(TestKatzCentralityDirected): numpy = 1 # nosetests attribute, use nosetests -a 'not numpy' to skip test @classmethod def setupClass(cls): global np try: import numpy as np import scipy except ImportError: raise SkipTest('SciPy not available.') def test_katz_centrality_weighted(self): G = self.G alpha = self.G.alpha p = networkx.katz_centrality_numpy(G, alpha) for (a, b) in zip(list(p.values()), self.G.evc): assert_almost_equal(a, b) def test_katz_centrality_unweighted(self): G = self.H alpha = self.H.alpha p = networkx.katz_centrality_numpy(G, alpha) for (a, b) in zip(list(p.values()), self.G.evc): assert_almost_equal(a, b) class TestKatzEigenvectorVKatz(object): numpy = 1 # nosetests attribute, use nosetests -a 'not numpy' to skip test @classmethod def setupClass(cls): global np global eigvals try: import numpy as np import scipy from numpy.linalg import eigvals except ImportError: raise SkipTest('SciPy not available.') def test_eigenvector_v_katz_random(self): G = networkx.gnp_random_graph(10,0.5, seed=1234) l = float(max(eigvals(networkx.adjacency_matrix(G).todense()))) e = networkx.eigenvector_centrality_numpy(G) k = networkx.katz_centrality_numpy(G, 1.0/l) for n in G: assert_almost_equal(e[n], k[n]) networkx-1.11/networkx/algorithms/centrality/tests/test_degree_centrality.py0000644000175000017500000000574612637544450027675 0ustar aricaric00000000000000""" Unit tests for degree centrality. """ from nose.tools import * import networkx as nx class TestDegreeCentrality: def __init__(self): self.K = nx.krackhardt_kite_graph() self.P3 = nx.path_graph(3) self.K5 = nx.complete_graph(5) F = nx.Graph() # Florentine families F.add_edge('Acciaiuoli','Medici') F.add_edge('Castellani','Peruzzi') F.add_edge('Castellani','Strozzi') F.add_edge('Castellani','Barbadori') F.add_edge('Medici','Barbadori') F.add_edge('Medici','Ridolfi') F.add_edge('Medici','Tornabuoni') F.add_edge('Medici','Albizzi') F.add_edge('Medici','Salviati') F.add_edge('Salviati','Pazzi') F.add_edge('Peruzzi','Strozzi') F.add_edge('Peruzzi','Bischeri') F.add_edge('Strozzi','Ridolfi') F.add_edge('Strozzi','Bischeri') F.add_edge('Ridolfi','Tornabuoni') F.add_edge('Tornabuoni','Guadagni') F.add_edge('Albizzi','Ginori') F.add_edge('Albizzi','Guadagni') F.add_edge('Bischeri','Guadagni') F.add_edge('Guadagni','Lamberteschi') self.F = F G = nx.DiGraph() G.add_edge(0,5) G.add_edge(1,5) G.add_edge(2,5) G.add_edge(3,5) G.add_edge(4,5) G.add_edge(5,6) G.add_edge(5,7) G.add_edge(5,8) self.G = G def test_degree_centrality_1(self): d = nx.degree_centrality(self.K5) exact = dict(zip(range(5), [1]*5)) for n,dc in d.items(): assert_almost_equal(exact[n], dc) def test_degree_centrality_2(self): d = nx.degree_centrality(self.P3) exact = {0:0.5, 1:1, 2:0.5} for n,dc in d.items(): assert_almost_equal(exact[n], dc) def test_degree_centrality_3(self): d = nx.degree_centrality(self.K) exact = {0:.444, 1:.444, 2:.333, 3:.667, 4:.333, 5:.556, 6:.556, 7:.333, 8:.222, 9:.111} for n,dc in d.items(): assert_almost_equal(exact[n], float("%5.3f" % dc)) def test_degree_centrality_4(self): d = nx.degree_centrality(self.F) names = sorted(self.F.nodes()) dcs = [0.071, 0.214, 0.143, 0.214, 0.214, 0.071, 0.286, 0.071, 0.429, 0.071, 0.214, 0.214, 0.143, 0.286, 0.214] exact = dict(zip(names, dcs)) for n,dc in d.items(): assert_almost_equal(exact[n], float("%5.3f" % dc)) def test_indegree_centrality(self): d = nx.in_degree_centrality(self.G) exact = {0: 0.0, 1: 0.0, 2: 0.0, 3: 0.0, 4: 0.0, 5: 0.625, 6: 0.125, 7: 0.125, 8: 0.125} for n,dc in d.items(): assert_almost_equal(exact[n], dc) def test_outdegree_centrality(self): d = nx.out_degree_centrality(self.G) exact = {0: 0.125, 1: 0.125, 2: 0.125, 3: 0.125, 4: 0.125, 5: 0.375, 6: 0.0, 7: 0.0, 8: 0.0} for n,dc in d.items(): assert_almost_equal(exact[n], dc) networkx-1.11/networkx/algorithms/centrality/tests/test_betweenness_centrality_subset.py0000644000175000017500000002174112637544450032342 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * import networkx from networkx import betweenness_centrality_subset,\ edge_betweenness_centrality_subset class TestSubsetBetweennessCentrality: def test_K5(self): """Betweenness centrality: K5""" G=networkx.complete_graph(5) b=betweenness_centrality_subset(G, sources=[0], targets=[1,3], weight=None) b_answer={0: 0.0, 1: 0.0, 2: 0.0, 3: 0.0, 4: 0.0} for n in sorted(G): assert_almost_equal(b[n],b_answer[n]) def test_P5_directed(self): """Betweenness centrality: P5 directed""" G=networkx.DiGraph() G.add_path(list(range(5))) b_answer={0:0,1:1,2:1,3:0,4:0,5:0} b=betweenness_centrality_subset(G, sources=[0], targets=[3], weight=None) for n in sorted(G): assert_almost_equal(b[n],b_answer[n]) def test_P5(self): """Betweenness centrality: P5""" G=networkx.Graph() G.add_path(list(range(5))) b_answer={0:0,1:0.5,2:0.5,3:0,4:0,5:0} b=betweenness_centrality_subset(G, sources=[0], targets=[3], weight=None) for n in sorted(G): assert_almost_equal(b[n],b_answer[n]) def test_P5_multiple_target(self): """Betweenness centrality: P5 multiple target""" G=networkx.Graph() G.add_path(list(range(5))) b_answer={0:0,1:1,2:1,3:0.5,4:0,5:0} b=betweenness_centrality_subset(G, sources=[0], targets=[3,4], weight=None) for n in sorted(G): assert_almost_equal(b[n],b_answer[n]) def test_box(self): """Betweenness centrality: box""" G=networkx.Graph() G.add_edge(0,1) G.add_edge(0,2) G.add_edge(1,3) G.add_edge(2,3) b_answer={0:0,1:0.25,2:0.25,3:0} b=betweenness_centrality_subset(G, sources=[0], targets=[3], weight=None) for n in sorted(G): assert_almost_equal(b[n],b_answer[n]) def test_box_and_path(self): """Betweenness centrality: box and path""" G=networkx.Graph() G.add_edge(0,1) G.add_edge(0,2) G.add_edge(1,3) G.add_edge(2,3) G.add_edge(3,4) G.add_edge(4,5) b_answer={0:0,1:0.5,2:0.5,3:0.5,4:0,5:0} b=betweenness_centrality_subset(G, sources=[0], targets=[3,4], weight=None) for n in sorted(G): assert_almost_equal(b[n],b_answer[n]) def test_box_and_path2(self): """Betweenness centrality: box and path multiple target""" G=networkx.Graph() G.add_edge(0,1) G.add_edge(1,2) G.add_edge(2,3) G.add_edge(1,20) G.add_edge(20,3) G.add_edge(3,4) b_answer={0:0,1:1.0,2:0.5,20:0.5,3:0.5,4:0} b=betweenness_centrality_subset(G, sources=[0], targets=[3,4]) for n in sorted(G): assert_almost_equal(b[n],b_answer[n]) class TestBetweennessCentralitySources: def test_K5(self): """Betweenness centrality: K5""" G=networkx.complete_graph(5) b=networkx.betweenness_centrality_source(G, weight=None, normalized=False) b_answer={0: 0.0, 1: 0.0, 2: 0.0, 3: 0.0, 4: 0.0} for n in sorted(G): assert_almost_equal(b[n],b_answer[n]) def test_P3(self): """Betweenness centrality: P3""" G=networkx.path_graph(3) b_answer={0: 0.0, 1: 1.0, 2: 0.0} b=networkx.betweenness_centrality_source(G, weight=None, normalized=True) for n in sorted(G): assert_almost_equal(b[n],b_answer[n]) class TestEdgeSubsetBetweennessCentrality: def test_K5(self): """Edge betweenness centrality: K5""" G=networkx.complete_graph(5) b=edge_betweenness_centrality_subset(G, sources=[0], targets=[1,3], weight=None) b_answer=dict.fromkeys(G.edges(),0) b_answer[(0,3)]=0.5 b_answer[(0,1)]=0.5 for n in sorted(G.edges()): print(n,b[n]) assert_almost_equal(b[n],b_answer[n]) def test_P5_directed(self): """Edge betweenness centrality: P5 directed""" G=networkx.DiGraph() G.add_path(list(range(5))) b_answer=dict.fromkeys(G.edges(),0) b_answer[(0,1)]=1 b_answer[(1,2)]=1 b_answer[(2,3)]=1 b=edge_betweenness_centrality_subset(G, sources=[0], targets=[3], weight=None) for n in sorted(G.edges()): assert_almost_equal(b[n],b_answer[n]) def test_P5(self): """Edge betweenness centrality: P5""" G=networkx.Graph() G.add_path(list(range(5))) b_answer=dict.fromkeys(G.edges(),0) b_answer[(0,1)]=0.5 b_answer[(1,2)]=0.5 b_answer[(2,3)]=0.5 b=edge_betweenness_centrality_subset(G, sources=[0], targets=[3], weight=None) for n in sorted(G.edges()): assert_almost_equal(b[n],b_answer[n]) def test_P5_multiple_target(self): """Edge betweenness centrality: P5 multiple target""" G=networkx.Graph() G.add_path(list(range(5))) b_answer=dict.fromkeys(G.edges(),0) b_answer[(0,1)]=1 b_answer[(1,2)]=1 b_answer[(2,3)]=1 b_answer[(3,4)]=0.5 b=edge_betweenness_centrality_subset(G, sources=[0], targets=[3,4], weight=None) for n in sorted(G.edges()): assert_almost_equal(b[n],b_answer[n]) def test_box(self): """Edge etweenness centrality: box""" G=networkx.Graph() G.add_edge(0,1) G.add_edge(0,2) G.add_edge(1,3) G.add_edge(2,3) b_answer=dict.fromkeys(G.edges(),0) b_answer[(0,1)]=0.25 b_answer[(0,2)]=0.25 b_answer[(1,3)]=0.25 b_answer[(2,3)]=0.25 b=edge_betweenness_centrality_subset(G, sources=[0], targets=[3], weight=None) for n in sorted(G.edges()): assert_almost_equal(b[n],b_answer[n]) def test_box_and_path(self): """Edge etweenness centrality: box and path""" G=networkx.Graph() G.add_edge(0,1) G.add_edge(0,2) G.add_edge(1,3) G.add_edge(2,3) G.add_edge(3,4) G.add_edge(4,5) b_answer=dict.fromkeys(G.edges(),0) b_answer[(0,1)]=1.0/2 b_answer[(0,2)]=1.0/2 b_answer[(1,3)]=1.0/2 b_answer[(2,3)]=1.0/2 b_answer[(3,4)]=1.0/2 b=edge_betweenness_centrality_subset(G, sources=[0], targets=[3,4], weight=None) for n in sorted(G.edges()): assert_almost_equal(b[n],b_answer[n]) def test_box_and_path2(self): """Edge betweenness centrality: box and path multiple target""" G=networkx.Graph() G.add_edge(0,1) G.add_edge(1,2) G.add_edge(2,3) G.add_edge(1,20) G.add_edge(20,3) G.add_edge(3,4) b_answer=dict.fromkeys(G.edges(),0) b_answer[(0,1)]=1.0 b_answer[(1,20)]=1.0/2 b_answer[(3,20)]=1.0/2 b_answer[(1,2)]=1.0/2 b_answer[(2,3)]=1.0/2 b_answer[(3,4)]=1.0/2 b=edge_betweenness_centrality_subset(G, sources=[0], targets=[3,4], weight=None) for n in sorted(G.edges()): assert_almost_equal(b[n],b_answer[n]) networkx-1.11/networkx/algorithms/centrality/tests/test_load_centrality.py0000644000175000017500000002033012637544500027337 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * import networkx as nx class TestLoadCentrality: def setUp(self): G=nx.Graph(); G.add_edge(0,1,weight=3) G.add_edge(0,2,weight=2) G.add_edge(0,3,weight=6) G.add_edge(0,4,weight=4) G.add_edge(1,3,weight=5) G.add_edge(1,5,weight=5) G.add_edge(2,4,weight=1) G.add_edge(3,4,weight=2) G.add_edge(3,5,weight=1) G.add_edge(4,5,weight=4) self.G=G self.exact_weighted={0: 4.0, 1: 0.0, 2: 8.0, 3: 6.0, 4: 8.0, 5: 0.0} self.K = nx.krackhardt_kite_graph() self.P3 = nx.path_graph(3) self.P4 = nx.path_graph(4) self.K5 = nx.complete_graph(5) self.C4=nx.cycle_graph(4) self.T=nx.balanced_tree(r=2, h=2) self.Gb = nx.Graph() self.Gb.add_edges_from([(0, 1), (0, 2), (1, 3), (2, 3), (2, 4), (4, 5), (3, 5)]) self.F = nx.florentine_families_graph() self.D = nx.cycle_graph(3, create_using=nx.DiGraph()) self.D.add_edges_from([(3, 0), (4, 3)]) def test_not_strongly_connected(self): b = nx.load_centrality(self.D) result = {0: 5./12, 1: 1./4, 2: 1./12, 3: 1./4, 4: 0.000} for n in sorted(self.D): assert_almost_equal(result[n], b[n], places=3) assert_almost_equal(result[n], nx.load_centrality(self.D, n), places=3) def test_weighted_load(self): b=nx.load_centrality(self.G,weight='weight',normalized=False) for n in sorted(self.G): assert_equal(b[n],self.exact_weighted[n]) def test_k5_load(self): G=self.K5 c=nx.load_centrality(G) d={0: 0.000, 1: 0.000, 2: 0.000, 3: 0.000, 4: 0.000} for n in sorted(G): assert_almost_equal(c[n],d[n],places=3) def test_p3_load(self): G=self.P3 c=nx.load_centrality(G) d={0: 0.000, 1: 1.000, 2: 0.000} for n in sorted(G): assert_almost_equal(c[n],d[n],places=3) c=nx.load_centrality(G,v=1) assert_almost_equal(c,1.0) c=nx.load_centrality(G,v=1,normalized=True) assert_almost_equal(c,1.0) def test_p2_load(self): G=nx.path_graph(2) c=nx.load_centrality(G) d={0: 0.000, 1: 0.000} for n in sorted(G): assert_almost_equal(c[n],d[n],places=3) def test_krackhardt_load(self): G=self.K c=nx.load_centrality(G) d={0: 0.023, 1: 0.023, 2: 0.000, 3: 0.102, 4: 0.000, 5: 0.231, 6: 0.231, 7: 0.389, 8: 0.222, 9: 0.000} for n in sorted(G): assert_almost_equal(c[n],d[n],places=3) def test_florentine_families_load(self): G=self.F c=nx.load_centrality(G) d={'Acciaiuoli': 0.000, 'Albizzi': 0.211, 'Barbadori': 0.093, 'Bischeri': 0.104, 'Castellani': 0.055, 'Ginori': 0.000, 'Guadagni': 0.251, 'Lamberteschi': 0.000, 'Medici': 0.522, 'Pazzi': 0.000, 'Peruzzi': 0.022, 'Ridolfi': 0.117, 'Salviati': 0.143, 'Strozzi': 0.106, 'Tornabuoni': 0.090} for n in sorted(G): assert_almost_equal(c[n],d[n],places=3) def test_unnormalized_k5_load(self): G=self.K5 c=nx.load_centrality(G,normalized=False) d={0: 0.000, 1: 0.000, 2: 0.000, 3: 0.000, 4: 0.000} for n in sorted(G): assert_almost_equal(c[n],d[n],places=3) def test_unnormalized_p3_load(self): G=self.P3 c=nx.load_centrality(G,normalized=False) d={0: 0.000, 1: 2.000, 2: 0.000} for n in sorted(G): assert_almost_equal(c[n],d[n],places=3) def test_unnormalized_krackhardt_load(self): G=self.K c=nx.load_centrality(G,normalized=False) d={0: 1.667, 1: 1.667, 2: 0.000, 3: 7.333, 4: 0.000, 5: 16.667, 6: 16.667, 7: 28.000, 8: 16.000, 9: 0.000} for n in sorted(G): assert_almost_equal(c[n],d[n],places=3) def test_unnormalized_florentine_families_load(self): G=self.F c=nx.load_centrality(G,normalized=False) d={'Acciaiuoli': 0.000, 'Albizzi': 38.333, 'Barbadori': 17.000, 'Bischeri': 19.000, 'Castellani': 10.000, 'Ginori': 0.000, 'Guadagni': 45.667, 'Lamberteschi': 0.000, 'Medici': 95.000, 'Pazzi': 0.000, 'Peruzzi': 4.000, 'Ridolfi': 21.333, 'Salviati': 26.000, 'Strozzi': 19.333, 'Tornabuoni': 16.333} for n in sorted(G): assert_almost_equal(c[n],d[n],places=3) def test_load_betweenness_difference(self): # Difference Between Load and Betweenness # --------------------------------------- The smallest graph # that shows the difference between load and betweenness is # G=ladder_graph(3) (Graph B below) # Graph A and B are from Tao Zhou, Jian-Guo Liu, Bing-Hong # Wang: Comment on ``Scientific collaboration # networks. II. Shortest paths, weighted networks, and # centrality". http://arxiv.org/pdf/physics/0511084 # Notice that unlike here, their calculation adds to 1 to the # betweennes of every node i for every path from i to every # other node. This is exactly what it should be, based on # Eqn. (1) in their paper: the eqn is B(v) = \sum_{s\neq t, # s\neq v}{\frac{\sigma_{st}(v)}{\sigma_{st}}}, therefore, # they allow v to be the target node. # We follow Brandes 2001, who follows Freeman 1977 that make # the sum for betweenness of v exclude paths where v is either # the source or target node. To agree with their numbers, we # must additionally, remove edge (4,8) from the graph, see AC # example following (there is a mistake in the figure in their # paper - personal communication). # A = nx.Graph() # A.add_edges_from([(0,1), (1,2), (1,3), (2,4), # (3,5), (4,6), (4,7), (4,8), # (5,8), (6,9), (7,9), (8,9)]) B = nx.Graph() # ladder_graph(3) B.add_edges_from([(0,1), (0,2), (1,3), (2,3), (2,4), (4,5), (3,5)]) c = nx.load_centrality(B,normalized=False) d={0: 1.750, 1: 1.750, 2: 6.500, 3: 6.500, 4: 1.750, 5: 1.750} for n in sorted(B): assert_almost_equal(c[n],d[n],places=3) def test_c4_edge_load(self): G=self.C4 c = nx.edge_load(G) d={(0, 1): 6.000, (0, 3): 6.000, (1, 2): 6.000, (2, 3): 6.000} for n in G.edges(): assert_almost_equal(c[n],d[n],places=3) def test_p4_edge_load(self): G=self.P4 c = nx.edge_load(G) d={(0, 1): 6.000, (1, 2): 8.000, (2, 3): 6.000} for n in G.edges(): assert_almost_equal(c[n],d[n],places=3) def test_k5_edge_load(self): G=self.K5 c = nx.edge_load(G) d={(0, 1): 5.000, (0, 2): 5.000, (0, 3): 5.000, (0, 4): 5.000, (1, 2): 5.000, (1, 3): 5.000, (1, 4): 5.000, (2, 3): 5.000, (2, 4): 5.000, (3, 4): 5.000} for n in G.edges(): assert_almost_equal(c[n],d[n],places=3) def test_tree_edge_load(self): G=self.T c = nx.edge_load(G) d={(0, 1): 24.000, (0, 2): 24.000, (1, 3): 12.000, (1, 4): 12.000, (2, 5): 12.000, (2, 6): 12.000} for n in G.edges(): assert_almost_equal(c[n],d[n],places=3) networkx-1.11/networkx/algorithms/centrality/tests/test_dispersion.py0000644000175000017500000000261012637544450026346 0ustar aricaric00000000000000import networkx as nx from nose.tools import * def small_ego_G(): """The sample network from http://arxiv.org/pdf/1310.6753v1.pdf""" edges=[('a','b'), ('a','c'), ('b','c'), ('b','d'), ('b', 'e'),('b','f'),('c','d'),('c','f'),('c','h'),('d','f'), ('e','f'), ('f','h'),('h','j'), ('h','k'),('i','j'), ('i','k'), ('j','k'), ('u','a'), ('u','b'), ('u','c'), ('u','d'), ('u','e'), ('u','f'), ('u','g'), ('u','h'), ('u','i'), ('u','j'), ('u','k')] G = nx.Graph() G.add_edges_from(edges) return G class TestDispersion(object): def test_article(self): """our algorithm matches article's""" G = small_ego_G() disp_uh = nx.dispersion(G, 'u', 'h', normalized=False) disp_ub = nx.dispersion(G, 'u', 'b', normalized=False) assert disp_uh == 4 assert disp_ub == 1 def test_results_length(self): """there is a result for every node""" G = small_ego_G() disp = nx.dispersion(G) disp_Gu = nx.dispersion(G, 'u') disp_uv = nx.dispersion(G, 'u', 'h') assert len(disp) == len(G) assert len(disp_Gu) == len(G) - 1 assert type(disp_uv) is float def test_impossible_things(self): G=nx.karate_club_graph() disp = nx.dispersion(G) for u in disp: for v in disp[u]: assert disp[u][v] >= 0 networkx-1.11/networkx/algorithms/centrality/tests/test_betweenness_centrality.py0000644000175000017500000004123512637544500030751 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * import networkx as nx def weighted_G(): G=nx.Graph(); G.add_edge(0,1,weight=3) G.add_edge(0,2,weight=2) G.add_edge(0,3,weight=6) G.add_edge(0,4,weight=4) G.add_edge(1,3,weight=5) G.add_edge(1,5,weight=5) G.add_edge(2,4,weight=1) G.add_edge(3,4,weight=2) G.add_edge(3,5,weight=1) G.add_edge(4,5,weight=4) return G class TestBetweennessCentrality(object): def test_K5(self): """Betweenness centrality: K5""" G=nx.complete_graph(5) b=nx.betweenness_centrality(G, weight=None, normalized=False) b_answer={0: 0.0, 1: 0.0, 2: 0.0, 3: 0.0, 4: 0.0} for n in sorted(G): assert_almost_equal(b[n],b_answer[n]) def test_K5_endpoints(self): """Betweenness centrality: K5 endpoints""" G=nx.complete_graph(5) b=nx.betweenness_centrality(G, weight=None, normalized=False, endpoints=True) b_answer={0: 4.0, 1: 4.0, 2: 4.0, 3: 4.0, 4: 4.0} for n in sorted(G): assert_almost_equal(b[n],b_answer[n]) def test_P3_normalized(self): """Betweenness centrality: P3 normalized""" G=nx.path_graph(3) b=nx.betweenness_centrality(G, weight=None, normalized=True) b_answer={0: 0.0, 1: 1.0, 2: 0.0} for n in sorted(G): assert_almost_equal(b[n],b_answer[n]) def test_P3(self): """Betweenness centrality: P3""" G=nx.path_graph(3) b_answer={0: 0.0, 1: 1.0, 2: 0.0} b=nx.betweenness_centrality(G, weight=None, normalized=False) for n in sorted(G): assert_almost_equal(b[n],b_answer[n]) def test_P3_endpoints(self): """Betweenness centrality: P3 endpoints""" G=nx.path_graph(3) b_answer={0: 2.0, 1: 3.0, 2: 2.0} b=nx.betweenness_centrality(G, weight=None, normalized=False, endpoints=True) for n in sorted(G): assert_almost_equal(b[n],b_answer[n]) def test_krackhardt_kite_graph(self): """Betweenness centrality: Krackhardt kite graph""" G=nx.krackhardt_kite_graph() b_answer={0: 1.667,1: 1.667,2: 0.000,3: 7.333,4: 0.000, 5: 16.667,6: 16.667,7: 28.000,8: 16.000,9: 0.000} for b in b_answer: b_answer[b]/=2.0 b=nx.betweenness_centrality(G, weight=None, normalized=False) for n in sorted(G): assert_almost_equal(b[n],b_answer[n],places=3) def test_krackhardt_kite_graph_normalized(self): """Betweenness centrality: Krackhardt kite graph normalized""" G=nx.krackhardt_kite_graph() b_answer={0:0.023,1:0.023,2:0.000,3:0.102,4:0.000, 5:0.231,6:0.231,7:0.389,8:0.222,9:0.000} b=nx.betweenness_centrality(G, weight=None, normalized=True) for n in sorted(G): assert_almost_equal(b[n],b_answer[n],places=3) def test_florentine_families_graph(self): """Betweenness centrality: Florentine families graph""" G=nx.florentine_families_graph() b_answer=\ {'Acciaiuoli': 0.000, 'Albizzi': 0.212, 'Barbadori': 0.093, 'Bischeri': 0.104, 'Castellani': 0.055, 'Ginori': 0.000, 'Guadagni': 0.255, 'Lamberteschi': 0.000, 'Medici': 0.522, 'Pazzi': 0.000, 'Peruzzi': 0.022, 'Ridolfi': 0.114, 'Salviati': 0.143, 'Strozzi': 0.103, 'Tornabuoni': 0.092} b=nx.betweenness_centrality(G, weight=None, normalized=True) for n in sorted(G): assert_almost_equal(b[n],b_answer[n],places=3) def test_ladder_graph(self): """Betweenness centrality: Ladder graph""" G = nx.Graph() # ladder_graph(3) G.add_edges_from([(0,1), (0,2), (1,3), (2,3), (2,4), (4,5), (3,5)]) b_answer={0:1.667,1: 1.667,2: 6.667, 3: 6.667,4: 1.667,5: 1.667} for b in b_answer: b_answer[b]/=2.0 b=nx.betweenness_centrality(G, weight=None, normalized=False) for n in sorted(G): assert_almost_equal(b[n],b_answer[n],places=3) def test_disconnected_path(self): """Betweenness centrality: disconnected path""" G=nx.Graph() G.add_path([0,1,2]) G.add_path([3,4,5,6]) b_answer={0:0,1:1,2:0,3:0,4:2,5:2,6:0} b=nx.betweenness_centrality(G, weight=None, normalized=False) for n in sorted(G): assert_almost_equal(b[n],b_answer[n]) def test_disconnected_path_endpoints(self): """Betweenness centrality: disconnected path endpoints""" G=nx.Graph() G.add_path([0,1,2]) G.add_path([3,4,5,6]) b_answer={0:2,1:3,2:2,3:3,4:5,5:5,6:3} b=nx.betweenness_centrality(G, weight=None, normalized=False, endpoints=True) for n in sorted(G): assert_almost_equal(b[n],b_answer[n]) def test_directed_path(self): """Betweenness centrality: directed path""" G=nx.DiGraph() G.add_path([0,1,2]) b=nx.betweenness_centrality(G, weight=None, normalized=False) b_answer={0: 0.0, 1: 1.0, 2: 0.0} for n in sorted(G): assert_almost_equal(b[n],b_answer[n]) def test_directed_path_normalized(self): """Betweenness centrality: directed path normalized""" G=nx.DiGraph() G.add_path([0,1,2]) b=nx.betweenness_centrality(G, weight=None, normalized=True) b_answer={0: 0.0, 1: 0.5, 2: 0.0} for n in sorted(G): assert_almost_equal(b[n],b_answer[n]) class TestWeightedBetweennessCentrality(object): def test_K5(self): """Weighted betweenness centrality: K5""" G=nx.complete_graph(5) b=nx.betweenness_centrality(G, weight='weight', normalized=False) b_answer={0: 0.0, 1: 0.0, 2: 0.0, 3: 0.0, 4: 0.0} for n in sorted(G): assert_almost_equal(b[n],b_answer[n]) def test_P3_normalized(self): """Weighted betweenness centrality: P3 normalized""" G=nx.path_graph(3) b=nx.betweenness_centrality(G, weight='weight', normalized=True) b_answer={0: 0.0, 1: 1.0, 2: 0.0} for n in sorted(G): assert_almost_equal(b[n],b_answer[n]) def test_P3(self): """Weighted betweenness centrality: P3""" G=nx.path_graph(3) b_answer={0: 0.0, 1: 1.0, 2: 0.0} b=nx.betweenness_centrality(G, weight='weight', normalized=False) for n in sorted(G): assert_almost_equal(b[n],b_answer[n]) def test_krackhardt_kite_graph(self): """Weighted betweenness centrality: Krackhardt kite graph""" G=nx.krackhardt_kite_graph() b_answer={0: 1.667,1: 1.667,2: 0.000,3: 7.333,4: 0.000, 5: 16.667,6: 16.667,7: 28.000,8: 16.000,9: 0.000} for b in b_answer: b_answer[b]/=2.0 b=nx.betweenness_centrality(G, weight='weight', normalized=False) for n in sorted(G): assert_almost_equal(b[n],b_answer[n],places=3) def test_krackhardt_kite_graph_normalized(self): """Weighted betweenness centrality: Krackhardt kite graph normalized """ G=nx.krackhardt_kite_graph() b_answer={0:0.023,1:0.023,2:0.000,3:0.102,4:0.000, 5:0.231,6:0.231,7:0.389,8:0.222,9:0.000} b=nx.betweenness_centrality(G, weight='weight', normalized=True) for n in sorted(G): assert_almost_equal(b[n],b_answer[n],places=3) def test_florentine_families_graph(self): """Weighted betweenness centrality: Florentine families graph""" G=nx.florentine_families_graph() b_answer=\ {'Acciaiuoli': 0.000, 'Albizzi': 0.212, 'Barbadori': 0.093, 'Bischeri': 0.104, 'Castellani': 0.055, 'Ginori': 0.000, 'Guadagni': 0.255, 'Lamberteschi': 0.000, 'Medici': 0.522, 'Pazzi': 0.000, 'Peruzzi': 0.022, 'Ridolfi': 0.114, 'Salviati': 0.143, 'Strozzi': 0.103, 'Tornabuoni': 0.092} b=nx.betweenness_centrality(G, weight='weight', normalized=True) for n in sorted(G): assert_almost_equal(b[n],b_answer[n],places=3) def test_ladder_graph(self): """Weighted betweenness centrality: Ladder graph""" G = nx.Graph() # ladder_graph(3) G.add_edges_from([(0,1), (0,2), (1,3), (2,3), (2,4), (4,5), (3,5)]) b_answer={0:1.667,1: 1.667,2: 6.667, 3: 6.667,4: 1.667,5: 1.667} for b in b_answer: b_answer[b]/=2.0 b=nx.betweenness_centrality(G, weight='weight', normalized=False) for n in sorted(G): assert_almost_equal(b[n],b_answer[n],places=3) def test_G(self): """Weighted betweenness centrality: G""" G = weighted_G() b_answer={0: 2.0, 1: 0.0, 2: 4.0, 3: 3.0, 4: 4.0, 5: 0.0} b=nx.betweenness_centrality(G, weight='weight', normalized=False) for n in sorted(G): assert_almost_equal(b[n],b_answer[n]) def test_G2(self): """Weighted betweenness centrality: G2""" G=nx.DiGraph() G.add_weighted_edges_from([('s','u',10) ,('s','x',5) , ('u','v',1) ,('u','x',2) , ('v','y',1) ,('x','u',3) , ('x','v',5) ,('x','y',2) , ('y','s',7) ,('y','v',6)]) b_answer={'y':5.0,'x':5.0,'s':4.0,'u':2.0,'v':2.0} b=nx.betweenness_centrality(G, weight='weight', normalized=False) for n in sorted(G): assert_almost_equal(b[n],b_answer[n]) class TestEdgeBetweennessCentrality(object): def test_K5(self): """Edge betweenness centrality: K5""" G=nx.complete_graph(5) b=nx.edge_betweenness_centrality(G, weight=None, normalized=False) b_answer=dict.fromkeys(G.edges(),1) for n in sorted(G.edges()): assert_almost_equal(b[n],b_answer[n]) def test_normalized_K5(self): """Edge betweenness centrality: K5""" G=nx.complete_graph(5) b=nx.edge_betweenness_centrality(G, weight=None, normalized=True) b_answer=dict.fromkeys(G.edges(),1/10.0) for n in sorted(G.edges()): assert_almost_equal(b[n],b_answer[n]) def test_C4(self): """Edge betweenness centrality: C4""" G=nx.cycle_graph(4) b=nx.edge_betweenness_centrality(G, weight=None, normalized=True) b_answer={(0, 1):2,(0, 3):2, (1, 2):2, (2, 3): 2} for n in sorted(G.edges()): assert_almost_equal(b[n],b_answer[n]/6.0) def test_P4(self): """Edge betweenness centrality: P4""" G=nx.path_graph(4) b=nx.edge_betweenness_centrality(G, weight=None, normalized=False) b_answer={(0, 1):3,(1, 2):4, (2, 3):3} for n in sorted(G.edges()): assert_almost_equal(b[n],b_answer[n]) def test_normalized_P4(self): """Edge betweenness centrality: P4""" G=nx.path_graph(4) b=nx.edge_betweenness_centrality(G, weight=None, normalized=True) b_answer={(0, 1):3,(1, 2):4, (2, 3):3} for n in sorted(G.edges()): assert_almost_equal(b[n],b_answer[n]/6.0) def test_balanced_tree(self): """Edge betweenness centrality: balanced tree""" G=nx.balanced_tree(r=2,h=2) b=nx.edge_betweenness_centrality(G, weight=None, normalized=False) b_answer={(0, 1):12,(0, 2):12, (1, 3):6,(1, 4):6,(2, 5):6,(2,6):6} for n in sorted(G.edges()): assert_almost_equal(b[n],b_answer[n]) class TestWeightedEdgeBetweennessCentrality(object): def test_K5(self): """Edge betweenness centrality: K5""" G=nx.complete_graph(5) b=nx.edge_betweenness_centrality(G, weight='weight', normalized=False) b_answer=dict.fromkeys(G.edges(),1) for n in sorted(G.edges()): assert_almost_equal(b[n],b_answer[n]) def test_C4(self): """Edge betweenness centrality: C4""" G=nx.cycle_graph(4) b=nx.edge_betweenness_centrality(G, weight='weight', normalized=False) b_answer={(0, 1):2,(0, 3):2, (1, 2):2, (2, 3): 2} for n in sorted(G.edges()): assert_almost_equal(b[n],b_answer[n]) def test_P4(self): """Edge betweenness centrality: P4""" G=nx.path_graph(4) b=nx.edge_betweenness_centrality(G, weight='weight', normalized=False) b_answer={(0, 1):3,(1, 2):4, (2, 3):3} for n in sorted(G.edges()): assert_almost_equal(b[n],b_answer[n]) def test_balanced_tree(self): """Edge betweenness centrality: balanced tree""" G=nx.balanced_tree(r=2,h=2) b=nx.edge_betweenness_centrality(G, weight='weight', normalized=False) b_answer={(0, 1):12,(0, 2):12, (1, 3):6,(1, 4):6,(2, 5):6,(2,6):6} for n in sorted(G.edges()): assert_almost_equal(b[n],b_answer[n]) def test_weighted_graph(self): eList = [(0, 1, 5), (0, 2, 4), (0, 3, 3), (0, 4, 2), (1, 2, 4), (1, 3, 1), (1, 4, 3), (2, 4, 5), (3, 4, 4)] G = nx.Graph() G.add_weighted_edges_from(eList) b = nx.edge_betweenness_centrality(G, weight='weight', normalized=False) b_answer={(0, 1):0.0, (0, 2):1.0, (0, 3):2.0, (0, 4):1.0, (1, 2):2.0, (1, 3):3.5, (1, 4):1.5, (2, 4):1.0, (3, 4):0.5} for n in sorted(G.edges()): assert_almost_equal(b[n],b_answer[n]) def test_normalized_weighted_graph(self): eList = [(0, 1, 5), (0, 2, 4), (0, 3, 3), (0, 4, 2), (1, 2, 4), (1, 3, 1), (1, 4, 3), (2, 4, 5), (3, 4, 4)] G = nx.Graph() G.add_weighted_edges_from(eList) b = nx.edge_betweenness_centrality(G, weight='weight', normalized=True) b_answer={(0, 1):0.0, (0, 2):1.0, (0, 3):2.0, (0, 4):1.0, (1, 2):2.0, (1, 3):3.5, (1, 4):1.5, (2, 4):1.0, (3, 4):0.5} norm = len(G)*(len(G)-1)/2.0 for n in sorted(G.edges()): assert_almost_equal(b[n],b_answer[n]/norm) networkx-1.11/networkx/algorithms/centrality/tests/test_communicability.py0000644000175000017500000001245512637544450027367 0ustar aricaric00000000000000from collections import defaultdict from nose.tools import * from nose import SkipTest import networkx as nx from networkx.algorithms.centrality.communicability_alg import * class TestCommunicability: @classmethod def setupClass(cls): global numpy global scipy try: import numpy except ImportError: raise SkipTest('NumPy not available.') try: import scipy except ImportError: raise SkipTest('SciPy not available.') def test_communicability_centrality(self): answer={0: 1.5430806348152433, 1: 1.5430806348152433} result=communicability_centrality(nx.path_graph(2)) for k,v in result.items(): assert_almost_equal(answer[k],result[k],places=7) answer1={'1': 1.6445956054135658, 'Albert': 2.4368257358712189, 'Aric': 2.4368257358712193, 'Dan':3.1306328496328168, 'Franck': 2.3876142275231915} G1=nx.Graph([('Franck','Aric'),('Aric','Dan'),('Dan','Albert'), ('Albert','Franck'),('Dan','1'),('Franck','Albert')]) result1=communicability_centrality(G1) for k,v in result1.items(): assert_almost_equal(answer1[k],result1[k],places=7) result1=communicability_centrality_exp(G1) for k,v in result1.items(): assert_almost_equal(answer1[k],result1[k],places=7) def test_communicability_betweenness_centrality(self): answer={0: 0.07017447951484615, 1: 0.71565598701107991, 2: 0.71565598701107991, 3: 0.07017447951484615} result=communicability_betweenness_centrality(nx.path_graph(4)) for k,v in result.items(): assert_almost_equal(answer[k],result[k],places=7) answer1={'1': 0.060039074193949521, 'Albert': 0.315470761661372, 'Aric': 0.31547076166137211, 'Dan': 0.68297778678316201, 'Franck': 0.21977926617449497} G1=nx.Graph([('Franck','Aric'), ('Aric','Dan'),('Dan','Albert'),('Albert','Franck'), ('Dan','1'),('Franck','Albert')]) result1=communicability_betweenness_centrality(G1) for k,v in result1.items(): assert_almost_equal(answer1[k],result1[k],places=7) def test_communicability_betweenness_centrality_small(self): G = nx.Graph([(1,2)]) result=communicability_betweenness_centrality(G) assert_equal(result, {1:0,2:0}) def test_communicability(self): answer={0 :{0: 1.5430806348152435, 1: 1.1752011936438012 }, 1 :{0: 1.1752011936438012, 1: 1.5430806348152435 } } # answer={(0, 0): 1.5430806348152435, # (0, 1): 1.1752011936438012, # (1, 0): 1.1752011936438012, # (1, 1): 1.5430806348152435} result=communicability(nx.path_graph(2)) for k1,val in result.items(): for k2 in val: assert_almost_equal(answer[k1][k2],result[k1][k2],places=7) def test_communicability2(self): answer_orig ={('1', '1'): 1.6445956054135658, ('1', 'Albert'): 0.7430186221096251, ('1', 'Aric'): 0.7430186221096251, ('1', 'Dan'): 1.6208126320442937, ('1', 'Franck'): 0.42639707170035257, ('Albert', '1'): 0.7430186221096251, ('Albert', 'Albert'): 2.4368257358712189, ('Albert', 'Aric'): 1.4368257358712191, ('Albert', 'Dan'): 2.0472097037446453, ('Albert', 'Franck'): 1.8340111678944691, ('Aric', '1'): 0.7430186221096251, ('Aric', 'Albert'): 1.4368257358712191, ('Aric', 'Aric'): 2.4368257358712193, ('Aric', 'Dan'): 2.0472097037446457, ('Aric', 'Franck'): 1.8340111678944691, ('Dan', '1'): 1.6208126320442937, ('Dan', 'Albert'): 2.0472097037446453, ('Dan', 'Aric'): 2.0472097037446457, ('Dan', 'Dan'): 3.1306328496328168, ('Dan', 'Franck'): 1.4860372442192515, ('Franck', '1'): 0.42639707170035257, ('Franck', 'Albert'): 1.8340111678944691, ('Franck', 'Aric'): 1.8340111678944691, ('Franck', 'Dan'): 1.4860372442192515, ('Franck', 'Franck'): 2.3876142275231915} answer=defaultdict(dict) for (k1,k2),v in answer_orig.items(): answer[k1][k2]=v G1=nx.Graph([('Franck','Aric'),('Aric','Dan'),('Dan','Albert'), ('Albert','Franck'),('Dan','1'),('Franck','Albert')]) result=communicability(G1) for k1,val in result.items(): for k2 in val: assert_almost_equal(answer[k1][k2],result[k1][k2],places=7) result=communicability_exp(G1) for k1,val in result.items(): for k2 in val: assert_almost_equal(answer[k1][k2],result[k1][k2],places=7) def test_estrada_index(self): answer=1041.2470334195475 result=estrada_index(nx.karate_club_graph()) assert_almost_equal(answer,result,places=7) networkx-1.11/networkx/algorithms/centrality/tests/test_harmonic_centrality.py0000644000175000017500000000630412637544450030231 0ustar aricaric00000000000000""" Tests for degree centrality. """ from nose.tools import * import networkx as nx from networkx.algorithms.centrality import harmonic_centrality class TestClosenessCentrality: def setUp(self): self.P3 = nx.path_graph(3) self.P4 = nx.path_graph(4) self.K5 = nx.complete_graph(5) self.C4 = nx.cycle_graph(4) self.C5 = nx.cycle_graph(5) self.T = nx.balanced_tree(r=2, h=2) self.Gb = nx.DiGraph() self.Gb.add_edges_from([(0, 1), (0, 2), (0, 4), (2, 1), (2, 3), (4, 3)]) def test_p3_harmonic(self): c = harmonic_centrality(self.P3) d = {0: 1.5, 1: 2, 2: 1.5} for n in sorted(self.P3): assert_almost_equal(c[n], d[n], places=3) def test_p4_harmonic(self): c = harmonic_centrality(self.P4) d = {0: 1.8333333, 1: 2.5, 2: 2.5, 3: 1.8333333} for n in sorted(self.P4): assert_almost_equal(c[n], d[n], places=3) def test_clique_complete(self): c = harmonic_centrality(self.K5) d = {0: 4, 1: 4, 2: 4, 3: 4, 4: 4} for n in sorted(self.P3): assert_almost_equal(c[n], d[n],places=3) def test_cycle_C4(self): c = harmonic_centrality(self.C4) d = {0: 2.5, 1: 2.5, 2: 2.5, 3: 2.5,} for n in sorted(self.C4): assert_almost_equal(c[n], d[n], places=3) def test_cycle_C5(self): c = harmonic_centrality(self.C5) d={0: 3, 1: 3, 2: 3, 3: 3, 4: 3, 5: 4} for n in sorted(self.C5): assert_almost_equal(c[n], d[n], places=3) def test_bal_tree(self): c = harmonic_centrality(self.T) d = {0: 4.0, 1: 4.1666, 2: 4.1666, 3: 2.8333, 4: 2.8333, 5: 2.8333, 6: 2.8333} for n in sorted(self.T): assert_almost_equal(c[n], d[n], places=3) def test_exampleGraph(self): c = harmonic_centrality(self.Gb) d = {0: 0, 1: 2, 2: 1, 3: 2.5, 4: 1} for n in sorted(self.Gb): assert_almost_equal(c[n], d[n], places=3) def test_weighted_harmonic(self): XG = nx.DiGraph() XG.add_weighted_edges_from([('a','b',10), ('d','c',5), ('a','c',1), ('e','f',2), ('f','c',1), ('a','f',3), ]) c = harmonic_centrality(XG, distance='weight') d = {'a': 0, 'b': 0.1, 'c': 2.533, 'd': 0, 'e': 0, 'f': 0.83333} for n in sorted(XG): assert_almost_equal(c[n], d[n], places=3) def test_empty(self): G = nx.DiGraph() c = harmonic_centrality(G, distance='weight') d = {} assert_equal(c, d) def test_singleton(self): G = nx.DiGraph() G.add_node(0) c = harmonic_centrality(G, distance='weight') d = {0: 0} assert_equal(c, d) networkx-1.11/networkx/algorithms/centrality/tests/test_current_flow_betweenness_centrality.py0000644000175000017500000001705112637544450033545 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * from nose import SkipTest import networkx from nose.plugins.attrib import attr from networkx import edge_current_flow_betweenness_centrality \ as edge_current_flow from networkx import approximate_current_flow_betweenness_centrality \ as approximate_cfbc class TestFlowBetweennessCentrality(object): numpy=1 # nosetests attribute, use nosetests -a 'not numpy' to skip test @classmethod def setupClass(cls): global np try: import numpy as np import scipy except ImportError: raise SkipTest('NumPy not available.') def test_K4_normalized(self): """Betweenness centrality: K4""" G=networkx.complete_graph(4) b=networkx.current_flow_betweenness_centrality(G,normalized=True) b_answer={0: 0.25, 1: 0.25, 2: 0.25, 3: 0.25} for n in sorted(G): assert_almost_equal(b[n],b_answer[n]) G.add_edge(0,1,{'weight':0.5,'other':0.3}) b=networkx.current_flow_betweenness_centrality(G,normalized=True,weight=None) for n in sorted(G): assert_almost_equal(b[n],b_answer[n]) wb_answer={0: 0.2222222, 1: 0.2222222, 2: 0.30555555, 3: 0.30555555} b=networkx.current_flow_betweenness_centrality(G,normalized=True) for n in sorted(G): assert_almost_equal(b[n],wb_answer[n]) wb_answer={0: 0.2051282, 1: 0.2051282, 2: 0.33974358, 3: 0.33974358} b=networkx.current_flow_betweenness_centrality(G,normalized=True,weight='other') for n in sorted(G): assert_almost_equal(b[n],wb_answer[n]) def test_K4(self): """Betweenness centrality: K4""" G=networkx.complete_graph(4) for solver in ['full','lu','cg']: b=networkx.current_flow_betweenness_centrality(G, normalized=False, solver=solver) b_answer={0: 0.75, 1: 0.75, 2: 0.75, 3: 0.75} for n in sorted(G): assert_almost_equal(b[n],b_answer[n]) def test_P4_normalized(self): """Betweenness centrality: P4 normalized""" G=networkx.path_graph(4) b=networkx.current_flow_betweenness_centrality(G,normalized=True) b_answer={0: 0, 1: 2./3, 2: 2./3, 3:0} for n in sorted(G): assert_almost_equal(b[n],b_answer[n]) def test_P4(self): """Betweenness centrality: P4""" G=networkx.path_graph(4) b=networkx.current_flow_betweenness_centrality(G,normalized=False) b_answer={0: 0, 1: 2, 2: 2, 3: 0} for n in sorted(G): assert_almost_equal(b[n],b_answer[n]) def test_star(self): """Betweenness centrality: star """ G=networkx.Graph() G.add_star(['a','b','c','d']) b=networkx.current_flow_betweenness_centrality(G,normalized=True) b_answer={'a': 1.0, 'b': 0.0, 'c': 0.0, 'd':0.0} for n in sorted(G): assert_almost_equal(b[n],b_answer[n]) def test_solers(self): """Betweenness centrality: alternate solvers""" G=networkx.complete_graph(4) for solver in ['full','lu','cg']: b=networkx.current_flow_betweenness_centrality(G,normalized=False, solver=solver) b_answer={0: 0.75, 1: 0.75, 2: 0.75, 3: 0.75} for n in sorted(G): assert_almost_equal(b[n],b_answer[n]) class TestApproximateFlowBetweennessCentrality(object): numpy=1 # nosetests attribute, use nosetests -a 'not numpy' to skip test @classmethod def setupClass(cls): global np global assert_allclose try: import numpy as np import scipy from numpy.testing import assert_allclose except ImportError: raise SkipTest('NumPy not available.') def test_K4_normalized(self): "Approximate current-flow betweenness centrality: K4 normalized" G=networkx.complete_graph(4) b=networkx.current_flow_betweenness_centrality(G,normalized=True) epsilon=0.1 ba = approximate_cfbc(G,normalized=True, epsilon=0.5*epsilon) for n in sorted(G): assert_allclose(b[n],ba[n],atol=epsilon) def test_K4(self): "Approximate current-flow betweenness centrality: K4" G=networkx.complete_graph(4) b=networkx.current_flow_betweenness_centrality(G,normalized=False) epsilon=0.1 ba = approximate_cfbc(G,normalized=False, epsilon=0.5*epsilon) for n in sorted(G): assert_allclose(b[n],ba[n],atol=epsilon*len(G)**2) def test_star(self): "Approximate current-flow betweenness centrality: star" G=networkx.Graph() G.add_star(['a','b','c','d']) b=networkx.current_flow_betweenness_centrality(G,normalized=True) epsilon=0.1 ba = approximate_cfbc(G,normalized=True, epsilon=0.5*epsilon) for n in sorted(G): assert_allclose(b[n],ba[n],atol=epsilon) def test_grid(self): "Approximate current-flow betweenness centrality: 2d grid" G=networkx.grid_2d_graph(4,4) b=networkx.current_flow_betweenness_centrality(G,normalized=True) epsilon=0.1 ba = approximate_cfbc(G,normalized=True, epsilon=0.5*epsilon) for n in sorted(G): assert_allclose(b[n],ba[n],atol=epsilon) def test_solvers(self): "Approximate current-flow betweenness centrality: solvers" G=networkx.complete_graph(4) epsilon=0.1 for solver in ['full','lu','cg']: b=approximate_cfbc(G,normalized=False,solver=solver, epsilon=0.5*epsilon) b_answer={0: 0.75, 1: 0.75, 2: 0.75, 3: 0.75} for n in sorted(G): assert_allclose(b[n],b_answer[n],atol=epsilon) class TestWeightedFlowBetweennessCentrality(object): pass class TestEdgeFlowBetweennessCentrality(object): numpy=1 # nosetests attribute, use nosetests -a 'not numpy' to skip test @classmethod def setupClass(cls): global np try: import numpy as np import scipy except ImportError: raise SkipTest('NumPy not available.') def test_K4(self): """Edge flow betweenness centrality: K4""" G=networkx.complete_graph(4) b=edge_current_flow(G,normalized=True) b_answer=dict.fromkeys(G.edges(),0.25) for (s,t),v1 in b_answer.items(): v2=b.get((s,t),b.get((t,s))) assert_almost_equal(v1,v2) def test_K4_normalized(self): """Edge flow betweenness centrality: K4""" G=networkx.complete_graph(4) b=edge_current_flow(G,normalized=False) b_answer=dict.fromkeys(G.edges(),0.75) for (s,t),v1 in b_answer.items(): v2=b.get((s,t),b.get((t,s))) assert_almost_equal(v1,v2) def test_C4(self): """Edge flow betweenness centrality: C4""" G=networkx.cycle_graph(4) b=edge_current_flow(G,normalized=False) b_answer={(0, 1):1.25,(0, 3):1.25, (1, 2):1.25, (2, 3): 1.25} for (s,t),v1 in b_answer.items(): v2=b.get((s,t),b.get((t,s))) assert_almost_equal(v1,v2) def test_P4(self): """Edge betweenness centrality: P4""" G=networkx.path_graph(4) b=edge_current_flow(G,normalized=False) b_answer={(0, 1):1.5,(1, 2):2.0, (2, 3):1.5} for (s,t),v1 in b_answer.items(): v2=b.get((s,t),b.get((t,s))) assert_almost_equal(v1,v2) networkx-1.11/networkx/algorithms/centrality/tests/test_current_flow_closeness.py0000644000175000017500000000260212637544450030757 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * from nose import SkipTest import networkx class TestFlowClosenessCentrality(object): numpy=1 # nosetests attribute, use nosetests -a 'not numpy' to skip test @classmethod def setupClass(cls): global np try: import numpy as np import scipy except ImportError: raise SkipTest('NumPy not available.') def test_K4(self): """Closeness centrality: K4""" G=networkx.complete_graph(4) b=networkx.current_flow_closeness_centrality(G) b_answer={0: 2.0/3, 1: 2.0/3, 2: 2.0/3, 3: 2.0/3} for n in sorted(G): assert_almost_equal(b[n],b_answer[n]) def test_P4(self): """Closeness centrality: P4""" G=networkx.path_graph(4) b=networkx.current_flow_closeness_centrality(G) b_answer={0: 1.0/6, 1: 1.0/4, 2: 1.0/4, 3:1.0/6} for n in sorted(G): assert_almost_equal(b[n],b_answer[n]) def test_star(self): """Closeness centrality: star """ G=networkx.Graph() G.add_star(['a','b','c','d']) b=networkx.current_flow_closeness_centrality(G) b_answer={'a': 1.0/3, 'b': 0.6/3, 'c': 0.6/3, 'd':0.6/3} for n in sorted(G): assert_almost_equal(b[n],b_answer[n]) class TestWeightedFlowClosenessCentrality(object): pass ././@LongLink0000000000000000000000000000014600000000000011216 Lustar 00000000000000networkx-1.11/networkx/algorithms/centrality/tests/test_current_flow_betweenness_centrality_subset.pynetworkx-1.11/networkx/algorithms/centrality/tests/test_current_flow_betweenness_centrality_subset.p0000644000175000017500000001703412637544500034736 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * from nose import SkipTest import networkx from nose.plugins.attrib import attr from networkx import edge_current_flow_betweenness_centrality \ as edge_current_flow from networkx import edge_current_flow_betweenness_centrality_subset \ as edge_current_flow_subset class TestFlowBetweennessCentrality(object): numpy=1 # nosetests attribute, use nosetests -a 'not numpy' to skip test @classmethod def setupClass(cls): global np try: import numpy as np import scipy except ImportError: raise SkipTest('NumPy not available.') def test_K4_normalized(self): """Betweenness centrality: K4""" G=networkx.complete_graph(4) b=networkx.current_flow_betweenness_centrality_subset(G, G.nodes(), G.nodes(), normalized=True) b_answer=networkx.current_flow_betweenness_centrality(G,normalized=True) for n in sorted(G): assert_almost_equal(b[n],b_answer[n]) def test_K4(self): """Betweenness centrality: K4""" G=networkx.complete_graph(4) b=networkx.current_flow_betweenness_centrality_subset(G, G.nodes(), G.nodes(), normalized=True) b_answer=networkx.current_flow_betweenness_centrality(G,normalized=True) for n in sorted(G): assert_almost_equal(b[n],b_answer[n]) # test weighted network G.add_edge(0,1,{'weight':0.5,'other':0.3}) b=networkx.current_flow_betweenness_centrality_subset(G, G.nodes(), G.nodes(), normalized=True, weight=None) for n in sorted(G): assert_almost_equal(b[n],b_answer[n]) b=networkx.current_flow_betweenness_centrality_subset(G, G.nodes(), G.nodes(), normalized=True) b_answer=networkx.current_flow_betweenness_centrality(G,normalized=True) for n in sorted(G): assert_almost_equal(b[n],b_answer[n]) b=networkx.current_flow_betweenness_centrality_subset(G, G.nodes(), G.nodes(), normalized=True, weight='other') b_answer=networkx.current_flow_betweenness_centrality(G,normalized=True,weight='other') for n in sorted(G): assert_almost_equal(b[n],b_answer[n]) def test_P4_normalized(self): """Betweenness centrality: P4 normalized""" G=networkx.path_graph(4) b=networkx.current_flow_betweenness_centrality_subset(G, G.nodes(), G.nodes(), normalized=True) b_answer=networkx.current_flow_betweenness_centrality(G,normalized=True) for n in sorted(G): assert_almost_equal(b[n],b_answer[n]) def test_P4(self): """Betweenness centrality: P4""" G=networkx.path_graph(4) b=networkx.current_flow_betweenness_centrality_subset(G, G.nodes(), G.nodes(), normalized=True) b_answer=networkx.current_flow_betweenness_centrality(G,normalized=True) for n in sorted(G): assert_almost_equal(b[n],b_answer[n]) def test_star(self): """Betweenness centrality: star """ G=networkx.Graph() G.add_star(['a','b','c','d']) b=networkx.current_flow_betweenness_centrality_subset(G, G.nodes(), G.nodes(), normalized=True) b_answer=networkx.current_flow_betweenness_centrality(G,normalized=True) for n in sorted(G): assert_almost_equal(b[n],b_answer[n]) # class TestWeightedFlowBetweennessCentrality(): # pass class TestEdgeFlowBetweennessCentrality(object): numpy=1 # nosetests attribute, use nosetests -a 'not numpy' to skip test @classmethod def setupClass(cls): global np try: import numpy as np import scipy except ImportError: raise SkipTest('NumPy not available.') def test_K4_normalized(self): """Betweenness centrality: K4""" G=networkx.complete_graph(4) b=edge_current_flow_subset(G,G.nodes(),G.nodes(),normalized=True) b_answer=edge_current_flow(G,normalized=True) for (s,t),v1 in b_answer.items(): v2=b.get((s,t),b.get((t,s))) assert_almost_equal(v1,v2) def test_K4(self): """Betweenness centrality: K4""" G=networkx.complete_graph(4) b=edge_current_flow_subset(G,G.nodes(),G.nodes(),normalized=False) b_answer=edge_current_flow(G,normalized=False) for (s,t),v1 in b_answer.items(): v2=b.get((s,t),b.get((t,s))) assert_almost_equal(v1,v2) # test weighted network G.add_edge(0,1,{'weight':0.5,'other':0.3}) b=edge_current_flow_subset(G,G.nodes(),G.nodes(),normalized=False,weight=None) # weight is None => same as unweighted network for (s,t),v1 in b_answer.items(): v2=b.get((s,t),b.get((t,s))) assert_almost_equal(v1,v2) b=edge_current_flow_subset(G,G.nodes(),G.nodes(),normalized=False) b_answer=edge_current_flow(G,normalized=False) for (s,t),v1 in b_answer.items(): v2=b.get((s,t),b.get((t,s))) assert_almost_equal(v1,v2) b=edge_current_flow_subset(G,G.nodes(),G.nodes(),normalized=False,weight='other') b_answer=edge_current_flow(G,normalized=False,weight='other') for (s,t),v1 in b_answer.items(): v2=b.get((s,t),b.get((t,s))) assert_almost_equal(v1,v2) def test_C4(self): """Edge betweenness centrality: C4""" G=networkx.cycle_graph(4) b=edge_current_flow_subset(G,G.nodes(),G.nodes(),normalized=True) b_answer=edge_current_flow(G,normalized=True) for (s,t),v1 in b_answer.items(): v2=b.get((s,t),b.get((t,s))) assert_almost_equal(v1,v2) def test_P4(self): """Edge betweenness centrality: P4""" G=networkx.path_graph(4) b=edge_current_flow_subset(G,G.nodes(),G.nodes(),normalized=True) b_answer=edge_current_flow(G,normalized=True) for (s,t),v1 in b_answer.items(): v2=b.get((s,t),b.get((t,s))) assert_almost_equal(v1,v2) networkx-1.11/networkx/algorithms/centrality/current_flow_closeness.py0000644000175000017500000000703712637544450026565 0ustar aricaric00000000000000"""Current-flow closeness centrality measures. """ # Copyright (C) 2010-2013 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import networkx as nx from networkx.algorithms.centrality.flow_matrix import * __author__ = """Aric Hagberg """ __all__ = ['current_flow_closeness_centrality', 'information_centrality'] def current_flow_closeness_centrality(G, weight='weight', dtype=float, solver='lu'): """Compute current-flow closeness centrality for nodes. Current-flow closeness centrality is variant of closeness centrality based on effective resistance between nodes in a network. This metric is also known as information centrality. Parameters ---------- G : graph A NetworkX graph dtype: data type (float) Default data type for internal matrices. Set to np.float32 for lower memory consumption. solver: string (default='lu') Type of linear solver to use for computing the flow matrix. Options are "full" (uses most memory), "lu" (recommended), and "cg" (uses least memory). Returns ------- nodes : dictionary Dictionary of nodes with current flow closeness centrality as the value. See Also -------- closeness_centrality Notes ----- The algorithm is from Brandes [1]_. See also [2]_ for the original definition of information centrality. References ---------- .. [1] Ulrik Brandes and Daniel Fleischer, Centrality Measures Based on Current Flow. Proc. 22nd Symp. Theoretical Aspects of Computer Science (STACS '05). LNCS 3404, pp. 533-544. Springer-Verlag, 2005. http://www.inf.uni-konstanz.de/algo/publications/bf-cmbcf-05.pdf .. [2] Karen Stephenson and Marvin Zelen: Rethinking centrality: Methods and examples. Social Networks 11(1):1-37, 1989. http://dx.doi.org/10.1016/0378-8733(89)90016-6 """ from networkx.utils import reverse_cuthill_mckee_ordering import numpy as np import scipy if G.is_directed(): raise nx.NetworkXError( "current_flow_closeness_centrality() not defined for digraphs.") if not nx.is_connected(G): raise nx.NetworkXError("Graph not connected.") solvername = {"full": FullInverseLaplacian, "lu": SuperLUInverseLaplacian, "cg": CGInverseLaplacian} n = G.number_of_nodes() ordering = list(reverse_cuthill_mckee_ordering(G)) # make a copy with integer labels according to rcm ordering # this could be done without a copy if we really wanted to H = nx.relabel_nodes(G, dict(zip(ordering, range(n)))) betweenness = dict.fromkeys(H, 0.0) # b[v]=0 for v in H n = H.number_of_nodes() L = laplacian_sparse_matrix(H, nodelist=range(n), weight=weight, dtype=dtype, format='csc') C2 = solvername[solver](L, width=1, dtype=dtype) # initialize solver for v in H: col = C2.get_row(v) for w in H: betweenness[v] += col[v]-2*col[w] betweenness[w] += col[v] for v in H: betweenness[v] = 1.0 / (betweenness[v]) return dict((ordering[k], float(v)) for k, v in betweenness.items()) information_centrality = current_flow_closeness_centrality # fixture for nose tests def setup_module(module): from nose import SkipTest try: import numpy except: raise SkipTest("NumPy not available") networkx-1.11/networkx/algorithms/smetric.py0000644000175000017500000000226612637544500021261 0ustar aricaric00000000000000import networkx as nx #from networkx.generators.smax import li_smax_graph def s_metric(G, normalized=True): """Return the s-metric of graph. The s-metric is defined as the sum of the products deg(u)*deg(v) for every edge (u,v) in G. If norm is provided construct the s-max graph and compute it's s_metric, and return the normalized s value Parameters ---------- G : graph The graph used to compute the s-metric. normalized : bool (optional) Normalize the value. Returns ------- s : float The s-metric of the graph. References ---------- .. [1] Lun Li, David Alderson, John C. Doyle, and Walter Willinger, Towards a Theory of Scale-Free Graphs: Definition, Properties, and Implications (Extended Version), 2005. http://arxiv.org/abs/cond-mat/0501169 """ if normalized: raise nx.NetworkXError("Normalization not implemented") # Gmax = li_smax_graph(list(G.degree().values())) # return s_metric(G,normalized=False)/s_metric(Gmax,normalized=False) # else: return float(sum([G.degree(u)*G.degree(v) for (u,v) in G.edges_iter()])) networkx-1.11/networkx/algorithms/core.py0000644000175000017500000002170012637544500020535 0ustar aricaric00000000000000""" Find the k-cores of a graph. The k-core is found by recursively pruning nodes with degrees less than k. See the following reference for details: An O(m) Algorithm for Cores Decomposition of Networks Vladimir Batagelj and Matjaz Zaversnik, 2003. http://arxiv.org/abs/cs.DS/0310049 """ __author__ = "\n".join(['Dan Schult (dschult@colgate.edu)', 'Jason Grout (jason-sage@creativetrax.com)', 'Aric Hagberg (hagberg@lanl.gov)']) # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. __all__ = ['core_number','k_core','k_shell','k_crust','k_corona','find_cores'] import networkx as nx def core_number(G): """Return the core number for each vertex. A k-core is a maximal subgraph that contains nodes of degree k or more. The core number of a node is the largest value k of a k-core containing that node. Parameters ---------- G : NetworkX graph A graph or directed graph Returns ------- core_number : dictionary A dictionary keyed by node to the core number. Raises ------ NetworkXError The k-core is not defined for graphs with self loops or parallel edges. Notes ----- Not implemented for graphs with parallel edges or self loops. For directed graphs the node degree is defined to be the in-degree + out-degree. References ---------- .. [1] An O(m) Algorithm for Cores Decomposition of Networks Vladimir Batagelj and Matjaz Zaversnik, 2003. http://arxiv.org/abs/cs.DS/0310049 """ if G.is_multigraph(): raise nx.NetworkXError( 'MultiGraph and MultiDiGraph types not supported.') if G.number_of_selfloops()>0: raise nx.NetworkXError( 'Input graph has self loops; the core number is not defined.', 'Consider using G.remove_edges_from(G.selfloop_edges()).') if G.is_directed(): import itertools def neighbors(v): return itertools.chain.from_iterable([G.predecessors_iter(v), G.successors_iter(v)]) else: neighbors=G.neighbors_iter degrees=G.degree() # sort nodes by degree nodes=sorted(degrees,key=degrees.get) bin_boundaries=[0] curr_degree=0 for i,v in enumerate(nodes): if degrees[v]>curr_degree: bin_boundaries.extend([i]*(degrees[v]-curr_degree)) curr_degree=degrees[v] node_pos = dict((v,pos) for pos,v in enumerate(nodes)) # initial guesses for core is degree core=degrees nbrs=dict((v,set(neighbors(v))) for v in G) for v in nodes: for u in nbrs[v]: if core[u] > core[v]: nbrs[u].remove(v) pos=node_pos[u] bin_start=bin_boundaries[core[u]] node_pos[u]=bin_start node_pos[nodes[bin_start]]=pos nodes[bin_start],nodes[pos]=nodes[pos],nodes[bin_start] bin_boundaries[core[u]]+=1 core[u]-=1 return core find_cores=core_number def k_core(G,k=None,core_number=None): """Return the k-core of G. A k-core is a maximal subgraph that contains nodes of degree k or more. Parameters ---------- G : NetworkX graph A graph or directed graph k : int, optional The order of the core. If not specified return the main core. core_number : dictionary, optional Precomputed core numbers for the graph G. Returns ------- G : NetworkX graph The k-core subgraph Raises ------ NetworkXError The k-core is not defined for graphs with self loops or parallel edges. Notes ----- The main core is the core with the largest degree. Not implemented for graphs with parallel edges or self loops. For directed graphs the node degree is defined to be the in-degree + out-degree. Graph, node, and edge attributes are copied to the subgraph. See Also -------- core_number References ---------- .. [1] An O(m) Algorithm for Cores Decomposition of Networks Vladimir Batagelj and Matjaz Zaversnik, 2003. http://arxiv.org/abs/cs.DS/0310049 """ if core_number is None: core_number=nx.core_number(G) if k is None: k=max(core_number.values()) # max core nodes=(n for n in core_number if core_number[n]>=k) return G.subgraph(nodes).copy() def k_shell(G,k=None,core_number=None): """Return the k-shell of G. The k-shell is the subgraph of nodes in the k-core but not in the (k+1)-core. Parameters ---------- G : NetworkX graph A graph or directed graph. k : int, optional The order of the shell. If not specified return the main shell. core_number : dictionary, optional Precomputed core numbers for the graph G. Returns ------- G : NetworkX graph The k-shell subgraph Raises ------ NetworkXError The k-shell is not defined for graphs with self loops or parallel edges. Notes ----- This is similar to k_corona but in that case only neighbors in the k-core are considered. Not implemented for graphs with parallel edges or self loops. For directed graphs the node degree is defined to be the in-degree + out-degree. Graph, node, and edge attributes are copied to the subgraph. See Also -------- core_number k_corona References ---------- .. [1] A model of Internet topology using k-shell decomposition Shai Carmi, Shlomo Havlin, Scott Kirkpatrick, Yuval Shavitt, and Eran Shir, PNAS July 3, 2007 vol. 104 no. 27 11150-11154 http://www.pnas.org/content/104/27/11150.full """ if core_number is None: core_number=nx.core_number(G) if k is None: k=max(core_number.values()) # max core nodes=(n for n in core_number if core_number[n]==k) return G.subgraph(nodes).copy() def k_crust(G,k=None,core_number=None): """Return the k-crust of G. The k-crust is the graph G with the k-core removed. Parameters ---------- G : NetworkX graph A graph or directed graph. k : int, optional The order of the shell. If not specified return the main crust. core_number : dictionary, optional Precomputed core numbers for the graph G. Returns ------- G : NetworkX graph The k-crust subgraph Raises ------ NetworkXError The k-crust is not defined for graphs with self loops or parallel edges. Notes ----- This definition of k-crust is different than the definition in [1]_. The k-crust in [1]_ is equivalent to the k+1 crust of this algorithm. Not implemented for graphs with parallel edges or self loops. For directed graphs the node degree is defined to be the in-degree + out-degree. Graph, node, and edge attributes are copied to the subgraph. See Also -------- core_number References ---------- .. [1] A model of Internet topology using k-shell decomposition Shai Carmi, Shlomo Havlin, Scott Kirkpatrick, Yuval Shavitt, and Eran Shir, PNAS July 3, 2007 vol. 104 no. 27 11150-11154 http://www.pnas.org/content/104/27/11150.full """ if core_number is None: core_number=nx.core_number(G) if k is None: k=max(core_number.values())-1 nodes=(n for n in core_number if core_number[n]<=k) return G.subgraph(nodes).copy() def k_corona(G, k, core_number=None): """Return the k-corona of G. The k-corona is the subgraph of nodes in the k-core which have exactly k neighbours in the k-core. Parameters ---------- G : NetworkX graph A graph or directed graph k : int The order of the corona. core_number : dictionary, optional Precomputed core numbers for the graph G. Returns ------- G : NetworkX graph The k-corona subgraph Raises ------ NetworkXError The k-cornoa is not defined for graphs with self loops or parallel edges. Notes ----- Not implemented for graphs with parallel edges or self loops. For directed graphs the node degree is defined to be the in-degree + out-degree. Graph, node, and edge attributes are copied to the subgraph. See Also -------- core_number References ---------- .. [1] k -core (bootstrap) percolation on complex networks: Critical phenomena and nonlocal effects, A. V. Goltsev, S. N. Dorogovtsev, and J. F. F. Mendes, Phys. Rev. E 73, 056101 (2006) http://link.aps.org/doi/10.1103/PhysRevE.73.056101 """ if core_number is None: core_number = nx.core_number(G) nodes = (n for n in core_number if core_number[n] == k and len([v for v in G[n] if core_number[v] >= k]) == k) return G.subgraph(nodes).copy() networkx-1.11/networkx/algorithms/tree/0000755000175000017500000000000012653231454020171 5ustar aricaric00000000000000networkx-1.11/networkx/algorithms/tree/recognition.py0000644000175000017500000001533212637544500023070 0ustar aricaric00000000000000#-*- coding: utf-8 -*- """ Recognition Tests ================= A *forest* is an acyclic, undirected graph, and a *tree* is a connected forest. Depending on the subfield, there are various conventions for generalizing these definitions to directed graphs. In one convention, directed variants of forest and tree are defined in an identical manner, except that the direction of the edges is ignored. In effect, each directed edge is treated as a single undirected edge. Then, additional restrictions are imposed to define *branchings* and *arborescences*. In another convention, directed variants of forest and tree correspond to the previous convention's branchings and arborescences, respectively. Then two new terms, *polyforest* and *polytree*, are defined to correspond to the other convention's forest and tree. Summarizing:: +-----------------------------+ | Convention A | Convention B | +=============================+ | forest | polyforest | | tree | polytree | | branching | forest | | arborescence | tree | +-----------------------------+ Each convention has its reasons. The first convention emphasizes definitional similarity in that directed forests and trees are only concerned with acyclicity and do not have an in-degree constraint, just as their undirected counterparts do not. The second convention emphasizes functional similarity in the sense that the directed analog of a spanning tree is a spanning arborescence. That is, take any spanning tree and choose one node as the root. Then every edge is assigned a direction such there is a directed path from the root to every other node. The result is a spanning arborescence. NetworkX follows convention "A". Explicitly, these are: undirected forest An undirected graph with no undirected cycles. undirected tree A connected, undirected forest. directed forest A directed graph with no undirected cycles. Equivalently, the underlying graph structure (which ignores edge orientations) is an undirected forest. In convention B, this is known as a polyforest. directed tree A weakly connected, directed forest. Equivalently, the underlying graph structure (which ignores edge orientations) is an undirected tree. In convention B, this is known as a polytree. branching A directed forest with each node having, at most, one parent. So the maximum in-degree is equal to 1. In convention B, this is known as a forest. arborescence A directed tree with each node having, at most, one parent. So the maximum in-degree is equal to 1. In convention B, this is known as a tree. For trees and arborescences, the adjective "spanning" may be added to designate that the graph, when considered as a forest/branching, consists of a single tree/arborescence that includes all nodes in the graph. It is true, by definition, that every tree/arborescence is spanning with respect to the nodes that define the tree/arborescence and so, it might seem redundant to introduce the notion of "spanning". However, the nodes may represent a subset of nodes from a larger graph, and it is in this context that the term "spanning" becomes a useful notion. """ import networkx as nx __author__ = """\n""".join([ 'Ferdinando Papale ', 'chebee7i ', ]) __all__ = ['is_arborescence', 'is_branching', 'is_forest', 'is_tree'] @nx.utils.not_implemented_for('undirected') def is_arborescence(G): """ Returns ``True`` if ``G`` is an arborescence. An arborescence is a directed tree with maximum in-degree equal to 1. Parameters ---------- G : graph The graph to test. Returns ------- b : bool A boolean that is ``True`` if ``G`` is an arborescence. Notes ----- In another convention, an arborescence is known as a *tree*. See Also -------- is_tree """ if not is_tree(G): return False if max(G.in_degree().values()) > 1: return False return True @nx.utils.not_implemented_for('undirected') def is_branching(G): """ Returns ``True`` if ``G`` is a branching. A branching is a directed forest with maximum in-degree equal to 1. Parameters ---------- G : directed graph The directed graph to test. Returns ------- b : bool A boolean that is ``True`` if ``G`` is a branching. Notes ----- In another convention, a branching is also known as a *forest*. See Also -------- is_forest """ if not is_forest(G): return False if max(G.in_degree().values()) > 1: return False return True def is_forest(G): """ Returns ``True`` if ``G`` is a forest. A forest is a graph with no undirected cycles. For directed graphs, ``G`` is a forest if the underlying graph is a forest. The underlying graph is obtained by treating each directed edge as a single undirected edge in a multigraph. Parameters ---------- G : graph The graph to test. Returns ------- b : bool A boolean that is ``True`` if ``G`` is a forest. Notes ----- In another convention, a directed forest is known as a *polyforest* and then *forest* corresponds to a *branching*. See Also -------- is_branching """ if len(G) == 0: raise nx.exception.NetworkXPointlessConcept('G has no nodes.') if G.is_directed(): components = nx.weakly_connected_component_subgraphs else: components = nx.connected_component_subgraphs for component in components(G): # Make sure the component is a tree. if component.number_of_edges() != component.number_of_nodes() - 1: return False return True def is_tree(G): """ Returns ``True`` if ``G`` is a tree. A tree is a connected graph with no undirected cycles. For directed graphs, ``G`` is a tree if the underlying graph is a tree. The underlying graph is obtained by treating each directed edge as a single undirected edge in a multigraph. Parameters ---------- G : graph The graph to test. Returns ------- b : bool A boolean that is ``True`` if ``G`` is a tree. Notes ----- In another convention, a directed tree is known as a *polytree* and then *tree* corresponds to an *arborescence*. See Also -------- is_arborescence """ if len(G) == 0: raise nx.exception.NetworkXPointlessConcept('G has no nodes.') # A connected graph with no cycles has n-1 edges. if G.number_of_edges() != len(G) - 1: return False if G.is_directed(): is_connected = nx.is_weakly_connected else: is_connected = nx.is_connected return is_connected(G) networkx-1.11/networkx/algorithms/tree/branchings.py0000644000175000017500000006122512637544500022670 0ustar aricaric00000000000000# encoding: utf-8 """ Algorithms for finding optimum branchings and spanning arborescences. This implementation is based on: J. Edmonds, Optimum branchings, J. Res. Natl. Bur. Standards 71B (1967), 233–240. URL: http://archive.org/details/jresv71Bn4p233 """ # TODO: Implement method from Gabow, Galil, Spence and Tarjan: # #@article{ # year={1986}, # issn={0209-9683}, # journal={Combinatorica}, # volume={6}, # number={2}, # doi={10.1007/BF02579168}, # title={Efficient algorithms for finding minimum spanning trees in # undirected and directed graphs}, # url={http://dx.doi.org/10.1007/BF02579168}, # publisher={Springer-Verlag}, # keywords={68 B 15; 68 C 05}, # author={Gabow, Harold N. and Galil, Zvi and Spencer, Thomas and Tarjan, # Robert E.}, # pages={109-122}, # language={English} #} from __future__ import division from __future__ import print_function import string import random from operator import itemgetter import networkx as nx from .recognition import * __all__ = [ 'branching_weight', 'greedy_branching', 'maximum_branching', 'minimum_branching', 'maximum_spanning_arborescence', 'minimum_spanning_arborescence', 'Edmonds' ] KINDS = set(['max', 'min']) STYLES = { 'branching': 'branching', 'arborescence': 'arborescence', 'spanning arborescence': 'arborescence' } INF = float('inf') def edge_subgraph(G, ebunch): """Return the subgraph induced on edges in `ebunch`. The induced subgraph of the graph contains the edges appearing in `ebunch` and only the nodes that appear in some edge in `ebunch`. Parameters ---------- ebunch : list, iterable A container of edges as 3-tuples (u, v, key), which will be iterated through exactly once. Returns ------- H : MultiDiGraph A subgraph of the graph with the same graph, node, and edge attributes. Notes ----- The graph, edge or node attributes just point to the original graph. So changes to the node or edge structure will not be reflected in the original graph while changes to the attributes will. To create a subgraph with its own copy of the edge/node attributes use: nx.Graph(G.subgraph(nbunch)) If edge attributes are containers, a deep copy can be obtained using: G.subgraph(nbunch).copy() For an inplace reduction of a graph to a subgraph you can remove nodes: G.remove_nodes_from([ n in G if n not in set(nbunch)]) Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> H = G.subgraph([0,1,2]) >>> H.edges() [(0, 1), (1, 2)] """ # TODO: Add all the bells and whistles that G.subgraph has. # Note: MultiDiGraph seems to point to node/graph and copy edge attributes. # Seems inconsistent. # create new graph and copy subgraph into it H = G.__class__() # add edges G_succ = G.succ for u, v, key in ebunch: try: attrs = G_succ[u][v][key] except KeyError: raise KeyError('Invalid edge: ({0}, {1}, {2})'.format(u, v, key)) H.add_edge(u, v, key=key, **attrs) # copy node and graph attributes for u in H: H.node[u] = G.node[u].copy() H.graph = G.graph.copy() return H def random_string(L=15, seed=None): random.seed(seed) return ''.join([random.choice(string.ascii_letters) for n in range(L)]) def _min_weight(weight): return -weight def _max_weight(weight): return weight def branching_weight(G, attr='weight', default=1): """ Returns the total weight of a branching. """ return sum(edge[2].get(attr, default) for edge in G.edges(data=True)) def greedy_branching(G, attr='weight', default=1, kind='max'): """ Returns a branching obtained through a greedy algorithm. This algorithm is wrong, and cannot give a proper optimal branching. However, we include it for pedagogical reasons, as it can be helpful to see what its outputs are. The output is a branching, and possibly, a spanning arborescence. However, it is not guaranteed to be optimal in either case. Parameters ---------- G : DiGraph The directed graph to scan. attr : str The attribute to use as weights. If None, then each edge will be treated equally with a weight of 1. default : float When `attr` is not None, then if an edge does not have that attribute, `default` specifies what value it should take. kind : str The type of optimum to search for: 'min' or 'max' greedy branching. Returns ------- B : directed graph The greedily obtained branching. """ if kind not in KINDS: raise nx.NetworkXException("Unknown value for `kind`.") if kind == 'min': reverse = False else: reverse = True if attr is None: # Generate a random string the graph probably won't have. attr = random_string() edges = [(u, v, data.get(attr, default)) for (u, v, data) in G.edges(data=True)] # We sort by weight, but also by nodes to normalize behavior across runs. try: edges.sort(key=itemgetter(2, 0, 1), reverse=reverse) except TypeError: # This will fail in Python 3.x if the nodes are of varying types. # In that case, we use the arbitrary order. edges.sort(key=itemgetter(2), reverse=reverse) # The branching begins with a forest of no edges. B = nx.DiGraph() B.add_nodes_from(G) # Now we add edges greedily so long we maintain the branching. uf = nx.utils.UnionFind() for i, (u, v, w) in enumerate(edges): if uf[u] == uf[v]: # Adding this edge would form a directed cycle. continue elif B.in_degree(v) == 1: # The edge would increase the degree to be greater than one. continue else: # If attr was None, then don't insert weights... data = {} if attr is not None: data[attr] = w B.add_edge(u, v, **data) uf.union(u, v) return B class MultiDiGraph_EdgeKey(nx.MultiDiGraph): """ MultiDiGraph which assigns unique keys to every edge. Adds a dictionary edge_index which maps edge keys to (u, v, data) tuples. This is not a complete implementation. For Edmonds algorithm, we only use add_node and add_edge, so that is all that is implemented here. During additions, any specified keys are ignored---this means that you also cannot update edge attributes through add_node and add_edge. Why do we need this? Edmonds algorithm requires that we track edges, even as we change the head and tail of an edge, and even changing the weight of edges. We must reliably track edges across graph mutations. """ def __init__(self, data=None, **attr): cls = super(MultiDiGraph_EdgeKey, self) cls.__init__(data=data, **attr) self._cls = cls self.edge_index = {} def remove_node(self, n): keys = set([]) for keydict in self.pred[n].values(): keys.update(keydict) for keydict in self.succ[n].values(): keys.update(keydict) for key in keys: del self.edge_index[key] self._cls.remove_node(n) def remove_nodes_from(self, nbunch): for n in nbunch: self.remove_node(n) def add_edge(self, u, v, key, attr_dict=None, **attr): """ Key is now required. """ if key in self.edge_index: uu, vv, _ = self.edge_index[key] if (u != uu) or (v != vv): raise Exception("Key {0!r} is already in use.".format(key)) self._cls.add_edge(u, v, key=key, attr_dict=attr_dict, **attr) self.edge_index[key] = (u, v, self.succ[u][v][key]) def add_edges_from(self, ebunch, attr_dict=None, **attr): raise NotImplementedError def remove_edge_with_key(self, key): try: u, v, _ = self.edge_index[key] except KeyError: raise KeyError('Invalid edge key {0!r}'.format(key)) else: del self.edge_index[key] self._cls.remove_edge(u, v, key) def remove_edges_from(self, ebunch): raise NotImplementedError def get_path(G, u, v): """ Returns the edge keys of the unique path between u and v. This is not a generic function. G must be a branching and an instance of MultiDiGraph_EdgeKey. """ nodes = nx.shortest_path(G, u, v) # We are guaranteed that there is only one edge connected every node # in the shortest path. def first_key(i, vv): # Needed for 2.x/3.x compatibilitity keys = G[nodes[i]][vv].keys() # Normalize behavior keys = list(keys) return keys[0] edges = [first_key(i, vv) for i, vv in enumerate(nodes[1:])] return nodes, edges class Edmonds(object): """ Edmonds algorithm for finding optimal branchings and spanning arborescences. """ def __init__(self, G, seed=None): self.G_original = G # Need to fix this. We need the whole tree. self.store = True # The final answer. self.edges = [] # Since we will be creating graphs with new nodes, we need to make # sure that our node names do not conflict with the real node names. self.template = random_string(seed=seed) + '_{0}' def _init(self, attr, default, kind, style): if kind not in KINDS: raise nx.NetworkXException("Unknown value for `kind`.") # Store inputs. self.attr = attr self.default = default self.kind = kind self.style = style # Determine how we are going to transform the weights. if kind == 'min': self.trans = trans = _min_weight else: self.trans = trans = _max_weight if attr is None: # Generate a random attr the graph probably won't have. attr = random_string() # This is the actual attribute used by the algorithm. self._attr = attr # The object we manipulate at each step is a multidigraph. self.G = G = MultiDiGraph_EdgeKey() for key, (u, v, data) in enumerate(self.G_original.edges(data=True)): d = {attr: trans(data.get(attr, default))} G.add_edge(u, v, key, **d) self.level = 0 # These are the "buckets" from the paper. # # As in the paper, G^i are modified versions of the original graph. # D^i and E^i are nodes and edges of the maximal edges that are # consistent with G^i. These are dashed edges in figures A-F of the # paper. In this implementation, we store D^i and E^i together as a # graph B^i. So we will have strictly more B^i than the paper does. self.B = MultiDiGraph_EdgeKey() self.B.edge_index = {} self.graphs = [] # G^i self.branchings = [] # B^i self.uf = nx.utils.UnionFind() # A list of lists of edge indexes. Each list is a circuit for graph G^i. # Note the edge list will not, in general, be a circuit in graph G^0. self.circuits = [] # Stores the index of the minimum edge in the circuit found in G^i and B^i. # The ordering of the edges seems to preserve the weight ordering from G^0. # So even if the circuit does not form a circuit in G^0, it is still true # that the minimum edge of the circuit in G^i is still the minimum edge # in circuit G^0 (depsite their weights being different). self.minedge_circuit = [] def find_optimum(self, attr='weight', default=1, kind='max', style='branching'): """ Returns a branching from G. Parameters ---------- attr : str The edge attribute used to in determining optimality. default : float The value of the edge attribute used if an edge does not have the attribute `attr`. kind : {'min', 'max'} The type of optimum to search for, either 'min' or 'max'. style : {'branching', 'arborescence'} If 'branching', then an optimal branching is found. If `style` is 'arborescence', then a branching is found, such that if the branching is also an arborescence, then the branching is an optimal spanning arborescences. A given graph G need not have an optimal spanning arborescence. Returns ------- H : (multi)digraph The branching. """ self._init(attr, default, kind, style) uf = self.uf # This enormous while loop could use some refactoring... G, B = self.G, self.B D = set([]) nodes = iter(list(G.nodes())) attr = self._attr G_pred = G.pred def desired_edge(v): """ Find the edge directed toward v with maximal weight. """ edge = None weight = -INF for u, _, key, data in G.in_edges(v, data=True, keys=True): new_weight = data[attr] if new_weight > weight: weight = new_weight edge = (u, v, key, new_weight) return edge, weight while True: # (I1): Choose a node v in G^i not in D^i. try: v = next(nodes) except StopIteration: # If there are no more new nodes to consider, then we *should* # meet the break condition (b) from the paper: # (b) every node of G^i is in D^i and E^i is a branching # Construction guarantees that it's a branching. assert( len(G) == len(B) ) if len(B): assert( is_branching(B) ) if self.store: self.graphs.append(G.copy()) self.branchings.append(B.copy()) # Add these to keep the lengths equal. Element i is the # circuit at level i that was merged to form branching i+1. # There is no circuit for the last level. self.circuits.append([]) self.minedge_circuit.append(None) break else: if v in D: #print("v in D", v) continue # Put v into bucket D^i. #print("Adding node {0}".format(v)) D.add(v) B.add_node(v) edge, weight = desired_edge(v) #print("Max edge is {0!r}".format(edge)) if edge is None: # If there is no edge, continue with a new node at (I1). continue else: # Determine if adding the edge to E^i would mean its no longer # a branching. Presently, v has indegree 0 in B---it is a root. u = edge[0] if uf[u] == uf[v]: # Then adding the edge will create a circuit. Then B # contains a unique path P from v to u. So condition (a) # from the paper does hold. We need to store the circuit # for future reference. Q_nodes, Q_edges = get_path(B, v, u) Q_edges.append(edge[2]) else: # Then B with the edge is still a branching and condition # (a) from the paper does not hold. Q_nodes, Q_edges = None, None # Conditions for adding the edge. # If weight < 0, then it cannot help in finding a maximum branching. if self.style == 'branching' and weight <= 0: acceptable = False else: acceptable = True #print("Edge is acceptable: {0}".format(acceptable)) if acceptable: dd = {attr: weight} B.add_edge(u, v, key=edge[2], **dd) G[u][v][edge[2]]['candidate'] = True uf.union(u, v) if Q_edges is not None: #print("Edge introduced a simple cycle:") #print(Q_nodes, Q_edges) # Move to method # Previous meaning of u and v is no longer important. # Apply (I2). # Get the edge in the cycle with the minimum weight. # Also, save the incoming weights for each node. minweight = INF minedge = None Q_incoming_weight = {} for edge_key in Q_edges: u, v, data = B.edge_index[edge_key] w = data[attr] Q_incoming_weight[v] = w if w < minweight: minweight = w minedge = edge_key self.circuits.append(Q_edges) self.minedge_circuit.append(minedge) if self.store: self.graphs.append(G.copy()) # Always need the branching with circuits. self.branchings.append(B.copy()) # Now we mutate it. new_node = self.template.format(self.level) #print(minweight, minedge, Q_incoming_weight) G.add_node(new_node) new_edges = [] for u, v, key, data in G.edges(data=True, keys=True): if u in Q_incoming_weight: if v in Q_incoming_weight: # Circuit edge, do nothing for now. # Eventually delete it. continue else: # Outgoing edge. Make it from new node dd = data.copy() new_edges.append((new_node, v, key, dd)) else: if v in Q_incoming_weight: # Incoming edge. Change its weight w = data[attr] w += minweight - Q_incoming_weight[v] dd = data.copy() dd[attr] = w new_edges.append((u, new_node, key, dd)) else: # Outside edge. No modification necessary. continue G.remove_nodes_from(Q_nodes) B.remove_nodes_from(Q_nodes) D.difference_update(set(Q_nodes)) for u, v, key, data in new_edges: G.add_edge(u, v, key, **data) if 'candidate' in data: del data['candidate'] B.add_edge(u, v, key, **data) uf.union(u, v) nodes = iter(list(G.nodes())) self.level += 1 # (I3) Branch construction. #print(self.level) H = self.G_original.__class__() def is_root(G, u, edgekeys): """ Returns True if `u` is a root node in G. Node `u` will be a root node if its in-degree, restricted to the specified edges, is equal to 0. """ if u not in G: #print(G.nodes(), u) raise Exception('{0!r} not in G'.format(u)) for v in G.pred[u]: for edgekey in G.pred[u][v]: if edgekey in edgekeys: return False, edgekey else: return True, None # Start with the branching edges in the last level. edges = set(self.branchings[self.level].edge_index) while self.level > 0: self.level -= 1 # The current level is i, and we start counting from 0. # We need the node at level i+1 that results from merging a circuit # at level i. randomname_0 is the first merged node and this # happens at level 1. That is, randomname_0 is a node at level 1 # that results from merging a circuit at level 0. merged_node = self.template.format(self.level) # The circuit at level i that was merged as a node the graph # at level i+1. circuit = self.circuits[self.level] #print #print(merged_node, self.level, circuit) #print("before", edges) # Note, we ask if it is a root in the full graph, not the branching. # The branching alone doesn't have all the edges. isroot, edgekey = is_root(self.graphs[self.level + 1], merged_node, edges) edges.update(circuit) if isroot: minedge = self.minedge_circuit[self.level] if minedge is None: raise Exception # Remove the edge in the cycle with minimum weight. edges.remove(minedge) else: # We have identified an edge at next higher level that # transitions into the merged node at the level. That edge # transitions to some corresponding node at the current level. # We want to remove an edge from the cycle that transitions # into the corresponding node. #print("edgekey is: ", edgekey) #print("circuit is: ", circuit) # The branching at level i G = self.graphs[self.level] #print(G.edge_index) target = G.edge_index[edgekey][1] for edgekey in circuit: u, v, data = G.edge_index[edgekey] if v == target: break else: raise Exception("Couldn't find edge incoming to merged node.") #print("not a root. removing {0}".format(edgekey)) edges.remove(edgekey) self.edges = edges H.add_nodes_from(self.G_original) for edgekey in edges: u, v, d = self.graphs[0].edge_index[edgekey] dd = {self.attr: self.trans(d[self.attr])} # TODO: make this preserve the key. In fact, make this use the # same edge attributes as the original graph. H.add_edge(u, v, **dd) return H def maximum_branching(G, attr='weight', default=1): ed = Edmonds(G) B = ed.find_optimum(attr, default, kind='max', style='branching') return B def minimum_branching(G, attr='weight', default=1): ed = Edmonds(G) B = ed.find_optimum(attr, default, kind='min', style='branching') return B def maximum_spanning_arborescence(G, attr='weight', default=1): ed = Edmonds(G) B = ed.find_optimum(attr, default, kind='max', style='arborescence') if not is_arborescence(B): msg = 'No maximum spanning arborescence in G.' raise nx.exception.NetworkXException(msg) return B def minimum_spanning_arborescence(G, attr='weight', default=1): ed = Edmonds(G) B = ed.find_optimum(attr, default, kind='min', style='arborescence') if not is_arborescence(B): msg = 'No maximum spanning arborescence in G.' raise nx.exception.NetworkXException(msg) return B docstring_branching = """ Returns a {kind} {style} from G. Parameters ---------- G : (multi)digraph-like The graph to be searched. attr : str The edge attribute used to in determining optimality. default : float The value of the edge attribute used if an edge does not have the attribute `attr`. Returns ------- B : (multi)digraph-like A {kind} {style}. """ docstring_arborescence = docstring_branching + """ Raises ------ NetworkXException If the graph does not contain a {kind} {style}. """ maximum_branching.__doc__ = \ docstring_branching.format(kind='maximum', style='branching') minimum_branching.__doc__ = \ docstring_branching.format(kind='minimum', style='branching') maximum_spanning_arborescence.__doc__ = \ docstring_arborescence.format(kind='maximum', style='spanning arborescence') minimum_spanning_arborescence.__doc__ = \ docstring_arborescence.format(kind='minimum', style='spanning arborescence') networkx-1.11/networkx/algorithms/tree/__init__.py0000644000175000017500000000006512637544500022304 0ustar aricaric00000000000000from .recognition import * from .branchings import * networkx-1.11/networkx/algorithms/tree/tests/0000755000175000017500000000000012653231454021333 5ustar aricaric00000000000000networkx-1.11/networkx/algorithms/tree/tests/test_branchings.py0000644000175000017500000002331712637544500025071 0ustar aricaric00000000000000from nose import SkipTest from nose.tools import * import networkx as nx try: import numpy as np except: raise SkipTest('NumPy not available.') from networkx.algorithms.tree import branchings from networkx.algorithms.tree import recognition from networkx.testing import * # # Explicitly discussed examples from Edmonds paper. # # Used in Figures A-F. # G_array = np.array([ # 0 1 2 3 4 5 6 7 8 [ 0, 0, 12, 0, 12, 0, 0, 0, 0], # 0 [ 4, 0, 0, 0, 0, 13, 0, 0, 0], # 1 [ 0, 17, 0, 21, 0, 12, 0, 0, 0], # 2 [ 5, 0, 0, 0, 17, 0, 18, 0, 0], # 3 [ 0, 0, 0, 0, 0, 0, 0, 12, 0], # 4 [ 0, 0, 0, 0, 0, 0, 14, 0, 12], # 5 [ 0, 0, 21, 0, 0, 0, 0, 0, 15], # 6 [ 0, 0, 0, 19, 0, 0, 15, 0, 0], # 7 [ 0, 0, 0, 0, 0, 0, 0, 18, 0], # 8 ], dtype=int) # We convert to MultiDiGraph after using from_numpy_matrix # https://github.com/networkx/networkx/pull/1305 def G1(): G = nx.DiGraph() G = nx.from_numpy_matrix(G_array, create_using=G) G = nx.MultiDiGraph(G) return G def G2(): # Now we shift all the weights by -10. # Should not affect optimal arborescence, but does affect optimal branching. G = nx.DiGraph() Garr = G_array.copy() Garr[np.nonzero(Garr)] -= 10 G = nx.from_numpy_matrix(Garr, create_using=G) G = nx.MultiDiGraph(G) return G # An optimal branching for G1 that is also a spanning arborescence. So it is # also an optimal spanning arborescence. # optimal_arborescence_1 = [ (0, 2, 12), (2, 1, 17), (2, 3, 21), (1, 5, 13), (3, 4, 17), (3, 6, 18), (6, 8, 15), (8, 7, 18), ] # For G2, the optimal branching of G1 (with shifted weights) is no longer # an optimal branching, but it is still an optimal spanning arborescence # (just with shifted weights). An optimal branching for G2 is similar to what # appears in figure G (this is greedy_subopt_branching_1a below), but with the # edge (3, 0, 5), which is now (3, 0, -5), removed. Thus, the optimal branching # is not a spanning arborescence. The code finds optimal_branching_2a. # An alternative and equivalent branching is optimal_branching_2b. We would # need to modify the code to iterate through all equivalent optimal branchings. # # These are maximal branchings or arborescences. optimal_branching_2a = [ (5, 6, 4), (6, 2, 11), (6, 8, 5), (8, 7, 8), (2, 1, 7), (2, 3, 11), (3, 4, 7), ] optimal_branching_2b = [ (8, 7, 8), (7, 3, 9), (3, 4, 7), (3, 6, 8), (6, 2, 11), (2, 1, 7), (1, 5, 3), ] optimal_arborescence_2 = [ (0, 2, 2), (2, 1, 7), (2, 3, 11), (1, 5, 3), (3, 4, 7), (3, 6, 8), (6, 8, 5), (8, 7, 8), ] # Two suboptimal maximal branchings on G1 obtained from a greedy algorithm. # 1a matches what is shown in Figure G in Edmonds's paper. greedy_subopt_branching_1a = [ (5, 6, 14), (6, 2, 21), (6, 8, 15), (8, 7, 18), (2, 1, 17), (2, 3, 21), (3, 0, 5), (3, 4, 17), ] greedy_subopt_branching_1b = [ (8, 7, 18), (7, 6, 15), (6, 2, 21), (2, 1, 17), (2, 3, 21), (1, 5, 13), (3, 0, 5), (3, 4, 17), ] def build_branching(edges): G = nx.DiGraph() for u, v, weight in edges: G.add_edge(u, v, weight=weight) return G def sorted_edges(G, attr='weight', default=1): edges = [(u,v,data.get(attr, default)) for (u,v,data) in G.edges(data=True)] edges = sorted(edges, key=lambda x: x[2]) return edges def assert_equal_branchings(G1, G2, attr='weight', default=1): edges1 = G1.edges(data=True) edges2 = G2.edges(data=True) # Grab the weights only. e1 = sorted_edges(G1, attr, default) e2 = sorted_edges(G2, attr, default) # If we have an exception, let's see the edges. print(e1) print(e2) print for a, b in zip(e1, e2): assert_equal(a[:2], b[:2]) np.testing.assert_almost_equal(a[2], b[2]) assert_equal(len(edges1), len(edges2)) ################ def test_optimal_branching1(): G = build_branching(optimal_arborescence_1) assert_true(recognition.is_arborescence(G), True) assert_equal(branchings.branching_weight(G), 131) def test_optimal_branching2a(): G = build_branching(optimal_branching_2a) assert_true(recognition.is_arborescence(G), True) assert_equal(branchings.branching_weight(G), 53) def test_optimal_branching2b(): G = build_branching(optimal_branching_2b) assert_true(recognition.is_arborescence(G), True) assert_equal(branchings.branching_weight(G), 53) def test_optimal_arborescence2(): G = build_branching(optimal_arborescence_2) assert_true(recognition.is_arborescence(G), True) assert_equal(branchings.branching_weight(G), 51) def test_greedy_suboptimal_branching1a(): G = build_branching(greedy_subopt_branching_1a) assert_true(recognition.is_arborescence(G), True) assert_equal(branchings.branching_weight(G), 128) def test_greedy_suboptimal_branching1b(): G = build_branching(greedy_subopt_branching_1b) assert_true(recognition.is_arborescence(G), True) assert_equal(branchings.branching_weight(G), 127) def test_greedy_max1(): # Standard test. # G = G1() B = branchings.greedy_branching(G) # There are only two possible greedy branchings. The sorting is such # that it should equal the second suboptimal branching: 1b. B_ = build_branching(greedy_subopt_branching_1b) assert_equal_branchings(B, B_) def test_greedy_max2(): # Different default weight. # G = G1() del G[1][0][0]['weight'] B = branchings.greedy_branching(G, default=6) # Chosen so that edge (3,0,5) is not selected and (1,0,6) is instead. edges = [ (1, 0, 6), (1, 5, 13), (7, 6, 15), (2, 1, 17), (3, 4, 17), (8, 7, 18), (2, 3, 21), (6, 2, 21), ] B_ = build_branching(edges) assert_equal_branchings(B, B_) def test_greedy_max3(): # All equal weights. # G = G1() B = branchings.greedy_branching(G, attr=None) # This is mostly arbitrary...the output was generated by running the algo. edges = [ (2, 1, 1), (3, 0, 1), (3, 4, 1), (5, 8, 1), (6, 2, 1), (7, 3, 1), (7, 6, 1), (8, 7, 1), ] B_ = build_branching(edges) assert_equal_branchings(B, B_, default=1) def test_greedy_min(): G = G1() B = branchings.greedy_branching(G, kind='min') edges = [ (1, 0, 4), (0, 2, 12), (0, 4, 12), (2, 5, 12), (4, 7, 12), (5, 8, 12), (5, 6, 14), (7, 3, 19) ] B_ = build_branching(edges) assert_equal_branchings(B, B_) def test_edmonds1_maxbranch(): G = G1() x = branchings.maximum_branching(G) x_ = build_branching(optimal_arborescence_1) assert_equal_branchings(x, x_) def test_edmonds1_maxarbor(): G = G1() x = branchings.maximum_spanning_arborescence(G) x_ = build_branching(optimal_arborescence_1) assert_equal_branchings(x, x_) def test_edmonds2_maxbranch(): G = G2() x = branchings.maximum_branching(G) x_ = build_branching(optimal_branching_2a) assert_equal_branchings(x, x_) def test_edmonds2_maxarbor(): G = G2() x = branchings.maximum_spanning_arborescence(G) x_ = build_branching(optimal_arborescence_2) assert_equal_branchings(x, x_) def test_edmonds2_minarbor(): G = G1() x = branchings.minimum_spanning_arborescence(G) # This was obtained from algorithm. Need to verify it independently. # Branch weight is: 96 edges = [ (3, 0, 5), (0, 2, 12), (0, 4, 12), (2, 5, 12), (4, 7, 12), (5, 8, 12), (5, 6, 14), (2, 1, 17) ] x_ = build_branching(edges) assert_equal_branchings(x, x_) def test_edmonds3_minbranch1(): G = G1() x = branchings.minimum_branching(G) edges = [] x_ = build_branching(edges) assert_equal_branchings(x, x_) def test_edmonds3_minbranch2(): G = G1() G.add_edge(8, 9, weight=-10) x = branchings.minimum_branching(G) edges = [(8, 9, -10)] x_ = build_branching(edges) assert_equal_branchings(x, x_) # Need more tests def test_mst(): # Make sure we get the same results for undirected graphs. # Example from: http://en.wikipedia.org/wiki/Kruskal's_algorithm G = nx.Graph() edgelist = [(0, 3, [('weight', 5)]), (0, 1, [('weight', 7)]), (1, 3, [('weight', 9)]), (1, 2, [('weight', 8)]), (1, 4, [('weight', 7)]), (3, 4, [('weight', 15)]), (3, 5, [('weight', 6)]), (2, 4, [('weight', 5)]), (4, 5, [('weight', 8)]), (4, 6, [('weight', 9)]), (5, 6, [('weight', 11)])] G.add_edges_from(edgelist) G = G.to_directed() x = branchings.minimum_spanning_arborescence(G) edges = [(set([0, 1]), 7), (set([0, 3]), 5), (set([3, 5]), 6), (set([1, 4]), 7), (set([4, 2]), 5), (set([4, 6]), 9)] assert_equal(x.number_of_edges(), len(edges)) for u, v, d in x.edges(data=True): assert_true( (set([u,v]), d['weight']) in edges ) def test_mixed_nodetypes(): # Smoke test to make sure no TypeError is raised for mixed node types. G = nx.Graph() edgelist = [(0, 3, [('weight', 5)]), (0, '1', [('weight', 5)])] G.add_edges_from(edgelist) G = G.to_directed() x = branchings.minimum_spanning_arborescence(G) def test_edmonds1_minbranch(): # Using -G_array and min should give the same as optimal_arborescence_1, # but with all edges negative. edges = [ (u, v, -w) for (u, v, w) in optimal_arborescence_1 ] G = nx.DiGraph() G = nx.from_numpy_matrix(-G_array, create_using=G) # Quickly make sure max branching is empty. x = branchings.maximum_branching(G) x_ = build_branching([]) assert_equal_branchings(x, x_) # Now test the min branching. x = branchings.minimum_branching(G) x_ = build_branching(edges) assert_equal_branchings(x, x_) networkx-1.11/networkx/algorithms/tree/tests/test_recognition.py0000644000175000017500000000773512637544450025305 0ustar aricaric00000000000000 from nose.tools import * import networkx as nx class TestTreeRecognition(object): graph = nx.Graph multigraph = nx.MultiGraph def setUp(self): self.T1 = self.graph() self.T2 = self.graph() self.T2.add_node(1) self.T3 = self.graph() self.T3.add_nodes_from(range(5)) edges = [(i,i+1) for i in range(4)] self.T3.add_edges_from(edges) self.T5 = self.multigraph() self.T5.add_nodes_from(range(5)) edges = [(i,i+1) for i in range(4)] self.T5.add_edges_from(edges) self.T6 = self.graph() self.T6.add_nodes_from([6,7]) self.T6.add_edge(6,7) self.F1 = nx.compose(self.T6, self.T3) self.N4 = self.graph() self.N4.add_node(1) self.N4.add_edge(1,1) self.N5 = self.graph() self.N5.add_nodes_from(range(5)) self.N6 = self.graph() self.N6.add_nodes_from(range(3)) self.N6.add_edges_from([(0,1),(1,2),(2,0)]) self.NF1 = nx.compose(self.T6,self.N6) @raises(nx.NetworkXPointlessConcept) def test_null_tree(self): nx.is_tree(self.graph()) nx.is_tree(self.multigraph()) @raises(nx.NetworkXPointlessConcept) def test_null_forest(self): nx.is_forest(self.graph()) nx.is_forest(self.multigraph()) def test_is_tree(self): assert_true(nx.is_tree(self.T2)) assert_true(nx.is_tree(self.T3)) assert_true(nx.is_tree(self.T5)) def test_is_not_tree(self): assert_false(nx.is_tree(self.N4)) assert_false(nx.is_tree(self.N5)) assert_false(nx.is_tree(self.N6)) def test_is_forest(self): assert_true(nx.is_forest(self.T2)) assert_true(nx.is_forest(self.T3)) assert_true(nx.is_forest(self.T5)) assert_true(nx.is_forest(self.F1)) assert_true(nx.is_forest(self.N5)) def test_is_not_forest(self): assert_false(nx.is_forest(self.N4)) assert_false(nx.is_forest(self.N6)) assert_false(nx.is_forest(self.NF1)) class TestDirectedTreeRecognition(TestTreeRecognition): graph = nx.DiGraph multigraph = nx.MultiDiGraph def test_disconnected_graph(): # https://github.com/networkx/networkx/issues/1144 G = nx.Graph() G.add_edges_from([(0, 1), (1, 2), (2, 0), (3, 4)]) assert_false(nx.is_tree(G)) G = nx.DiGraph() G.add_edges_from([(0, 1), (1, 2), (2, 0), (3, 4)]) assert_false(nx.is_tree(G)) def test_dag_nontree(): G = nx.DiGraph() G.add_edges_from([(0,1), (0,2), (1,2)]) assert_false(nx.is_tree(G)) assert_true(nx.is_directed_acyclic_graph(G)) def test_multicycle(): G = nx.MultiDiGraph() G.add_edges_from([(0,1), (0,1)]) assert_false(nx.is_tree(G)) assert_true(nx.is_directed_acyclic_graph(G)) def test_emptybranch(): G = nx.DiGraph() G.add_nodes_from(range(10)) assert_true(nx.is_branching(G)) assert_false(nx.is_arborescence(G)) def test_path(): G = nx.DiGraph() G.add_path(range(5)) assert_true(nx.is_branching(G)) assert_true(nx.is_arborescence(G)) def test_notbranching1(): # Acyclic violation. G = nx.MultiDiGraph() G.add_nodes_from(range(10)) G.add_edges_from([(0,1),(1,0)]) assert_false(nx.is_branching(G)) assert_false(nx.is_arborescence(G)) def test_notbranching2(): # In-degree violation. G = nx.MultiDiGraph() G.add_nodes_from(range(10)) G.add_edges_from([(0,1),(0,2),(3,2)]) assert_false(nx.is_branching(G)) assert_false(nx.is_arborescence(G)) def test_notarborescence1(): # Not an arborescence due to not spanning. G = nx.MultiDiGraph() G.add_nodes_from(range(10)) G.add_edges_from([(0,1),(0,2),(1,3),(5,6)]) assert_true(nx.is_branching(G)) assert_false(nx.is_arborescence(G)) def test_notarborescence2(): # Not an arborescence due to in-degree violation. G = nx.MultiDiGraph() G.add_path(range(5)) G.add_edge(6, 4) assert_false(nx.is_branching(G)) assert_false(nx.is_arborescence(G)) networkx-1.11/networkx/algorithms/triads.py0000644000175000017500000001261412637544500021077 0ustar aricaric00000000000000# triads.py - functions for analyzing triads of a graph # # Copyright 2015 NetworkX developers. # Copyright 2011 Reya Group # Copyright 2011 Alex Levenson # Copyright 2011 Diederik van Liere # # This file is part of NetworkX. # # NetworkX is distributed under a BSD license; see LICENSE.txt for more # information. """Functions for analyzing triads of a graph.""" from __future__ import division import networkx as nx __author__ = '\n'.join(['Alex Levenson (alex@isnontinvain.com)', 'Diederik van Liere (diederik.vanliere@rotman.utoronto.ca)']) __all__ = ['triadic_census'] #: The names of each type of triad. TRIAD_NAMES = ('003', '012', '102', '021D', '021U', '021C', '111D', '111U', '030T', '030C', '201', '120D', '120U', '120C', '210', '300') #: The integer codes representing each type of triad. #: #: Triads that are the same up to symmetry have the same code. TRICODES = (1, 2, 2, 3, 2, 4, 6, 8, 2, 6, 5, 7, 3, 8, 7, 11, 2, 6, 4, 8, 5, 9, 9, 13, 6, 10, 9, 14, 7, 14, 12, 15, 2, 5, 6, 7, 6, 9, 10, 14, 4, 9, 9, 12, 8, 13, 14, 15, 3, 7, 8, 11, 7, 12, 14, 15, 8, 14, 13, 15, 11, 15, 15, 16) #: A dictionary mapping triad code to triad name. TRICODE_TO_NAME = {i: TRIAD_NAMES[code - 1] for i, code in enumerate(TRICODES)} def triad_graphs(): """Returns a dictionary mapping triad name to triad graph.""" def abc_graph(): """Returns a directed graph on three nodes, named ``'a'``, ``'b'``, and ``'c'``. """ G = nx.DiGraph() G.add_nodes_from('abc') return G tg = {name: abc_graph() for name in TRIAD_NAMES} tg['012'].add_edges_from([('a', 'b')]) tg['102'].add_edges_from([('a', 'b'), ('b', 'a')]) tg['102'].add_edges_from([('a', 'b'), ('b', 'a')]) tg['021D'].add_edges_from([('b', 'a'), ('b', 'c')]) tg['021U'].add_edges_from([('a', 'b'), ('c', 'b')]) tg['021C'].add_edges_from([('a', 'b'), ('b', 'c')]) tg['111D'].add_edges_from([('a', 'c'), ('c', 'a'), ('b', 'c')]) tg['111U'].add_edges_from([('a', 'c'), ('c', 'a'), ('c', 'b')]) tg['030T'].add_edges_from([('a', 'b'), ('c', 'b'), ('a', 'c')]) tg['030C'].add_edges_from([('b', 'a'), ('c', 'b'), ('a', 'c')]) tg['201'].add_edges_from([('a', 'b'), ('b', 'a'), ('a', 'c'), ('c', 'a')]) tg['120D'].add_edges_from([('b', 'c'), ('b', 'a'), ('a', 'c'), ('c', 'a')]) tg['120C'].add_edges_from([('a', 'b'), ('b', 'c'), ('a', 'c'), ('c', 'a')]) tg['120U'].add_edges_from([('a', 'b'), ('c', 'b'), ('a', 'c'), ('c', 'a')]) tg['210'].add_edges_from([('a', 'b'), ('b', 'c'), ('c', 'b'), ('a', 'c'), ('c', 'a')]) tg['300'].add_edges_from([('a', 'b'), ('b', 'a'), ('b', 'c'), ('c', 'b'), ('a', 'c'), ('c', 'a')]) return tg def _tricode(G, v, u, w): """Returns the integer code of the given triad. This is some fancy magic that comes from Batagelj and Mrvar's paper. It treats each edge joining a pair of ``v``, ``u``, and ``w`` as a bit in the binary representation of an integer. """ combos = ((v, u, 1), (u, v, 2), (v, w, 4), (w, v, 8), (u, w, 16), (w, u, 32)) return sum(x for u, v, x in combos if v in G[u]) def triadic_census(G): """Determines the triadic census of a directed graph. The triadic census is a count of how many of the 16 possible types of triads are present in a directed graph. Parameters ---------- G : digraph A NetworkX DiGraph Returns ------- census : dict Dictionary with triad names as keys and number of occurrences as values. Notes ----- This algorithm has complexity `O(m)` where `m` is the number of edges in the graph. References ---------- .. [1] Vladimir Batagelj and Andrej Mrvar, A subquadratic triad census algorithm for large sparse networks with small maximum degree, University of Ljubljana, http://vlado.fmf.uni-lj.si/pub/networks/doc/triads/triads.pdf """ if not G.is_directed(): raise nx.NetworkXError('Not defined for undirected graphs.') # Initialize the count for each triad to be zero. census = {name: 0 for name in TRIAD_NAMES} n = len(G) # m = dict(zip(G, range(n))) m = {v: i for i, v in enumerate(G)} for v in G: vnbrs = set(G.pred[v]) | set(G.succ[v]) for u in vnbrs: if m[u] <= m[v]: continue neighbors = (vnbrs | set(G.succ[u]) | set(G.pred[u])) - {u, v} # Calculate dyadic triads instead of counting them. if v in G[u] and u in G[v]: census['102'] += n - len(neighbors) - 2 else: census['012'] += n - len(neighbors) - 2 # Count connected triads. for w in neighbors: if m[u] < m[w] or (m[v] < m[w] and m[w] < m[u] and v not in G.pred[w] and v not in G.succ[w]): code = _tricode(G, v, u, w) census[TRICODE_TO_NAME[code]] += 1 # null triads = total number of possible triads - all found triads # # Use integer division here, since we know this formula guarantees an # integral value. census['003'] = ((n * (n - 1) * (n - 2)) // 6) - sum(census.values()) return census networkx-1.11/networkx/external/0000755000175000017500000000000012653231454016703 5ustar aricaric00000000000000networkx-1.11/networkx/external/__init__.py0000644000175000017500000000000012637544450021007 0ustar aricaric00000000000000networkx-1.11/networkx/release.py0000644000175000017500000001712012653165361017057 0ustar aricaric00000000000000"""Release data for NetworkX. When NetworkX is imported a number of steps are followed to determine the version information. 1) If the release is not a development release (dev=False), then version information is read from version.py, a file containing statically defined version information. This file should exist on every downloadable release of NetworkX since setup.py creates it during packaging/installation. However, version.py might not exist if one is running NetworkX from the mercurial repository. In the event that version.py does not exist, then no vcs information will be available. 2) If the release is a development release, then version information is read dynamically, when possible. If no dynamic information can be read, then an attempt is made to read the information from version.py. If version.py does not exist, then no vcs information will be available. Clarification: version.py is created only by setup.py When setup.py creates version.py, it does so before packaging/installation. So the created file is included in the source distribution. When a user downloads a tar.gz file and extracts the files, the files will not be in a live version control repository. So when the user runs setup.py to install NetworkX, we must make sure write_versionfile() does not overwrite the revision information contained in the version.py that was included in the tar.gz file. This is why write_versionfile() includes an early escape. """ # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. from __future__ import absolute_import import os import sys import time import datetime basedir = os.path.abspath(os.path.split(__file__)[0]) def write_versionfile(): """Creates a static file containing version information.""" versionfile = os.path.join(basedir, 'version.py') text = '''""" Version information for NetworkX, created during installation. Do not add this file to the repository. """ import datetime version = %(version)r date = %(date)r # Was NetworkX built from a development version? If so, remember that the major # and minor versions reference the "target" (rather than "current") release. dev = %(dev)r # Format: (name, major, min, revision) version_info = %(version_info)r # Format: a 'datetime.datetime' instance date_info = %(date_info)r # Format: (vcs, vcs_tuple) vcs_info = %(vcs_info)r ''' # Try to update all information date, date_info, version, version_info, vcs_info = get_info(dynamic=True) def writefile(): fh = open(versionfile, 'w') subs = { 'dev' : dev, 'version': version, 'version_info': version_info, 'date': date, 'date_info': date_info, 'vcs_info': vcs_info } fh.write(text % subs) fh.close() if vcs_info[0] == 'mercurial': # Then, we want to update version.py. writefile() else: if os.path.isfile(versionfile): # This is *good*, and the most likely place users will be when # running setup.py. We do not want to overwrite version.py. # Grab the version so that setup can use it. sys.path.insert(0, basedir) from version import version del sys.path[0] else: # This is *bad*. It means the user might have a tarball that # does not include version.py. Let this error raise so we can # fix the tarball. ##raise Exception('version.py not found!') # We no longer require that prepared tarballs include a version.py # So we use the possibly trunctated value from get_info() # Then we write a new file. writefile() return version def get_revision(): """Returns revision and vcs information, dynamically obtained.""" vcs, revision, tag = None, None, None hgdir = os.path.join(basedir, '..', '.hg') gitdir = os.path.join(basedir, '..', '.git') if os.path.isdir(gitdir): vcs = 'git' # For now, we are not bothering with revision and tag. vcs_info = (vcs, (revision, tag)) return revision, vcs_info def get_info(dynamic=True): ## Date information date_info = datetime.datetime.now() date = time.asctime(date_info.timetuple()) revision, version, version_info, vcs_info = None, None, None, None import_failed = False dynamic_failed = False if dynamic: revision, vcs_info = get_revision() if revision is None: dynamic_failed = True if dynamic_failed or not dynamic: # This is where most final releases of NetworkX will be. # All info should come from version.py. If it does not exist, then # no vcs information will be provided. sys.path.insert(0, basedir) try: from version import date, date_info, version, version_info, vcs_info except ImportError: import_failed = True vcs_info = (None, (None, None)) else: revision = vcs_info[1][0] del sys.path[0] if import_failed or (dynamic and not dynamic_failed): # We are here if: # we failed to determine static versioning info, or # we successfully obtained dynamic revision info version = ''.join([str(major), '.', str(minor)]) if dev: version += '.dev_' + date_info.strftime("%Y%m%d%H%M%S") version_info = (name, major, minor, revision) return date, date_info, version, version_info, vcs_info ## Version information name = 'networkx' major = '1' minor = '11' ## Declare current release as a development release. ## Change to False before tagging a release; then change back. dev = False description = "Python package for creating and manipulating graphs and networks" long_description = \ """ NetworkX is a Python package for the creation, manipulation, and study of the structure, dynamics, and functions of complex networks. """ license = 'BSD' authors = {'Hagberg' : ('Aric Hagberg','hagberg@lanl.gov'), 'Schult' : ('Dan Schult','dschult@colgate.edu'), 'Swart' : ('Pieter Swart','swart@lanl.gov') } maintainer = "NetworkX Developers" maintainer_email = "networkx-discuss@googlegroups.com" url = 'http://networkx.github.io/' download_url= 'https://pypi.python.org/pypi/networkx/' platforms = ['Linux','Mac OSX','Windows','Unix'] keywords = ['Networks', 'Graph Theory', 'Mathematics', 'network', 'graph', 'discrete mathematics', 'math'] classifiers = [ 'Development Status :: 5 - Production/Stable', 'Intended Audience :: Developers', 'Intended Audience :: Science/Research', 'License :: OSI Approved :: BSD License', 'Operating System :: OS Independent', 'Programming Language :: Python :: 2', 'Programming Language :: Python :: 2.7', 'Programming Language :: Python :: 3', 'Programming Language :: Python :: 3.5', 'Programming Language :: Python :: 3.3', 'Programming Language :: Python :: 3.4', 'Topic :: Software Development :: Libraries :: Python Modules', 'Topic :: Scientific/Engineering :: Bio-Informatics', 'Topic :: Scientific/Engineering :: Information Analysis', 'Topic :: Scientific/Engineering :: Mathematics', 'Topic :: Scientific/Engineering :: Physics'] date, date_info, version, version_info, vcs_info = get_info() if __name__ == '__main__': # Write versionfile for nightly snapshots. write_versionfile() networkx-1.11/networkx/readwrite/0000755000175000017500000000000012653231454017047 5ustar aricaric00000000000000networkx-1.11/networkx/readwrite/leda.py0000644000175000017500000000543712637544450020344 0ustar aricaric00000000000000""" Read graphs in LEDA format. LEDA is a C++ class library for efficient data types and algorithms. Format ------ See http://www.algorithmic-solutions.info/leda_guide/graphs/leda_native_graph_fileformat.html """ # Original author: D. Eppstein, UC Irvine, August 12, 2003. # The original code at http://www.ics.uci.edu/~eppstein/PADS/ is public domain. __author__ = """Aric Hagberg (hagberg@lanl.gov)""" # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. __all__ = ['read_leda', 'parse_leda'] import networkx as nx from networkx.exception import NetworkXError from networkx.utils import open_file, is_string_like @open_file(0,mode='rb') def read_leda(path, encoding='UTF-8'): """Read graph in LEDA format from path. Parameters ---------- path : file or string File or filename to read. Filenames ending in .gz or .bz2 will be uncompressed. Returns ------- G : NetworkX graph Examples -------- G=nx.read_leda('file.leda') References ---------- .. [1] http://www.algorithmic-solutions.info/leda_guide/graphs/leda_native_graph_fileformat.html """ lines=(line.decode(encoding) for line in path) G=parse_leda(lines) return G def parse_leda(lines): """Read graph in LEDA format from string or iterable. Parameters ---------- lines : string or iterable Data in LEDA format. Returns ------- G : NetworkX graph Examples -------- G=nx.parse_leda(string) References ---------- .. [1] http://www.algorithmic-solutions.info/leda_guide/graphs/leda_native_graph_fileformat.html """ if is_string_like(lines): lines=iter(lines.split('\n')) lines = iter([line.rstrip('\n') for line in lines \ if not (line.startswith('#') or line.startswith('\n') or line=='')]) for i in range(3): next(lines) # Graph du = int(next(lines)) # -1=directed, -2=undirected if du==-1: G = nx.DiGraph() else: G = nx.Graph() # Nodes n =int(next(lines)) # number of nodes node={} for i in range(1,n+1): # LEDA counts from 1 to n symbol=next(lines).rstrip().strip('|{}| ') if symbol=="": symbol=str(i) # use int if no label - could be trouble node[i]=symbol G.add_nodes_from([s for i,s in node.items()]) # Edges m = int(next(lines)) # number of edges for i in range(m): try: s,t,reversal,label=next(lines).split() except: raise NetworkXError('Too few fields in LEDA.GRAPH edge %d'%(i+1)) # BEWARE: no handling of reversal edges G.add_edge(node[int(s)],node[int(t)],label=label[2:-2]) return G networkx-1.11/networkx/readwrite/p2g.py0000644000175000017500000000631712637544500020121 0ustar aricaric00000000000000""" This module provides the following: read and write of p2g format used in metabolic pathway studies. See http://www.cs.purdue.edu/homes/koyuturk/pathway/ for a description. The summary is included here: A file that describes a uniquely labeled graph (with extension ".gr") format looks like the following: name 3 4 a 1 2 b c 0 2 "name" is simply a description of what the graph corresponds to. The second line displays the number of nodes and number of edges, respectively. This sample graph contains three nodes labeled "a", "b", and "c". The rest of the graph contains two lines for each node. The first line for a node contains the node label. After the declaration of the node label, the out-edges of that node in the graph are provided. For instance, "a" is linked to nodes 1 and 2, which are labeled "b" and "c", while the node labeled "b" has no outgoing edges. Observe that node labeled "c" has an outgoing edge to itself. Indeed, self-loops are allowed. Node index starts from 0. """ # Copyright (C) 2008-2012 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import networkx from networkx.utils import is_string_like,open_file __author__ = '\n'.join(['Willem Ligtenberg (w.p.a.ligtenberg@tue.nl)', 'Aric Hagberg (aric.hagberg@gmail.com)']) @open_file(1,mode='w') def write_p2g(G, path, encoding = 'utf-8'): """Write NetworkX graph in p2g format. Notes ----- This format is meant to be used with directed graphs with possible self loops. """ path.write(("%s\n"%G.name).encode(encoding)) path.write(("%s %s\n"%(G.order(),G.size())).encode(encoding)) nodes = G.nodes() # make dictionary mapping nodes to integers nodenumber=dict(zip(nodes,range(len(nodes)))) for n in nodes: path.write(("%s\n"%n).encode(encoding)) for nbr in G.neighbors(n): path.write(("%s "%nodenumber[nbr]).encode(encoding)) path.write("\n".encode(encoding)) @open_file(0,mode='r') def read_p2g(path, encoding='utf-8'): """Read graph in p2g format from path. Returns ------- MultiDiGraph Notes ----- If you want a DiGraph (with no self loops allowed and no edge data) use D=networkx.DiGraph(read_p2g(path)) """ lines = (line.decode(encoding) for line in path) G=parse_p2g(lines) return G def parse_p2g(lines): """Parse p2g format graph from string or iterable. Returns ------- MultiDiGraph """ description = next(lines).strip() # are multiedges (parallel edges) allowed? G=networkx.MultiDiGraph(name=description,selfloops=True) nnodes,nedges=map(int,next(lines).split()) nodelabel={} nbrs={} # loop over the nodes keeping track of node labels and out neighbors # defer adding edges until all node labels are known for i in range(nnodes): n=next(lines).strip() nodelabel[i]=n G.add_node(n) nbrs[n]=map(int,next(lines).split()) # now we know all of the node labels so we can add the edges # with the correct labels for n in G: for nbr in nbrs[n]: G.add_edge(n,nodelabel[nbr]) return G networkx-1.11/networkx/readwrite/json_graph/0000755000175000017500000000000012653231454021201 5ustar aricaric00000000000000networkx-1.11/networkx/readwrite/json_graph/__init__.py0000644000175000017500000000115212637544450023316 0ustar aricaric00000000000000""" ********* JSON data ********* Generate and parse JSON serializable data for NetworkX graphs. These formats are suitable for use with the d3.js examples http://d3js.org/ The three formats that you can generate with NetworkX are: - node-link like in the d3.js example http://bl.ocks.org/mbostock/4062045 - tree like in the d3.js example http://bl.ocks.org/mbostock/4063550 - adjacency like in the d3.js example http://bost.ocks.org/mike/miserables/ """ from networkx.readwrite.json_graph.node_link import * from networkx.readwrite.json_graph.adjacency import * from networkx.readwrite.json_graph.tree import * networkx-1.11/networkx/readwrite/json_graph/node_link.py0000644000175000017500000001235512637544500023524 0ustar aricaric00000000000000# Copyright (C) 2011-2013 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. from itertools import chain, count import json import networkx as nx from networkx.utils import make_str __author__ = """Aric Hagberg """ __all__ = ['node_link_data', 'node_link_graph'] _attrs = dict(id='id', source='source', target='target', key='key') def node_link_data(G, attrs=_attrs): """Return data in node-link format that is suitable for JSON serialization and use in Javascript documents. Parameters ---------- G : NetworkX graph attrs : dict A dictionary that contains four keys 'id', 'source', 'target' and 'key'. The corresponding values provide the attribute names for storing NetworkX-internal graph data. The values should be unique. Default value: :samp:`dict(id='id', source='source', target='target', key='key')`. If some user-defined graph data use these attribute names as data keys, they may be silently dropped. Returns ------- data : dict A dictionary with node-link formatted data. Raises ------ NetworkXError If values in attrs are not unique. Examples -------- >>> from networkx.readwrite import json_graph >>> G = nx.Graph([(1,2)]) >>> data = json_graph.node_link_data(G) To serialize with json >>> import json >>> s = json.dumps(data) Notes ----- Graph, node, and link attributes are stored in this format. Note that attribute keys will be converted to strings in order to comply with JSON. The default value of attrs will be changed in a future release of NetworkX. See Also -------- node_link_graph, adjacency_data, tree_data """ multigraph = G.is_multigraph() id_ = attrs['id'] source = attrs['source'] target = attrs['target'] # Allow 'key' to be omitted from attrs if the graph is not a multigraph. key = None if not multigraph else attrs['key'] if len(set([source, target, key])) < 3: raise nx.NetworkXError('Attribute names are not unique.') mapping = dict(zip(G, count())) data = {} data['directed'] = G.is_directed() data['multigraph'] = multigraph data['graph'] = G.graph data['nodes'] = [dict(chain(G.node[n].items(), [(id_, n)])) for n in G] if multigraph: data['links'] = [ dict(chain(d.items(), [(source, mapping[u]), (target, mapping[v]), (key, k)])) for u, v, k, d in G.edges_iter(keys=True, data=True)] else: data['links'] = [ dict(chain(d.items(), [(source, mapping[u]), (target, mapping[v])])) for u, v, d in G.edges_iter(data=True)] return data def node_link_graph(data, directed=False, multigraph=True, attrs=_attrs): """Return graph from node-link data format. Parameters ---------- data : dict node-link formatted graph data directed : bool If True, and direction not specified in data, return a directed graph. multigraph : bool If True, and multigraph not specified in data, return a multigraph. attrs : dict A dictionary that contains four keys 'id', 'source', 'target' and 'key'. The corresponding values provide the attribute names for storing NetworkX-internal graph data. Default value: :samp:`dict(id='id', source='source', target='target', key='key')`. Returns ------- G : NetworkX graph A NetworkX graph object Examples -------- >>> from networkx.readwrite import json_graph >>> G = nx.Graph([(1,2)]) >>> data = json_graph.node_link_data(G) >>> H = json_graph.node_link_graph(data) Notes ----- The default value of attrs will be changed in a future release of NetworkX. See Also -------- node_link_data, adjacency_data, tree_data """ multigraph = data.get('multigraph', multigraph) directed = data.get('directed', directed) if multigraph: graph = nx.MultiGraph() else: graph = nx.Graph() if directed: graph = graph.to_directed() id_ = attrs['id'] source = attrs['source'] target = attrs['target'] # Allow 'key' to be omitted from attrs if the graph is not a multigraph. key = None if not multigraph else attrs['key'] mapping = [] graph.graph = data.get('graph', {}) c = count() for d in data['nodes']: node = d.get(id_, next(c)) mapping.append(node) nodedata = dict((make_str(k), v) for k, v in d.items() if k != id_) graph.add_node(node, **nodedata) for d in data['links']: src = d[source] tgt = d[target] if not multigraph: edgedata = dict((make_str(k), v) for k, v in d.items() if k != source and k != target) graph.add_edge(mapping[src], mapping[tgt], **edgedata) else: ky = d.get(key, None) edgedata = dict((make_str(k), v) for k, v in d.items() if k != source and k != target and k != key) graph.add_edge(mapping[src], mapping[tgt], ky, **edgedata) return graph networkx-1.11/networkx/readwrite/json_graph/tests/0000755000175000017500000000000012653231454022343 5ustar aricaric00000000000000networkx-1.11/networkx/readwrite/json_graph/tests/test_node_link.py0000644000175000017500000000366012637544450025730 0ustar aricaric00000000000000# -*- coding: utf-8 -*- import json from nose.tools import assert_equal, assert_raises, assert_not_equal, assert_true, raises import networkx as nx from networkx.readwrite.json_graph import * class TestNodeLink: def test_graph(self): G = nx.path_graph(4) H = node_link_graph(node_link_data(G)) nx.is_isomorphic(G,H) def test_graph_attributes(self): G = nx.path_graph(4) G.add_node(1,color='red') G.add_edge(1,2,width=7) G.graph[1]='one' G.graph['foo']='bar' H = node_link_graph(node_link_data(G)) assert_equal(H.graph['foo'],'bar') assert_equal(H.node[1]['color'],'red') assert_equal(H[1][2]['width'],7) d = json.dumps(node_link_data(G)) H = node_link_graph(json.loads(d)) assert_equal(H.graph['foo'],'bar') assert_equal(H.graph['1'],'one') assert_equal(H.node[1]['color'],'red') assert_equal(H[1][2]['width'],7) def test_digraph(self): G = nx.DiGraph() H = node_link_graph(node_link_data(G)) assert_true(H.is_directed()) def test_multigraph(self): G = nx.MultiGraph() G.add_edge(1,2,key='first') G.add_edge(1,2,key='second',color='blue') H = node_link_graph(node_link_data(G)) nx.is_isomorphic(G,H) assert_equal(H[1][2]['second']['color'],'blue') def test_unicode_keys(self): try: q = unicode("qualité",'utf-8') except NameError: q = "qualité" G = nx.Graph() G.add_node(1, {q:q}) s = node_link_data(G) output = json.dumps(s, ensure_ascii=False) data = json.loads(output) H = node_link_graph(data) assert_equal(H.node[1][q], q) @raises(nx.NetworkXError) def test_exception(self): G = nx.MultiDiGraph() attrs = dict(id='id', source='node', target='node', key='node') node_link_data(G, attrs) networkx-1.11/networkx/readwrite/json_graph/tests/test_adjacency.py0000644000175000017500000000341312637544450025703 0ustar aricaric00000000000000import json from nose.tools import assert_equal, assert_raises, assert_not_equal, assert_true, raises import networkx as nx from networkx.readwrite.json_graph import * class TestAdjacency: def test_graph(self): G = nx.path_graph(4) H = adjacency_graph(adjacency_data(G)) nx.is_isomorphic(G,H) def test_graph_attributes(self): G = nx.path_graph(4) G.add_node(1,color='red') G.add_edge(1,2,width=7) G.graph['foo']='bar' G.graph[1]='one' H = adjacency_graph(adjacency_data(G)) assert_equal(H.graph['foo'],'bar') assert_equal(H.node[1]['color'],'red') assert_equal(H[1][2]['width'],7) d = json.dumps(adjacency_data(G)) H = adjacency_graph(json.loads(d)) assert_equal(H.graph['foo'],'bar') assert_equal(H.graph[1],'one') assert_equal(H.node[1]['color'],'red') assert_equal(H[1][2]['width'],7) def test_digraph(self): G = nx.DiGraph() G.add_path([1,2,3]) H = adjacency_graph(adjacency_data(G)) assert_true(H.is_directed()) nx.is_isomorphic(G,H) def test_multidigraph(self): G = nx.MultiDiGraph() G.add_path([1,2,3]) H = adjacency_graph(adjacency_data(G)) assert_true(H.is_directed()) assert_true(H.is_multigraph()) def test_multigraph(self): G = nx.MultiGraph() G.add_edge(1,2,key='first') G.add_edge(1,2,key='second',color='blue') H = adjacency_graph(adjacency_data(G)) nx.is_isomorphic(G,H) assert_equal(H[1][2]['second']['color'],'blue') @raises(nx.NetworkXError) def test_exception(self): G = nx.MultiDiGraph() attrs = dict(id='node', key='node') adjacency_data(G, attrs) networkx-1.11/networkx/readwrite/json_graph/tests/test_tree.py0000644000175000017500000000202512637544450024717 0ustar aricaric00000000000000import json from nose.tools import assert_equal, assert_raises, assert_not_equal, assert_true, raises import networkx as nx from networkx.readwrite.json_graph import * class TestTree: def test_graph(self): G=nx.DiGraph() G.add_nodes_from([1,2,3],color='red') G.add_edge(1,2,foo=7) G.add_edge(1,3,foo=10) G.add_edge(3,4,foo=10) H = tree_graph(tree_data(G,1)) nx.is_isomorphic(G,H) def test_graph_attributes(self): G=nx.DiGraph() G.add_nodes_from([1,2,3],color='red') G.add_edge(1,2,foo=7) G.add_edge(1,3,foo=10) G.add_edge(3,4,foo=10) H = tree_graph(tree_data(G,1)) assert_equal(H.node[1]['color'],'red') d = json.dumps(tree_data(G,1)) H = tree_graph(json.loads(d)) assert_equal(H.node[1]['color'],'red') @raises(nx.NetworkXError) def test_exception(self): G = nx.MultiDiGraph() G.add_node(0) attrs = dict(id='node', children='node') tree_data(G, 0, attrs) networkx-1.11/networkx/readwrite/json_graph/adjacency.py0000644000175000017500000001144012637544500023475 0ustar aricaric00000000000000# Copyright (C) 2011-2013 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. from itertools import chain, count import networkx as nx __author__ = """Aric Hagberg """ __all__ = ['adjacency_data', 'adjacency_graph'] _attrs = dict(id='id', key='key') def adjacency_data(G, attrs=_attrs): """Return data in adjacency format that is suitable for JSON serialization and use in Javascript documents. Parameters ---------- G : NetworkX graph attrs : dict A dictionary that contains two keys 'id' and 'key'. The corresponding values provide the attribute names for storing NetworkX-internal graph data. The values should be unique. Default value: :samp:`dict(id='id', key='key')`. If some user-defined graph data use these attribute names as data keys, they may be silently dropped. Returns ------- data : dict A dictionary with adjacency formatted data. Raises ------ NetworkXError If values in attrs are not unique. Examples -------- >>> from networkx.readwrite import json_graph >>> G = nx.Graph([(1,2)]) >>> data = json_graph.adjacency_data(G) To serialize with json >>> import json >>> s = json.dumps(data) Notes ----- Graph, node, and link attributes will be written when using this format but attribute keys must be strings if you want to serialize the resulting data with JSON. The default value of attrs will be changed in a future release of NetworkX. See Also -------- adjacency_graph, node_link_data, tree_data """ multigraph = G.is_multigraph() id_ = attrs['id'] # Allow 'key' to be omitted from attrs if the graph is not a multigraph. key = None if not multigraph else attrs['key'] if id_ == key: raise nx.NetworkXError('Attribute names are not unique.') data = {} data['directed'] = G.is_directed() data['multigraph'] = multigraph data['graph'] = list(G.graph.items()) data['nodes'] = [] data['adjacency'] = [] for n, nbrdict in G.adjacency_iter(): data['nodes'].append(dict(chain(G.node[n].items(), [(id_, n)]))) adj = [] if multigraph: for nbr, keys in nbrdict.items(): for k, d in keys.items(): adj.append(dict(chain(d.items(), [(id_, nbr), (key, k)]))) else: for nbr, d in nbrdict.items(): adj.append(dict(chain(d.items(), [(id_, nbr)]))) data['adjacency'].append(adj) return data def adjacency_graph(data, directed=False, multigraph=True, attrs=_attrs): """Return graph from adjacency data format. Parameters ---------- data : dict Adjacency list formatted graph data Returns ------- G : NetworkX graph A NetworkX graph object directed : bool If True, and direction not specified in data, return a directed graph. multigraph : bool If True, and multigraph not specified in data, return a multigraph. attrs : dict A dictionary that contains two keys 'id' and 'key'. The corresponding values provide the attribute names for storing NetworkX-internal graph data. The values should be unique. Default value: :samp:`dict(id='id', key='key')`. Examples -------- >>> from networkx.readwrite import json_graph >>> G = nx.Graph([(1,2)]) >>> data = json_graph.adjacency_data(G) >>> H = json_graph.adjacency_graph(data) Notes ----- The default value of attrs will be changed in a future release of NetworkX. See Also -------- adjacency_graph, node_link_data, tree_data """ multigraph = data.get('multigraph', multigraph) directed = data.get('directed', directed) if multigraph: graph = nx.MultiGraph() else: graph = nx.Graph() if directed: graph = graph.to_directed() id_ = attrs['id'] # Allow 'key' to be omitted from attrs if the graph is not a multigraph. key = None if not multigraph else attrs['key'] graph.graph = dict(data.get('graph', [])) mapping = [] for d in data['nodes']: node_data = d.copy() node = node_data.pop(id_) mapping.append(node) graph.add_node(node, attr_dict=node_data) for i, d in enumerate(data['adjacency']): source = mapping[i] for tdata in d: target_data = tdata.copy() target = target_data.pop(id_) if not multigraph: graph.add_edge(source, target, attr_dict=tdata) else: ky = target_data.pop(key, None) graph.add_edge(source, target, key=ky, attr_dict=tdata) return graph networkx-1.11/networkx/readwrite/json_graph/tree.py0000644000175000017500000001040612637544450022520 0ustar aricaric00000000000000# Copyright (C) 2011 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. from itertools import chain, count import networkx as nx from networkx.utils import make_str __author__ = """Aric Hagberg (hagberg@lanl.gov))""" __all__ = ['tree_data', 'tree_graph'] _attrs = dict(id='id', children='children') def tree_data(G, root, attrs=_attrs): """Return data in tree format that is suitable for JSON serialization and use in Javascript documents. Parameters ---------- G : NetworkX graph G must be an oriented tree root : node The root of the tree attrs : dict A dictionary that contains two keys 'id' and 'children'. The corresponding values provide the attribute names for storing NetworkX-internal graph data. The values should be unique. Default value: :samp:`dict(id='id', children='children')`. If some user-defined graph data use these attribute names as data keys, they may be silently dropped. Returns ------- data : dict A dictionary with node-link formatted data. Raises ------ NetworkXError If values in attrs are not unique. Examples -------- >>> from networkx.readwrite import json_graph >>> G = nx.DiGraph([(1,2)]) >>> data = json_graph.tree_data(G,root=1) To serialize with json >>> import json >>> s = json.dumps(data) Notes ----- Node attributes are stored in this format but keys for attributes must be strings if you want to serialize with JSON. Graph and edge attributes are not stored. The default value of attrs will be changed in a future release of NetworkX. See Also -------- tree_graph, node_link_data, node_link_data """ if G.number_of_nodes() != G.number_of_edges() + 1: raise TypeError("G is not a tree.") if not G.is_directed(): raise TypeError("G is not directed.") id_ = attrs['id'] children = attrs['children'] if id_ == children: raise nx.NetworkXError('Attribute names are not unique.') def add_children(n, G): nbrs = G[n] if len(nbrs) == 0: return [] children_ = [] for child in nbrs: d = dict(chain(G.node[child].items(), [(id_, child)])) c = add_children(child, G) if c: d[children] = c children_.append(d) return children_ data = dict(chain(G.node[root].items(), [(id_, root)])) data[children] = add_children(root, G) return data def tree_graph(data, attrs=_attrs): """Return graph from tree data format. Parameters ---------- data : dict Tree formatted graph data Returns ------- G : NetworkX DiGraph attrs : dict A dictionary that contains two keys 'id' and 'children'. The corresponding values provide the attribute names for storing NetworkX-internal graph data. The values should be unique. Default value: :samp:`dict(id='id', children='children')`. Examples -------- >>> from networkx.readwrite import json_graph >>> G = nx.DiGraph([(1,2)]) >>> data = json_graph.tree_data(G,root=1) >>> H = json_graph.tree_graph(data) Notes ----- The default value of attrs will be changed in a future release of NetworkX. See Also -------- tree_graph, node_link_data, adjacency_data """ graph = nx.DiGraph() id_ = attrs['id'] children = attrs['children'] def add_children(parent, children_): for data in children_: child = data[id_] graph.add_edge(parent, child) grandchildren = data.get(children, []) if grandchildren: add_children(child, grandchildren) nodedata = dict((make_str(k), v) for k, v in data.items() if k != id_ and k != children) graph.add_node(child, attr_dict=nodedata) root = data[id_] children_ = data.get(children, []) nodedata = dict((make_str(k), v) for k, v in data.items() if k != id_ and k != children) graph.add_node(root, attr_dict=nodedata) add_children(root, children_) return graph networkx-1.11/networkx/readwrite/nx_shp.py0000644000175000017500000002060312637544500020722 0ustar aricaric00000000000000""" ********* Shapefile ********* Generates a networkx.DiGraph from point and line shapefiles. "The Esri Shapefile or simply a shapefile is a popular geospatial vector data format for geographic information systems software. It is developed and regulated by Esri as a (mostly) open specification for data interoperability among Esri and other software products." See http://en.wikipedia.org/wiki/Shapefile for additional information. """ # Copyright (C) 2004-2015 by # Ben Reilly # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import networkx as nx __author__ = """Ben Reilly (benwreilly@gmail.com)""" __all__ = ['read_shp', 'write_shp'] def read_shp(path, simplify=True): """Generates a networkx.DiGraph from shapefiles. Point geometries are translated into nodes, lines into edges. Coordinate tuples are used as keys. Attributes are preserved, line geometries are simplified into start and end coordinates. Accepts a single shapefile or directory of many shapefiles. "The Esri Shapefile or simply a shapefile is a popular geospatial vector data format for geographic information systems software [1]_." Parameters ---------- path : file or string File, directory, or filename to read. simplify: bool If ``True``, simplify line geometries to start and end coordinates. If ``False``, and line feature geometry has multiple segments, the non-geometric attributes for that feature will be repeated for each edge comprising that feature. Returns ------- G : NetworkX graph Examples -------- >>> G=nx.read_shp('test.shp') # doctest: +SKIP References ---------- .. [1] http://en.wikipedia.org/wiki/Shapefile """ try: from osgeo import ogr except ImportError: raise ImportError("read_shp requires OGR: http://www.gdal.org/") if not isinstance(path, str): return net = nx.DiGraph() shp = ogr.Open(path) for lyr in shp: fields = [x.GetName() for x in lyr.schema] for f in lyr: flddata = [f.GetField(f.GetFieldIndex(x)) for x in fields] g = f.geometry() attributes = dict(zip(fields, flddata)) attributes["ShpName"] = lyr.GetName() if g.GetGeometryType() == 1: # point net.add_node((g.GetPoint_2D(0)), attributes) if g.GetGeometryType() == 2: # linestring last = g.GetPointCount() - 1 if simplify: attributes["Wkb"] = g.ExportToWkb() attributes["Wkt"] = g.ExportToWkt() attributes["Json"] = g.ExportToJson() net.add_edge(g.GetPoint_2D(0), g.GetPoint_2D(last), attributes) else: # separate out each segment as individual edge for i in range(last): pt1 = g.GetPoint_2D(i) pt2 = g.GetPoint_2D(i + 1) segment = ogr.Geometry(ogr.wkbLineString) segment.AddPoint_2D(pt1[0], pt1[1]) segment.AddPoint_2D(pt2[0], pt2[1]) attributes["Wkb"] = segment.ExportToWkb() attributes["Wkt"] = segment.ExportToWkt() attributes["Json"] = segment.ExportToJson() del segment net.add_edge(pt1, pt2, attributes) return net def write_shp(G, outdir): """Writes a networkx.DiGraph to two shapefiles, edges and nodes. Nodes and edges are expected to have a Well Known Binary (Wkb) or Well Known Text (Wkt) key in order to generate geometries. Also acceptable are nodes with a numeric tuple key (x,y). "The Esri Shapefile or simply a shapefile is a popular geospatial vector data format for geographic information systems software [1]_." Parameters ---------- outdir : directory path Output directory for the two shapefiles. Returns ------- None Examples -------- nx.write_shp(digraph, '/shapefiles') # doctest +SKIP References ---------- .. [1] http://en.wikipedia.org/wiki/Shapefile """ try: from osgeo import ogr except ImportError: raise ImportError("write_shp requires OGR: http://www.gdal.org/") # easier to debug in python if ogr throws exceptions ogr.UseExceptions() def netgeometry(key, data): if 'Wkb' in data: geom = ogr.CreateGeometryFromWkb(data['Wkb']) elif 'Wkt' in data: geom = ogr.CreateGeometryFromWkt(data['Wkt']) elif type(key[0]).__name__ == 'tuple': # edge keys are packed tuples geom = ogr.Geometry(ogr.wkbLineString) _from, _to = key[0], key[1] try: geom.SetPoint(0, *_from) geom.SetPoint(1, *_to) except TypeError: # assume user used tuple of int and choked ogr _ffrom = [float(x) for x in _from] _fto = [float(x) for x in _to] geom.SetPoint(0, *_ffrom) geom.SetPoint(1, *_fto) else: geom = ogr.Geometry(ogr.wkbPoint) try: geom.SetPoint(0, *key) except TypeError: # assume user used tuple of int and choked ogr fkey = [float(x) for x in key] geom.SetPoint(0, *fkey) return geom # Create_feature with new optional attributes arg (should be dict type) def create_feature(geometry, lyr, attributes=None): feature = ogr.Feature(lyr.GetLayerDefn()) feature.SetGeometry(g) if attributes != None: # Loop through attributes, assigning data to each field for field, data in attributes.items(): feature.SetField(field, data) lyr.CreateFeature(feature) feature.Destroy() drv = ogr.GetDriverByName("ESRI Shapefile") shpdir = drv.CreateDataSource(outdir) # delete pre-existing output first otherwise ogr chokes try: shpdir.DeleteLayer("nodes") except: pass nodes = shpdir.CreateLayer("nodes", None, ogr.wkbPoint) for n in G: data = G.node[n] g = netgeometry(n, data) create_feature(g, nodes) try: shpdir.DeleteLayer("edges") except: pass edges = shpdir.CreateLayer("edges", None, ogr.wkbLineString) # New edge attribute write support merged into edge loop fields = {} # storage for field names and their data types attributes = {} # storage for attribute data (indexed by field names) # Conversion dict between python and ogr types OGRTypes = {int: ogr.OFTInteger, str: ogr.OFTString, float: ogr.OFTReal} # Edge loop for e in G.edges(data=True): data = G.get_edge_data(*e) g = netgeometry(e, data) # Loop through attribute data in edges for key, data in e[2].items(): # Reject spatial data not required for attribute table if (key != 'Json' and key != 'Wkt' and key != 'Wkb' and key != 'ShpName'): # For all edges check/add field and data type to fields dict if key not in fields: # Field not in previous edges so add to dict if type(data) in OGRTypes: fields[key] = OGRTypes[type(data)] else: # Data type not supported, default to string (char 80) fields[key] = ogr.OFTString # Create the new field newfield = ogr.FieldDefn(key, fields[key]) edges.CreateField(newfield) # Store the data from new field to dict for CreateLayer() attributes[key] = data else: # Field already exists, add data to dict for CreateLayer() attributes[key] = data # Create the feature with, passing new attribute data create_feature(g, edges, attributes) nodes, edges = None, None # fixture for nose tests def setup_module(module): from nose import SkipTest try: import ogr except: raise SkipTest("OGR not available") networkx-1.11/networkx/readwrite/__init__.py0000644000175000017500000000113012637544500021154 0ustar aricaric00000000000000""" A package for reading and writing graphs in various formats. """ from networkx.readwrite.adjlist import * from networkx.readwrite.multiline_adjlist import * from networkx.readwrite.edgelist import * from networkx.readwrite.gpickle import * from networkx.readwrite.pajek import * from networkx.readwrite.leda import * from networkx.readwrite.sparse6 import * from networkx.readwrite.graph6 import * from networkx.readwrite.nx_yaml import * from networkx.readwrite.gml import * from networkx.readwrite.graphml import * from networkx.readwrite.gexf import * from networkx.readwrite.nx_shp import * networkx-1.11/networkx/readwrite/edgelist.py0000644000175000017500000003331612637544500021230 0ustar aricaric00000000000000""" ********** Edge Lists ********** Read and write NetworkX graphs as edge lists. The multi-line adjacency list format is useful for graphs with nodes that can be meaningfully represented as strings. With the edgelist format simple edge data can be stored but node or graph data is not. There is no way of representing isolated nodes unless the node has a self-loop edge. Format ------ You can read or write three formats of edge lists with these functions. Node pairs with no data:: 1 2 Python dictionary as data:: 1 2 {'weight':7, 'color':'green'} Arbitrary data:: 1 2 7 green """ __author__ = """Aric Hagberg (hagberg@lanl.gov)\nDan Schult (dschult@colgate.edu)""" # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. __all__ = ['generate_edgelist', 'write_edgelist', 'parse_edgelist', 'read_edgelist', 'read_weighted_edgelist', 'write_weighted_edgelist'] from networkx.utils import open_file, make_str import networkx as nx def generate_edgelist(G, delimiter=' ', data=True): """Generate a single line of the graph G in edge list format. Parameters ---------- G : NetworkX graph delimiter : string, optional Separator for node labels data : bool or list of keys If False generate no edge data. If True use a dictionary representation of edge data. If a list of keys use a list of data values corresponding to the keys. Returns ------- lines : string Lines of data in adjlist format. Examples -------- >>> G = nx.lollipop_graph(4, 3) >>> G[1][2]['weight'] = 3 >>> G[3][4]['capacity'] = 12 >>> for line in nx.generate_edgelist(G, data=False): ... print(line) 0 1 0 2 0 3 1 2 1 3 2 3 3 4 4 5 5 6 >>> for line in nx.generate_edgelist(G): ... print(line) 0 1 {} 0 2 {} 0 3 {} 1 2 {'weight': 3} 1 3 {} 2 3 {} 3 4 {'capacity': 12} 4 5 {} 5 6 {} >>> for line in nx.generate_edgelist(G,data=['weight']): ... print(line) 0 1 0 2 0 3 1 2 3 1 3 2 3 3 4 4 5 5 6 See Also -------- write_adjlist, read_adjlist """ if data is True or data is False: for e in G.edges(data=data): yield delimiter.join(map(make_str,e)) else: for u,v,d in G.edges(data=True): e=[u,v] try: e.extend(d[k] for k in data) except KeyError: pass # missing data for this edge, should warn? yield delimiter.join(map(make_str,e)) @open_file(1,mode='wb') def write_edgelist(G, path, comments="#", delimiter=' ', data=True, encoding = 'utf-8'): """Write graph as a list of edges. Parameters ---------- G : graph A NetworkX graph path : file or string File or filename to write. If a file is provided, it must be opened in 'wb' mode. Filenames ending in .gz or .bz2 will be compressed. comments : string, optional The character used to indicate the start of a comment delimiter : string, optional The string used to separate values. The default is whitespace. data : bool or list, optional If False write no edge data. If True write a string representation of the edge data dictionary.. If a list (or other iterable) is provided, write the keys specified in the list. encoding: string, optional Specify which encoding to use when writing file. Examples -------- >>> G=nx.path_graph(4) >>> nx.write_edgelist(G, "test.edgelist") >>> G=nx.path_graph(4) >>> fh=open("test.edgelist",'wb') >>> nx.write_edgelist(G, fh) >>> nx.write_edgelist(G, "test.edgelist.gz") >>> nx.write_edgelist(G, "test.edgelist.gz", data=False) >>> G=nx.Graph() >>> G.add_edge(1,2,weight=7,color='red') >>> nx.write_edgelist(G,'test.edgelist',data=False) >>> nx.write_edgelist(G,'test.edgelist',data=['color']) >>> nx.write_edgelist(G,'test.edgelist',data=['color','weight']) See Also -------- write_edgelist() write_weighted_edgelist() """ for line in generate_edgelist(G, delimiter, data): line+='\n' path.write(line.encode(encoding)) def parse_edgelist(lines, comments='#', delimiter=None, create_using=None, nodetype=None, data=True): """Parse lines of an edge list representation of a graph. Parameters ---------- lines : list or iterator of strings Input data in edgelist format comments : string, optional Marker for comment lines delimiter : string, optional Separator for node labels create_using: NetworkX graph container, optional Use given NetworkX graph for holding nodes or edges. nodetype : Python type, optional Convert nodes to this type. data : bool or list of (label,type) tuples If False generate no edge data or if True use a dictionary representation of edge data or a list tuples specifying dictionary key names and types for edge data. Returns ------- G: NetworkX Graph The graph corresponding to lines Examples -------- Edgelist with no data: >>> lines = ["1 2", ... "2 3", ... "3 4"] >>> G = nx.parse_edgelist(lines, nodetype = int) >>> G.nodes() [1, 2, 3, 4] >>> G.edges() [(1, 2), (2, 3), (3, 4)] Edgelist with data in Python dictionary representation: >>> lines = ["1 2 {'weight':3}", ... "2 3 {'weight':27}", ... "3 4 {'weight':3.0}"] >>> G = nx.parse_edgelist(lines, nodetype = int) >>> G.nodes() [1, 2, 3, 4] >>> G.edges(data = True) [(1, 2, {'weight': 3}), (2, 3, {'weight': 27}), (3, 4, {'weight': 3.0})] Edgelist with data in a list: >>> lines = ["1 2 3", ... "2 3 27", ... "3 4 3.0"] >>> G = nx.parse_edgelist(lines, nodetype = int, data=(('weight',float),)) >>> G.nodes() [1, 2, 3, 4] >>> G.edges(data = True) [(1, 2, {'weight': 3.0}), (2, 3, {'weight': 27.0}), (3, 4, {'weight': 3.0})] See Also -------- read_weighted_edgelist """ from ast import literal_eval if create_using is None: G=nx.Graph() else: try: G=create_using G.clear() except: raise TypeError("create_using input is not a NetworkX graph type") for line in lines: p=line.find(comments) if p>=0: line = line[:p] if not len(line): continue # split line, should have 2 or more s=line.strip().split(delimiter) if len(s)<2: continue u=s.pop(0) v=s.pop(0) d=s if nodetype is not None: try: u=nodetype(u) v=nodetype(v) except: raise TypeError("Failed to convert nodes %s,%s to type %s." %(u,v,nodetype)) if len(d)==0 or data is False: # no data or data type specified edgedata={} elif data is True: # no edge types specified try: # try to evaluate as dictionary edgedata=dict(literal_eval(' '.join(d))) except: raise TypeError( "Failed to convert edge data (%s) to dictionary."%(d)) else: # convert edge data to dictionary with specified keys and type if len(d)!=len(data): raise IndexError( "Edge data %s and data_keys %s are not the same length"% (d, data)) edgedata={} for (edge_key,edge_type),edge_value in zip(data,d): try: edge_value=edge_type(edge_value) except: raise TypeError( "Failed to convert %s data %s to type %s." %(edge_key, edge_value, edge_type)) edgedata.update({edge_key:edge_value}) G.add_edge(u, v, attr_dict=edgedata) return G @open_file(0,mode='rb') def read_edgelist(path, comments="#", delimiter=None, create_using=None, nodetype=None, data=True, edgetype=None, encoding='utf-8'): """Read a graph from a list of edges. Parameters ---------- path : file or string File or filename to read. If a file is provided, it must be opened in 'rb' mode. Filenames ending in .gz or .bz2 will be uncompressed. comments : string, optional The character used to indicate the start of a comment. delimiter : string, optional The string used to separate values. The default is whitespace. create_using : Graph container, optional, Use specified container to build graph. The default is networkx.Graph, an undirected graph. nodetype : int, float, str, Python type, optional Convert node data from strings to specified type data : bool or list of (label,type) tuples Tuples specifying dictionary key names and types for edge data edgetype : int, float, str, Python type, optional OBSOLETE Convert edge data from strings to specified type and use as 'weight' encoding: string, optional Specify which encoding to use when reading file. Returns ------- G : graph A networkx Graph or other type specified with create_using Examples -------- >>> nx.write_edgelist(nx.path_graph(4), "test.edgelist") >>> G=nx.read_edgelist("test.edgelist") >>> fh=open("test.edgelist", 'rb') >>> G=nx.read_edgelist(fh) >>> fh.close() >>> G=nx.read_edgelist("test.edgelist", nodetype=int) >>> G=nx.read_edgelist("test.edgelist",create_using=nx.DiGraph()) Edgelist with data in a list: >>> textline = '1 2 3' >>> fh = open('test.edgelist','w') >>> d = fh.write(textline) >>> fh.close() >>> G = nx.read_edgelist('test.edgelist', nodetype=int, data=(('weight',float),)) >>> G.nodes() [1, 2] >>> G.edges(data = True) [(1, 2, {'weight': 3.0})] See parse_edgelist() for more examples of formatting. See Also -------- parse_edgelist Notes ----- Since nodes must be hashable, the function nodetype must return hashable types (e.g. int, float, str, frozenset - or tuples of those, etc.) """ lines = (line.decode(encoding) for line in path) return parse_edgelist(lines,comments=comments, delimiter=delimiter, create_using=create_using, nodetype=nodetype, data=data) def write_weighted_edgelist(G, path, comments="#", delimiter=' ', encoding='utf-8'): """Write graph G as a list of edges with numeric weights. Parameters ---------- G : graph A NetworkX graph path : file or string File or filename to write. If a file is provided, it must be opened in 'wb' mode. Filenames ending in .gz or .bz2 will be compressed. comments : string, optional The character used to indicate the start of a comment delimiter : string, optional The string used to separate values. The default is whitespace. encoding: string, optional Specify which encoding to use when writing file. Examples -------- >>> G=nx.Graph() >>> G.add_edge(1,2,weight=7) >>> nx.write_weighted_edgelist(G, 'test.weighted.edgelist') See Also -------- read_edgelist() write_edgelist() write_weighted_edgelist() """ write_edgelist(G,path, comments=comments, delimiter=delimiter, data=('weight',), encoding = encoding) def read_weighted_edgelist(path, comments="#", delimiter=None, create_using=None, nodetype=None, encoding='utf-8'): """Read a graph as list of edges with numeric weights. Parameters ---------- path : file or string File or filename to read. If a file is provided, it must be opened in 'rb' mode. Filenames ending in .gz or .bz2 will be uncompressed. comments : string, optional The character used to indicate the start of a comment. delimiter : string, optional The string used to separate values. The default is whitespace. create_using : Graph container, optional, Use specified container to build graph. The default is networkx.Graph, an undirected graph. nodetype : int, float, str, Python type, optional Convert node data from strings to specified type encoding: string, optional Specify which encoding to use when reading file. Returns ------- G : graph A networkx Graph or other type specified with create_using Notes ----- Since nodes must be hashable, the function nodetype must return hashable types (e.g. int, float, str, frozenset - or tuples of those, etc.) Example edgelist file format. With numeric edge data:: # read with # >>> G=nx.read_weighted_edgelist(fh) # source target data a b 1 a c 3.14159 d e 42 """ return read_edgelist(path, comments=comments, delimiter=delimiter, create_using=create_using, nodetype=nodetype, data=(('weight',float),), encoding = encoding ) # fixture for nose tests def teardown_module(module): import os for fname in ['test.edgelist', 'test.edgelist.gz', 'test.weighted.edgelist']: if os.path.isfile(fname): os.unlink(fname) networkx-1.11/networkx/readwrite/pajek.py0000644000175000017500000001606612637544500020525 0ustar aricaric00000000000000""" ***** Pajek ***** Read graphs in Pajek format. This implementation handles directed and undirected graphs including those with self loops and parallel edges. Format ------ See http://vlado.fmf.uni-lj.si/pub/networks/pajek/doc/draweps.htm for format information. """ # Copyright (C) 2008-2014 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import networkx as nx from networkx.utils import is_string_like, open_file, make_str __author__ = """Aric Hagberg """ __all__ = ['read_pajek', 'parse_pajek', 'generate_pajek', 'write_pajek'] def generate_pajek(G): """Generate lines in Pajek graph format. Parameters ---------- G : graph A Networkx graph References ---------- See http://vlado.fmf.uni-lj.si/pub/networks/pajek/doc/draweps.htm for format information. """ if G.name=='': name='NetworkX' else: name=G.name # Apparently many Pajek format readers can't process this line # So we'll leave it out for now. # yield '*network %s'%name # write nodes with attributes yield '*vertices %s'%(G.order()) nodes = G.nodes() # make dictionary mapping nodes to integers nodenumber=dict(zip(nodes,range(1,len(nodes)+1))) for n in nodes: na=G.node.get(n,{}) x=na.get('x',0.0) y=na.get('y',0.0) id=int(na.get('id',nodenumber[n])) nodenumber[n]=id shape=na.get('shape','ellipse') s=' '.join(map(make_qstr,(id,n,x,y,shape))) for k,v in na.items(): s+=' %s %s'%(make_qstr(k),make_qstr(v)) yield s # write edges with attributes if G.is_directed(): yield '*arcs' else: yield '*edges' for u,v,edgedata in G.edges(data=True): d=edgedata.copy() value=d.pop('weight',1.0) # use 1 as default edge value s=' '.join(map(make_qstr,(nodenumber[u],nodenumber[v],value))) for k,v in d.items(): s+=' %s %s'%(make_qstr(k),make_qstr(v)) s+=' %s %s'%(k,v) yield s @open_file(1,mode='wb') def write_pajek(G, path, encoding='UTF-8'): """Write graph in Pajek format to path. Parameters ---------- G : graph A Networkx graph path : file or string File or filename to write. Filenames ending in .gz or .bz2 will be compressed. Examples -------- >>> G=nx.path_graph(4) >>> nx.write_pajek(G, "test.net") References ---------- See http://vlado.fmf.uni-lj.si/pub/networks/pajek/doc/draweps.htm for format information. """ for line in generate_pajek(G): line+='\n' path.write(line.encode(encoding)) @open_file(0, mode='rb') def read_pajek(path, encoding='UTF-8'): """Read graph in Pajek format from path. Parameters ---------- path : file or string File or filename to write. Filenames ending in .gz or .bz2 will be uncompressed. Returns ------- G : NetworkX MultiGraph or MultiDiGraph. Examples -------- >>> G=nx.path_graph(4) >>> nx.write_pajek(G, "test.net") >>> G=nx.read_pajek("test.net") To create a Graph instead of a MultiGraph use >>> G1=nx.Graph(G) References ---------- See http://vlado.fmf.uni-lj.si/pub/networks/pajek/doc/draweps.htm for format information. """ lines = (line.decode(encoding) for line in path) return parse_pajek(lines) def parse_pajek(lines): """Parse Pajek format graph from string or iterable. Parameters ---------- lines : string or iterable Data in Pajek format. Returns ------- G : NetworkX graph See Also -------- read_pajek() """ import shlex # multigraph=False if is_string_like(lines): lines=iter(lines.split('\n')) lines = iter([line.rstrip('\n') for line in lines]) G=nx.MultiDiGraph() # are multiedges allowed in Pajek? assume yes while lines: try: l=next(lines) except: #EOF break if l.lower().startswith("*network"): try: label, name = l.split() except ValueError: # Line was not of the form: *network NAME pass else: G.graph['name'] = name elif l.lower().startswith("*vertices"): nodelabels={} l,nnodes=l.split() for i in range(int(nnodes)): l = next(lines) try: splitline=[x.decode('utf-8') for x in shlex.split(make_str(l).encode('utf-8'))] except AttributeError: splitline = shlex.split(str(l)) id,label=splitline[0:2] G.add_node(label) nodelabels[id]=label G.node[label]={'id':id} try: x,y,shape=splitline[2:5] G.node[label].update({'x':float(x), 'y':float(y), 'shape':shape}) except: pass extra_attr=zip(splitline[5::2],splitline[6::2]) G.node[label].update(extra_attr) elif l.lower().startswith("*edges") or l.lower().startswith("*arcs"): if l.lower().startswith("*edge"): # switch from multidigraph to multigraph G=nx.MultiGraph(G) if l.lower().startswith("*arcs"): # switch to directed with multiple arcs for each existing edge G=G.to_directed() for l in lines: try: splitline = [x.decode('utf-8') for x in shlex.split(make_str(l).encode('utf-8'))] except AttributeError: splitline = shlex.split(str(l)) if len(splitline)<2: continue ui,vi=splitline[0:2] u=nodelabels.get(ui,ui) v=nodelabels.get(vi,vi) # parse the data attached to this edge and put in a dictionary edge_data={} try: # there should always be a single value on the edge? w=splitline[2:3] edge_data.update({'weight':float(w[0])}) except: pass # if there isn't, just assign a 1 # edge_data.update({'value':1}) extra_attr=zip(splitline[3::2],splitline[4::2]) edge_data.update(extra_attr) # if G.has_edge(u,v): # multigraph=True G.add_edge(u,v,**edge_data) return G def make_qstr(t): """Return the string representation of t. Add outer double-quotes if the string has a space. """ if not is_string_like(t): t = str(t) if " " in t: t=r'"%s"'%t return t # fixture for nose tests def teardown_module(module): import os os.unlink('test.net') networkx-1.11/networkx/readwrite/sparse6.py0000644000175000017500000001772412637544450021024 0ustar aricaric00000000000000"""Sparse6 Read and write graphs in sparse6 format. Format ------ "graph6 and sparse6 are formats for storing undirected graphs in a compact manner, using only printable ASCII characters. Files in these formats have text type and contain one line per graph." See http://cs.anu.edu.au/~bdm/data/formats.txt for details. """ # Original author: D. Eppstein, UC Irvine, August 12, 2003. # The original code at http://www.ics.uci.edu/~eppstein/PADS/ is public domain. # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # Tomas Gavenciak # All rights reserved. # BSD license. import networkx as nx from networkx.exception import NetworkXError from networkx.utils import open_file, not_implemented_for from networkx.readwrite.graph6 import data_to_graph6, graph6_to_data,\ data_to_n, n_to_data __author__ = """\n""".join(['Tomas Gavenciak ', 'Aric Hagberg >> G = nx.parse_sparse6(':A_') >>> sorted(G.edges()) [(0, 1), (0, 1), (0, 1)] See Also -------- generate_sparse6, read_sparse6, write_sparse6 References ---------- Sparse6 specification: http://cs.anu.edu.au/~bdm/data/formats.txt """ if string.startswith('>>sparse6<<'): string = string[11:] if not string.startswith(':'): raise NetworkXError('Expected leading colon in sparse6') n, data = data_to_n(graph6_to_data(string[1:])) k = 1 while 1<>dLen) & 1 # grab top remaining bit x = d & ((1<> (xLen - k)) # shift back the extra bits dLen = xLen - k yield b,x v = 0 G = nx.MultiGraph() G.add_nodes_from(range(n)) multigraph = False for b,x in parseData(): if b == 1: v += 1 # padding with ones can cause overlarge number here if x >= n or v >= n: break elif x > v: v = x else: if G.has_edge(x,v): multigraph = True G.add_edge(x,v) if not multigraph: G = nx.Graph(G) return G @open_file(0,mode='rt') def read_sparse6(path): """Read an undirected graph in sparse6 format from path. Parameters ---------- path : file or string File or filename to write. Returns ------- G : Graph/Multigraph or list of Graphs/MultiGraphs If the file contains multple lines then a list of graphs is returned Raises ------ NetworkXError If the string is unable to be parsed in sparse6 format Examples -------- >>> nx.write_sparse6(nx.Graph([(0,1),(0,1),(0,1)]), 'test.s6') >>> G = nx.read_sparse6('test.s6') >>> sorted(G.edges()) [(0, 1)] See Also -------- generate_sparse6, read_sparse6, parse_sparse6 References ---------- Sparse6 specification: http://cs.anu.edu.au/~bdm/data/formats.txt """ glist = [] for line in path: line = line.strip() if not len(line): continue glist.append(parse_sparse6(line)) if len(glist) == 1: return glist[0] else: return glist @not_implemented_for('directed') def generate_sparse6(G, nodes=None, header=True): """Generate sparse6 format string from an undirected graph. Parameters ---------- G : Graph (undirected) nodes: list or iterable Nodes are labeled 0...n-1 in the order provided. If None the ordering given by G.nodes() is used. header: bool If True add '>>sparse6<<' string to head of data Returns ------- s : string String in sparse6 format Raises ------ NetworkXError If the graph is directed Examples -------- >>> G = nx.MultiGraph([(0, 1), (0, 1), (0, 1)]) >>> nx.generate_sparse6(G) '>>sparse6<<:A_' See Also -------- read_sparse6, parse_sparse6, write_sparse6 Notes ----- The format does not support edge or node labels. References ---------- Sparse6 specification: http://cs.anu.edu.au/~bdm/data/formats.txt for details. """ n = G.order() k = 1 while 1< node else: ns = list(nodes) ndict = dict(((ns[i], i) for i in range(len(ns)))) # node -> number edges = [(ndict[u], ndict[v]) for (u, v) in G.edges()] edges = [(max(u,v), min(u,v)) for (u, v) in edges] edges.sort() bits = [] curv = 0 for (v, u) in edges: if v == curv: # current vertex edge bits.append(0) bits.extend(enc(u)) elif v == curv + 1: # next vertex edge curv += 1 bits.append(1) bits.extend(enc(u)) else: # skip to vertex v and then add edge to u curv = v bits.append(1) bits.extend(enc(v)) bits.append(0) bits.extend(enc(u)) if k < 6 and n == (1 << k) and ((-len(bits)) % 6) >= k and curv < (n - 1): # Padding special case: small k, n=2^k, # more than k bits of padding needed, # current vertex is not (n-1) -- # appending 1111... would add a loop on (n-1) bits.append(0) bits.extend([1] * ((-len(bits)) % 6)) else: bits.extend([1] * ((-len(bits)) % 6)) data = [(bits[i+0]<<5) + (bits[i+1]<<4) + (bits[i+2]<<3) + (bits[i+3]<<2) + (bits[i+4]<<1) + (bits[i+5]<<0) for i in range(0, len(bits), 6)] res = (':' + data_to_graph6(n_to_data(n)) + data_to_graph6(data)) if header: return '>>sparse6<<' + res else: return res @open_file(1, mode='wt') def write_sparse6(G, path, nodes=None, header=True): """Write graph G to given path in sparse6 format. Parameters ---------- G : Graph (undirected) path : file or string File or filename to write nodes: list or iterable Nodes are labeled 0...n-1 in the order provided. If None the ordering given by G.nodes() is used. header: bool If True add '>>sparse6<<' string to head of data Raises ------ NetworkXError If the graph is directed Examples -------- >>> G = nx.Graph([(0, 1), (0, 1), (0, 1)]) >>> nx.write_sparse6(G, 'test.s6') See Also -------- read_sparse6, parse_sparse6, generate_sparse6 Notes ----- The format does not support edge or node labels. References ---------- Sparse6 specification: http://cs.anu.edu.au/~bdm/data/formats.txt for details. """ path.write(generate_sparse6(G, nodes=nodes, header=header)) path.write('\n') def teardown_module(test): import os if os.path.isfile('test.s6'): os.unlink('test.s6') networkx-1.11/networkx/readwrite/gexf.py0000644000175000017500000010630212637544500020355 0ustar aricaric00000000000000""" **** GEXF **** Read and write graphs in GEXF format. GEXF (Graph Exchange XML Format) is a language for describing complex network structures, their associated data and dynamics. This implementation does not support mixed graphs (directed and undirected edges together). Format ------ GEXF is an XML format. See http://gexf.net/format/schema.html for the specification and http://gexf.net/format/basic.html for examples. """ # Copyright (C) 2013 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. # Based on GraphML NetworkX GraphML reader __author__ = """\n""".join(['Aric Hagberg ']) __all__ = ['write_gexf', 'read_gexf', 'relabel_gexf_graph', 'generate_gexf'] import itertools import networkx as nx from networkx.utils import open_file, make_str try: from xml.etree.cElementTree import Element, ElementTree, tostring except ImportError: try: from xml.etree.ElementTree import Element, ElementTree, tostring except ImportError: pass @open_file(1,mode='wb') def write_gexf(G, path, encoding='utf-8',prettyprint=True,version='1.1draft'): """Write G in GEXF format to path. "GEXF (Graph Exchange XML Format) is a language for describing complex networks structures, their associated data and dynamics" [1]_. Parameters ---------- G : graph A NetworkX graph path : file or string File or file name to write. File names ending in .gz or .bz2 will be compressed. encoding : string (optional) Encoding for text data. prettyprint : bool (optional) If True use line breaks and indenting in output XML. Examples -------- >>> G=nx.path_graph(4) >>> nx.write_gexf(G, "test.gexf") Notes ----- This implementation does not support mixed graphs (directed and undirected edges together). The node id attribute is set to be the string of the node label. If you want to specify an id use set it as node data, e.g. node['a']['id']=1 to set the id of node 'a' to 1. References ---------- .. [1] GEXF graph format, http://gexf.net/format/ """ writer = GEXFWriter(encoding=encoding,prettyprint=prettyprint, version=version) writer.add_graph(G) writer.write(path) def generate_gexf(G, encoding='utf-8',prettyprint=True,version='1.1draft'): """Generate lines of GEXF format representation of G" "GEXF (Graph Exchange XML Format) is a language for describing complex networks structures, their associated data and dynamics" [1]_. Parameters ---------- G : graph A NetworkX graph encoding : string (optional) Encoding for text data. prettyprint : bool (optional) If True use line breaks and indenting in output XML. Examples -------- >>> G=nx.path_graph(4) >>> linefeed=chr(10) # linefeed=\n >>> s=linefeed.join(nx.generate_gexf(G)) # doctest: +SKIP >>> for line in nx.generate_gexf(G): # doctest: +SKIP ... print line Notes ----- This implementation does not support mixed graphs (directed and undirected edges together). The node id attribute is set to be the string of the node label. If you want to specify an id use set it as node data, e.g. node['a']['id']=1 to set the id of node 'a' to 1. References ---------- .. [1] GEXF graph format, http://gexf.net/format/ """ writer = GEXFWriter(encoding=encoding,prettyprint=prettyprint, version=version) writer.add_graph(G) for line in str(writer).splitlines(): yield line @open_file(0,mode='rb') def read_gexf(path,node_type=None,relabel=False,version='1.1draft'): """Read graph in GEXF format from path. "GEXF (Graph Exchange XML Format) is a language for describing complex networks structures, their associated data and dynamics" [1]_. Parameters ---------- path : file or string File or file name to write. File names ending in .gz or .bz2 will be compressed. node_type: Python type (default: None) Convert node ids to this type if not None. relabel : bool (default: False) If True relabel the nodes to use the GEXF node "label" attribute instead of the node "id" attribute as the NetworkX node label. Returns ------- graph: NetworkX graph If no parallel edges are found a Graph or DiGraph is returned. Otherwise a MultiGraph or MultiDiGraph is returned. Notes ----- This implementation does not support mixed graphs (directed and undirected edges together). References ---------- .. [1] GEXF graph format, http://gexf.net/format/ """ reader = GEXFReader(node_type=node_type,version=version) if relabel: G=relabel_gexf_graph(reader(path)) else: G=reader(path) return G class GEXF(object): # global register_namespace versions={} d={'NS_GEXF':"http://www.gexf.net/1.1draft", 'NS_VIZ':"http://www.gexf.net/1.1draft/viz", 'NS_XSI':"http://www.w3.org/2001/XMLSchema-instance", 'SCHEMALOCATION':' '.join(['http://www.gexf.net/1.1draft', 'http://www.gexf.net/1.1draft/gexf.xsd' ]), 'VERSION':'1.1' } versions['1.1draft']=d d={'NS_GEXF':"http://www.gexf.net/1.2draft", 'NS_VIZ':"http://www.gexf.net/1.2draft/viz", 'NS_XSI':"http://www.w3.org/2001/XMLSchema-instance", 'SCHEMALOCATION':' '.join(['http://www.gexf.net/1.2draft', 'http://www.gexf.net/1.2draft/gexf.xsd' ]), 'VERSION':'1.2' } versions['1.2draft']=d types=[(int,"integer"), (float,"float"), (float,"double"), (bool,"boolean"), (list,"string"), (dict,"string"), ] try: # Python 3.x blurb = chr(1245) # just to trigger the exception types.extend([ (int, "long"), (str,"liststring"), (str,"anyURI"), (str,"string")]) except ValueError: # Python 2.6+ types.extend([ (long,"long"), (str,"liststring"), (str,"anyURI"), (str,"string"), (unicode,"liststring"), (unicode,"anyURI"), (unicode,"string")]) xml_type = dict(types) python_type = dict(reversed(a) for a in types) # http://www.w3.org/TR/xmlschema-2/#boolean convert_bool = { 'true': True, 'false': False, 'True': True, 'False': False, '0': False, 0: False, '1': False, 1: True } # try: # register_namespace = ET.register_namespace # except AttributeError: # def register_namespace(prefix, uri): # ET._namespace_map[uri] = prefix def set_version(self,version): d=self.versions.get(version) if d is None: raise nx.NetworkXError('Unknown GEXF version %s'%version) self.NS_GEXF = d['NS_GEXF'] self.NS_VIZ = d['NS_VIZ'] self.NS_XSI = d['NS_XSI'] self.SCHEMALOCATION = d['NS_XSI'] self.VERSION=d['VERSION'] self.version=version # register_namespace('viz', d['NS_VIZ']) class GEXFWriter(GEXF): # class for writing GEXF format files # use write_gexf() function def __init__(self, graph=None, encoding="utf-8", prettyprint=True, version='1.1draft'): try: import xml.etree.ElementTree except ImportError: raise ImportError('GEXF writer requires ' 'xml.elementtree.ElementTree') self.prettyprint=prettyprint self.encoding = encoding self.set_version(version) self.xml = Element("gexf", {'xmlns':self.NS_GEXF, 'xmlns:xsi':self.NS_XSI, 'xmlns:viz':self.NS_VIZ, 'xsi:schemaLocation':self.SCHEMALOCATION, 'version':self.VERSION}) # counters for edge and attribute identifiers self.edge_id=itertools.count() self.attr_id=itertools.count() # default attributes are stored in dictionaries self.attr={} self.attr['node']={} self.attr['edge']={} self.attr['node']['dynamic']={} self.attr['node']['static']={} self.attr['edge']['dynamic']={} self.attr['edge']['static']={} if graph is not None: self.add_graph(graph) def __str__(self): if self.prettyprint: self.indent(self.xml) s=tostring(self.xml).decode(self.encoding) return s def add_graph(self, G): # set graph attributes if G.graph.get('mode')=='dynamic': mode='dynamic' else: mode='static' # Add a graph element to the XML if G.is_directed(): default='directed' else: default='undirected' graph_element = Element("graph",defaultedgetype=default,mode=mode) self.graph_element=graph_element self.add_nodes(G,graph_element) self.add_edges(G,graph_element) self.xml.append(graph_element) def add_nodes(self, G, graph_element): nodes_element = Element('nodes') for node,data in G.nodes_iter(data=True): node_data=data.copy() node_id = make_str(node_data.pop('id', node)) kw={'id':node_id} label = make_str(node_data.pop('label', node)) kw['label']=label try: pid=node_data.pop('pid') kw['pid'] = make_str(pid) except KeyError: pass # add node element with attributes node_element = Element("node", **kw) # add node element and attr subelements default=G.graph.get('node_default',{}) node_data=self.add_parents(node_element, node_data) if self.version=='1.1': node_data=self.add_slices(node_element, node_data) else: node_data=self.add_spells(node_element, node_data) node_data=self.add_viz(node_element,node_data) node_data=self.add_attributes("node", node_element, node_data, default) nodes_element.append(node_element) graph_element.append(nodes_element) def add_edges(self, G, graph_element): def edge_key_data(G): # helper function to unify multigraph and graph edge iterator if G.is_multigraph(): for u,v,key,data in G.edges_iter(data=True,keys=True): edge_data=data.copy() edge_data.update(key=key) edge_id=edge_data.pop('id',None) if edge_id is None: edge_id=next(self.edge_id) yield u,v,edge_id,edge_data else: for u,v,data in G.edges_iter(data=True): edge_data=data.copy() edge_id=edge_data.pop('id',None) if edge_id is None: edge_id=next(self.edge_id) yield u,v,edge_id,edge_data edges_element = Element('edges') for u,v,key,edge_data in edge_key_data(G): kw={'id':make_str(key)} try: edge_weight=edge_data.pop('weight') kw['weight']=make_str(edge_weight) except KeyError: pass try: edge_type=edge_data.pop('type') kw['type']=make_str(edge_type) except KeyError: pass try: start=edge_data.pop('start') kw['start']=make_str(start) self.alter_graph_mode_timeformat(start) except KeyError: pass try: end=edge_data.pop('end') kw['end']=make_str(end) self.alter_graph_mode_timeformat(end) except KeyError: pass source_id = make_str(G.node[u].get('id', u)) target_id = make_str(G.node[v].get('id', v)) edge_element = Element("edge", source=source_id,target=target_id, **kw) default=G.graph.get('edge_default',{}) if self.version == '1.1': edge_data=self.add_slices(edge_element, edge_data) else: edge_data=self.add_spells(edge_element, edge_data) edge_data=self.add_viz(edge_element,edge_data) edge_data=self.add_attributes("edge", edge_element, edge_data, default) edges_element.append(edge_element) graph_element.append(edges_element) def add_attributes(self, node_or_edge, xml_obj, data, default): # Add attrvalues to node or edge attvalues=Element('attvalues') if len(data)==0: return data mode='static' for k,v in data.items(): # rename generic multigraph key to avoid any name conflict if k == 'key': k='networkx_key' val_type=type(v) if type(v)==list: # dynamic data for val,start,end in v: val_type = type(val) if start is not None or end is not None: mode='dynamic' self.alter_graph_mode_timeformat(start) self.alter_graph_mode_timeformat(end) break attr_id = self.get_attr_id(make_str(k), self.xml_type[val_type], node_or_edge, default, mode) for val,start,end in v: e=Element("attvalue") e.attrib['for']=attr_id e.attrib['value']=make_str(val) if start is not None: e.attrib['start']=make_str(start) if end is not None: e.attrib['end']=make_str(end) attvalues.append(e) else: # static data mode='static' attr_id = self.get_attr_id(make_str(k), self.xml_type[val_type], node_or_edge, default, mode) e=Element("attvalue") e.attrib['for']=attr_id if type(v) == bool: e.attrib['value']=make_str(v).lower() else: e.attrib['value']=make_str(v) attvalues.append(e) xml_obj.append(attvalues) return data def get_attr_id(self, title, attr_type, edge_or_node, default, mode): # find the id of the attribute or generate a new id try: return self.attr[edge_or_node][mode][title] except KeyError: # generate new id new_id=str(next(self.attr_id)) self.attr[edge_or_node][mode][title] = new_id attr_kwargs = {"id":new_id, "title":title, "type":attr_type} attribute=Element("attribute",**attr_kwargs) # add subelement for data default value if present default_title=default.get(title) if default_title is not None: default_element=Element("default") default_element.text=make_str(default_title) attribute.append(default_element) # new insert it into the XML attributes_element=None for a in self.graph_element.findall("attributes"): # find existing attributes element by class and mode a_class=a.get('class') a_mode=a.get('mode','static') # default mode is static if a_class==edge_or_node and a_mode==mode: attributes_element=a if attributes_element is None: # create new attributes element attr_kwargs = {"mode":mode,"class":edge_or_node} attributes_element=Element('attributes', **attr_kwargs) self.graph_element.insert(0,attributes_element) attributes_element.append(attribute) return new_id def add_viz(self,element,node_data): viz=node_data.pop('viz',False) if viz: color=viz.get('color') if color is not None: if self.VERSION=='1.1': e=Element("{%s}color"%self.NS_VIZ, r=str(color.get('r')), g=str(color.get('g')), b=str(color.get('b')), ) else: e=Element("{%s}color"%self.NS_VIZ, r=str(color.get('r')), g=str(color.get('g')), b=str(color.get('b')), a=str(color.get('a')), ) element.append(e) size=viz.get('size') if size is not None: e=Element("{%s}size"%self.NS_VIZ,value=str(size)) element.append(e) thickness=viz.get('thickness') if thickness is not None: e=Element("{%s}thickness"%self.NS_VIZ,value=str(thickness)) element.append(e) shape=viz.get('shape') if shape is not None: if shape.startswith('http'): e=Element("{%s}shape"%self.NS_VIZ, value='image',uri=str(shape)) else: e=Element("{%s}shape"%self.NS_VIZ,value=str(shape)) element.append(e) position=viz.get('position') if position is not None: e=Element("{%s}position"%self.NS_VIZ, x=str(position.get('x')), y=str(position.get('y')), z=str(position.get('z')), ) element.append(e) return node_data def add_parents(self,node_element,node_data): parents=node_data.pop('parents',False) if parents: parents_element=Element('parents') for p in parents: e=Element('parent') e.attrib['for']=str(p) parents_element.append(e) node_element.append(parents_element) return node_data def add_slices(self,node_or_edge_element,node_or_edge_data): slices=node_or_edge_data.pop('slices',False) if slices: slices_element=Element('slices') for start,end in slices: e=Element('slice',start=str(start),end=str(end)) slices_element.append(e) node_or_edge_element.append(slices_element) return node_or_edge_data def add_spells(self,node_or_edge_element,node_or_edge_data): spells=node_or_edge_data.pop('spells',False) if spells: spells_element=Element('spells') for start,end in spells: e=Element('spell') if start is not None: e.attrib['start']=make_str(start) self.alter_graph_mode_timeformat(start) if end is not None: e.attrib['end']=make_str(end) self.alter_graph_mode_timeformat(end) spells_element.append(e) node_or_edge_element.append(spells_element) return node_or_edge_data def alter_graph_mode_timeformat(self, start_or_end): # if 'start' or 'end' appears, alter Graph mode to dynamic and set timeformat if self.graph_element.get('mode') == 'static': if start_or_end is not None: if type(start_or_end) == str: timeformat = 'date' elif type(start_or_end) == float: timeformat = 'double' elif type(start_or_end) == int: timeformat = 'long' self.graph_element.set('timeformat', timeformat) self.graph_element.set('mode', 'dynamic') def write(self, fh): # Serialize graph G in GEXF to the open fh if self.prettyprint: self.indent(self.xml) document = ElementTree(self.xml) document.write(fh, encoding=self.encoding, xml_declaration=True) def indent(self, elem, level=0): # in-place prettyprint formatter i = "\n" + level*" " if len(elem): if not elem.text or not elem.text.strip(): elem.text = i + " " if not elem.tail or not elem.tail.strip(): elem.tail = i for elem in elem: self.indent(elem, level+1) if not elem.tail or not elem.tail.strip(): elem.tail = i else: if level and (not elem.tail or not elem.tail.strip()): elem.tail = i class GEXFReader(GEXF): # Class to read GEXF format files # use read_gexf() function def __init__(self, node_type=None,version='1.1draft'): try: import xml.etree.ElementTree except ImportError: raise ImportError('GEXF reader requires ' 'xml.elementtree.ElementTree') self.node_type=node_type # assume simple graph and test for multigraph on read self.simple_graph=True self.set_version(version) def __call__(self, stream): self.xml = ElementTree(file=stream) g=self.xml.find("{%s}graph" % self.NS_GEXF) if g is not None: return self.make_graph(g) # try all the versions for version in self.versions: self.set_version(version) g=self.xml.find("{%s}graph" % self.NS_GEXF) if g is not None: return self.make_graph(g) raise nx.NetworkXError("No element in GEXF file") def make_graph(self, graph_xml): # start with empty DiGraph or MultiDiGraph edgedefault = graph_xml.get("defaultedgetype", None) if edgedefault=='directed': G=nx.MultiDiGraph() else: G=nx.MultiGraph() # graph attributes graph_start=graph_xml.get('start') if graph_start is not None: G.graph['start']=graph_start graph_end=graph_xml.get('end') if graph_end is not None: G.graph['end']=graph_end graph_mode=graph_xml.get("mode", "") if graph_mode=='dynamic': G.graph['mode']='dynamic' else: G.graph['mode']='static' # timeformat self.timeformat=graph_xml.get('timeformat') if self.timeformat == 'date': self.timeformat = 'string' # node and edge attributes attributes_elements=graph_xml.findall("{%s}attributes"%self.NS_GEXF) # dictionaries to hold attributes and attribute defaults node_attr={} node_default={} edge_attr={} edge_default={} for a in attributes_elements: attr_class = a.get("class") if attr_class=='node': na,nd = self.find_gexf_attributes(a) node_attr.update(na) node_default.update(nd) G.graph['node_default']=node_default elif attr_class=='edge': ea,ed = self.find_gexf_attributes(a) edge_attr.update(ea) edge_default.update(ed) G.graph['edge_default']=edge_default else: raise # unknown attribute class # Hack to handle Gephi0.7beta bug # add weight attribute ea={'weight':{'type': 'double', 'mode': 'static', 'title': 'weight'}} ed={} edge_attr.update(ea) edge_default.update(ed) G.graph['edge_default']=edge_default # add nodes nodes_element=graph_xml.find("{%s}nodes" % self.NS_GEXF) if nodes_element is not None: for node_xml in nodes_element.findall("{%s}node" % self.NS_GEXF): self.add_node(G, node_xml, node_attr) # add edges edges_element=graph_xml.find("{%s}edges" % self.NS_GEXF) if edges_element is not None: for edge_xml in edges_element.findall("{%s}edge" % self.NS_GEXF): self.add_edge(G, edge_xml, edge_attr) # switch to Graph or DiGraph if no parallel edges were found. if self.simple_graph: if G.is_directed(): G=nx.DiGraph(G) else: G=nx.Graph(G) return G def add_node(self, G, node_xml, node_attr, node_pid=None): # add a single node with attributes to the graph # get attributes and subattributues for node data = self.decode_attr_elements(node_attr, node_xml) data = self.add_parents(data, node_xml) # add any parents if self.version=='1.1': data = self.add_slices(data, node_xml) # add slices else: data = self.add_spells(data, node_xml) # add spells data = self.add_viz(data, node_xml) # add viz data = self.add_start_end(data, node_xml) # add start/end # find the node id and cast it to the appropriate type node_id = node_xml.get("id") if self.node_type is not None: node_id=self.node_type(node_id) # every node should have a label node_label = node_xml.get("label") data['label']=node_label # parent node id node_pid = node_xml.get("pid", node_pid) if node_pid is not None: data['pid']=node_pid # check for subnodes, recursive subnodes=node_xml.find("{%s}nodes" % self.NS_GEXF) if subnodes is not None: for node_xml in subnodes.findall("{%s}node" % self.NS_GEXF): self.add_node(G, node_xml, node_attr, node_pid=node_id) G.add_node(node_id, data) def add_start_end(self, data, xml): # start and end times ttype = self.timeformat node_start = xml.get("start") if node_start is not None: data['start']=self.python_type[ttype](node_start) node_end = xml.get("end") if node_end is not None: data['end']=self.python_type[ttype](node_end) return data def add_viz(self, data, node_xml): # add viz element for node viz={} color=node_xml.find("{%s}color"%self.NS_VIZ) if color is not None: if self.VERSION=='1.1': viz['color']={'r':int(color.get('r')), 'g':int(color.get('g')), 'b':int(color.get('b'))} else: viz['color']={'r':int(color.get('r')), 'g':int(color.get('g')), 'b':int(color.get('b')), 'a':float(color.get('a', 1)), } size=node_xml.find("{%s}size"%self.NS_VIZ) if size is not None: viz['size']=float(size.get('value')) thickness=node_xml.find("{%s}thickness"%self.NS_VIZ) if thickness is not None: viz['thickness']=float(thickness.get('value')) shape=node_xml.find("{%s}shape"%self.NS_VIZ) if shape is not None: viz['shape']=shape.get('shape') if viz['shape']=='image': viz['shape']=shape.get('uri') position=node_xml.find("{%s}position"%self.NS_VIZ) if position is not None: viz['position']={'x':float(position.get('x',0)), 'y':float(position.get('y',0)), 'z':float(position.get('z',0))} if len(viz)>0: data['viz']=viz return data def add_parents(self, data, node_xml): parents_element=node_xml.find("{%s}parents"%self.NS_GEXF) if parents_element is not None: data['parents']=[] for p in parents_element.findall("{%s}parent"%self.NS_GEXF): parent=p.get('for') data['parents'].append(parent) return data def add_slices(self, data, node_or_edge_xml): slices_element=node_or_edge_xml.find("{%s}slices"%self.NS_GEXF) if slices_element is not None: data['slices']=[] for s in slices_element.findall("{%s}slice"%self.NS_GEXF): start=s.get('start') end=s.get('end') data['slices'].append((start,end)) return data def add_spells(self, data, node_or_edge_xml): spells_element=node_or_edge_xml.find("{%s}spells"%self.NS_GEXF) if spells_element is not None: data['spells']=[] ttype = self.timeformat for s in spells_element.findall("{%s}spell"%self.NS_GEXF): start=self.python_type[ttype](s.get('start')) end=self.python_type[ttype](s.get('end')) data['spells'].append((start,end)) return data def add_edge(self, G, edge_element, edge_attr): # add an edge to the graph # raise error if we find mixed directed and undirected edges edge_direction = edge_element.get("type") if G.is_directed() and edge_direction=='undirected': raise nx.NetworkXError(\ "Undirected edge found in directed graph.") if (not G.is_directed()) and edge_direction=='directed': raise nx.NetworkXError(\ "Directed edge found in undirected graph.") # Get source and target and recast type if required source = edge_element.get("source") target = edge_element.get("target") if self.node_type is not None: source=self.node_type(source) target=self.node_type(target) data = self.decode_attr_elements(edge_attr, edge_element) data = self.add_start_end(data,edge_element) if self.version=='1.1': data = self.add_slices(data, edge_element) # add slices else: data = self.add_spells(data, edge_element) # add spells # GEXF stores edge ids as an attribute # NetworkX uses them as keys in multigraphs # if networkx_key is not specified as an attribute edge_id = edge_element.get("id") if edge_id is not None: data["id"] = edge_id # check if there is a 'multigraph_key' and use that as edge_id multigraph_key = data.pop('networkx_key',None) if multigraph_key is not None: edge_id=multigraph_key weight = edge_element.get('weight') if weight is not None: data['weight']=float(weight) edge_label = edge_element.get("label") if edge_label is not None: data['label']=edge_label if G.has_edge(source,target): # seen this edge before - this is a multigraph self.simple_graph=False G.add_edge(source, target, key=edge_id, **data) if edge_direction=='mutual': G.add_edge(target, source, key=edge_id, **data) def decode_attr_elements(self, gexf_keys, obj_xml): # Use the key information to decode the attr XML attr = {} # look for outer "" element attr_element=obj_xml.find("{%s}attvalues" % self.NS_GEXF) if attr_element is not None: # loop over elements for a in attr_element.findall("{%s}attvalue" % self.NS_GEXF): key = a.get('for') # for is required try: # should be in our gexf_keys dictionary title=gexf_keys[key]['title'] except KeyError: raise nx.NetworkXError("No attribute defined for=%s"%key) atype=gexf_keys[key]['type'] value=a.get('value') if atype=='boolean': value=self.convert_bool[value] else: value=self.python_type[atype](value) if gexf_keys[key]['mode']=='dynamic': # for dynamic graphs use list of three-tuples # [(value1,start1,end1), (value2,start2,end2), etc] ttype = self.timeformat start=self.python_type[ttype](a.get('start')) end=self.python_type[ttype](a.get('end')) if title in attr: attr[title].append((value,start,end)) else: attr[title]=[(value,start,end)] else: # for static graphs just assign the value attr[title] = value return attr def find_gexf_attributes(self, attributes_element): # Extract all the attributes and defaults attrs = {} defaults = {} mode=attributes_element.get('mode') for k in attributes_element.findall("{%s}attribute" % self.NS_GEXF): attr_id = k.get("id") title=k.get('title') atype=k.get('type') attrs[attr_id]={'title':title,'type':atype,'mode':mode} # check for the "default" subelement of key element and add default=k.find("{%s}default" % self.NS_GEXF) if default is not None: if atype=='boolean': value=self.convert_bool[default.text] else: value=self.python_type[atype](default.text) defaults[title]=value return attrs,defaults def relabel_gexf_graph(G): """Relabel graph using "label" node keyword for node label. Parameters ---------- G : graph A NetworkX graph read from GEXF data Returns ------- H : graph A NetworkX graph with relabed nodes Notes ----- This function relabels the nodes in a NetworkX graph with the "label" attribute. It also handles relabeling the specific GEXF node attributes "parents", and "pid". """ # build mapping of node labels, do some error checking try: mapping=[(u,G.node[u]['label']) for u in G] except KeyError: raise nx.NetworkXError('Failed to relabel nodes: ' 'missing node labels found. ' 'Use relabel=False.') x,y=zip(*mapping) if len(set(y))!=len(G): raise nx.NetworkXError('Failed to relabel nodes: ' 'duplicate node labels found. ' 'Use relabel=False.') mapping=dict(mapping) H=nx.relabel_nodes(G,mapping) # relabel attributes for n in G: m=mapping[n] H.node[m]['id']=n H.node[m].pop('label') if 'pid' in H.node[m]: H.node[m]['pid']=mapping[G.node[n]['pid']] if 'parents' in H.node[m]: H.node[m]['parents']=[mapping[p] for p in G.node[n]['parents']] return H # fixture for nose tests def setup_module(module): from nose import SkipTest try: import xml.etree.cElementTree except: raise SkipTest("xml.etree.cElementTree not available") # fixture for nose tests def teardown_module(module): import os try: os.unlink('test.gexf') except: pass networkx-1.11/networkx/readwrite/multiline_adjlist.py0000644000175000017500000002726212637544500023147 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ ************************* Multi-line Adjacency List ************************* Read and write NetworkX graphs as multi-line adjacency lists. The multi-line adjacency list format is useful for graphs with nodes that can be meaningfully represented as strings. With this format simple edge data can be stored but node or graph data is not. Format ------ The first label in a line is the source node label followed by the node degree d. The next d lines are target node labels and optional edge data. That pattern repeats for all nodes in the graph. The graph with edges a-b, a-c, d-e can be represented as the following adjacency list (anything following the # in a line is a comment):: # example.multiline-adjlist a 2 b c d 1 e """ __author__ = '\n'.join(['Aric Hagberg ', 'Dan Schult ', 'Loïc Séguin-C. ']) # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. __all__ = ['generate_multiline_adjlist', 'write_multiline_adjlist', 'parse_multiline_adjlist', 'read_multiline_adjlist'] from networkx.utils import make_str, open_file import networkx as nx def generate_multiline_adjlist(G, delimiter = ' '): """Generate a single line of the graph G in multiline adjacency list format. Parameters ---------- G : NetworkX graph delimiter : string, optional Separator for node labels Returns ------- lines : string Lines of data in multiline adjlist format. Examples -------- >>> G = nx.lollipop_graph(4, 3) >>> for line in nx.generate_multiline_adjlist(G): ... print(line) 0 3 1 {} 2 {} 3 {} 1 2 2 {} 3 {} 2 1 3 {} 3 1 4 {} 4 1 5 {} 5 1 6 {} 6 0 See Also -------- write_multiline_adjlist, read_multiline_adjlist """ if G.is_directed(): if G.is_multigraph(): for s,nbrs in G.adjacency_iter(): nbr_edges=[ (u,data) for u,datadict in nbrs.items() for key,data in datadict.items()] deg=len(nbr_edges) yield make_str(s)+delimiter+"%i"%(deg) for u,d in nbr_edges: if d is None: yield make_str(u) else: yield make_str(u)+delimiter+make_str(d) else: # directed single edges for s,nbrs in G.adjacency_iter(): deg=len(nbrs) yield make_str(s)+delimiter+"%i"%(deg) for u,d in nbrs.items(): if d is None: yield make_str(u) else: yield make_str(u)+delimiter+make_str(d) else: # undirected if G.is_multigraph(): seen=set() # helper dict used to avoid duplicate edges for s,nbrs in G.adjacency_iter(): nbr_edges=[ (u,data) for u,datadict in nbrs.items() if u not in seen for key,data in datadict.items()] deg=len(nbr_edges) yield make_str(s)+delimiter+"%i"%(deg) for u,d in nbr_edges: if d is None: yield make_str(u) else: yield make_str(u)+delimiter+make_str(d) seen.add(s) else: # undirected single edges seen=set() # helper dict used to avoid duplicate edges for s,nbrs in G.adjacency_iter(): nbr_edges=[ (u,d) for u,d in nbrs.items() if u not in seen] deg=len(nbr_edges) yield make_str(s)+delimiter+"%i"%(deg) for u,d in nbr_edges: if d is None: yield make_str(u) else: yield make_str(u)+delimiter+make_str(d) seen.add(s) @open_file(1,mode='wb') def write_multiline_adjlist(G, path, delimiter=' ', comments='#', encoding = 'utf-8'): """ Write the graph G in multiline adjacency list format to path Parameters ---------- G : NetworkX graph comments : string, optional Marker for comment lines delimiter : string, optional Separator for node labels encoding : string, optional Text encoding. Examples -------- >>> G=nx.path_graph(4) >>> nx.write_multiline_adjlist(G,"test.adjlist") The path can be a file handle or a string with the name of the file. If a file handle is provided, it has to be opened in 'wb' mode. >>> fh=open("test.adjlist",'wb') >>> nx.write_multiline_adjlist(G,fh) Filenames ending in .gz or .bz2 will be compressed. >>> nx.write_multiline_adjlist(G,"test.adjlist.gz") See Also -------- read_multiline_adjlist """ import sys import time pargs=comments+" ".join(sys.argv) header = ("%s\n" % (pargs) + comments + " GMT %s\n" % (time.asctime(time.gmtime())) + comments + " %s\n" % (G.name)) path.write(header.encode(encoding)) for multiline in generate_multiline_adjlist(G, delimiter): multiline+='\n' path.write(multiline.encode(encoding)) def parse_multiline_adjlist(lines, comments = '#', delimiter = None, create_using = None, nodetype = None, edgetype = None): """Parse lines of a multiline adjacency list representation of a graph. Parameters ---------- lines : list or iterator of strings Input data in multiline adjlist format create_using: NetworkX graph container Use given NetworkX graph for holding nodes or edges. nodetype : Python type, optional Convert nodes to this type. comments : string, optional Marker for comment lines delimiter : string, optional Separator for node labels. The default is whitespace. create_using: NetworkX graph container Use given NetworkX graph for holding nodes or edges. Returns ------- G: NetworkX graph The graph corresponding to the lines in multiline adjacency list format. Examples -------- >>> lines = ['1 2', ... "2 {'weight':3, 'name': 'Frodo'}", ... "3 {}", ... "2 1", ... "5 {'weight':6, 'name': 'Saruman'}"] >>> G = nx.parse_multiline_adjlist(iter(lines), nodetype = int) >>> G.nodes() [1, 2, 3, 5] """ from ast import literal_eval if create_using is None: G=nx.Graph() else: try: G=create_using G.clear() except: raise TypeError("Input graph is not a networkx graph type") for line in lines: p=line.find(comments) if p>=0: line = line[:p] if not line: continue try: (u,deg)=line.strip().split(delimiter) deg=int(deg) except: raise TypeError("Failed to read node and degree on line (%s)"%line) if nodetype is not None: try: u=nodetype(u) except: raise TypeError("Failed to convert node (%s) to type %s"\ %(u,nodetype)) G.add_node(u) for i in range(deg): while True: try: line = next(lines) except StopIteration: msg = "Failed to find neighbor for node (%s)" % (u,) raise TypeError(msg) p=line.find(comments) if p>=0: line = line[:p] if line: break vlist=line.strip().split(delimiter) numb=len(vlist) if numb<1: continue # isolated node v=vlist.pop(0) data=''.join(vlist) if nodetype is not None: try: v=nodetype(v) except: raise TypeError( "Failed to convert node (%s) to type %s"\ %(v,nodetype)) if edgetype is not None: try: edgedata={'weight':edgetype(data)} except: raise TypeError( "Failed to convert edge data (%s) to type %s"\ %(data, edgetype)) else: try: # try to evaluate edgedata=literal_eval(data) except: edgedata={} G.add_edge(u,v,attr_dict=edgedata) return G @open_file(0,mode='rb') def read_multiline_adjlist(path, comments="#", delimiter=None, create_using=None, nodetype=None, edgetype=None, encoding = 'utf-8'): """Read graph in multi-line adjacency list format from path. Parameters ---------- path : string or file Filename or file handle to read. Filenames ending in .gz or .bz2 will be uncompressed. create_using: NetworkX graph container Use given NetworkX graph for holding nodes or edges. nodetype : Python type, optional Convert nodes to this type. edgetype : Python type, optional Convert edge data to this type. comments : string, optional Marker for comment lines delimiter : string, optional Separator for node labels. The default is whitespace. create_using: NetworkX graph container Use given NetworkX graph for holding nodes or edges. Returns ------- G: NetworkX graph Examples -------- >>> G=nx.path_graph(4) >>> nx.write_multiline_adjlist(G,"test.adjlist") >>> G=nx.read_multiline_adjlist("test.adjlist") The path can be a file or a string with the name of the file. If a file s provided, it has to be opened in 'rb' mode. >>> fh=open("test.adjlist", 'rb') >>> G=nx.read_multiline_adjlist(fh) Filenames ending in .gz or .bz2 will be compressed. >>> nx.write_multiline_adjlist(G,"test.adjlist.gz") >>> G=nx.read_multiline_adjlist("test.adjlist.gz") The optional nodetype is a function to convert node strings to nodetype. For example >>> G=nx.read_multiline_adjlist("test.adjlist", nodetype=int) will attempt to convert all nodes to integer type. The optional edgetype is a function to convert edge data strings to edgetype. >>> G=nx.read_multiline_adjlist("test.adjlist") The optional create_using parameter is a NetworkX graph container. The default is Graph(), an undirected graph. To read the data as a directed graph use >>> G=nx.read_multiline_adjlist("test.adjlist", create_using=nx.DiGraph()) Notes ----- This format does not store graph, node, or edge data. See Also -------- write_multiline_adjlist """ lines = (line.decode(encoding) for line in path) return parse_multiline_adjlist(lines, comments = comments, delimiter = delimiter, create_using = create_using, nodetype = nodetype, edgetype = edgetype) # fixture for nose tests def teardown_module(module): import os for fname in ['test.adjlist', 'test.adjlist.gz']: if os.path.isfile(fname): os.unlink(fname) networkx-1.11/networkx/readwrite/gpickle.py0000644000175000017500000000537312637544450021054 0ustar aricaric00000000000000""" ************** Pickled Graphs ************** Read and write NetworkX graphs as Python pickles. "The pickle module implements a fundamental, but powerful algorithm for serializing and de-serializing a Python object structure. "Pickling" is the process whereby a Python object hierarchy is converted into a byte stream, and "unpickling" is the inverse operation, whereby a byte stream is converted back into an object hierarchy." Note that NetworkX graphs can contain any hashable Python object as node (not just integers and strings). For arbitrary data types it may be difficult to represent the data as text. In that case using Python pickles to store the graph data can be used. Format ------ See http://docs.python.org/library/pickle.html """ __author__ = """Aric Hagberg (hagberg@lanl.gov)\nDan Schult (dschult@colgate.edu)""" # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. __all__ = ['read_gpickle', 'write_gpickle'] import networkx as nx from networkx.utils import open_file try: import cPickle as pickle except ImportError: import pickle @open_file(1, mode='wb') def write_gpickle(G, path, protocol=pickle.HIGHEST_PROTOCOL): """Write graph in Python pickle format. Pickles are a serialized byte stream of a Python object [1]_. This format will preserve Python objects used as nodes or edges. Parameters ---------- G : graph A NetworkX graph path : file or string File or filename to write. Filenames ending in .gz or .bz2 will be compressed. protocol : integer Pickling protocol to use. Default value: ``pickle.HIGHEST_PROTOCOL``. Examples -------- >>> G = nx.path_graph(4) >>> nx.write_gpickle(G, "test.gpickle") References ---------- .. [1] http://docs.python.org/library/pickle.html """ pickle.dump(G, path, protocol) @open_file(0, mode='rb') def read_gpickle(path): """Read graph object in Python pickle format. Pickles are a serialized byte stream of a Python object [1]_. This format will preserve Python objects used as nodes or edges. Parameters ---------- path : file or string File or filename to write. Filenames ending in .gz or .bz2 will be uncompressed. Returns ------- G : graph A NetworkX graph Examples -------- >>> G = nx.path_graph(4) >>> nx.write_gpickle(G, "test.gpickle") >>> G = nx.read_gpickle("test.gpickle") References ---------- .. [1] http://docs.python.org/library/pickle.html """ return pickle.load(path) # fixture for nose tests def teardown_module(module): import os os.unlink('test.gpickle') networkx-1.11/networkx/readwrite/graphml.py0000644000175000017500000005144012637544500021060 0ustar aricaric00000000000000""" ******* GraphML ******* Read and write graphs in GraphML format. This implementation does not support mixed graphs (directed and unidirected edges together), hyperedges, nested graphs, or ports. "GraphML is a comprehensive and easy-to-use file format for graphs. It consists of a language core to describe the structural properties of a graph and a flexible extension mechanism to add application-specific data. Its main features include support of * directed, undirected, and mixed graphs, * hypergraphs, * hierarchical graphs, * graphical representations, * references to external data, * application-specific attribute data, and * light-weight parsers. Unlike many other file formats for graphs, GraphML does not use a custom syntax. Instead, it is based on XML and hence ideally suited as a common denominator for all kinds of services generating, archiving, or processing graphs." http://graphml.graphdrawing.org/ Format ------ GraphML is an XML format. See http://graphml.graphdrawing.org/specification.html for the specification and http://graphml.graphdrawing.org/primer/graphml-primer.html for examples. """ __author__ = """\n""".join(['Salim Fadhley', 'Aric Hagberg (hagberg@lanl.gov)' ]) __all__ = ['write_graphml', 'read_graphml', 'generate_graphml', 'parse_graphml', 'GraphMLWriter', 'GraphMLReader'] import networkx as nx from networkx.utils import open_file, make_str import warnings try: from xml.etree.cElementTree import Element, ElementTree, tostring, fromstring except ImportError: try: from xml.etree.ElementTree import Element, ElementTree, tostring, fromstring except ImportError: pass @open_file(1,mode='wb') def write_graphml(G, path, encoding='utf-8',prettyprint=True): """Write G in GraphML XML format to path Parameters ---------- G : graph A networkx graph path : file or string File or filename to write. Filenames ending in .gz or .bz2 will be compressed. encoding : string (optional) Encoding for text data. prettyprint : bool (optional) If True use line breaks and indenting in output XML. Examples -------- >>> G=nx.path_graph(4) >>> nx.write_graphml(G, "test.graphml") Notes ----- This implementation does not support mixed graphs (directed and unidirected edges together) hyperedges, nested graphs, or ports. """ writer = GraphMLWriter(encoding=encoding,prettyprint=prettyprint) writer.add_graph_element(G) writer.dump(path) def generate_graphml(G, encoding='utf-8',prettyprint=True): """Generate GraphML lines for G Parameters ---------- G : graph A networkx graph encoding : string (optional) Encoding for text data. prettyprint : bool (optional) If True use line breaks and indenting in output XML. Examples -------- >>> G=nx.path_graph(4) >>> linefeed=chr(10) # linefeed=\n >>> s=linefeed.join(nx.generate_graphml(G)) # doctest: +SKIP >>> for line in nx.generate_graphml(G): # doctest: +SKIP ... print(line) Notes ----- This implementation does not support mixed graphs (directed and unidirected edges together) hyperedges, nested graphs, or ports. """ writer = GraphMLWriter(encoding=encoding,prettyprint=prettyprint) writer.add_graph_element(G) for line in str(writer).splitlines(): yield line @open_file(0,mode='rb') def read_graphml(path,node_type=str): """Read graph in GraphML format from path. Parameters ---------- path : file or string File or filename to write. Filenames ending in .gz or .bz2 will be compressed. node_type: Python type (default: str) Convert node ids to this type Returns ------- graph: NetworkX graph If no parallel edges are found a Graph or DiGraph is returned. Otherwise a MultiGraph or MultiDiGraph is returned. Notes ----- This implementation does not support mixed graphs (directed and unidirected edges together), hypergraphs, nested graphs, or ports. For multigraphs the GraphML edge "id" will be used as the edge key. If not specified then they "key" attribute will be used. If there is no "key" attribute a default NetworkX multigraph edge key will be provided. Files with the yEd "yfiles" extension will can be read but the graphics information is discarded. yEd compressed files ("file.graphmlz" extension) can be read by renaming the file to "file.graphml.gz". """ reader = GraphMLReader(node_type=node_type) # need to check for multiple graphs glist=list(reader(path=path)) return glist[0] def parse_graphml(graphml_string,node_type=str): """Read graph in GraphML format from string. Parameters ---------- graphml_string : string String containing graphml information (e.g., contents of a graphml file). node_type: Python type (default: str) Convert node ids to this type Returns ------- graph: NetworkX graph If no parallel edges are found a Graph or DiGraph is returned. Otherwise a MultiGraph or MultiDiGraph is returned. Examples -------- >>> G=nx.path_graph(4) >>> linefeed=chr(10) # linefeed=\n >>> s=linefeed.join(nx.generate_graphml(G)) >>> H=nx.parse_graphml(s) Notes ----- This implementation does not support mixed graphs (directed and unidirected edges together), hypergraphs, nested graphs, or ports. For multigraphs the GraphML edge "id" will be used as the edge key. If not specified then they "key" attribute will be used. If there is no "key" attribute a default NetworkX multigraph edge key will be provided. """ reader = GraphMLReader(node_type=node_type) # need to check for multiple graphs glist=list(reader(string=graphml_string)) return glist[0] class GraphML(object): NS_GRAPHML = "http://graphml.graphdrawing.org/xmlns" NS_XSI = "http://www.w3.org/2001/XMLSchema-instance" #xmlns:y="http://www.yworks.com/xml/graphml" NS_Y = "http://www.yworks.com/xml/graphml" SCHEMALOCATION = \ ' '.join(['http://graphml.graphdrawing.org/xmlns', 'http://graphml.graphdrawing.org/xmlns/1.0/graphml.xsd']) try: chr(12345) # Fails on Py!=3. unicode = str # Py3k's str is our unicode type long = int # Py3K's int is our long type except ValueError: # Python 2.x pass types=[(int,"integer"), # for Gephi GraphML bug (str,"yfiles"),(str,"string"), (unicode,"string"), (int,"int"), (long,"long"), (float,"float"), (float,"double"), (bool, "boolean")] xml_type = dict(types) python_type = dict(reversed(a) for a in types) # http://www.w3.org/TR/xmlschema-2/#boolean convert_bool = { 'true': True, 'false': False, 'True': True, 'False': False, '0': False, 0: False, '1': False, 1: True } class GraphMLWriter(GraphML): def __init__(self, graph=None, encoding="utf-8",prettyprint=True): try: import xml.etree.ElementTree except ImportError: raise ImportError('GraphML writer requires ' 'xml.elementtree.ElementTree') self.prettyprint=prettyprint self.encoding = encoding self.xml = Element("graphml", {'xmlns':self.NS_GRAPHML, 'xmlns:xsi':self.NS_XSI, 'xsi:schemaLocation':self.SCHEMALOCATION} ) self.keys={} if graph is not None: self.add_graph_element(graph) def __str__(self): if self.prettyprint: self.indent(self.xml) s=tostring(self.xml).decode(self.encoding) return s def get_key(self, name, attr_type, scope, default): keys_key = (name, attr_type, scope) try: return self.keys[keys_key] except KeyError: new_id = "d%i" % len(list(self.keys)) self.keys[keys_key] = new_id key_kwargs = {"id":new_id, "for":scope, "attr.name":name, "attr.type":attr_type} key_element=Element("key",**key_kwargs) # add subelement for data default value if present if default is not None: default_element=Element("default") default_element.text=make_str(default) key_element.append(default_element) self.xml.insert(0,key_element) return new_id def add_data(self, name, element_type, value, scope="all", default=None): """ Make a data element for an edge or a node. Keep a log of the type in the keys table. """ if element_type not in self.xml_type: raise nx.NetworkXError('GraphML writer does not support ' '%s as data values.'%element_type) key_id = self.get_key(name, self.xml_type[element_type], scope, default) data_element = Element("data", key=key_id) data_element.text = make_str(value) return data_element def add_attributes(self, scope, xml_obj, data, default): """Appends attributes to edges or nodes. """ for k,v in data.items(): default_value=default.get(k) obj=self.add_data(make_str(k), type(v), make_str(v), scope=scope, default=default_value) xml_obj.append(obj) def add_nodes(self, G, graph_element): for node,data in G.nodes_iter(data=True): node_element = Element("node", id = make_str(node)) default=G.graph.get('node_default',{}) self.add_attributes("node", node_element, data, default) graph_element.append(node_element) def add_edges(self, G, graph_element): if G.is_multigraph(): for u,v,key,data in G.edges_iter(data=True,keys=True): edge_element = Element("edge",source=make_str(u), target=make_str(v)) default=G.graph.get('edge_default',{}) self.add_attributes("edge", edge_element, data, default) self.add_attributes("edge", edge_element, {'key':key}, default) graph_element.append(edge_element) else: for u,v,data in G.edges_iter(data=True): edge_element = Element("edge",source=make_str(u), target=make_str(v)) default=G.graph.get('edge_default',{}) self.add_attributes("edge", edge_element, data, default) graph_element.append(edge_element) def add_graph_element(self, G): """ Serialize graph G in GraphML to the stream. """ if G.is_directed(): default_edge_type='directed' else: default_edge_type='undirected' graphid=G.graph.pop('id',None) if graphid is None: graph_element = Element("graph", edgedefault = default_edge_type) else: graph_element = Element("graph", edgedefault = default_edge_type, id=graphid) default={} data=dict((k,v) for (k,v) in G.graph.items() if k not in ['node_default','edge_default']) self.add_attributes("graph", graph_element, data, default) self.add_nodes(G,graph_element) self.add_edges(G,graph_element) self.xml.append(graph_element) def add_graphs(self, graph_list): """ Add many graphs to this GraphML document. """ for G in graph_list: self.add_graph_element(G) def dump(self, stream): if self.prettyprint: self.indent(self.xml) document = ElementTree(self.xml) document.write(stream, encoding=self.encoding, xml_declaration=True) def indent(self, elem, level=0): # in-place prettyprint formatter i = "\n" + level*" " if len(elem): if not elem.text or not elem.text.strip(): elem.text = i + " " if not elem.tail or not elem.tail.strip(): elem.tail = i for elem in elem: self.indent(elem, level+1) if not elem.tail or not elem.tail.strip(): elem.tail = i else: if level and (not elem.tail or not elem.tail.strip()): elem.tail = i class GraphMLReader(GraphML): """Read a GraphML document. Produces NetworkX graph objects. """ def __init__(self, node_type=str): try: import xml.etree.ElementTree except ImportError: raise ImportError('GraphML reader requires ' 'xml.elementtree.ElementTree') self.node_type=node_type self.multigraph=False # assume multigraph and test for parallel edges def __call__(self, path=None, string=None): if path is not None: self.xml = ElementTree(file=path) elif string is not None: self.xml = fromstring(string) else: raise ValueError("Must specify either 'path' or 'string' as kwarg.") (keys,defaults) = self.find_graphml_keys(self.xml) for g in self.xml.findall("{%s}graph" % self.NS_GRAPHML): yield self.make_graph(g, keys, defaults) def make_graph(self, graph_xml, graphml_keys, defaults): # set default graph type edgedefault = graph_xml.get("edgedefault", None) if edgedefault=='directed': G=nx.MultiDiGraph() else: G=nx.MultiGraph() # set defaults for graph attributes G.graph['node_default']={} G.graph['edge_default']={} for key_id,value in defaults.items(): key_for=graphml_keys[key_id]['for'] name=graphml_keys[key_id]['name'] python_type=graphml_keys[key_id]['type'] if key_for=='node': G.graph['node_default'].update({name:python_type(value)}) if key_for=='edge': G.graph['edge_default'].update({name:python_type(value)}) # hyperedges are not supported hyperedge=graph_xml.find("{%s}hyperedge" % self.NS_GRAPHML) if hyperedge is not None: raise nx.NetworkXError("GraphML reader does not support hyperedges") # add nodes for node_xml in graph_xml.findall("{%s}node" % self.NS_GRAPHML): self.add_node(G, node_xml, graphml_keys) # add edges for edge_xml in graph_xml.findall("{%s}edge" % self.NS_GRAPHML): self.add_edge(G, edge_xml, graphml_keys) # add graph data data = self.decode_data_elements(graphml_keys, graph_xml) G.graph.update(data) # switch to Graph or DiGraph if no parallel edges were found. if not self.multigraph: if G.is_directed(): return nx.DiGraph(G) else: return nx.Graph(G) else: return G def add_node(self, G, node_xml, graphml_keys): """Add a node to the graph. """ # warn on finding unsupported ports tag ports=node_xml.find("{%s}port" % self.NS_GRAPHML) if ports is not None: warnings.warn("GraphML port tag not supported.") # find the node by id and cast it to the appropriate type node_id = self.node_type(node_xml.get("id")) # get data/attributes for node data = self.decode_data_elements(graphml_keys, node_xml) G.add_node(node_id, data) def add_edge(self, G, edge_element, graphml_keys): """Add an edge to the graph. """ # warn on finding unsupported ports tag ports=edge_element.find("{%s}port" % self.NS_GRAPHML) if ports is not None: warnings.warn("GraphML port tag not supported.") # raise error if we find mixed directed and undirected edges directed = edge_element.get("directed") if G.is_directed() and directed=='false': raise nx.NetworkXError(\ "directed=false edge found in directed graph.") if (not G.is_directed()) and directed=='true': raise nx.NetworkXError(\ "directed=true edge found in undirected graph.") source = self.node_type(edge_element.get("source")) target = self.node_type(edge_element.get("target")) data = self.decode_data_elements(graphml_keys, edge_element) # GraphML stores edge ids as an attribute # NetworkX uses them as keys in multigraphs too if no key # attribute is specified edge_id = edge_element.get("id") if edge_id: data["id"] = edge_id if G.has_edge(source,target): # mark this as a multigraph self.multigraph=True if edge_id is None: # no id specified, try using 'key' attribute as id edge_id=data.pop('key',None) G.add_edge(source, target, key=edge_id, **data) def decode_data_elements(self, graphml_keys, obj_xml): """Use the key information to decode the data XML if present.""" data = {} for data_element in obj_xml.findall("{%s}data" % self.NS_GRAPHML): key = data_element.get("key") try: data_name=graphml_keys[key]['name'] data_type=graphml_keys[key]['type'] except KeyError: raise nx.NetworkXError("Bad GraphML data: no key %s"%key) text=data_element.text # assume anything with subelements is a yfiles extension if text is not None and len(list(data_element))==0: if data_type==bool: data[data_name] = self.convert_bool[text] else: data[data_name] = data_type(text) elif len(list(data_element)) > 0: # Assume yfiles as subelements, try to extract node_label node_label = None for node_type in ['ShapeNode', 'SVGNode', 'ImageNode']: geometry = data_element.find("{%s}%s/{%s}Geometry" % (self.NS_Y, node_type, self.NS_Y)) if geometry is not None: data['x'] = geometry.get('x') data['y'] = geometry.get('y') if node_label is None: node_label = data_element.find("{%s}%s/{%s}NodeLabel" % (self.NS_Y, node_type, self.NS_Y)) if node_label is not None: data['label'] = node_label.text # check all the diffrent types of edges avaivable in yEd. for e in ['PolyLineEdge', 'SplineEdge', 'QuadCurveEdge', 'BezierEdge', 'ArcEdge']: edge_label = data_element.find("{%s}%s/{%s}EdgeLabel"% (self.NS_Y, e, (self.NS_Y))) if edge_label is not None: break if edge_label is not None: data['label'] = edge_label.text return data def find_graphml_keys(self, graph_element): """Extracts all the keys and key defaults from the xml. """ graphml_keys = {} graphml_key_defaults = {} for k in graph_element.findall("{%s}key" % self.NS_GRAPHML): attr_id = k.get("id") attr_type=k.get('attr.type') attr_name=k.get("attr.name") yfiles_type=k.get("yfiles.type") if yfiles_type is not None: attr_name = yfiles_type attr_type = 'yfiles' if attr_type is None: attr_type = "string" warnings.warn("No key type for id %s. Using string"%attr_id) if attr_name is None: raise nx.NetworkXError("Unknown key for id %s in file."%attr_id) graphml_keys[attr_id] = { "name":attr_name, "type":self.python_type[attr_type], "for":k.get("for")} # check for "default" subelement of key element default=k.find("{%s}default" % self.NS_GRAPHML) if default is not None: graphml_key_defaults[attr_id]=default.text return graphml_keys,graphml_key_defaults # fixture for nose tests def setup_module(module): from nose import SkipTest try: import xml.etree.ElementTree except: raise SkipTest("xml.etree.ElementTree not available") # fixture for nose tests def teardown_module(module): import os try: os.unlink('test.graphml') except: pass networkx-1.11/networkx/readwrite/tests/0000755000175000017500000000000012653231454020211 5ustar aricaric00000000000000networkx-1.11/networkx/readwrite/tests/test_p2g.py0000644000175000017500000000246012637544450022321 0ustar aricaric00000000000000from nose.tools import assert_equal, assert_raises, assert_not_equal import networkx as nx import io import tempfile import os from networkx.readwrite.p2g import * from networkx.testing import * class TestP2G: def setUp(self): self.G=nx.Graph(name="test") e=[('a','b'),('b','c'),('c','d'),('d','e'),('e','f'),('a','f')] self.G.add_edges_from(e) self.G.add_node('g') self.DG=nx.DiGraph(self.G) def test_read_p2g(self): s = b"""\ name 3 4 a 1 2 b c 0 2 """ bytesIO = io.BytesIO(s) G = read_p2g(bytesIO) assert_equal(G.name,'name') assert_equal(sorted(G),['a','b','c']) edges = [(str(u),str(v)) for u,v in G.edges()] assert_edges_equal(G.edges(),[('a','c'),('a','b'),('c','a'),('c','c')]) def test_write_p2g(self): s=b"""foo 3 2 1 1 2 2 3 """ fh=io.BytesIO() G=nx.DiGraph() G.name='foo' G.add_edges_from([(1,2),(2,3)]) write_p2g(G,fh) fh.seek(0) r=fh.read() assert_equal(r,s) def test_write_read_p2g(self): fh=io.BytesIO() G=nx.DiGraph() G.name='foo' G.add_edges_from([('a','b'),('b','c')]) write_p2g(G,fh) fh.seek(0) H=read_p2g(fh) assert_edges_equal(G.edges(),H.edges()) networkx-1.11/networkx/readwrite/tests/test_yaml.py0000644000175000017500000000250612637544500022570 0ustar aricaric00000000000000""" Unit tests for yaml. """ import os,tempfile from nose import SkipTest from nose.tools import assert_equal import networkx as nx from networkx.testing import assert_edges_equal, assert_nodes_equal class TestYaml(object): @classmethod def setupClass(cls): global yaml try: import yaml except ImportError: raise SkipTest('yaml not available.') def setUp(self): self.build_graphs() def build_graphs(self): self.G = nx.Graph(name="test") e = [('a','b'),('b','c'),('c','d'),('d','e'),('e','f'),('a','f')] self.G.add_edges_from(e) self.G.add_node('g') self.DG = nx.DiGraph(self.G) self.MG = nx.MultiGraph() self.MG.add_weighted_edges_from([(1,2,5),(1,2,5),(1,2,1),(3,3,42)]) def assert_equal(self, G, data=False): (fd, fname) = tempfile.mkstemp() nx.write_yaml(G, fname) Gin = nx.read_yaml(fname); assert_nodes_equal(G.nodes(), Gin.nodes()) assert_edges_equal(G.edges(data=data), Gin.edges(data=data)) os.close(fd) os.unlink(fname) def testUndirected(self): self.assert_equal(self.G, False) def testDirected(self): self.assert_equal(self.DG, False) def testMultiGraph(self): self.assert_equal(self.MG, True) networkx-1.11/networkx/readwrite/tests/test_graph6.py0000644000175000017500000000622312637544450023021 0ustar aricaric00000000000000#!/usr/bin/env python try: from StringIO import StringIO except ImportError: from io import StringIO from nose.tools import * import networkx as nx import networkx.readwrite.graph6 as g6 class TestGraph6Utils(object): def test_n_data_n_conversion(self): for i in [0, 1, 42, 62, 63, 64, 258047, 258048, 7744773, 68719476735]: assert_equal(g6.data_to_n(g6.n_to_data(i))[0], i) assert_equal(g6.data_to_n(g6.n_to_data(i))[1], []) assert_equal(g6.data_to_n(g6.n_to_data(i) + [42, 43])[1], [42, 43]) def test_data_sparse6_data_conversion(self): for data in [[], [0], [63], [63, 63], [0]*42, [0, 1, 62, 42, 3, 11, 0, 11]]: assert_equal(g6.graph6_to_data(g6.data_to_graph6(data)), data) assert_equal(len(g6.data_to_graph6(data)), len(data)) class TestGraph6(object): def test_parse_graph6(self): data="""DF{""" G=nx.parse_graph6(data) assert_equal(sorted(G.nodes()),[0, 1, 2, 3, 4]) assert_equal([e for e in sorted(G.edges())], [(0, 3), (0, 4), (1, 3), (1, 4), (2, 3), (2, 4), (3, 4)]) def test_read_graph6(self): data="""DF{""" G=nx.parse_graph6(data) fh = StringIO(data) Gin=nx.read_graph6(fh) assert_equal(sorted(G.nodes()),sorted(Gin.nodes())) assert_equal(sorted(G.edges()),sorted(Gin.edges())) def test_read_many_graph6(self): # Read many graphs into list data="""DF{\nD`{\nDqK\nD~{\n""" fh = StringIO(data) glist=nx.read_graph6(fh) assert_equal(len(glist),4) for G in glist: assert_equal(sorted(G.nodes()),[0, 1, 2, 3, 4]) def test_generate_graph6(self): assert_equal(nx.generate_graph6(nx.empty_graph(0)), '>>graph6<>graph6<<@') G1 = nx.complete_graph(4) assert_equal(nx.generate_graph6(G1, header=True), '>>graph6<>graph6<>sparse6<<:?') assert_equal(nx.generate_sparse6(nx.empty_graph(1)), '>>sparse6<<:@') assert_equal(nx.generate_sparse6(nx.empty_graph(5)), '>>sparse6<<:D') assert_equal(nx.generate_sparse6(nx.empty_graph(68)), '>>sparse6<<:~?@C') assert_equal(nx.generate_sparse6(nx.empty_graph(258049)), '>>sparse6<<:~~???~?@') G1 = nx.complete_graph(4) assert_equal(nx.generate_sparse6(G1, header=True), '>>sparse6<<:CcKI') assert_equal(nx.generate_sparse6(G1, header=False), ':CcKI') # Padding testing assert_equal(nx.generate_sparse6(nx.path_graph(4), header=False), ':Cdv') assert_equal(nx.generate_sparse6(nx.path_graph(5), header=False), ':DaYn') assert_equal(nx.generate_sparse6(nx.path_graph(6), header=False), ':EaYnN') assert_equal(nx.generate_sparse6(nx.path_graph(7), header=False), ':FaYnL') assert_equal(nx.generate_sparse6(nx.path_graph(8), header=False), ':GaYnLz') def test_write_sparse6(self): fh = StringIO() nx.write_sparse6(nx.complete_bipartite_graph(6,9), fh) fh.seek(0) assert_equal(fh.read(), '>>sparse6<<:Nk?G`cJ?G`cJ?G`cJ?G`'+ 'cJ?G`cJ?G`cJ?G`cJ?G`cJ?G`cJ\n') # Compared with sage def test_generate_and_parse_sparse6(self): for i in list(range(13)) + [31, 47, 62, 63, 64, 72]: m = min(2 * i, i * i // 2) g = nx.random_graphs.gnm_random_graph(i, m, seed=i) gstr = nx.generate_sparse6(g, header=False) g2 = nx.parse_sparse6(gstr) assert_equal(g2.order(), g.order()) assert_equal(sorted(g2.edges()), sorted(g.edges())) @raises(nx.NetworkXError) def directed_error(self): nx.generate_sparse6(nx.DiGraph()) networkx-1.11/networkx/readwrite/tests/test_pajek.py0000644000175000017500000000670012637544500022720 0ustar aricaric00000000000000#!/usr/bin/env python """ Pajek tests """ from nose.tools import assert_equal from networkx import * import os,tempfile from io import open from networkx.testing import * class TestPajek(object): def setUp(self): self.data="""*network Tralala\n*vertices 4\n 1 "A1" 0.0938 0.0896 ellipse x_fact 1 y_fact 1\n 2 "Bb" 0.8188 0.2458 ellipse x_fact 1 y_fact 1\n 3 "C" 0.3688 0.7792 ellipse x_fact 1\n 4 "D2" 0.9583 0.8563 ellipse x_fact 1\n*arcs\n1 1 1 h2 0 w 3 c Blue s 3 a1 -130 k1 0.6 a2 -130 k2 0.6 ap 0.5 l "Bezier loop" lc BlueViolet fos 20 lr 58 lp 0.3 la 360\n2 1 1 h2 0 a1 120 k1 1.3 a2 -120 k2 0.3 ap 25 l "Bezier arc" lphi 270 la 180 lr 19 lp 0.5\n1 2 1 h2 0 a1 40 k1 2.8 a2 30 k2 0.8 ap 25 l "Bezier arc" lphi 90 la 0 lp 0.65\n4 2 -1 h2 0 w 1 k1 -2 k2 250 ap 25 l "Circular arc" c Red lc OrangeRed\n3 4 1 p Dashed h2 0 w 2 c OliveGreen ap 25 l "Straight arc" lc PineGreen\n1 3 1 p Dashed h2 0 w 5 k1 -1 k2 -20 ap 25 l "Oval arc" c Brown lc Black\n3 3 -1 h1 6 w 1 h2 12 k1 -2 k2 -15 ap 0.5 l "Circular loop" c Red lc OrangeRed lphi 270 la 180""" self.G=nx.MultiDiGraph() self.G.add_nodes_from(['A1', 'Bb', 'C', 'D2']) self.G.add_edges_from([('A1', 'A1'), ('A1', 'Bb'), ('A1', 'C'), ('Bb', 'A1'),('C', 'C'), ('C', 'D2'), ('D2', 'Bb')]) self.G.graph['name']='Tralala' (fd,self.fname)=tempfile.mkstemp() with os.fdopen(fd, 'wb') as fh: fh.write(self.data.encode('UTF-8')) def tearDown(self): os.unlink(self.fname) def test_parse_pajek_simple(self): # Example without node positions or shape data="""*Vertices 2\n1 "1"\n2 "2"\n*Edges\n1 2\n2 1""" G=parse_pajek(data) assert_equal(sorted(G.nodes()), ['1', '2']) assert_edges_equal(G.edges(), [('1', '2'), ('1', '2')]) def test_parse_pajek(self): G=parse_pajek(self.data) assert_equal(sorted(G.nodes()), ['A1', 'Bb', 'C', 'D2']) assert_edges_equal(G.edges(), [('A1', 'A1'), ('A1', 'Bb'), ('A1', 'C'), ('Bb', 'A1'), ('C', 'C'), ('C', 'D2'), ('D2', 'Bb')]) def test_read_pajek(self): G=parse_pajek(self.data) Gin=read_pajek(self.fname) assert_equal(sorted(G.nodes()), sorted(Gin.nodes())) assert_edges_equal(G.edges(), Gin.edges()) assert_equal(self.G.graph,Gin.graph) for n in G.node: assert_equal(G.node[n],Gin.node[n]) def test_noname(self): # Make sure we can parse a line such as: *network # Issue #952 line = "*network\n" other_lines = self.data.split('\n')[1:] data = line + '\n'.join(other_lines) G = parse_pajek(data) def test_unicode(self): import io G = nx.Graph() try: # Python 3.x name1 = chr(2344) + chr(123) + chr(6543) name2 = chr(5543) + chr(1543) + chr(324) except ValueError: # Python 2.6+ name1 = unichr(2344) + unichr(123) + unichr(6543) name2 = unichr(5543) + unichr(1543) + unichr(324) G.add_edge(name1, 'Radiohead', attr_dict={'foo': name2}) fh = io.BytesIO() nx.write_pajek(G,fh) fh.seek(0) H=nx.read_pajek(fh) assert_nodes_equal(G.nodes(), H.nodes()) assert_edges_equal(G.edges(), H.edges()) assert_equal(G.graph,H.graph) networkx-1.11/networkx/readwrite/tests/test_gpickle.py0000644000175000017500000000375312637544500023251 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import assert_equal import os import tempfile import networkx as nx from networkx.testing.utils import * class TestGpickle(object): def setUp(self): G=nx.Graph(name="test") e=[('a','b'),('b','c'),('c','d'),('d','e'),('e','f'),('a','f')] G.add_edges_from(e,width=10) G.add_node('g',color='green') G.graph['number']=1 DG=nx.DiGraph(G) MG=nx.MultiGraph(G) MG.add_edge('a', 'a') MDG=nx.MultiDiGraph(G) MDG.add_edge('a', 'a') fG = G.copy() fDG = DG.copy() fMG = MG.copy() fMDG = MDG.copy() nx.freeze(fG) nx.freeze(fDG) nx.freeze(fMG) nx.freeze(fMDG) self.G=G self.DG=DG self.MG=MG self.MDG=MDG self.fG=fG self.fDG=fDG self.fMG=fMG self.fMDG=fMDG def test_gpickle(self): for G in [self.G, self.DG, self.MG, self.MDG, self.fG, self.fDG, self.fMG, self.fMDG]: (fd,fname)=tempfile.mkstemp() nx.write_gpickle(G,fname) Gin=nx.read_gpickle(fname) assert_nodes_equal(G.nodes(data=True), Gin.nodes(data=True)) assert_edges_equal(G.edges(data=True), Gin.edges(data=True)) assert_graphs_equal(G, Gin) os.close(fd) os.unlink(fname) def test_protocol(self): for G in [self.G, self.DG, self.MG, self.MDG, self.fG, self.fDG, self.fMG, self.fMDG]: with tempfile.TemporaryFile() as f: nx.write_gpickle(G, f, 0) f.seek(0) Gin = nx.read_gpickle(f) assert_nodes_equal(G.nodes(data=True), Gin.nodes(data=True)) assert_edges_equal(G.edges(data=True), Gin.edges(data=True)) assert_graphs_equal(G, Gin) networkx-1.11/networkx/readwrite/tests/test_graphml.py0000644000175000017500000004200312637544500023254 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * from nose import SkipTest import networkx as nx import io import tempfile import os class TestGraph(object): @classmethod def setupClass(cls): try: import xml.etree.ElementTree except ImportError: raise SkipTest('xml.etree.ElementTree not available.') def setUp(self): self.simple_directed_data=""" """ self.simple_directed_graph=nx.DiGraph() self.simple_directed_graph.add_node('n10') self.simple_directed_graph.add_edge('n0','n2',id='foo') self.simple_directed_graph.add_edges_from([('n1','n2'), ('n2','n3'), ('n3','n5'), ('n3','n4'), ('n4','n6'), ('n6','n5'), ('n5','n7'), ('n6','n8'), ('n8','n7'), ('n8','n9'), ]) self.simple_directed_fh = \ io.BytesIO(self.simple_directed_data.encode('UTF-8')) self.attribute_data=""" yellow green blue red turquoise 1.0 1.0 2.0 1.1 """ self.attribute_graph=nx.DiGraph(id='G') self.attribute_graph.graph['node_default']={'color':'yellow'} self.attribute_graph.add_node('n0',color='green') self.attribute_graph.add_node('n2',color='blue') self.attribute_graph.add_node('n3',color='red') self.attribute_graph.add_node('n4') self.attribute_graph.add_node('n5',color='turquoise') self.attribute_graph.add_edge('n0','n2',id='e0',weight=1.0) self.attribute_graph.add_edge('n0','n1',id='e1',weight=1.0) self.attribute_graph.add_edge('n1','n3',id='e2',weight=2.0) self.attribute_graph.add_edge('n3','n2',id='e3') self.attribute_graph.add_edge('n2','n4',id='e4') self.attribute_graph.add_edge('n3','n5',id='e5') self.attribute_graph.add_edge('n5','n4',id='e6',weight=1.1) self.attribute_fh = io.BytesIO(self.attribute_data.encode('UTF-8')) self.simple_undirected_data=""" """ # self.simple_undirected_graph=nx.Graph() self.simple_undirected_graph.add_node('n10') self.simple_undirected_graph.add_edge('n0','n2',id='foo') self.simple_undirected_graph.add_edges_from([('n1','n2'), ('n2','n3'), ]) self.simple_undirected_fh = io.BytesIO(self.simple_undirected_data.encode('UTF-8')) def test_read_simple_directed_graphml(self): G=self.simple_directed_graph H=nx.read_graphml(self.simple_directed_fh) assert_equal(sorted(G.nodes()),sorted(H.nodes())) assert_equal(sorted(G.edges()),sorted(H.edges())) assert_equal(sorted(G.edges(data=True)), sorted(H.edges(data=True))) self.simple_directed_fh.seek(0) I=nx.parse_graphml(self.simple_directed_data) assert_equal(sorted(G.nodes()),sorted(I.nodes())) assert_equal(sorted(G.edges()),sorted(I.edges())) assert_equal(sorted(G.edges(data=True)), sorted(I.edges(data=True))) def test_write_read_simple_directed_graphml(self): G=self.simple_directed_graph fh=io.BytesIO() nx.write_graphml(G,fh) fh.seek(0) H=nx.read_graphml(fh) assert_equal(sorted(G.nodes()),sorted(H.nodes())) assert_equal(sorted(G.edges()),sorted(H.edges())) assert_equal(sorted(G.edges(data=True)), sorted(H.edges(data=True))) self.simple_directed_fh.seek(0) def test_read_simple_undirected_graphml(self): G=self.simple_undirected_graph H=nx.read_graphml(self.simple_undirected_fh) assert_equal(sorted(G.nodes()),sorted(H.nodes())) assert_equal( sorted(sorted(e) for e in G.edges()), sorted(sorted(e) for e in H.edges())) self.simple_undirected_fh.seek(0) I=nx.parse_graphml(self.simple_undirected_data) assert_equal(sorted(G.nodes()),sorted(I.nodes())) assert_equal( sorted(sorted(e) for e in G.edges()), sorted(sorted(e) for e in I.edges())) def test_read_attribute_graphml(self): G=self.attribute_graph H=nx.read_graphml(self.attribute_fh) assert_equal(sorted(G.nodes(True)),sorted(H.nodes(data=True))) ge=sorted(G.edges(data=True)) he=sorted(H.edges(data=True)) for a,b in zip(ge,he): assert_equal(a,b) self.attribute_fh.seek(0) I=nx.parse_graphml(self.attribute_data) assert_equal(sorted(G.nodes(True)),sorted(I.nodes(data=True))) ge=sorted(G.edges(data=True)) he=sorted(I.edges(data=True)) for a,b in zip(ge,he): assert_equal(a,b) def test_directed_edge_in_undirected(self): s=""" """ fh = io.BytesIO(s.encode('UTF-8')) assert_raises(nx.NetworkXError,nx.read_graphml,fh) assert_raises(nx.NetworkXError,nx.parse_graphml,s) def test_undirected_edge_in_directed(self): s=""" """ fh = io.BytesIO(s.encode('UTF-8')) assert_raises(nx.NetworkXError,nx.read_graphml,fh) assert_raises(nx.NetworkXError,nx.parse_graphml,s) def test_key_error(self): s=""" yellow green blue 1.0 """ fh = io.BytesIO(s.encode('UTF-8')) assert_raises(nx.NetworkXError,nx.read_graphml,fh) assert_raises(nx.NetworkXError,nx.parse_graphml,s) def test_hyperedge_error(self): s=""" yellow green blue """ fh = io.BytesIO(s.encode('UTF-8')) assert_raises(nx.NetworkXError,nx.read_graphml,fh) assert_raises(nx.NetworkXError,nx.parse_graphml,s) # remove test until we get the "name" issue sorted # https://networkx.lanl.gov/trac/ticket/544 def test_default_attribute(self): G=nx.Graph() G.add_node(1,label=1,color='green') G.add_path([0,1,2,3]) G.add_edge(1,2,weight=3) G.graph['node_default']={'color':'yellow'} G.graph['edge_default']={'weight':7} fh = io.BytesIO() nx.write_graphml(G,fh) fh.seek(0) H=nx.read_graphml(fh,node_type=int) assert_equal(sorted(G.nodes()),sorted(H.nodes())) assert_equal( sorted(sorted(e) for e in G.edges()), sorted(sorted(e) for e in H.edges())) assert_equal(G.graph,H.graph) def test_multigraph_keys(self): # test that multigraphs use edge id attributes as key pass def test_multigraph_to_graph(self): # test converting multigraph to graph if no parallel edges are found pass def test_yfiles_extension(self): data=""" 1 2 """ fh = io.BytesIO(data.encode('UTF-8')) G=nx.read_graphml(fh) assert_equal(G.edges(),[('n0','n1')]) assert_equal(G['n0']['n1']['id'],'e0') assert_equal(G.node['n0']['label'],'1') assert_equal(G.node['n1']['label'],'2') H=nx.parse_graphml(data) assert_equal(H.edges(),[('n0','n1')]) assert_equal(H['n0']['n1']['id'],'e0') assert_equal(H.node['n0']['label'],'1') assert_equal(H.node['n1']['label'],'2') def test_unicode(self): G = nx.Graph() try: # Python 3.x name1 = chr(2344) + chr(123) + chr(6543) name2 = chr(5543) + chr(1543) + chr(324) node_type=str except ValueError: # Python 2.6+ name1 = unichr(2344) + unichr(123) + unichr(6543) name2 = unichr(5543) + unichr(1543) + unichr(324) node_type=unicode G.add_edge(name1, 'Radiohead', attr_dict={'foo': name2}) fd, fname = tempfile.mkstemp() nx.write_graphml(G, fname) H = nx.read_graphml(fname,node_type=node_type) assert_equal(G.adj, H.adj) os.close(fd) os.unlink(fname) def test_bool(self): s=""" false True False true false """ fh = io.BytesIO(s.encode('UTF-8')) G=nx.read_graphml(fh) assert_equal(G.node['n0']['test'],True) assert_equal(G.node['n2']['test'],False) H=nx.parse_graphml(s) assert_equal(H.node['n0']['test'],True) assert_equal(H.node['n2']['test'],False) networkx-1.11/networkx/readwrite/tests/test_gexf.py0000644000175000017500000002632112637544500022560 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * from nose import SkipTest import networkx as nx import io class TestGEXF(object): @classmethod def setupClass(cls): try: import xml.etree.ElementTree except ImportError: raise SkipTest('xml.etree.ElementTree not available.') def setUp(self): self.simple_directed_data=""" """ self.simple_directed_graph=nx.DiGraph() self.simple_directed_graph.add_node('0',label='Hello') self.simple_directed_graph.add_node('1',label='World') self.simple_directed_graph.add_edge('0','1',id='0') self.simple_directed_fh = \ io.BytesIO(self.simple_directed_data.encode('UTF-8')) self.attribute_data=""" Gephi.org A Web network true """ self.attribute_graph=nx.DiGraph() self.attribute_graph.graph['node_default']={'frog':True} self.attribute_graph.add_node('0', label='Gephi', url='http://gephi.org', indegree=1, frog=False) self.attribute_graph.add_node('1', label='Webatlas', url='http://webatlas.fr', indegree=2, frog=False) self.attribute_graph.add_node('2', label='RTGI', url='http://rtgi.fr', indegree=1, frog=True) self.attribute_graph.add_node('3', label='BarabasiLab', url='http://barabasilab.com', indegree=1, frog=True) self.attribute_graph.add_edge('0','1',id='0') self.attribute_graph.add_edge('0','2',id='1') self.attribute_graph.add_edge('1','0',id='2') self.attribute_graph.add_edge('2','1',id='3') self.attribute_graph.add_edge('0','3',id='4') self.attribute_fh = io.BytesIO(self.attribute_data.encode('UTF-8')) self.simple_undirected_data=""" """ self.simple_undirected_graph=nx.Graph() self.simple_undirected_graph.add_node('0',label='Hello') self.simple_undirected_graph.add_node('1',label='World') self.simple_undirected_graph.add_edge('0','1',id='0') self.simple_undirected_fh = io.BytesIO(self.simple_undirected_data.encode('UTF-8')) def test_read_simple_directed_graphml(self): G=self.simple_directed_graph H=nx.read_gexf(self.simple_directed_fh) assert_equal(sorted(G.nodes()),sorted(H.nodes())) assert_equal(sorted(G.edges()),sorted(H.edges())) assert_equal(sorted(G.edges(data=True)), sorted(H.edges(data=True))) self.simple_directed_fh.seek(0) def test_write_read_simple_directed_graphml(self): G=self.simple_directed_graph fh=io.BytesIO() nx.write_gexf(G,fh) fh.seek(0) H=nx.read_gexf(fh) assert_equal(sorted(G.nodes()),sorted(H.nodes())) assert_equal(sorted(G.edges()),sorted(H.edges())) assert_equal(sorted(G.edges(data=True)), sorted(H.edges(data=True))) self.simple_directed_fh.seek(0) def test_read_simple_undirected_graphml(self): G=self.simple_undirected_graph H=nx.read_gexf(self.simple_undirected_fh) assert_equal(sorted(G.nodes()),sorted(H.nodes())) assert_equal( sorted(sorted(e) for e in G.edges()), sorted(sorted(e) for e in H.edges())) self.simple_undirected_fh.seek(0) def test_read_attribute_graphml(self): G=self.attribute_graph H=nx.read_gexf(self.attribute_fh) assert_equal(sorted(G.nodes(True)),sorted(H.nodes(data=True))) ge=sorted(G.edges(data=True)) he=sorted(H.edges(data=True)) for a,b in zip(ge,he): assert_equal(a,b) self.attribute_fh.seek(0) def test_directed_edge_in_undirected(self): s=""" """ fh = io.BytesIO(s.encode('UTF-8')) assert_raises(nx.NetworkXError,nx.read_gexf,fh) def test_undirected_edge_in_directed(self): s=""" """ fh = io.BytesIO(s.encode('UTF-8')) assert_raises(nx.NetworkXError,nx.read_gexf,fh) def test_key_error(self): s=""" """ fh = io.BytesIO(s.encode('UTF-8')) assert_raises(nx.NetworkXError,nx.read_gexf,fh) def test_relabel(self): s=""" """ fh = io.BytesIO(s.encode('UTF-8')) G=nx.read_gexf(fh,relabel=True) assert_equal(sorted(G.nodes()),["Hello","Word"]) def test_default_attribute(self): G=nx.Graph() G.add_node(1,label='1',color='green') G.add_path([0,1,2,3]) G.add_edge(1,2,foo=3) G.graph['node_default']={'color':'yellow'} G.graph['edge_default']={'foo':7} fh = io.BytesIO() nx.write_gexf(G,fh) fh.seek(0) H=nx.read_gexf(fh,node_type=int) assert_equal(sorted(G.nodes()),sorted(H.nodes())) assert_equal( sorted(sorted(e) for e in G.edges()), sorted(sorted(e) for e in H.edges())) # Reading a gexf graph always sets mode attribute to either # 'static' or 'dynamic'. Remove the mode attribute from the # read graph for the sake of comparing remaining attributes. del H.graph['mode'] assert_equal(G.graph,H.graph) def test_serialize_ints_to_strings(self): G=nx.Graph() G.add_node(1,id=7,label=77) fh = io.BytesIO() nx.write_gexf(G,fh) fh.seek(0) H=nx.read_gexf(fh,node_type=int) assert_equal(H.nodes(),[7]) assert_equal(H.node[7]['label'],'77') def test_write_with_node_attributes(self): # Addresses #673. G = nx.path_graph(4) for i in range(4): G.node[i]['id'] = i G.node[i]['label'] = i G.node[i]['pid'] = i expected = """ """ obtained = '\n'.join(nx.generate_gexf(G)) assert_equal( expected, obtained ) def test_bool(self): G=nx.Graph() G.add_node(1, testattr=True) fh = io.BytesIO() nx.write_gexf(G,fh) fh.seek(0) H=nx.read_gexf(fh,node_type=int) assert_equal(H.node[1]['testattr'], True) networkx-1.11/networkx/readwrite/adjlist.py0000644000175000017500000002054012637544500021055 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ ************** Adjacency List ************** Read and write NetworkX graphs as adjacency lists. Adjacency list format is useful for graphs without data associated with nodes or edges and for nodes that can be meaningfully represented as strings. Format ------ The adjacency list format consists of lines with node labels. The first label in a line is the source node. Further labels in the line are considered target nodes and are added to the graph along with an edge between the source node and target node. The graph with edges a-b, a-c, d-e can be represented as the following adjacency list (anything following the # in a line is a comment):: a b c # source target target d e """ __author__ = '\n'.join(['Aric Hagberg ', 'Dan Schult ', 'Loïc Séguin-C. ']) # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. __all__ = ['generate_adjlist', 'write_adjlist', 'parse_adjlist', 'read_adjlist'] from networkx.utils import make_str, open_file import networkx as nx def generate_adjlist(G, delimiter = ' '): """Generate a single line of the graph G in adjacency list format. Parameters ---------- G : NetworkX graph delimiter : string, optional Separator for node labels Returns ------- lines : string Lines of data in adjlist format. Examples -------- >>> G = nx.lollipop_graph(4, 3) >>> for line in nx.generate_adjlist(G): ... print(line) 0 1 2 3 1 2 3 2 3 3 4 4 5 5 6 6 See Also -------- write_adjlist, read_adjlist """ directed=G.is_directed() seen=set() for s,nbrs in G.adjacency_iter(): line = make_str(s)+delimiter for t,data in nbrs.items(): if not directed and t in seen: continue if G.is_multigraph(): for d in data.values(): line += make_str(t) + delimiter else: line += make_str(t) + delimiter if not directed: seen.add(s) yield line[:-len(delimiter)] @open_file(1,mode='wb') def write_adjlist(G, path, comments="#", delimiter=' ', encoding = 'utf-8'): """Write graph G in single-line adjacency-list format to path. Parameters ---------- G : NetworkX graph path : string or file Filename or file handle for data output. Filenames ending in .gz or .bz2 will be compressed. comments : string, optional Marker for comment lines delimiter : string, optional Separator for node labels encoding : string, optional Text encoding. Examples -------- >>> G=nx.path_graph(4) >>> nx.write_adjlist(G,"test.adjlist") The path can be a filehandle or a string with the name of the file. If a filehandle is provided, it has to be opened in 'wb' mode. >>> fh=open("test.adjlist",'wb') >>> nx.write_adjlist(G, fh) Notes ----- This format does not store graph, node, or edge data. See Also -------- read_adjlist, generate_adjlist """ import sys import time pargs=comments + " ".join(sys.argv) + '\n' header = (pargs + comments + " GMT %s\n" % (time.asctime(time.gmtime())) + comments + " %s\n" % (G.name)) path.write(header.encode(encoding)) for line in generate_adjlist(G, delimiter): line+='\n' path.write(line.encode(encoding)) def parse_adjlist(lines, comments = '#', delimiter = None, create_using = None, nodetype = None): """Parse lines of a graph adjacency list representation. Parameters ---------- lines : list or iterator of strings Input data in adjlist format create_using: NetworkX graph container Use given NetworkX graph for holding nodes or edges. nodetype : Python type, optional Convert nodes to this type. comments : string, optional Marker for comment lines delimiter : string, optional Separator for node labels. The default is whitespace. create_using: NetworkX graph container Use given NetworkX graph for holding nodes or edges. Returns ------- G: NetworkX graph The graph corresponding to the lines in adjacency list format. Examples -------- >>> lines = ['1 2 5', ... '2 3 4', ... '3 5', ... '4', ... '5'] >>> G = nx.parse_adjlist(lines, nodetype = int) >>> G.nodes() [1, 2, 3, 4, 5] >>> G.edges() [(1, 2), (1, 5), (2, 3), (2, 4), (3, 5)] See Also -------- read_adjlist """ if create_using is None: G=nx.Graph() else: try: G=create_using G.clear() except: raise TypeError("Input graph is not a NetworkX graph type") for line in lines: p=line.find(comments) if p>=0: line = line[:p] if not len(line): continue vlist=line.strip().split(delimiter) u=vlist.pop(0) # convert types if nodetype is not None: try: u=nodetype(u) except: raise TypeError("Failed to convert node (%s) to type %s"\ %(u,nodetype)) G.add_node(u) if nodetype is not None: try: vlist=map(nodetype,vlist) except: raise TypeError("Failed to convert nodes (%s) to type %s"\ %(','.join(vlist),nodetype)) G.add_edges_from([(u, v) for v in vlist]) return G @open_file(0,mode='rb') def read_adjlist(path, comments="#", delimiter=None, create_using=None, nodetype=None, encoding = 'utf-8'): """Read graph in adjacency list format from path. Parameters ---------- path : string or file Filename or file handle to read. Filenames ending in .gz or .bz2 will be uncompressed. create_using: NetworkX graph container Use given NetworkX graph for holding nodes or edges. nodetype : Python type, optional Convert nodes to this type. comments : string, optional Marker for comment lines delimiter : string, optional Separator for node labels. The default is whitespace. create_using: NetworkX graph container Use given NetworkX graph for holding nodes or edges. Returns ------- G: NetworkX graph The graph corresponding to the lines in adjacency list format. Examples -------- >>> G=nx.path_graph(4) >>> nx.write_adjlist(G, "test.adjlist") >>> G=nx.read_adjlist("test.adjlist") The path can be a filehandle or a string with the name of the file. If a filehandle is provided, it has to be opened in 'rb' mode. >>> fh=open("test.adjlist", 'rb') >>> G=nx.read_adjlist(fh) Filenames ending in .gz or .bz2 will be compressed. >>> nx.write_adjlist(G,"test.adjlist.gz") >>> G=nx.read_adjlist("test.adjlist.gz") The optional nodetype is a function to convert node strings to nodetype. For example >>> G=nx.read_adjlist("test.adjlist", nodetype=int) will attempt to convert all nodes to integer type. Since nodes must be hashable, the function nodetype must return hashable types (e.g. int, float, str, frozenset - or tuples of those, etc.) The optional create_using parameter is a NetworkX graph container. The default is Graph(), an undirected graph. To read the data as a directed graph use >>> G=nx.read_adjlist("test.adjlist", create_using=nx.DiGraph()) Notes ----- This format does not store graph or node data. See Also -------- write_adjlist """ lines = (line.decode(encoding) for line in path) return parse_adjlist(lines, comments = comments, delimiter = delimiter, create_using = create_using, nodetype = nodetype) # fixture for nose tests def teardown_module(module): import os for fname in ['test.adjlist', 'test.adjlist.gz']: if os.path.isfile(fname): os.unlink(fname) networkx-1.11/networkx/readwrite/nx_yaml.py0000644000175000017500000000470712637544450021105 0ustar aricaric00000000000000""" **** YAML **** Read and write NetworkX graphs in YAML format. "YAML is a data serialization format designed for human readability and interaction with scripting languages." See http://www.yaml.org for documentation. Format ------ http://pyyaml.org/wiki/PyYAML """ __author__ = """Aric Hagberg (hagberg@lanl.gov)""" # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. __all__ = ['read_yaml', 'write_yaml'] import networkx as nx from networkx.utils import open_file @open_file(1,mode='w') def write_yaml(G, path, encoding='UTF-8', **kwds): """Write graph G in YAML format to path. YAML is a data serialization format designed for human readability and interaction with scripting languages [1]_. Parameters ---------- G : graph A NetworkX graph path : file or string File or filename to write. Filenames ending in .gz or .bz2 will be compressed. encoding: string, optional Specify which encoding to use when writing file. Examples -------- >>> G=nx.path_graph(4) >>> nx.write_yaml(G,'test.yaml') References ---------- .. [1] http://www.yaml.org """ try: import yaml except ImportError: raise ImportError("write_yaml() requires PyYAML: http://pyyaml.org/") yaml.dump(G, path, **kwds) @open_file(0,mode='r') def read_yaml(path): """Read graph in YAML format from path. YAML is a data serialization format designed for human readability and interaction with scripting languages [1]_. Parameters ---------- path : file or string File or filename to read. Filenames ending in .gz or .bz2 will be uncompressed. Returns ------- G : NetworkX graph Examples -------- >>> G=nx.path_graph(4) >>> nx.write_yaml(G,'test.yaml') >>> G=nx.read_yaml('test.yaml') References ---------- .. [1] http://www.yaml.org """ try: import yaml except ImportError: raise ImportError("read_yaml() requires PyYAML: http://pyyaml.org/") G=yaml.load(path) return G # fixture for nose tests def setup_module(module): from nose import SkipTest try: import yaml except: raise SkipTest("PyYAML not available") # fixture for nose tests def teardown_module(module): import os os.unlink('test.yaml') networkx-1.11/networkx/readwrite/graph6.py0000644000175000017500000001711012637544450020615 0ustar aricaric00000000000000"""Graph6 Read and write graphs in graph6 format. Format ------ "graph6 and sparse6 are formats for storing undirected graphs in a compact manner, using only printable ASCII characters. Files in these formats have text type and contain one line per graph." See http://cs.anu.edu.au/~bdm/data/formats.txt for details. """ # Original author: D. Eppstein, UC Irvine, August 12, 2003. # The original code at http://www.ics.uci.edu/~eppstein/PADS/ is public domain. # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # Tomas Gavenciak # All rights reserved. # BSD license. import networkx as nx from networkx.exception import NetworkXError from networkx.utils import open_file, not_implemented_for __author__ = """\n""".join(['Tomas Gavenciak ', 'Aric Hagberg >> G = nx.parse_graph6('A_') >>> sorted(G.edges()) [(0, 1)] See Also -------- generate_graph6, read_graph6, write_graph6 References ---------- Graph6 specification: http://cs.anu.edu.au/~bdm/data/formats.txt for details. """ def bits(): """Return sequence of individual bits from 6-bit-per-value list of data values.""" for d in data: for i in [5,4,3,2,1,0]: yield (d>>i)&1 if string.startswith('>>graph6<<'): string = string[10:] data = graph6_to_data(string) n, data = data_to_n(data) nd = (n*(n-1)//2 + 5) // 6 if len(data) != nd: raise NetworkXError(\ 'Expected %d bits but got %d in graph6' % (n*(n-1)//2, len(data)*6)) G=nx.Graph() G.add_nodes_from(range(n)) for (i,j),b in zip([(i,j) for j in range(1,n) for i in range(j)], bits()): if b: G.add_edge(i,j) return G @open_file(0,mode='rt') def read_graph6(path): """Read simple undirected graphs in graph6 format from path. Parameters ---------- path : file or string File or filename to write. Returns ------- G : Graph or list of Graphs If the file contains multiple lines then a list of graphs is returned Raises ------ NetworkXError If the string is unable to be parsed in graph6 format Examples -------- >>> nx.write_graph6(nx.Graph([(0,1)]), 'test.g6') >>> G = nx.read_graph6('test.g6') >>> sorted(G.edges()) [(0, 1)] See Also -------- generate_graph6, parse_graph6, write_graph6 References ---------- Graph6 specification: http://cs.anu.edu.au/~bdm/data/formats.txt for details. """ glist = [] for line in path: line = line.strip() if not len(line): continue glist.append(parse_graph6(line)) if len(glist) == 1: return glist[0] else: return glist @not_implemented_for('directed','multigraph') def generate_graph6(G, nodes = None, header=True): """Generate graph6 format string from a simple undirected graph. Parameters ---------- G : Graph (undirected) nodes: list or iterable Nodes are labeled 0...n-1 in the order provided. If None the ordering given by G.nodes() is used. header: bool If True add '>>graph6<<' string to head of data Returns ------- s : string String in graph6 format Raises ------ NetworkXError If the graph is directed or has parallel edges Examples -------- >>> G = nx.Graph([(0, 1)]) >>> nx.generate_graph6(G) '>>graph6<>graph6<<' string to head of data Raises ------ NetworkXError If the graph is directed or has parallel edges Examples -------- >>> G = nx.Graph([(0, 1)]) >>> nx.write_graph6(G, 'test.g6') See Also -------- generate_graph6, parse_graph6, read_graph6 Notes ----- The format does not support edge or node labels, parallel edges or self loops. If self loops are present they are silently ignored. References ---------- Graph6 specification: http://cs.anu.edu.au/~bdm/data/formats.txt for details. """ path.write(generate_graph6(G, nodes=nodes, header=header)) path.write('\n') # helper functions def graph6_to_data(string): """Convert graph6 character sequence to 6-bit integers.""" v = [ord(c)-63 for c in string] if len(v) > 0 and (min(v) < 0 or max(v) > 63): return None return v def data_to_graph6(data): """Convert 6-bit integer sequence to graph6 character sequence.""" if len(data) > 0 and (min(data) < 0 or max(data) > 63): raise NetworkXError("graph6 data units must be within 0..63") return ''.join([chr(d+63) for d in data]) def data_to_n(data): """Read initial one-, four- or eight-unit value from graph6 integer sequence. Return (value, rest of seq.)""" if data[0] <= 62: return data[0], data[1:] if data[1] <= 62: return (data[1]<<12) + (data[2]<<6) + data[3], data[4:] return ((data[2]<<30) + (data[3]<<24) + (data[4]<<18) + (data[5]<<12) + (data[6]<<6) + data[7], data[8:]) def n_to_data(n): """Convert an integer to one-, four- or eight-unit graph6 sequence.""" if n < 0: raise NetworkXError("Numbers in graph6 format must be non-negative.") if n <= 62: return [n] if n <= 258047: return [63, (n>>12) & 0x3f, (n>>6) & 0x3f, n & 0x3f] if n <= 68719476735: return [63, 63, (n>>30) & 0x3f, (n>>24) & 0x3f, (n>>18) & 0x3f, (n>>12) & 0x3f, (n>>6) & 0x3f, n & 0x3f] raise NetworkXError("Numbers above 68719476735 are not supported by graph6") def teardown_module(module): import os if os.path.isfile('test.g6'): os.unlink('test.g6') networkx-1.11/networkx/readwrite/gml.py0000644000175000017500000005632112637544500020210 0ustar aricaric00000000000000# encoding: utf-8 """ Read graphs in GML format. "GML, the G>raph Modelling Language, is our proposal for a portable file format for graphs. GML's key features are portability, simple syntax, extensibility and flexibility. A GML file consists of a hierarchical key-value lists. Graphs can be annotated with arbitrary data structures. The idea for a common file format was born at the GD'95; this proposal is the outcome of many discussions. GML is the standard file format in the Graphlet graph editor system. It has been overtaken and adapted by several other systems for drawing graphs." See http://www.infosun.fim.uni-passau.de/Graphlet/GML/gml-tr.html Format ------ See http://www.infosun.fim.uni-passau.de/Graphlet/GML/gml-tr.html for format specification. Example graphs in GML format: http://www-personal.umich.edu/~mejn/netdata/ """ __author__ = """Aric Hagberg (hagberg@lanl.gov)""" # Copyright (C) 2008-2010 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. __all__ = ['read_gml', 'parse_gml', 'generate_gml', 'write_gml'] try: try: from cStringIO import StringIO except ImportError: from StringIO import StringIO except ImportError: from io import StringIO from ast import literal_eval from collections import defaultdict from lib2to3.pgen2.parse import ParseError from lib2to3.pgen2.tokenize import TokenError from lib2to3.refactor import RefactoringTool import networkx as nx from networkx.exception import NetworkXError from networkx.utils import open_file import re try: import htmlentitydefs except ImportError: # Python 3.x import html.entities as htmlentitydefs try: long except NameError: long = int try: unicode except NameError: unicode = str try: unichr except NameError: unichr = chr try: literal_eval(r"u'\u4444'") except SyntaxError: # Remove 'u' prefixes in unicode literals in Python 3 rtp_fix_unicode = RefactoringTool(['lib2to3.fixes.fix_unicode'], {'print_function': True}) else: rtp_fix_unicode = None def escape(text): """Escape unprintable or non-ASCII characters, double quotes and ampersands in a string using XML character references. """ def fixup(m): ch = m.group(0) return '&#' + str(ord(ch)) + ';' text = re.sub('[^ -~]|[&"]', fixup, text) return text if isinstance(text, str) else str(text) def unescape(text): """Replace XML character references in a string with the referenced characters. """ def fixup(m): text = m.group(0) if text[1] == '#': # Character reference if text[2] == 'x': code = int(text[3:-1], 16) else: code = int(text[2:-1]) else: # Named entity try: code = htmlentitydefs.name2codepoint[text[1:-1]] except KeyError: return text # leave unchanged try: return chr(code) if code < 256 else unichr(code) except (ValueError, OverflowError): return text # leave unchanged return re.sub("&(?:[0-9A-Za-z]+|#(?:[0-9]+|x[0-9A-Fa-f]+));", fixup, text) def literal_destringizer(rep): """Convert a Python literal to the value it represents. Parameters ---------- rep : string A Python literal. Returns ------- value : object The value of the Python literal. Raises ------ ValueError If ``rep`` is not a Python literal. """ if isinstance(rep, (str, unicode)): orig_rep = rep try: # Python 3.2 does not recognize 'u' prefixes before string literals if rtp_fix_unicode: rep = str(rtp_fix_unicode.refactor_string( rep + '\n', ''))[:-1] return literal_eval(rep) except (ParseError, SyntaxError, TokenError): raise ValueError('%r is not a valid Python literal' % (orig_rep,)) else: raise ValueError('%r is not a string' % (rep,)) @open_file(0, mode='rb') def read_gml(path, label='label', destringizer=None): """Read graph in GML format from path. Parameters ---------- path : filename or filehandle The filename or filehandle to read from. label : string, optional If not None, the parsed nodes will be renamed according to node attributes indicated by ``label``. Default value: ``'label'``. destringizer : callable, optional A destringizer that recovers values stored as strings in GML. If it cannot convert a string to a value, a ``ValueError`` is raised. Default value : ``None``. Returns ------- G : NetworkX graph The parsed graph. Raises ------ NetworkXError If the input cannot be parsed. See Also -------- write_gml, parse_gml Notes ----- The GML specification says that files should be ASCII encoded, with any extended ASCII characters (iso8859-1) appearing as HTML character entities. References ---------- GML specification: http://www.infosun.fim.uni-passau.de/Graphlet/GML/gml-tr.html Examples -------- >>> G = nx.path_graph(4) >>> nx.write_gml(G, 'test.gml') >>> H = nx.read_gml('test.gml') """ def filter_lines(lines): for line in lines: try: line = line.decode('ascii') except UnicodeDecodeError: raise NetworkXError('input is not ASCII-encoded') if not isinstance(line, str): lines = str(lines) if line and line[-1] == '\n': line = line[:-1] yield line G = parse_gml_lines(filter_lines(path), label, destringizer) return G def parse_gml(lines, label='label', destringizer=None): """Parse GML graph from a string or iterable. Parameters ---------- lines : string or iterable of strings Data in GML format. label : string, optional If not None, the parsed nodes will be renamed according to node attributes indicated by ``label``. Default value: ``'label'``. destringizer : callable, optional A destringizer that recovers values stored as strings in GML. If it cannot convert a string to a value, a ``ValueError`` is raised. Default value : ``None``. Returns ------- G : NetworkX graph The parsed graph. Raises ------ NetworkXError If the input cannot be parsed. See Also -------- write_gml, read_gml Notes ----- This stores nested GML attributes as dictionaries in the NetworkX graph, node, and edge attribute structures. References ---------- GML specification: http://www.infosun.fim.uni-passau.de/Graphlet/GML/gml-tr.html """ def decode_line(line): if isinstance(line, bytes): try: line.decode('ascii') except UnicodeDecodeError: raise NetworkXError('input is not ASCII-encoded') if not isinstance(line, str): line = str(line) return line def filter_lines(lines): if isinstance(lines, (str, unicode)): lines = decode_line(lines) lines = lines.splitlines() for line in lines: yield line else: for line in lines: line = decode_line(line) if line and line[-1] == '\n': line = line[:-1] if line.find('\n') != -1: raise NetworkXError('input line contains newline') yield line G = parse_gml_lines(filter_lines(lines), label, destringizer) return G def parse_gml_lines(lines, label, destringizer): """Parse GML into a graph. """ def tokenize(): patterns = [ r'[A-Za-z][0-9A-Za-z]*\s+', # keys r'[+-]?(?:[0-9]*\.[0-9]+|[0-9]+\.[0-9]*)(?:[Ee][+-]?[0-9]+)?', # reals r'[+-]?[0-9]+', # ints r'".*?"', # strings r'\[', # dict start r'\]', # dict end r'#.*$|\s+' # comments and whitespaces ] tokens = re.compile( '|'.join('(' + pattern + ')' for pattern in patterns)) lineno = 0 for line in lines: length = len(line) pos = 0 while pos < length: match = tokens.match(line, pos) if match is not None: for i in range(len(patterns)): group = match.group(i + 1) if group is not None: if i == 0: # keys value = group.rstrip() elif i == 1: # reals value = float(group) elif i == 2: # ints value = int(group) else: value = group if i != 6: # comments and whitespaces yield (i, value, lineno + 1, pos + 1) pos += len(group) break else: raise NetworkXError('cannot tokenize %r at (%d, %d)' % (line[pos:], lineno + 1, pos + 1)) lineno += 1 yield (None, None, lineno + 1, 1) # EOF def unexpected(curr_token, expected): type, value, lineno, pos = curr_token raise NetworkXError( 'expected %s, found %s at (%d, %d)' % (expected, repr(value) if value is not None else 'EOF', lineno, pos)) def consume(curr_token, type, expected): if curr_token[0] == type: return next(tokens) unexpected(curr_token, expected) def parse_kv(curr_token): dct = defaultdict(list) while curr_token[0] == 0: # keys key = curr_token[1] curr_token = next(tokens) type = curr_token[0] if type == 1 or type == 2: # reals or ints value = curr_token[1] curr_token = next(tokens) elif type == 3: # strings value = unescape(curr_token[1][1:-1]) if destringizer: try: value = destringizer(value) except ValueError: pass curr_token = next(tokens) elif type == 4: # dict start curr_token, value = parse_dict(curr_token) else: unexpected(curr_token, "an int, float, string or '['") dct[key].append(value) dct = {key: (value if not isinstance(value, list) or len(value) != 1 else value[0]) for key, value in dct.items()} return curr_token, dct def parse_dict(curr_token): curr_token = consume(curr_token, 4, "'['") # dict start curr_token, dct = parse_kv(curr_token) curr_token = consume(curr_token, 5, "']'") # dict end return curr_token, dct def parse_graph(): curr_token, dct = parse_kv(next(tokens)) if curr_token[0] is not None: # EOF unexpected(curr_token, 'EOF') if 'graph' not in dct: raise NetworkXError('input contains no graph') graph = dct['graph'] if isinstance(graph, list): raise NetworkXError('input contains more than one graph') return graph tokens = tokenize() graph = parse_graph() directed = graph.pop('directed', False) multigraph = graph.pop('multigraph', False) if not multigraph: G = nx.DiGraph() if directed else nx.Graph() else: G = nx.MultiDiGraph() if directed else nx.MultiGraph() G.graph.update((key, value) for key, value in graph.items() if key != 'node' and key != 'edge') def pop_attr(dct, type, attr, i): try: return dct.pop(attr) except KeyError: raise NetworkXError( "%s #%d has no '%s' attribute" % (type, i, attr)) nodes = graph.get('node', []) mapping = {} labels = set() for i, node in enumerate(nodes if isinstance(nodes, list) else [nodes]): id = pop_attr(node, 'node', 'id', i) if id in G: raise NetworkXError('node id %r is duplicated' % (id,)) if label != 'id': label = pop_attr(node, 'node', 'label', i) if label in labels: raise NetworkXError('node label %r is duplicated' % (label,)) labels.add(label) mapping[id] = label G.add_node(id, node) edges = graph.get('edge', []) for i, edge in enumerate(edges if isinstance(edges, list) else [edges]): source = pop_attr(edge, 'edge', 'source', i) target = pop_attr(edge, 'edge', 'target', i) if source not in G: raise NetworkXError( 'edge #%d has an undefined source %r' % (i, source)) if target not in G: raise NetworkXError( 'edge #%d has an undefined target %r' % (i, target)) if not multigraph: if not G.has_edge(source, target): G.add_edge(source, target, edge) else: raise nx.NetworkXError( 'edge #%d (%r%s%r) is duplicated' % (i, source, '->' if directed else '--', target)) else: key = edge.pop('key', None) if key is not None and G.has_edge(source, target, key): raise nx.NetworkXError( 'edge #%d (%r%s%r, %r) is duplicated' % (i, source, '->' if directed else '--', target, key)) G.add_edge(source, target, key, edge) if label != 'id': G = nx.relabel_nodes(G, mapping) if 'name' in graph: G.graph['name'] = graph['name'] else: del G.graph['name'] return G def literal_stringizer(value): """Convert a value to a Python literal in GML representation. Parameters ---------- value : object The value to be converted to GML representation. Returns ------- rep : string A double-quoted Python literal representing value. Unprintable characters are replaced by XML character references. Raises ------ ValueError If ``value`` cannot be converted to GML. Notes ----- ``literal_stringizer`` is largely the same as ``repr`` in terms of functionality but attempts prefix ``unicode`` and ``bytes`` literals with ``u`` and ``b`` to provide better interoperability of data generated by Python 2 and Python 3. The original value can be recovered using the ``networkx.readwrite.gml.literal_destringizer`` function. """ def stringize(value): if isinstance(value, (int, long, bool)) or value is None: buf.write(str(value)) elif isinstance(value, unicode): text = repr(value) if text[0] != 'u': try: value.encode('latin1') except UnicodeEncodeError: text = 'u' + text buf.write(text) elif isinstance(value, (float, complex, str, bytes)): buf.write(repr(value)) elif isinstance(value, list): buf.write('[') first = True for item in value: if not first: buf.write(',') else: first = False stringize(item) buf.write(']') elif isinstance(value, tuple): if len(value) > 1: buf.write('(') first = True for item in value: if not first: buf.write(',') else: first = False stringize(item) buf.write(')') elif value: buf.write('(') stringize(value[0]) buf.write(',)') else: buf.write('()') elif isinstance(value, dict): buf.write('{') first = True for key, value in value.items(): if not first: buf.write(',') else: first = False stringize(key) buf.write(':') stringize(value) buf.write('}') elif isinstance(value, set): buf.write('{') first = True for item in value: if not first: buf.write(',') else: first = False stringize(item) buf.write('}') else: raise ValueError( '%r cannot be converted into a Python literal' % (value,)) buf = StringIO() stringize(value) return buf.getvalue() def generate_gml(G, stringizer=None): """Generate a single entry of the graph G in GML format. Parameters ---------- G : NetworkX graph The graph to be converted to GML. stringizer : callable, optional A stringizer which converts non-int/float/dict values into strings. If it cannot convert a value into a string, it should raise a ``ValueError`` raised to indicate that. Default value: ``None``. Returns ------- lines: generator of strings Lines of GML data. Newlines are not appended. Raises ------ NetworkXError If ``stringizer`` cannot convert a value into a string, or the value to convert is not a string while ``stringizer`` is ``None``. Notes ----- Graph attributes named ``'directed'``, ``'multigraph'``, ``'node'`` or ``'edge'``,node attributes named ``'id'`` or ``'label'``, edge attributes named ``'source'`` or ``'target'`` (or ``'key'`` if ``G`` is a multigraph) are ignored because these attribute names are used to encode the graph structure. """ valid_keys = re.compile('^[A-Za-z][0-9A-Za-z]*$') def stringize(key, value, ignored_keys, indent, in_list=False): if not isinstance(key, (str, unicode)): raise NetworkXError('%r is not a string' % (key,)) if not valid_keys.match(key): raise NetworkXError('%r is not a valid key' % (key,)) if not isinstance(key, str): key = str(key) if key not in ignored_keys: if isinstance(value, (int, long)): yield indent + key + ' ' + str(value) elif isinstance(value, float): text = repr(value).upper() # GML requires that a real literal contain a decimal point, but # repr may not output a decimal point when the mantissa is # integral and hence needs fixing. epos = text.rfind('E') if epos != -1 and text.find('.', 0, epos) == -1: text = text[:epos] + '.' + text[epos:] yield indent + key + ' ' + text elif isinstance(value, dict): yield indent + key + ' [' next_indent = indent + ' ' for key, value in value.items(): for line in stringize(key, value, (), next_indent): yield line yield indent + ']' elif isinstance(value, list) and value and not in_list: next_indent = indent + ' ' for value in value: for line in stringize(key, value, (), next_indent, True): yield line else: if stringizer: try: value = stringizer(value) except ValueError: raise NetworkXError( '%r cannot be converted into a string' % (value,)) if not isinstance(value, (str, unicode)): raise NetworkXError('%r is not a string' % (value,)) yield indent + key + ' "' + escape(value) + '"' multigraph = G.is_multigraph() yield 'graph [' # Output graph attributes if G.is_directed(): yield ' directed 1' if multigraph: yield ' multigraph 1' ignored_keys = {'directed', 'multigraph', 'node', 'edge'} for attr, value in G.graph.items(): for line in stringize(attr, value, ignored_keys, ' '): yield line # Output node data node_id = dict(zip(G, range(len(G)))) ignored_keys = {'id', 'label'} for node, attrs in G.node.items(): yield ' node [' yield ' id ' + str(node_id[node]) for line in stringize('label', node, (), ' '): yield line for attr, value in attrs.items(): for line in stringize(attr, value, ignored_keys, ' '): yield line yield ' ]' # Output edge data ignored_keys = {'source', 'target'} kwargs = {'data': True} if multigraph: ignored_keys.add('key') kwargs['keys'] = True for e in G.edges_iter(**kwargs): yield ' edge [' yield ' source ' + str(node_id[e[0]]) yield ' target ' + str(node_id[e[1]]) if multigraph: for line in stringize('key', e[2], (), ' '): yield line for attr, value in e[-1].items(): for line in stringize(attr, value, ignored_keys, ' '): yield line yield ' ]' yield ']' @open_file(1, mode='wb') def write_gml(G, path, stringizer=None): """Write a graph ``G`` in GML format to the file or file handle ``path``. Parameters ---------- G : NetworkX graph The graph to be converted to GML. path : filename or filehandle The filename or filehandle to write. Files whose names end with .gz or .bz2 will be compressed. stringizer : callable, optional A stringizer which converts non-int/non-float/non-dict values into strings. If it cannot convert a value into a string, it should raise a ``ValueError`` to indicate that. Default value: ``None``. Raises ------ NetworkXError If ``stringizer`` cannot convert a value into a string, or the value to convert is not a string while ``stringizer`` is ``None``. See Also -------- read_gml, generate_gml Notes ----- Graph attributes named ``'directed'``, ``'multigraph'``, ``'node'`` or ``'edge'``,node attributes named ``'id'`` or ``'label'``, edge attributes named ``'source'`` or ``'target'`` (or ``'key'`` if ``G`` is a multigraph) are ignored because these attribute names are used to encode the graph structure. Examples -------- >>> G = nx.path_graph(4) >>> nx.write_gml(G, "test.gml") Filenames ending in .gz or .bz2 will be compressed. >>> nx.write_gml(G, "test.gml.gz") """ for line in generate_gml(G, stringizer): path.write((line + '\n').encode('ascii')) # fixture for nose def teardown_module(module): import os for fname in ['test.gml', 'test.gml.gz']: if os.path.isfile(fname): os.unlink(fname) networkx-1.11/networkx/convert.py0000644000175000017500000003164512653165361017127 0ustar aricaric00000000000000"""Functions to convert NetworkX graphs to and from other formats. The preferred way of converting data to a NetworkX graph is through the graph constuctor. The constructor calls the to_networkx_graph() function which attempts to guess the input type and convert it automatically. Examples -------- Create a graph with a single edge from a dictionary of dictionaries >>> d={0: {1: 1}} # dict-of-dicts single edge (0,1) >>> G=nx.Graph(d) See Also -------- nx_agraph, nx_pydot """ # Copyright (C) 2006-2013 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import warnings import networkx as nx __author__ = """\n""".join(['Aric Hagberg ', 'Pieter Swart (swart@lanl.gov)', 'Dan Schult(dschult@colgate.edu)']) __all__ = ['to_networkx_graph', 'from_dict_of_dicts', 'to_dict_of_dicts', 'from_dict_of_lists', 'to_dict_of_lists', 'from_edgelist', 'to_edgelist'] def _prep_create_using(create_using): """Return a graph object ready to be populated. If create_using is None return the default (just networkx.Graph()) If create_using.clear() works, assume it returns a graph object. Otherwise raise an exception because create_using is not a networkx graph. """ if create_using is None: return nx.Graph() try: create_using.clear() except: raise TypeError("Input graph is not a networkx graph type") return create_using def to_networkx_graph(data,create_using=None,multigraph_input=False): """Make a NetworkX graph from a known data structure. The preferred way to call this is automatically from the class constructor >>> d={0: {1: {'weight':1}}} # dict-of-dicts single edge (0,1) >>> G=nx.Graph(d) instead of the equivalent >>> G=nx.from_dict_of_dicts(d) Parameters ---------- data : a object to be converted Current known types are: any NetworkX graph dict-of-dicts dist-of-lists list of edges numpy matrix numpy ndarray scipy sparse matrix pygraphviz agraph create_using : NetworkX graph Use specified graph for result. Otherwise a new graph is created. multigraph_input : bool (default False) If True and data is a dict_of_dicts, try to create a multigraph assuming dict_of_dict_of_lists. If data and create_using are both multigraphs then create a multigraph from a multigraph. """ # NX graph if hasattr(data,"adj"): try: result= from_dict_of_dicts(data.adj,\ create_using=create_using,\ multigraph_input=data.is_multigraph()) if hasattr(data,'graph') and isinstance(data.graph,dict): result.graph=data.graph.copy() if hasattr(data,'node') and isinstance(data.node,dict): result.node=dict( (n,dd.copy()) for n,dd in data.node.items() ) return result except: raise nx.NetworkXError("Input is not a correct NetworkX graph.") # pygraphviz agraph if hasattr(data,"is_strict"): try: return nx.nx_agraph.from_agraph(data,create_using=create_using) except: raise nx.NetworkXError("Input is not a correct pygraphviz graph.") # dict of dicts/lists if isinstance(data,dict): try: return from_dict_of_dicts(data,create_using=create_using,\ multigraph_input=multigraph_input) except: try: return from_dict_of_lists(data,create_using=create_using) except: raise TypeError("Input is not known type.") # list or generator of edges if (isinstance(data,list) or isinstance(data,tuple) or hasattr(data,'next') or hasattr(data, '__next__')): try: return from_edgelist(data,create_using=create_using) except: raise nx.NetworkXError("Input is not a valid edge list") # Pandas DataFrame try: import pandas as pd if isinstance(data, pd.DataFrame): try: return nx.from_pandas_dataframe(data, create_using=create_using) except: msg = "Input is not a correct Pandas DataFrame." raise nx.NetworkXError(msg) except ImportError: msg = 'pandas not found, skipping conversion test.' warnings.warn(msg, ImportWarning) # numpy matrix or ndarray try: import numpy if isinstance(data,numpy.matrix) or \ isinstance(data,numpy.ndarray): try: return nx.from_numpy_matrix(data,create_using=create_using) except: raise nx.NetworkXError(\ "Input is not a correct numpy matrix or array.") except ImportError: warnings.warn('numpy not found, skipping conversion test.', ImportWarning) # scipy sparse matrix - any format try: import scipy if hasattr(data,"format"): try: return nx.from_scipy_sparse_matrix(data,create_using=create_using) except: raise nx.NetworkXError(\ "Input is not a correct scipy sparse matrix type.") except ImportError: warnings.warn('scipy not found, skipping conversion test.', ImportWarning) raise nx.NetworkXError(\ "Input is not a known data type for conversion.") return def convert_to_undirected(G): """Return a new undirected representation of the graph G.""" return G.to_undirected() def convert_to_directed(G): """Return a new directed representation of the graph G.""" return G.to_directed() def to_dict_of_lists(G,nodelist=None): """Return adjacency representation of graph as a dictionary of lists. Parameters ---------- G : graph A NetworkX graph nodelist : list Use only nodes specified in nodelist Notes ----- Completely ignores edge data for MultiGraph and MultiDiGraph. """ if nodelist is None: nodelist=G d = {} for n in nodelist: d[n]=[nbr for nbr in G.neighbors(n) if nbr in nodelist] return d def from_dict_of_lists(d,create_using=None): """Return a graph from a dictionary of lists. Parameters ---------- d : dictionary of lists A dictionary of lists adjacency representation. create_using : NetworkX graph Use specified graph for result. Otherwise a new graph is created. Examples -------- >>> dol= {0:[1]} # single edge (0,1) >>> G=nx.from_dict_of_lists(dol) or >>> G=nx.Graph(dol) # use Graph constructor """ G=_prep_create_using(create_using) G.add_nodes_from(d) if G.is_multigraph() and not G.is_directed(): # a dict_of_lists can't show multiedges. BUT for undirected graphs, # each edge shows up twice in the dict_of_lists. # So we need to treat this case separately. seen={} for node,nbrlist in d.items(): for nbr in nbrlist: if nbr not in seen: G.add_edge(node,nbr) seen[node]=1 # don't allow reverse edge to show up else: G.add_edges_from( ((node,nbr) for node,nbrlist in d.items() for nbr in nbrlist) ) return G def to_dict_of_dicts(G,nodelist=None,edge_data=None): """Return adjacency representation of graph as a dictionary of dictionaries. Parameters ---------- G : graph A NetworkX graph nodelist : list Use only nodes specified in nodelist edge_data : list, optional If provided, the value of the dictionary will be set to edge_data for all edges. This is useful to make an adjacency matrix type representation with 1 as the edge data. If edgedata is None, the edgedata in G is used to fill the values. If G is a multigraph, the edgedata is a dict for each pair (u,v). """ dod={} if nodelist is None: if edge_data is None: for u,nbrdict in G.adjacency_iter(): dod[u]=nbrdict.copy() else: # edge_data is not None for u,nbrdict in G.adjacency_iter(): dod[u]=dod.fromkeys(nbrdict, edge_data) else: # nodelist is not None if edge_data is None: for u in nodelist: dod[u]={} for v,data in ((v,data) for v,data in G[u].items() if v in nodelist): dod[u][v]=data else: # nodelist and edge_data are not None for u in nodelist: dod[u]={} for v in ( v for v in G[u] if v in nodelist): dod[u][v]=edge_data return dod def from_dict_of_dicts(d,create_using=None,multigraph_input=False): """Return a graph from a dictionary of dictionaries. Parameters ---------- d : dictionary of dictionaries A dictionary of dictionaries adjacency representation. create_using : NetworkX graph Use specified graph for result. Otherwise a new graph is created. multigraph_input : bool (default False) When True, the values of the inner dict are assumed to be containers of edge data for multiple edges. Otherwise this routine assumes the edge data are singletons. Examples -------- >>> dod= {0: {1:{'weight':1}}} # single edge (0,1) >>> G=nx.from_dict_of_dicts(dod) or >>> G=nx.Graph(dod) # use Graph constructor """ G=_prep_create_using(create_using) G.add_nodes_from(d) # is dict a MultiGraph or MultiDiGraph? if multigraph_input: # make a copy of the list of edge data (but not the edge data) if G.is_directed(): if G.is_multigraph(): G.add_edges_from( (u,v,key,data) for u,nbrs in d.items() for v,datadict in nbrs.items() for key,data in datadict.items() ) else: G.add_edges_from( (u,v,data) for u,nbrs in d.items() for v,datadict in nbrs.items() for key,data in datadict.items() ) else: # Undirected if G.is_multigraph(): seen=set() # don't add both directions of undirected graph for u,nbrs in d.items(): for v,datadict in nbrs.items(): if (u,v) not in seen: G.add_edges_from( (u,v,key,data) for key,data in datadict.items() ) seen.add((v,u)) else: seen=set() # don't add both directions of undirected graph for u,nbrs in d.items(): for v,datadict in nbrs.items(): if (u,v) not in seen: G.add_edges_from( (u,v,data) for key,data in datadict.items() ) seen.add((v,u)) else: # not a multigraph to multigraph transfer if G.is_multigraph() and not G.is_directed(): # d can have both representations u-v, v-u in dict. Only add one. # We don't need this check for digraphs since we add both directions, # or for Graph() since it is done implicitly (parallel edges not allowed) seen=set() for u,nbrs in d.items(): for v,data in nbrs.items(): if (u,v) not in seen: G.add_edge(u,v,attr_dict=data) seen.add((v,u)) else: G.add_edges_from( ( (u,v,data) for u,nbrs in d.items() for v,data in nbrs.items()) ) return G def to_edgelist(G,nodelist=None): """Return a list of edges in the graph. Parameters ---------- G : graph A NetworkX graph nodelist : list Use only nodes specified in nodelist """ if nodelist is None: return G.edges(data=True) else: return G.edges(nodelist,data=True) def from_edgelist(edgelist,create_using=None): """Return a graph from a list of edges. Parameters ---------- edgelist : list or iterator Edge tuples create_using : NetworkX graph Use specified graph for result. Otherwise a new graph is created. Examples -------- >>> edgelist= [(0,1)] # single edge (0,1) >>> G=nx.from_edgelist(edgelist) or >>> G=nx.Graph(edgelist) # use Graph constructor """ G=_prep_create_using(create_using) G.add_edges_from(edgelist) return G networkx-1.11/networkx/convert_matrix.py0000644000175000017500000010106112653165361020501 0ustar aricaric00000000000000"""Functions to convert NetworkX graphs to and from numpy/scipy matrices. The preferred way of converting data to a NetworkX graph is through the graph constuctor. The constructor calls the to_networkx_graph() function which attempts to guess the input type and convert it automatically. Examples -------- Create a 10 node random graph from a numpy matrix >>> import numpy >>> a = numpy.reshape(numpy.random.random_integers(0,1,size=100),(10,10)) >>> D = nx.DiGraph(a) or equivalently >>> D = nx.to_networkx_graph(a,create_using=nx.DiGraph()) See Also -------- nx_agraph, nx_pydot """ # Copyright (C) 2006-2014 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import warnings import itertools import networkx as nx from networkx.convert import _prep_create_using from networkx.utils import not_implemented_for __author__ = """\n""".join(['Aric Hagberg ', 'Pieter Swart (swart@lanl.gov)', 'Dan Schult(dschult@colgate.edu)']) __all__ = ['from_numpy_matrix', 'to_numpy_matrix', 'from_pandas_dataframe', 'to_pandas_dataframe', 'to_numpy_recarray', 'from_scipy_sparse_matrix', 'to_scipy_sparse_matrix'] def to_pandas_dataframe(G, nodelist=None, multigraph_weight=sum, weight='weight', nonedge=0.0): """Return the graph adjacency matrix as a Pandas DataFrame. Parameters ---------- G : graph The NetworkX graph used to construct the Pandas DataFrame. nodelist : list, optional The rows and columns are ordered according to the nodes in `nodelist`. If `nodelist` is None, then the ordering is produced by G.nodes(). multigraph_weight : {sum, min, max}, optional An operator that determines how weights in multigraphs are handled. The default is to sum the weights of the multiple edges. weight : string or None, optional The edge attribute that holds the numerical value used for the edge weight. If an edge does not have that attribute, then the value 1 is used instead. nonedge : float, optional The matrix values corresponding to nonedges are typically set to zero. However, this could be undesirable if there are matrix values corresponding to actual edges that also have the value zero. If so, one might prefer nonedges to have some other value, such as nan. Returns ------- df : Pandas DataFrame Graph adjacency matrix Notes ----- The DataFrame entries are assigned to the weight edge attribute. When an edge does not have a weight attribute, the value of the entry is set to the number 1. For multiple (parallel) edges, the values of the entries are determined by the 'multigraph_weight' parameter. The default is to sum the weight attributes for each of the parallel edges. When `nodelist` does not contain every node in `G`, the matrix is built from the subgraph of `G` that is induced by the nodes in `nodelist`. The convention used for self-loop edges in graphs is to assign the diagonal matrix entry value to the weight attribute of the edge (or the number 1 if the edge has no weight attribute). If the alternate convention of doubling the edge weight is desired the resulting Pandas DataFrame can be modified as follows: >>> import pandas as pd >>> import numpy as np >>> G = nx.Graph([(1,1)]) >>> df = nx.to_pandas_dataframe(G) >>> df 1 1 1 >>> df.values[np.diag_indices_from(df)] *= 2 >>> df 1 1 2 Examples -------- >>> G = nx.MultiDiGraph() >>> G.add_edge(0,1,weight=2) >>> G.add_edge(1,0) >>> G.add_edge(2,2,weight=3) >>> G.add_edge(2,2) >>> nx.to_pandas_dataframe(G, nodelist=[0,1,2]) 0 1 2 0 0 2 0 1 1 0 0 2 0 0 4 """ import pandas as pd M = to_numpy_matrix(G, nodelist, None, None, multigraph_weight, weight, nonedge) if nodelist is None: nodelist = G.nodes() nodeset = set(nodelist) df = pd.DataFrame(data=M, index = nodelist ,columns = nodelist) return df def from_pandas_dataframe(df, source, target, edge_attr=None, create_using=None): """Return a graph from Pandas DataFrame. The Pandas DataFrame should contain at least two columns of node names and zero or more columns of node attributes. Each row will be processed as one edge instance. Note: This function iterates over DataFrame.values, which is not guaranteed to retain the data type across columns in the row. This is only a problem if your row is entirely numeric and a mix of ints and floats. In that case, all values will be returned as floats. See the DataFrame.iterrows documentation for an example. Parameters ---------- df : Pandas DataFrame An edge list representation of a graph source : str or int A valid column name (string or iteger) for the source nodes (for the directed case). target : str or int A valid column name (string or iteger) for the target nodes (for the directed case). edge_attr : str or int, iterable, True A valid column name (str or integer) or list of column names that will be used to retrieve items from the row and add them to the graph as edge attributes. If `True`, all of the remaining columns will be added. create_using : NetworkX graph Use specified graph for result. The default is Graph() See Also -------- to_pandas_dataframe Examples -------- Simple integer weights on edges: >>> import pandas as pd >>> import numpy as np >>> r = np.random.RandomState(seed=5) >>> ints = r.random_integers(1, 10, size=(3,2)) >>> a = ['A', 'B', 'C'] >>> b = ['D', 'A', 'E'] >>> df = pd.DataFrame(ints, columns=['weight', 'cost']) >>> df[0] = a >>> df['b'] = b >>> df weight cost 0 b 0 4 7 A D 1 7 1 B A 2 10 9 C E >>> G=nx.from_pandas_dataframe(df, 0, 'b', ['weight', 'cost']) >>> G['E']['C']['weight'] 10 >>> G['E']['C']['cost'] 9 """ g = _prep_create_using(create_using) # Index of source and target src_i = df.columns.get_loc(source) tar_i = df.columns.get_loc(target) if edge_attr: # If all additional columns requested, build up a list of tuples # [(name, index),...] if edge_attr is True: # Create a list of all columns indices, ignore nodes edge_i = [] for i, col in enumerate(df.columns): if col is not source and col is not target: edge_i.append((col, i)) # If a list or tuple of name is requested elif isinstance(edge_attr, (list, tuple)): edge_i = [(i, df.columns.get_loc(i)) for i in edge_attr] # If a string or int is passed else: edge_i = [(edge_attr, df.columns.get_loc(edge_attr)),] # Iteration on values returns the rows as Numpy arrays for row in df.values: g.add_edge(row[src_i], row[tar_i], {i:row[j] for i, j in edge_i}) # If no column names are given, then just return the edges. else: for row in df.values: g.add_edge(row[src_i], row[tar_i]) return g def to_numpy_matrix(G, nodelist=None, dtype=None, order=None, multigraph_weight=sum, weight='weight', nonedge=0.0): """Return the graph adjacency matrix as a NumPy matrix. Parameters ---------- G : graph The NetworkX graph used to construct the NumPy matrix. nodelist : list, optional The rows and columns are ordered according to the nodes in ``nodelist``. If ``nodelist`` is None, then the ordering is produced by G.nodes(). dtype : NumPy data type, optional A valid single NumPy data type used to initialize the array. This must be a simple type such as int or numpy.float64 and not a compound data type (see to_numpy_recarray) If None, then the NumPy default is used. order : {'C', 'F'}, optional Whether to store multidimensional data in C- or Fortran-contiguous (row- or column-wise) order in memory. If None, then the NumPy default is used. multigraph_weight : {sum, min, max}, optional An operator that determines how weights in multigraphs are handled. The default is to sum the weights of the multiple edges. weight : string or None optional (default = 'weight') The edge attribute that holds the numerical value used for the edge weight. If an edge does not have that attribute, then the value 1 is used instead. nonedge : float (default = 0.0) The matrix values corresponding to nonedges are typically set to zero. However, this could be undesirable if there are matrix values corresponding to actual edges that also have the value zero. If so, one might prefer nonedges to have some other value, such as nan. Returns ------- M : NumPy matrix Graph adjacency matrix See Also -------- to_numpy_recarray, from_numpy_matrix Notes ----- The matrix entries are assigned to the weight edge attribute. When an edge does not have a weight attribute, the value of the entry is set to the number 1. For multiple (parallel) edges, the values of the entries are determined by the ``multigraph_weight`` parameter. The default is to sum the weight attributes for each of the parallel edges. When ``nodelist`` does not contain every node in ``G``, the matrix is built from the subgraph of ``G`` that is induced by the nodes in ``nodelist``. The convention used for self-loop edges in graphs is to assign the diagonal matrix entry value to the weight attribute of the edge (or the number 1 if the edge has no weight attribute). If the alternate convention of doubling the edge weight is desired the resulting Numpy matrix can be modified as follows: >>> import numpy as np >>> G = nx.Graph([(1, 1)]) >>> A = nx.to_numpy_matrix(G) >>> A matrix([[ 1.]]) >>> A.A[np.diag_indices_from(A)] *= 2 >>> A matrix([[ 2.]]) Examples -------- >>> G = nx.MultiDiGraph() >>> G.add_edge(0,1,weight=2) >>> G.add_edge(1,0) >>> G.add_edge(2,2,weight=3) >>> G.add_edge(2,2) >>> nx.to_numpy_matrix(G, nodelist=[0,1,2]) matrix([[ 0., 2., 0.], [ 1., 0., 0.], [ 0., 0., 4.]]) """ import numpy as np if nodelist is None: nodelist = G.nodes() nodeset = set(nodelist) if len(nodelist) != len(nodeset): msg = "Ambiguous ordering: `nodelist` contained duplicates." raise nx.NetworkXError(msg) nlen=len(nodelist) undirected = not G.is_directed() index=dict(zip(nodelist,range(nlen))) # Initially, we start with an array of nans. Then we populate the matrix # using data from the graph. Afterwards, any leftover nans will be # converted to the value of `nonedge`. Note, we use nans initially, # instead of zero, for two reasons: # # 1) It can be important to distinguish a real edge with the value 0 # from a nonedge with the value 0. # # 2) When working with multi(di)graphs, we must combine the values of all # edges between any two nodes in some manner. This often takes the # form of a sum, min, or max. Using the value 0 for a nonedge would # have undesirable effects with min and max, but using nanmin and # nanmax with initially nan values is not problematic at all. # # That said, there are still some drawbacks to this approach. Namely, if # a real edge is nan, then that value is a) not distinguishable from # nonedges and b) is ignored by the default combinator (nansum, nanmin, # nanmax) functions used for multi(di)graphs. If this becomes an issue, # an alternative approach is to use masked arrays. Initially, every # element is masked and set to some `initial` value. As we populate the # graph, elements are unmasked (automatically) when we combine the initial # value with the values given by real edges. At the end, we convert all # masked values to `nonedge`. Using masked arrays fully addresses reason 1, # but for reason 2, we would still have the issue with min and max if the # initial values were 0.0. Note: an initial value of +inf is appropriate # for min, while an initial value of -inf is appropriate for max. When # working with sum, an initial value of zero is appropriate. Ideally then, # we'd want to allow users to specify both a value for nonedges and also # an initial value. For multi(di)graphs, the choice of the initial value # will, in general, depend on the combinator function---sensible defaults # can be provided. if G.is_multigraph(): # Handle MultiGraphs and MultiDiGraphs M = np.zeros((nlen, nlen), dtype=dtype, order=order) + np.nan # use numpy nan-aware operations operator={sum:np.nansum, min:np.nanmin, max:np.nanmax} try: op=operator[multigraph_weight] except: raise ValueError('multigraph_weight must be sum, min, or max') for u,v,attrs in G.edges_iter(data=True): if (u in nodeset) and (v in nodeset): i, j = index[u], index[v] e_weight = attrs.get(weight, 1) M[i,j] = op([e_weight, M[i,j]]) if undirected: M[j,i] = M[i,j] else: # Graph or DiGraph, this is much faster than above M = np.zeros((nlen,nlen), dtype=dtype, order=order) + np.nan for u,nbrdict in G.adjacency_iter(): for v,d in nbrdict.items(): try: M[index[u],index[v]] = d.get(weight,1) except KeyError: # This occurs when there are fewer desired nodes than # there are nodes in the graph: len(nodelist) < len(G) pass M[np.isnan(M)] = nonedge M = np.asmatrix(M) return M def from_numpy_matrix(A, parallel_edges=False, create_using=None): """Return a graph from numpy matrix. The numpy matrix is interpreted as an adjacency matrix for the graph. Parameters ---------- A : numpy matrix An adjacency matrix representation of a graph parallel_edges : Boolean If this is ``True``, ``create_using`` is a multigraph, and ``A`` is an integer matrix, then entry *(i, j)* in the matrix is interpreted as the number of parallel edges joining vertices *i* and *j* in the graph. If it is ``False``, then the entries in the adjacency matrix are interpreted as the weight of a single edge joining the vertices. create_using : NetworkX graph Use specified graph for result. The default is Graph() Notes ----- If ``create_using`` is an instance of :class:`networkx.MultiGraph` or :class:`networkx.MultiDiGraph`, ``parallel_edges`` is ``True``, and the entries of ``A`` are of type ``int``, then this function returns a multigraph (of the same type as ``create_using``) with parallel edges. If ``create_using`` is an undirected multigraph, then only the edges indicated by the upper triangle of the matrix `A` will be added to the graph. If the numpy matrix has a single data type for each matrix entry it will be converted to an appropriate Python data type. If the numpy matrix has a user-specified compound data type the names of the data fields will be used as attribute keys in the resulting NetworkX graph. See Also -------- to_numpy_matrix, to_numpy_recarray Examples -------- Simple integer weights on edges: >>> import numpy >>> A=numpy.matrix([[1, 1], [2, 1]]) >>> G=nx.from_numpy_matrix(A) If ``create_using`` is a multigraph and the matrix has only integer entries, the entries will be interpreted as weighted edges joining the vertices (without creating parallel edges): >>> import numpy >>> A = numpy.matrix([[1, 1], [1, 2]]) >>> G = nx.from_numpy_matrix(A, create_using = nx.MultiGraph()) >>> G[1][1] {0: {'weight': 2}} If ``create_using`` is a multigraph and the matrix has only integer entries but ``parallel_edges`` is ``True``, then the entries will be interpreted as the number of parallel edges joining those two vertices: >>> import numpy >>> A = numpy.matrix([[1, 1], [1, 2]]) >>> temp = nx.MultiGraph() >>> G = nx.from_numpy_matrix(A, parallel_edges = True, create_using = temp) >>> G[1][1] {0: {'weight': 1}, 1: {'weight': 1}} User defined compound data type on edges: >>> import numpy >>> dt = [('weight', float), ('cost', int)] >>> A = numpy.matrix([[(1.0, 2)]], dtype = dt) >>> G = nx.from_numpy_matrix(A) >>> G.edges() [(0, 0)] >>> G[0][0]['cost'] 2 >>> G[0][0]['weight'] 1.0 """ # This should never fail if you have created a numpy matrix with numpy... import numpy as np kind_to_python_type={'f':float, 'i':int, 'u':int, 'b':bool, 'c':complex, 'S':str, 'V':'void'} try: # Python 3.x blurb = chr(1245) # just to trigger the exception kind_to_python_type['U']=str except ValueError: # Python 2.6+ kind_to_python_type['U']=unicode G=_prep_create_using(create_using) n,m=A.shape if n!=m: raise nx.NetworkXError("Adjacency matrix is not square.", "nx,ny=%s"%(A.shape,)) dt=A.dtype try: python_type=kind_to_python_type[dt.kind] except: raise TypeError("Unknown numpy data type: %s"%dt) # Make sure we get even the isolated nodes of the graph. G.add_nodes_from(range(n)) # Get a list of all the entries in the matrix with nonzero entries. These # coordinates will become the edges in the graph. edges = zip(*(np.asarray(A).nonzero())) # handle numpy constructed data type if python_type is 'void': # Sort the fields by their offset, then by dtype, then by name. fields = sorted((offset, dtype, name) for name, (dtype, offset) in A.dtype.fields.items()) triples = ((u, v, {name: kind_to_python_type[dtype.kind](val) for (_, dtype, name), val in zip(fields, A[u, v])}) for u, v in edges) # If the entries in the adjacency matrix are integers, the graph is a # multigraph, and parallel_edges is True, then create parallel edges, each # with weight 1, for each entry in the adjacency matrix. Otherwise, create # one edge for each positive entry in the adjacency matrix and set the # weight of that edge to be the entry in the matrix. elif python_type is int and G.is_multigraph() and parallel_edges: chain = itertools.chain.from_iterable # The following line is equivalent to: # # for (u, v) in edges: # for d in range(A[u, v]): # G.add_edge(u, v, weight=1) # triples = chain(((u, v, dict(weight=1)) for d in range(A[u, v])) for (u, v) in edges) else: # basic data type triples = ((u, v, dict(weight=python_type(A[u, v]))) for u, v in edges) # If we are creating an undirected multigraph, only add the edges from the # upper triangle of the matrix. Otherwise, add all the edges. This relies # on the fact that the vertices created in the # ``_generated_weighted_edges()`` function are actually the row/column # indices for the matrix ``A``. # # Without this check, we run into a problem where each edge is added twice # when ``G.add_edges_from()`` is invoked below. if G.is_multigraph() and not G.is_directed(): triples = ((u, v, d) for u, v, d in triples if u <= v) G.add_edges_from(triples) return G @not_implemented_for('multigraph') def to_numpy_recarray(G,nodelist=None, dtype=[('weight',float)], order=None): """Return the graph adjacency matrix as a NumPy recarray. Parameters ---------- G : graph The NetworkX graph used to construct the NumPy matrix. nodelist : list, optional The rows and columns are ordered according to the nodes in `nodelist`. If `nodelist` is None, then the ordering is produced by G.nodes(). dtype : NumPy data-type, optional A valid NumPy named dtype used to initialize the NumPy recarray. The data type names are assumed to be keys in the graph edge attribute dictionary. order : {'C', 'F'}, optional Whether to store multidimensional data in C- or Fortran-contiguous (row- or column-wise) order in memory. If None, then the NumPy default is used. Returns ------- M : NumPy recarray The graph with specified edge data as a Numpy recarray Notes ----- When `nodelist` does not contain every node in `G`, the matrix is built from the subgraph of `G` that is induced by the nodes in `nodelist`. Examples -------- >>> G = nx.Graph() >>> G.add_edge(1,2,weight=7.0,cost=5) >>> A=nx.to_numpy_recarray(G,dtype=[('weight',float),('cost',int)]) >>> print(A.weight) [[ 0. 7.] [ 7. 0.]] >>> print(A.cost) [[0 5] [5 0]] """ import numpy as np if nodelist is None: nodelist = G.nodes() nodeset = set(nodelist) if len(nodelist) != len(nodeset): msg = "Ambiguous ordering: `nodelist` contained duplicates." raise nx.NetworkXError(msg) nlen=len(nodelist) undirected = not G.is_directed() index=dict(zip(nodelist,range(nlen))) M = np.zeros((nlen,nlen), dtype=dtype, order=order) names=M.dtype.names for u,v,attrs in G.edges_iter(data=True): if (u in nodeset) and (v in nodeset): i,j = index[u],index[v] values=tuple([attrs[n] for n in names]) M[i,j] = values if undirected: M[j,i] = M[i,j] return M.view(np.recarray) def to_scipy_sparse_matrix(G, nodelist=None, dtype=None, weight='weight', format='csr'): """Return the graph adjacency matrix as a SciPy sparse matrix. Parameters ---------- G : graph The NetworkX graph used to construct the NumPy matrix. nodelist : list, optional The rows and columns are ordered according to the nodes in `nodelist`. If `nodelist` is None, then the ordering is produced by G.nodes(). dtype : NumPy data-type, optional A valid NumPy dtype used to initialize the array. If None, then the NumPy default is used. weight : string or None optional (default='weight') The edge attribute that holds the numerical value used for the edge weight. If None then all edge weights are 1. format : str in {'bsr', 'csr', 'csc', 'coo', 'lil', 'dia', 'dok'} The type of the matrix to be returned (default 'csr'). For some algorithms different implementations of sparse matrices can perform better. See [1]_ for details. Returns ------- M : SciPy sparse matrix Graph adjacency matrix. Notes ----- The matrix entries are populated using the edge attribute held in parameter weight. When an edge does not have that attribute, the value of the entry is 1. For multiple edges the matrix values are the sums of the edge weights. When `nodelist` does not contain every node in `G`, the matrix is built from the subgraph of `G` that is induced by the nodes in `nodelist`. Uses coo_matrix format. To convert to other formats specify the format= keyword. The convention used for self-loop edges in graphs is to assign the diagonal matrix entry value to the weight attribute of the edge (or the number 1 if the edge has no weight attribute). If the alternate convention of doubling the edge weight is desired the resulting Scipy sparse matrix can be modified as follows: >>> import scipy as sp >>> G = nx.Graph([(1,1)]) >>> A = nx.to_scipy_sparse_matrix(G) >>> print(A.todense()) [[1]] >>> A.setdiag(A.diagonal()*2) >>> print(A.todense()) [[2]] Examples -------- >>> G = nx.MultiDiGraph() >>> G.add_edge(0,1,weight=2) >>> G.add_edge(1,0) >>> G.add_edge(2,2,weight=3) >>> G.add_edge(2,2) >>> S = nx.to_scipy_sparse_matrix(G, nodelist=[0,1,2]) >>> print(S.todense()) [[0 2 0] [1 0 0] [0 0 4]] References ---------- .. [1] Scipy Dev. References, "Sparse Matrices", http://docs.scipy.org/doc/scipy/reference/sparse.html """ from scipy import sparse if nodelist is None: nodelist = G nlen = len(nodelist) if nlen == 0: raise nx.NetworkXError("Graph has no nodes or edges") if len(nodelist) != len(set(nodelist)): msg = "Ambiguous ordering: `nodelist` contained duplicates." raise nx.NetworkXError(msg) index = dict(zip(nodelist,range(nlen))) if G.number_of_edges() == 0: row,col,data=[],[],[] else: row,col,data = zip(*((index[u],index[v],d.get(weight,1)) for u,v,d in G.edges_iter(nodelist, data=True) if u in index and v in index)) if G.is_directed(): M = sparse.coo_matrix((data,(row,col)), shape=(nlen,nlen), dtype=dtype) else: # symmetrize matrix d = data + data r = row + col c = col + row # selfloop entries get double counted when symmetrizing # so we subtract the data on the diagonal selfloops = G.selfloop_edges(data=True) if selfloops: diag_index,diag_data = zip(*((index[u],-d.get(weight,1)) for u,v,d in selfloops if u in index and v in index)) d += diag_data r += diag_index c += diag_index M = sparse.coo_matrix((d, (r, c)), shape=(nlen,nlen), dtype=dtype) try: return M.asformat(format) except AttributeError: raise nx.NetworkXError("Unknown sparse matrix format: %s"%format) def _csr_gen_triples(A): """Converts a SciPy sparse matrix in **Compressed Sparse Row** format to an iterable of weighted edge triples. """ nrows = A.shape[0] data, indices, indptr = A.data, A.indices, A.indptr for i in range(nrows): for j in range(indptr[i], indptr[i+1]): yield i, indices[j], data[j] def _csc_gen_triples(A): """Converts a SciPy sparse matrix in **Compressed Sparse Column** format to an iterable of weighted edge triples. """ ncols = A.shape[1] data, indices, indptr = A.data, A.indices, A.indptr for i in range(ncols): for j in range(indptr[i], indptr[i+1]): yield indices[j], i, data[j] def _coo_gen_triples(A): """Converts a SciPy sparse matrix in **Coordinate** format to an iterable of weighted edge triples. """ row, col, data = A.row, A.col, A.data return zip(row, col, data) def _dok_gen_triples(A): """Converts a SciPy sparse matrix in **Dictionary of Keys** format to an iterable of weighted edge triples. """ for (r, c), v in A.items(): yield r, c, v def _generate_weighted_edges(A): """Returns an iterable over (u, v, w) triples, where u and v are adjacent vertices and w is the weight of the edge joining u and v. `A` is a SciPy sparse matrix (in any format). """ if A.format == 'csr': return _csr_gen_triples(A) if A.format == 'csc': return _csc_gen_triples(A) if A.format == 'dok': return _dok_gen_triples(A) # If A is in any other format (including COO), convert it to COO format. return _coo_gen_triples(A.tocoo()) def from_scipy_sparse_matrix(A, parallel_edges=False, create_using=None, edge_attribute='weight'): """Creates a new graph from an adjacency matrix given as a SciPy sparse matrix. Parameters ---------- A: scipy sparse matrix An adjacency matrix representation of a graph parallel_edges : Boolean If this is ``True``, `create_using` is a multigraph, and `A` is an integer matrix, then entry *(i, j)* in the matrix is interpreted as the number of parallel edges joining vertices *i* and *j* in the graph. If it is ``False``, then the entries in the adjacency matrix are interpreted as the weight of a single edge joining the vertices. create_using: NetworkX graph Use specified graph for result. The default is Graph() edge_attribute: string Name of edge attribute to store matrix numeric value. The data will have the same type as the matrix entry (int, float, (real,imag)). Notes ----- If `create_using` is an instance of :class:`networkx.MultiGraph` or :class:`networkx.MultiDiGraph`, `parallel_edges` is ``True``, and the entries of `A` are of type ``int``, then this function returns a multigraph (of the same type as `create_using`) with parallel edges. In this case, `edge_attribute` will be ignored. If `create_using` is an undirected multigraph, then only the edges indicated by the upper triangle of the matrix `A` will be added to the graph. Examples -------- >>> import scipy.sparse >>> A = scipy.sparse.eye(2,2,1) >>> G = nx.from_scipy_sparse_matrix(A) If `create_using` is a multigraph and the matrix has only integer entries, the entries will be interpreted as weighted edges joining the vertices (without creating parallel edges): >>> import scipy >>> A = scipy.sparse.csr_matrix([[1, 1], [1, 2]]) >>> G = nx.from_scipy_sparse_matrix(A, create_using=nx.MultiGraph()) >>> G[1][1] {0: {'weight': 2}} If `create_using` is a multigraph and the matrix has only integer entries but `parallel_edges` is ``True``, then the entries will be interpreted as the number of parallel edges joining those two vertices: >>> import scipy >>> A = scipy.sparse.csr_matrix([[1, 1], [1, 2]]) >>> G = nx.from_scipy_sparse_matrix(A, parallel_edges=True, ... create_using=nx.MultiGraph()) >>> G[1][1] {0: {'weight': 1}, 1: {'weight': 1}} """ G = _prep_create_using(create_using) n,m = A.shape if n != m: raise nx.NetworkXError(\ "Adjacency matrix is not square. nx,ny=%s"%(A.shape,)) # Make sure we get even the isolated nodes of the graph. G.add_nodes_from(range(n)) # Create an iterable over (u, v, w) triples and for each triple, add an # edge from u to v with weight w. triples = _generate_weighted_edges(A) # If the entries in the adjacency matrix are integers, the graph is a # multigraph, and parallel_edges is True, then create parallel edges, each # with weight 1, for each entry in the adjacency matrix. Otherwise, create # one edge for each positive entry in the adjacency matrix and set the # weight of that edge to be the entry in the matrix. if A.dtype.kind in ('i', 'u') and G.is_multigraph() and parallel_edges: chain = itertools.chain.from_iterable # The following line is equivalent to: # # for (u, v) in edges: # for d in range(A[u, v]): # G.add_edge(u, v, weight=1) # triples = chain(((u, v, 1) for d in range(w)) for (u, v, w) in triples) # If we are creating an undirected multigraph, only add the edges from the # upper triangle of the matrix. Otherwise, add all the edges. This relies # on the fact that the vertices created in the # ``_generated_weighted_edges()`` function are actually the row/column # indices for the matrix ``A``. # # Without this check, we run into a problem where each edge is added twice # when `G.add_weighted_edges_from()` is invoked below. if G.is_multigraph() and not G.is_directed(): triples = ((u, v, d) for u, v, d in triples if u <= v) G.add_weighted_edges_from(triples, weight=edge_attribute) return G # fixture for nose tests def setup_module(module): from nose import SkipTest try: import numpy except: raise SkipTest("NumPy not available") try: import scipy except: raise SkipTest("SciPy not available") try: import pandas except: raise SkipTest("Pandas not available") networkx-1.11/networkx/__init__.py0000644000175000017500000000455112637544450017204 0ustar aricaric00000000000000""" NetworkX ======== NetworkX (NX) is a Python package for the creation, manipulation, and study of the structure, dynamics, and functions of complex networks. https://networkx.lanl.gov/ Using ----- Just write in Python >>> import networkx as nx >>> G=nx.Graph() >>> G.add_edge(1,2) >>> G.add_node(42) >>> print(sorted(G.nodes())) [1, 2, 42] >>> print(sorted(G.edges())) [(1, 2)] """ # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. # # Add platform dependent shared library path to sys.path # from __future__ import absolute_import import sys if sys.version_info[:2] < (2, 7): m = "Python 2.7 or later is required for NetworkX (%d.%d detected)." raise ImportError(m % sys.version_info[:2]) del sys # Release data from networkx import release __author__ = '%s <%s>\n%s <%s>\n%s <%s>' % \ (release.authors['Hagberg'] + release.authors['Schult'] + release.authors['Swart']) __license__ = release.license __date__ = release.date __version__ = release.version __bibtex__ = """@inproceedings{hagberg-2008-exploring, author = {Aric A. Hagberg and Daniel A. Schult and Pieter J. Swart}, title = {Exploring network structure, dynamics, and function using {NetworkX}}, year = {2008}, month = Aug, urlpdf = {http://math.lanl.gov/~hagberg/Papers/hagberg-2008-exploring.pdf}, booktitle = {Proceedings of the 7th Python in Science Conference (SciPy2008)}, editors = {G\"{a}el Varoquaux, Travis Vaught, and Jarrod Millman}, address = {Pasadena, CA USA}, pages = {11--15} }""" # These are import orderwise from networkx.exception import * import networkx.external import networkx.utils import networkx.classes from networkx.classes import * import networkx.convert from networkx.convert import * import networkx.convert_matrix from networkx.convert_matrix import * import networkx.relabel from networkx.relabel import * import networkx.generators from networkx.generators import * import networkx.readwrite from networkx.readwrite import * # Need to test with SciPy, when available import networkx.algorithms from networkx.algorithms import * import networkx.linalg from networkx.linalg import * from networkx.tests.test import run as test import networkx.drawing from networkx.drawing import * networkx-1.11/networkx/drawing/0000755000175000017500000000000012653231454016514 5ustar aricaric00000000000000networkx-1.11/networkx/drawing/nx_pydot.py0000644000175000017500000001733512653165361020746 0ustar aricaric00000000000000""" ***** Pydot ***** Import and export NetworkX graphs in Graphviz dot format using pydotplus. Either this module or nx_agraph can be used to interface with graphviz. See Also -------- PyDotPlus: https://github.com/carlos-jenkins/pydotplus Graphviz: http://www.research.att.com/sw/tools/graphviz/ DOT Language: http://www.graphviz.org/doc/info/lang.html """ # Author: Aric Hagberg (aric.hagberg@gmail.com) # Copyright (C) 2004-2016 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import importlib from networkx.utils import open_file, make_str import networkx as nx __all__ = ['write_dot', 'read_dot', 'graphviz_layout', 'pydot_layout', 'to_pydot', 'from_pydot'] # 2.x/3.x compatibility try: basestring except NameError: basestring = str @open_file(1, mode='w') def write_dot(G, path): """Write NetworkX graph G to Graphviz dot format on path. Path can be a string or a file handle. """ P=to_pydot(G) path.write(P.to_string()) return @open_file(0, mode='r') def read_dot(path): """Return a NetworkX MultiGraph or MultiDiGraph from a dot file on path. Parameters ---------- path : filename or file handle Returns ------- G : NetworkX multigraph A MultiGraph or MultiDiGraph. Notes ----- Use G = nx.Graph(read_dot(path)) to return a Graph instead of a MultiGraph. """ import pydotplus data = path.read() P = pydotplus.graph_from_dot_data(data) return from_pydot(P) def from_pydot(P): """Return a NetworkX graph from a Pydot graph. Parameters ---------- P : Pydot graph A graph created with Pydot Returns ------- G : NetworkX multigraph A MultiGraph or MultiDiGraph. Examples -------- >>> K5 = nx.complete_graph(5) >>> A = nx.nx_pydot.to_pydot(K5) >>> G = nx.nx_pydot.from_pydot(A) # return MultiGraph # make a Graph instead of MultiGraph >>> G = nx.Graph(nx.nx_pydot.from_pydot(A)) """ if P.get_strict(None): # pydot bug: get_strict() shouldn't take argument multiedges=False else: multiedges=True if P.get_type()=='graph': # undirected if multiedges: N = nx.MultiGraph() else: N = nx.Graph() else: if multiedges: N = nx.MultiDiGraph() else: N = nx.DiGraph() # assign defaults name=P.get_name().strip('"') if name != '': N.name = name # add nodes, attributes to N.node_attr for p in P.get_node_list(): n=p.get_name().strip('"') if n in ('node','graph','edge'): continue N.add_node(n,**p.get_attributes()) # add edges for e in P.get_edge_list(): u=e.get_source() v=e.get_destination() attr=e.get_attributes() s=[] d=[] if isinstance(u, basestring): s.append(u.strip('"')) else: for unodes in u['nodes']: s.append(unodes.strip('"')) if isinstance(v, basestring): d.append(v.strip('"')) else: for vnodes in v['nodes']: d.append(vnodes.strip('"')) for source_node in s: for destination_node in d: N.add_edge(source_node,destination_node,**attr) # add default attributes for graph, nodes, edges pattr = P.get_attributes() if pattr: N.graph['graph'] = pattr try: N.graph['node']=P.get_node_defaults()[0] except:# IndexError,TypeError: pass #N.graph['node']={} try: N.graph['edge']=P.get_edge_defaults()[0] except:# IndexError,TypeError: pass #N.graph['edge']={} return N def to_pydot(N, strict=True): """Return a pydot graph from a NetworkX graph N. Parameters ---------- N : NetworkX graph A graph created with NetworkX Examples -------- >>> K5 = nx.complete_graph(5) >>> P = nx.nx_pydot.to_pydot(K5) Notes ----- """ import pydotplus # set Graphviz graph type if N.is_directed(): graph_type='digraph' else: graph_type='graph' strict=N.number_of_selfloops()==0 and not N.is_multigraph() name = N.name graph_defaults=N.graph.get('graph',{}) if name is '': P = pydotplus.Dot('', graph_type=graph_type, strict=strict, **graph_defaults) else: P = pydotplus.Dot('"%s"'%name, graph_type=graph_type, strict=strict, **graph_defaults) try: P.set_node_defaults(**N.graph['node']) except KeyError: pass try: P.set_edge_defaults(**N.graph['edge']) except KeyError: pass for n,nodedata in N.nodes_iter(data=True): str_nodedata=dict((k,make_str(v)) for k,v in nodedata.items()) p=pydotplus.Node(make_str(n),**str_nodedata) P.add_node(p) if N.is_multigraph(): for u,v,key,edgedata in N.edges_iter(data=True,keys=True): str_edgedata=dict((k,make_str(v)) for k,v in edgedata.items()) edge=pydotplus.Edge(make_str(u), make_str(v), key=make_str(key), **str_edgedata) P.add_edge(edge) else: for u,v,edgedata in N.edges_iter(data=True): str_edgedata=dict((k,make_str(v)) for k,v in edgedata.items()) edge=pydotplus.Edge(make_str(u),make_str(v),**str_edgedata) P.add_edge(edge) return P def pydot_from_networkx(N): """Create a Pydot graph from a NetworkX graph.""" from warnings import warn warn('pydot_from_networkx is replaced by to_pydot', DeprecationWarning) return to_pydot(N) def networkx_from_pydot(D, create_using=None): """Create a NetworkX graph from a Pydot graph.""" from warnings import warn warn('networkx_from_pydot is replaced by from_pydot', DeprecationWarning) return from_pydot(D) def graphviz_layout(G,prog='neato',root=None, **kwds): """Create node positions using Pydot and Graphviz. Returns a dictionary of positions keyed by node. Examples -------- >>> G = nx.complete_graph(4) >>> pos = nx.nx_pydot.graphviz_layout(G) >>> pos = nx.nx_pydot.graphviz_layout(G, prog='dot') Notes ----- This is a wrapper for pydot_layout. """ return pydot_layout(G=G,prog=prog,root=root,**kwds) def pydot_layout(G,prog='neato',root=None, **kwds): """Create node positions using Pydot and Graphviz. Returns a dictionary of positions keyed by node. Examples -------- >>> G = nx.complete_graph(4) >>> pos = nx.nx_pydot.pydot_layout(G) >>> pos = nx.nx_pydot.pydot_layout(G, prog='dot') """ import pydotplus P=to_pydot(G) if root is not None : P.set("root",make_str(root)) D=P.create_dot(prog=prog) if D=="": # no data returned print("Graphviz layout with %s failed"%(prog)) print() print("To debug what happened try:") print("P=pydot_from_networkx(G)") print("P.write_dot(\"file.dot\")") print("And then run %s on file.dot"%(prog)) return Q=pydotplus.graph_from_dot_data(D) node_pos={} for n in G.nodes(): pydot_node = pydotplus.Node(make_str(n)).get_name() node=Q.get_node(pydot_node) if isinstance(node,list): node=node[0] pos=node.get_pos()[1:-1] # strip leading and trailing double quotes if pos != None: xx,yy=pos.split(",") node_pos[n]=(float(xx),float(yy)) return node_pos # fixture for nose tests def setup_module(module): from nose import SkipTest try: import pydotplus except ImportError: raise SkipTest("pydotplus not available") networkx-1.11/networkx/drawing/nx_pylab.py0000644000175000017500000007302212653165361020711 0ustar aricaric00000000000000""" ********** Matplotlib ********** Draw networks with matplotlib. See Also -------- matplotlib: http://matplotlib.org/ pygraphviz: http://pygraphviz.github.io/ """ # Author: Aric Hagberg (hagberg@lanl.gov) # Copyright (C) 2004-2016 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import networkx as nx from networkx.drawing.layout import shell_layout,\ circular_layout,spectral_layout,spring_layout,random_layout __all__ = ['draw', 'draw_networkx', 'draw_networkx_nodes', 'draw_networkx_edges', 'draw_networkx_labels', 'draw_networkx_edge_labels', 'draw_circular', 'draw_random', 'draw_spectral', 'draw_spring', 'draw_shell', 'draw_graphviz'] def draw(G, pos=None, ax=None, hold=None, **kwds): """Draw the graph G with Matplotlib. Draw the graph as a simple representation with no node labels or edge labels and using the full Matplotlib figure area and no axis labels by default. See draw_networkx() for more full-featured drawing that allows title, axis labels etc. Parameters ---------- G : graph A networkx graph pos : dictionary, optional A dictionary with nodes as keys and positions as values. If not specified a spring layout positioning will be computed. See networkx.layout for functions that compute node positions. ax : Matplotlib Axes object, optional Draw the graph in specified Matplotlib axes. hold : bool, optional Set the Matplotlib hold state. If True subsequent draw commands will be added to the current axes. kwds : optional keywords See networkx.draw_networkx() for a description of optional keywords. Examples -------- >>> G=nx.dodecahedral_graph() >>> nx.draw(G) >>> nx.draw(G,pos=nx.spring_layout(G)) # use spring layout See Also -------- draw_networkx() draw_networkx_nodes() draw_networkx_edges() draw_networkx_labels() draw_networkx_edge_labels() Notes ----- This function has the same name as pylab.draw and pyplot.draw so beware when using >>> from networkx import * since you might overwrite the pylab.draw function. With pyplot use >>> import matplotlib.pyplot as plt >>> import networkx as nx >>> G=nx.dodecahedral_graph() >>> nx.draw(G) # networkx draw() >>> plt.draw() # pyplot draw() Also see the NetworkX drawing examples at http://networkx.github.io/documentation/latest/gallery.html """ try: import matplotlib.pyplot as plt except ImportError: raise ImportError("Matplotlib required for draw()") except RuntimeError: print("Matplotlib unable to open display") raise if ax is None: cf = plt.gcf() else: cf = ax.get_figure() cf.set_facecolor('w') if ax is None: if cf._axstack() is None: ax = cf.add_axes((0, 0, 1, 1)) else: ax = cf.gca() if 'with_labels' not in kwds: kwds['with_labels'] = 'labels' in kwds b = plt.ishold() # allow callers to override the hold state by passing hold=True|False h = kwds.pop('hold', None) if h is not None: plt.hold(h) try: draw_networkx(G, pos=pos, ax=ax, **kwds) ax.set_axis_off() plt.draw_if_interactive() except: plt.hold(b) raise plt.hold(b) return def draw_networkx(G, pos=None, arrows=True, with_labels=True, **kwds): """Draw the graph G using Matplotlib. Draw the graph with Matplotlib with options for node positions, labeling, titles, and many other drawing features. See draw() for simple drawing without labels or axes. Parameters ---------- G : graph A networkx graph pos : dictionary, optional A dictionary with nodes as keys and positions as values. If not specified a spring layout positioning will be computed. See networkx.layout for functions that compute node positions. arrows : bool, optional (default=True) For directed graphs, if True draw arrowheads. with_labels : bool, optional (default=True) Set to True to draw labels on the nodes. ax : Matplotlib Axes object, optional Draw the graph in the specified Matplotlib axes. nodelist : list, optional (default G.nodes()) Draw only specified nodes edgelist : list, optional (default=G.edges()) Draw only specified edges node_size : scalar or array, optional (default=300) Size of nodes. If an array is specified it must be the same length as nodelist. node_color : color string, or array of floats, (default='r') Node color. Can be a single color format string, or a sequence of colors with the same length as nodelist. If numeric values are specified they will be mapped to colors using the cmap and vmin,vmax parameters. See matplotlib.scatter for more details. node_shape : string, optional (default='o') The shape of the node. Specification is as matplotlib.scatter marker, one of 'so^>v>> G=nx.dodecahedral_graph() >>> nx.draw(G) >>> nx.draw(G,pos=nx.spring_layout(G)) # use spring layout >>> import matplotlib.pyplot as plt >>> limits=plt.axis('off') # turn of axis Also see the NetworkX drawing examples at http://networkx.github.io/documentation/latest/gallery.html See Also -------- draw() draw_networkx_nodes() draw_networkx_edges() draw_networkx_labels() draw_networkx_edge_labels() """ try: import matplotlib.pyplot as plt except ImportError: raise ImportError("Matplotlib required for draw()") except RuntimeError: print("Matplotlib unable to open display") raise if pos is None: pos = nx.drawing.spring_layout(G) # default to spring layout node_collection = draw_networkx_nodes(G, pos, **kwds) edge_collection = draw_networkx_edges(G, pos, arrows=arrows, **kwds) if with_labels: draw_networkx_labels(G, pos, **kwds) plt.draw_if_interactive() def draw_networkx_nodes(G, pos, nodelist=None, node_size=300, node_color='r', node_shape='o', alpha=1.0, cmap=None, vmin=None, vmax=None, ax=None, linewidths=None, label=None, **kwds): """Draw the nodes of the graph G. This draws only the nodes of the graph G. Parameters ---------- G : graph A networkx graph pos : dictionary A dictionary with nodes as keys and positions as values. Positions should be sequences of length 2. ax : Matplotlib Axes object, optional Draw the graph in the specified Matplotlib axes. nodelist : list, optional Draw only specified nodes (default G.nodes()) node_size : scalar or array Size of nodes (default=300). If an array is specified it must be the same length as nodelist. node_color : color string, or array of floats Node color. Can be a single color format string (default='r'), or a sequence of colors with the same length as nodelist. If numeric values are specified they will be mapped to colors using the cmap and vmin,vmax parameters. See matplotlib.scatter for more details. node_shape : string The shape of the node. Specification is as matplotlib.scatter marker, one of 'so^>v>> G=nx.dodecahedral_graph() >>> nodes=nx.draw_networkx_nodes(G,pos=nx.spring_layout(G)) Also see the NetworkX drawing examples at http://networkx.github.io/documentation/latest/gallery.html See Also -------- draw() draw_networkx() draw_networkx_edges() draw_networkx_labels() draw_networkx_edge_labels() """ try: import matplotlib.pyplot as plt import numpy except ImportError: raise ImportError("Matplotlib required for draw()") except RuntimeError: print("Matplotlib unable to open display") raise if ax is None: ax = plt.gca() if nodelist is None: nodelist = G.nodes() if not nodelist or len(nodelist) == 0: # empty nodelist, no drawing return None try: xy = numpy.asarray([pos[v] for v in nodelist]) except KeyError as e: raise nx.NetworkXError('Node %s has no position.'%e) except ValueError: raise nx.NetworkXError('Bad value in node positions.') node_collection = ax.scatter(xy[:, 0], xy[:, 1], s=node_size, c=node_color, marker=node_shape, cmap=cmap, vmin=vmin, vmax=vmax, alpha=alpha, linewidths=linewidths, label=label) node_collection.set_zorder(2) return node_collection def draw_networkx_edges(G, pos, edgelist=None, width=1.0, edge_color='k', style='solid', alpha=1.0, edge_cmap=None, edge_vmin=None, edge_vmax=None, ax=None, arrows=True, label=None, **kwds): """Draw the edges of the graph G. This draws only the edges of the graph G. Parameters ---------- G : graph A networkx graph pos : dictionary A dictionary with nodes as keys and positions as values. Positions should be sequences of length 2. edgelist : collection of edge tuples Draw only specified edges(default=G.edges()) width : float, or array of floats Line width of edges (default=1.0) edge_color : color string, or array of floats Edge color. Can be a single color format string (default='r'), or a sequence of colors with the same length as edgelist. If numeric values are specified they will be mapped to colors using the edge_cmap and edge_vmin,edge_vmax parameters. style : string Edge line style (default='solid') (solid|dashed|dotted,dashdot) alpha : float The edge transparency (default=1.0) edge_ cmap : Matplotlib colormap Colormap for mapping intensities of edges (default=None) edge_vmin,edge_vmax : floats Minimum and maximum for edge colormap scaling (default=None) ax : Matplotlib Axes object, optional Draw the graph in the specified Matplotlib axes. arrows : bool, optional (default=True) For directed graphs, if True draw arrowheads. label : [None| string] Label for legend Returns ------- matplotlib.collection.LineCollection `LineCollection` of the edges Notes ----- For directed graphs, "arrows" (actually just thicker stubs) are drawn at the head end. Arrows can be turned off with keyword arrows=False. Yes, it is ugly but drawing proper arrows with Matplotlib this way is tricky. Examples -------- >>> G=nx.dodecahedral_graph() >>> edges=nx.draw_networkx_edges(G,pos=nx.spring_layout(G)) Also see the NetworkX drawing examples at http://networkx.github.io/documentation/latest/gallery.html See Also -------- draw() draw_networkx() draw_networkx_nodes() draw_networkx_labels() draw_networkx_edge_labels() """ try: import matplotlib import matplotlib.pyplot as plt import matplotlib.cbook as cb from matplotlib.colors import colorConverter, Colormap from matplotlib.collections import LineCollection import numpy except ImportError: raise ImportError("Matplotlib required for draw()") except RuntimeError: print("Matplotlib unable to open display") raise if ax is None: ax = plt.gca() if edgelist is None: edgelist = G.edges() if not edgelist or len(edgelist) == 0: # no edges! return None # set edge positions edge_pos = numpy.asarray([(pos[e[0]], pos[e[1]]) for e in edgelist]) if not cb.iterable(width): lw = (width,) else: lw = width if not cb.is_string_like(edge_color) \ and cb.iterable(edge_color) \ and len(edge_color) == len(edge_pos): if numpy.alltrue([cb.is_string_like(c) for c in edge_color]): # (should check ALL elements) # list of color letters such as ['k','r','k',...] edge_colors = tuple([colorConverter.to_rgba(c, alpha) for c in edge_color]) elif numpy.alltrue([not cb.is_string_like(c) for c in edge_color]): # If color specs are given as (rgb) or (rgba) tuples, we're OK if numpy.alltrue([cb.iterable(c) and len(c) in (3, 4) for c in edge_color]): edge_colors = tuple(edge_color) else: # numbers (which are going to be mapped with a colormap) edge_colors = None else: raise ValueError('edge_color must consist of either color names or numbers') else: if cb.is_string_like(edge_color) or len(edge_color) == 1: edge_colors = (colorConverter.to_rgba(edge_color, alpha), ) else: raise ValueError('edge_color must be a single color or list of exactly m colors where m is the number or edges') edge_collection = LineCollection(edge_pos, colors=edge_colors, linewidths=lw, antialiaseds=(1,), linestyle=style, transOffset = ax.transData, ) edge_collection.set_zorder(1) # edges go behind nodes edge_collection.set_label(label) ax.add_collection(edge_collection) # Note: there was a bug in mpl regarding the handling of alpha values for # each line in a LineCollection. It was fixed in matplotlib in r7184 and # r7189 (June 6 2009). We should then not set the alpha value globally, # since the user can instead provide per-edge alphas now. Only set it # globally if provided as a scalar. if cb.is_numlike(alpha): edge_collection.set_alpha(alpha) if edge_colors is None: if edge_cmap is not None: assert(isinstance(edge_cmap, Colormap)) edge_collection.set_array(numpy.asarray(edge_color)) edge_collection.set_cmap(edge_cmap) if edge_vmin is not None or edge_vmax is not None: edge_collection.set_clim(edge_vmin, edge_vmax) else: edge_collection.autoscale() arrow_collection = None if G.is_directed() and arrows: # a directed graph hack # draw thick line segments at head end of edge # waiting for someone else to implement arrows that will work arrow_colors = edge_colors a_pos = [] p = 1.0-0.25 # make head segment 25 percent of edge length for src, dst in edge_pos: x1, y1 = src x2, y2 = dst dx = x2-x1 # x offset dy = y2-y1 # y offset d = numpy.sqrt(float(dx**2 + dy**2)) # length of edge if d == 0: # source and target at same position continue if dx == 0: # vertical edge xa = x2 ya = dy*p+y1 if dy == 0: # horizontal edge ya = y2 xa = dx*p+x1 else: theta = numpy.arctan2(dy, dx) xa = p*d*numpy.cos(theta)+x1 ya = p*d*numpy.sin(theta)+y1 a_pos.append(((xa, ya), (x2, y2))) arrow_collection = LineCollection(a_pos, colors=arrow_colors, linewidths=[4*ww for ww in lw], antialiaseds=(1,), transOffset = ax.transData, ) arrow_collection.set_zorder(1) # edges go behind nodes arrow_collection.set_label(label) ax.add_collection(arrow_collection) # update view minx = numpy.amin(numpy.ravel(edge_pos[:, :, 0])) maxx = numpy.amax(numpy.ravel(edge_pos[:, :, 0])) miny = numpy.amin(numpy.ravel(edge_pos[:, :, 1])) maxy = numpy.amax(numpy.ravel(edge_pos[:, :, 1])) w = maxx-minx h = maxy-miny padx, pady = 0.05*w, 0.05*h corners = (minx-padx, miny-pady), (maxx+padx, maxy+pady) ax.update_datalim(corners) ax.autoscale_view() # if arrow_collection: return edge_collection def draw_networkx_labels(G, pos, labels=None, font_size=12, font_color='k', font_family='sans-serif', font_weight='normal', alpha=1.0, bbox=None, ax=None, **kwds): """Draw node labels on the graph G. Parameters ---------- G : graph A networkx graph pos : dictionary A dictionary with nodes as keys and positions as values. Positions should be sequences of length 2. labels : dictionary, optional (default=None) Node labels in a dictionary keyed by node of text labels font_size : int Font size for text labels (default=12) font_color : string Font color string (default='k' black) font_family : string Font family (default='sans-serif') font_weight : string Font weight (default='normal') alpha : float The text transparency (default=1.0) ax : Matplotlib Axes object, optional Draw the graph in the specified Matplotlib axes. Returns ------- dict `dict` of labels keyed on the nodes Examples -------- >>> G=nx.dodecahedral_graph() >>> labels=nx.draw_networkx_labels(G,pos=nx.spring_layout(G)) Also see the NetworkX drawing examples at http://networkx.github.io/documentation/latest/gallery.html See Also -------- draw() draw_networkx() draw_networkx_nodes() draw_networkx_edges() draw_networkx_edge_labels() """ try: import matplotlib.pyplot as plt import matplotlib.cbook as cb except ImportError: raise ImportError("Matplotlib required for draw()") except RuntimeError: print("Matplotlib unable to open display") raise if ax is None: ax = plt.gca() if labels is None: labels = dict((n, n) for n in G.nodes()) # set optional alignment horizontalalignment = kwds.get('horizontalalignment', 'center') verticalalignment = kwds.get('verticalalignment', 'center') text_items = {} # there is no text collection so we'll fake one for n, label in labels.items(): (x, y) = pos[n] if not cb.is_string_like(label): label = str(label) # this will cause "1" and 1 to be labeled the same t = ax.text(x, y, label, size=font_size, color=font_color, family=font_family, weight=font_weight, horizontalalignment=horizontalalignment, verticalalignment=verticalalignment, transform=ax.transData, bbox=bbox, clip_on=True, ) text_items[n] = t return text_items def draw_networkx_edge_labels(G, pos, edge_labels=None, label_pos=0.5, font_size=10, font_color='k', font_family='sans-serif', font_weight='normal', alpha=1.0, bbox=None, ax=None, rotate=True, **kwds): """Draw edge labels. Parameters ---------- G : graph A networkx graph pos : dictionary A dictionary with nodes as keys and positions as values. Positions should be sequences of length 2. ax : Matplotlib Axes object, optional Draw the graph in the specified Matplotlib axes. alpha : float The text transparency (default=1.0) edge_labels : dictionary Edge labels in a dictionary keyed by edge two-tuple of text labels (default=None). Only labels for the keys in the dictionary are drawn. label_pos : float Position of edge label along edge (0=head, 0.5=center, 1=tail) font_size : int Font size for text labels (default=12) font_color : string Font color string (default='k' black) font_weight : string Font weight (default='normal') font_family : string Font family (default='sans-serif') bbox : Matplotlib bbox Specify text box shape and colors. clip_on : bool Turn on clipping at axis boundaries (default=True) Returns ------- dict `dict` of labels keyed on the edges Examples -------- >>> G=nx.dodecahedral_graph() >>> edge_labels=nx.draw_networkx_edge_labels(G,pos=nx.spring_layout(G)) Also see the NetworkX drawing examples at http://networkx.github.io/documentation/latest/gallery.html See Also -------- draw() draw_networkx() draw_networkx_nodes() draw_networkx_edges() draw_networkx_labels() """ try: import matplotlib.pyplot as plt import matplotlib.cbook as cb import numpy except ImportError: raise ImportError("Matplotlib required for draw()") except RuntimeError: print("Matplotlib unable to open display") raise if ax is None: ax = plt.gca() if edge_labels is None: labels = dict(((u, v), d) for u, v, d in G.edges(data=True)) else: labels = edge_labels text_items = {} for (n1, n2), label in labels.items(): (x1, y1) = pos[n1] (x2, y2) = pos[n2] (x, y) = (x1 * label_pos + x2 * (1.0 - label_pos), y1 * label_pos + y2 * (1.0 - label_pos)) if rotate: angle = numpy.arctan2(y2-y1, x2-x1)/(2.0*numpy.pi)*360 # degrees # make label orientation "right-side-up" if angle > 90: angle -= 180 if angle < - 90: angle += 180 # transform data coordinate angle to screen coordinate angle xy = numpy.array((x, y)) trans_angle = ax.transData.transform_angles(numpy.array((angle,)), xy.reshape((1, 2)))[0] else: trans_angle = 0.0 # use default box of white with white border if bbox is None: bbox = dict(boxstyle='round', ec=(1.0, 1.0, 1.0), fc=(1.0, 1.0, 1.0), ) if not cb.is_string_like(label): label = str(label) # this will cause "1" and 1 to be labeled the same # set optional alignment horizontalalignment = kwds.get('horizontalalignment', 'center') verticalalignment = kwds.get('verticalalignment', 'center') t = ax.text(x, y, label, size=font_size, color=font_color, family=font_family, weight=font_weight, horizontalalignment=horizontalalignment, verticalalignment=verticalalignment, rotation=trans_angle, transform=ax.transData, bbox=bbox, zorder=1, clip_on=True, ) text_items[(n1, n2)] = t return text_items def draw_circular(G, **kwargs): """Draw the graph G with a circular layout. Parameters ---------- G : graph A networkx graph kwargs : optional keywords See networkx.draw_networkx() for a description of optional keywords, with the exception of the pos parameter which is not used by this function. """ draw(G, circular_layout(G), **kwargs) def draw_random(G, **kwargs): """Draw the graph G with a random layout. Parameters ---------- G : graph A networkx graph kwargs : optional keywords See networkx.draw_networkx() for a description of optional keywords, with the exception of the pos parameter which is not used by this function. """ draw(G, random_layout(G), **kwargs) def draw_spectral(G, **kwargs): """Draw the graph G with a spectral layout. Parameters ---------- G : graph A networkx graph kwargs : optional keywords See networkx.draw_networkx() for a description of optional keywords, with the exception of the pos parameter which is not used by this function. """ draw(G, spectral_layout(G), **kwargs) def draw_spring(G, **kwargs): """Draw the graph G with a spring layout. Parameters ---------- G : graph A networkx graph kwargs : optional keywords See networkx.draw_networkx() for a description of optional keywords, with the exception of the pos parameter which is not used by this function. """ draw(G, spring_layout(G), **kwargs) def draw_shell(G, **kwargs): """Draw networkx graph with shell layout. Parameters ---------- G : graph A networkx graph kwargs : optional keywords See networkx.draw_networkx() for a description of optional keywords, with the exception of the pos parameter which is not used by this function. """ nlist = kwargs.get('nlist', None) if nlist is not None: del(kwargs['nlist']) draw(G, shell_layout(G, nlist=nlist), **kwargs) def draw_graphviz(G, prog="neato", **kwargs): """Draw networkx graph with graphviz layout. Parameters ---------- G : graph A networkx graph prog : string, optional Name of Graphviz layout program kwargs : optional keywords See networkx.draw_networkx() for a description of optional keywords. """ pos = nx.drawing.graphviz_layout(G, prog) draw(G, pos, **kwargs) def draw_nx(G, pos, **kwds): """For backward compatibility; use draw or draw_networkx.""" draw(G, pos, **kwds) # fixture for nose tests def setup_module(module): from nose import SkipTest try: import matplotlib as mpl mpl.use('PS', warn=False) import matplotlib.pyplot as plt except: raise SkipTest("matplotlib not available") networkx-1.11/networkx/drawing/__init__.py0000644000175000017500000000021012653165361020621 0ustar aricaric00000000000000# graph drawing and interface to graphviz from .layout import * from .nx_pylab import * from . import nx_agraph from . import nx_pydot networkx-1.11/networkx/drawing/layout.py0000644000175000017500000004335612653165361020421 0ustar aricaric00000000000000""" ****** Layout ****** Node positioning algorithms for graph drawing. The default scales and centering for these layouts are typically squares with side [0, 1] or [0, scale]. The two circular layout routines (circular_layout and shell_layout) have size [-1, 1] or [-scale, scale]. """ # Authors: Aric Hagberg , # Dan Schult # Copyright (C) 2004-2016 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import collections import networkx as nx __all__ = ['circular_layout', 'random_layout', 'shell_layout', 'spring_layout', 'spectral_layout', 'fruchterman_reingold_layout'] def random_layout(G, dim=2, scale=1., center=None): """Position nodes uniformly at random. For every node, a position is generated by choosing each of dim coordinates uniformly at random on the default interval [0.0, 1.0), or on an interval of length `scale` centered at `center`. NumPy (http://scipy.org) is required for this function. Parameters ---------- G : NetworkX graph or list of nodes A position will be assigned to every node in G. dim : int Dimension of layout. scale : float (default 1) Scale factor for positions center : array-like (default scale*0.5 in each dim) Coordinate around which to center the layout. Returns ------- pos : dict A dictionary of positions keyed by node Examples -------- >>> G = nx.lollipop_graph(4, 3) >>> pos = nx.random_layout(G) """ import numpy as np shape = (len(G), dim) pos = np.random.random(shape) * scale if center is not None: pos += np.asarray(center) - 0.5 * scale return dict(zip(G, pos)) def circular_layout(G, dim=2, scale=1., center=None): """Position nodes on a circle. Parameters ---------- G : NetworkX graph or list of nodes dim : int Dimension of layout, currently only dim=2 is supported scale : float (default 1) Scale factor for positions, i.e. radius of circle. center : array-like (default origin) Coordinate around which to center the layout. Returns ------- dict : A dictionary of positions keyed by node Examples -------- >>> G=nx.path_graph(4) >>> pos=nx.circular_layout(G) Notes ----- This algorithm currently only works in two dimensions and does not try to minimize edge crossings. """ import numpy as np if len(G) == 0: return {} twopi = 2.0*np.pi theta = np.arange(0, twopi, twopi/len(G)) pos = np.column_stack([np.cos(theta), np.sin(theta)]) * scale if center is not None: pos += np.asarray(center) return dict(zip(G, pos)) def shell_layout(G, nlist=None, dim=2, scale=1., center=None): """Position nodes in concentric circles. Parameters ---------- G : NetworkX graph or list of nodes nlist : list of lists List of node lists for each shell. dim : int Dimension of layout, currently only dim=2 is supported scale : float (default 1) Scale factor for positions, i.e.radius of largest shell center : array-like (default origin) Coordinate around which to center the layout. Returns ------- dict : A dictionary of positions keyed by node Examples -------- >>> G = nx.path_graph(4) >>> shells = [[0], [1,2,3]] >>> pos = nx.shell_layout(G, shells) Notes ----- This algorithm currently only works in two dimensions and does not try to minimize edge crossings. """ import numpy as np if len(G) == 0: return {} if nlist is None: # draw the whole graph in one shell nlist = [list(G)] numb_shells = len(nlist) if len(nlist[0]) == 1: # single node at center radius = 0.0 numb_shells -= 1 else: # else start at r=1 radius = 1.0 # distance between shells gap = (scale / numb_shells) if numb_shells else scale radius *= gap npos={} twopi = 2.0*np.pi for nodes in nlist: theta = np.arange(0, twopi, twopi/len(nodes)) pos = np.column_stack([np.cos(theta), np.sin(theta)]) * radius npos.update(zip(nodes, pos)) radius += gap if center is not None: center = np.asarray(center) for n,p in npos.items(): npos[n] = p + center return npos def fruchterman_reingold_layout(G, dim=2, k=None, pos=None, fixed=None, iterations=50, weight='weight', scale=1.0, center=None): """Position nodes using Fruchterman-Reingold force-directed algorithm. Parameters ---------- G : NetworkX graph dim : int Dimension of layout k : float (default=None) Optimal distance between nodes. If None the distance is set to 1/sqrt(n) where n is the number of nodes. Increase this value to move nodes farther apart. pos : dict or None optional (default=None) Initial positions for nodes as a dictionary with node as keys and values as a list or tuple. If None, then use random initial positions. fixed : list or None optional (default=None) Nodes to keep fixed at initial position. If any nodes are fixed, the scale and center features are not used. iterations : int optional (default=50) Number of iterations of spring-force relaxation weight : string or None optional (default='weight') The edge attribute that holds the numerical value used for the effective spring constant. If None, edge weights are 1. scale : float (default=1.0) Scale factor for positions. The nodes are positioned in a box of size `scale` in each dim centered at `center`. center : array-like (default scale/2 in each dim) Coordinate around which to center the layout. Returns ------- dict : A dictionary of positions keyed by node Examples -------- >>> G=nx.path_graph(4) >>> pos=nx.spring_layout(G) # this function has two names: # spring_layout and fruchterman_reingold_layout >>> pos=nx.fruchterman_reingold_layout(G) """ import numpy as np if len(G) == 0: return {} if fixed is not None: nfixed = dict(zip(G, range(len(G)))) fixed = np.asarray([nfixed[v] for v in fixed]) if pos is None: msg = "Keyword pos must be specified if any nodes are fixed" raise ValueError(msg) if pos is not None: # Determine size of existing domain to adjust initial positions pos_coords = np.array(list(pos.values())) min_coords = pos_coords.min(0) domain_size = pos_coords.max(0) - min_coords shape = (len(G), dim) pos_arr = np.random.random(shape) * domain_size + min_coords for i,n in enumerate(G): if n in pos: pos_arr[i] = np.asarray(pos[n]) else: pos_arr=None if k is None and fixed is not None: # Adjust k for domains larger than 1x1 k=domain_size.max()/np.sqrt(len(G)) try: # Sparse matrix if len(G) < 500: # sparse solver for large graphs raise ValueError A = nx.to_scipy_sparse_matrix(G, weight=weight, dtype='f') pos = _sparse_fruchterman_reingold(A,dim,k,pos_arr,fixed,iterations) except: A = nx.to_numpy_matrix(G, weight=weight) pos = _fruchterman_reingold(A, dim, k, pos_arr, fixed, iterations) if fixed is None: pos = _rescale_layout(pos, scale) if center is not None: pos += np.asarray(center) - 0.5 * scale return dict(zip(G,pos)) spring_layout=fruchterman_reingold_layout def _fruchterman_reingold(A,dim=2,k=None,pos=None,fixed=None,iterations=50): # Position nodes in adjacency matrix A using Fruchterman-Reingold # Entry point for NetworkX graph is fruchterman_reingold_layout() import numpy as np try: nnodes,_=A.shape except AttributeError: raise nx.NetworkXError( "fruchterman_reingold() takes an adjacency matrix as input") A=np.asarray(A) # make sure we have an array instead of a matrix if pos is None: # random initial positions pos=np.asarray(np.random.random((nnodes,dim)),dtype=A.dtype) else: # make sure positions are of same type as matrix pos=pos.astype(A.dtype) # optimal distance between nodes if k is None: k=np.sqrt(1.0/nnodes) # the initial "temperature" is about .1 of domain area (=1x1) # this is the largest step allowed in the dynamics. # Calculate domain in case our fixed positions are bigger than 1x1 t = max(max(pos.T[0]) - min(pos.T[0]), max(pos.T[1]) - min(pos.T[1]))*0.1 # simple cooling scheme. # linearly step down by dt on each iteration so last iteration is size dt. dt=t/float(iterations+1) delta = np.zeros((pos.shape[0],pos.shape[0],pos.shape[1]),dtype=A.dtype) # the inscrutable (but fast) version # this is still O(V^2) # could use multilevel methods to speed this up significantly for iteration in range(iterations): # matrix of difference between points for i in range(pos.shape[1]): delta[:,:,i]= pos[:,i,None]-pos[:,i] # distance between points distance=np.sqrt((delta**2).sum(axis=-1)) # enforce minimum distance of 0.01 distance=np.where(distance<0.01,0.01,distance) # displacement "force" displacement=np.transpose(np.transpose(delta)*\ (k*k/distance**2-A*distance/k))\ .sum(axis=1) # update positions length=np.sqrt((displacement**2).sum(axis=1)) length=np.where(length<0.01,0.01,length) delta_pos=np.transpose(np.transpose(displacement)*t/length) if fixed is not None: # don't change positions of fixed nodes delta_pos[fixed]=0.0 pos+=delta_pos # cool temperature t-=dt if fixed is None: pos = _rescale_layout(pos) return pos def _sparse_fruchterman_reingold(A, dim=2, k=None, pos=None, fixed=None, iterations=50): # Position nodes in adjacency matrix A using Fruchterman-Reingold # Entry point for NetworkX graph is fruchterman_reingold_layout() # Sparse version import numpy as np try: nnodes,_=A.shape except AttributeError: raise nx.NetworkXError( "fruchterman_reingold() takes an adjacency matrix as input") try: from scipy.sparse import spdiags,coo_matrix except ImportError: raise ImportError("_sparse_fruchterman_reingold() scipy numpy: http://scipy.org/ ") # make sure we have a LIst of Lists representation try: A=A.tolil() except: A=(coo_matrix(A)).tolil() if pos is None: # random initial positions pos=np.asarray(np.random.random((nnodes,dim)),dtype=A.dtype) else: # make sure positions are of same type as matrix pos=pos.astype(A.dtype) # no fixed nodes if fixed is None: fixed=[] # optimal distance between nodes if k is None: k=np.sqrt(1.0/nnodes) # the initial "temperature" is about .1 of domain area (=1x1) # this is the largest step allowed in the dynamics. # Calculate domain in case our fixed positions are bigger than 1x1 t = max(max(pos.T[0]) - min(pos.T[0]), max(pos.T[1]) - min(pos.T[1]))*0.1 # simple cooling scheme. # linearly step down by dt on each iteration so last iteration is size dt. dt=t/float(iterations+1) displacement=np.zeros((dim,nnodes)) for iteration in range(iterations): displacement*=0 # loop over rows for i in range(A.shape[0]): if i in fixed: continue # difference between this row's node position and all others delta=(pos[i]-pos).T # distance between points distance=np.sqrt((delta**2).sum(axis=0)) # enforce minimum distance of 0.01 distance=np.where(distance<0.01,0.01,distance) # the adjacency matrix row Ai=np.asarray(A.getrowview(i).toarray()) # displacement "force" displacement[:,i]+=\ (delta*(k*k/distance**2-Ai*distance/k)).sum(axis=1) # update positions length=np.sqrt((displacement**2).sum(axis=0)) length=np.where(length<0.01,0.01,length) pos+=(displacement*t/length).T # cool temperature t-=dt if fixed is None: pos = _rescale_layout(pos) return pos def spectral_layout(G, dim=2, weight='weight', scale=1., center=None): """Position nodes using the eigenvectors of the graph Laplacian. Parameters ---------- G : NetworkX graph or list of nodes dim : int Dimension of layout weight : string or None optional (default='weight') The edge attribute that holds the numerical value used for the edge weight. If None, then all edge weights are 1. scale : float optional (default 1) Scale factor for positions, i.e. nodes placed in a box with side [0, scale] or centered on `center` if provided. center : array-like (default scale/2 in each dim) Coordinate around which to center the layout. Returns ------- dict : A dictionary of positions keyed by node Examples -------- >>> G=nx.path_graph(4) >>> pos=nx.spectral_layout(G) Notes ----- Directed graphs will be considered as undirected graphs when positioning the nodes. For larger graphs (>500 nodes) this will use the SciPy sparse eigenvalue solver (ARPACK). """ # handle some special cases that break the eigensolvers import numpy as np if len(G) <= 2: if len(G) == 0: return {} elif len(G) == 1: if center is not None: pos = np.asarray(center) else: pos = np.ones((1,dim)) * scale * 0.5 else: #len(G) == 2 pos = np.array([np.zeros(dim), np.ones(dim) * scale]) if center is not None: pos += np.asarray(center) - scale * 0.5 return dict(zip(G,pos)) try: # Sparse matrix if len(G)< 500: # dense solver is faster for small graphs raise ValueError A = nx.to_scipy_sparse_matrix(G, weight=weight, dtype='d') # Symmetrize directed graphs if G.is_directed(): A = A + np.transpose(A) pos = _sparse_spectral(A,dim) except (ImportError, ValueError): # Dense matrix A = nx.to_numpy_matrix(G, weight=weight) # Symmetrize directed graphs if G.is_directed(): A = A + np.transpose(A) pos = _spectral(A, dim) pos = _rescale_layout(pos, scale) if center is not None: pos += np.asarray(center) - 0.5 * scale return dict(zip(G,pos)) def _spectral(A, dim=2): # Input adjacency matrix A # Uses dense eigenvalue solver from numpy try: import numpy as np except ImportError: raise ImportError("spectral_layout() requires numpy: http://scipy.org/ ") try: nnodes,_=A.shape except AttributeError: raise nx.NetworkXError(\ "spectral() takes an adjacency matrix as input") # form Laplacian matrix # make sure we have an array instead of a matrix A=np.asarray(A) I=np.identity(nnodes,dtype=A.dtype) D=I*np.sum(A,axis=1) # diagonal of degrees L=D-A eigenvalues,eigenvectors=np.linalg.eig(L) # sort and keep smallest nonzero index=np.argsort(eigenvalues)[1:dim+1] # 0 index is zero eigenvalue return np.real(eigenvectors[:,index]) def _sparse_spectral(A,dim=2): # Input adjacency matrix A # Uses sparse eigenvalue solver from scipy # Could use multilevel methods here, see Koren "On spectral graph drawing" try: import numpy as np from scipy.sparse import spdiags except ImportError: raise ImportError("_sparse_spectral() requires scipy & numpy: http://scipy.org/ ") try: from scipy.sparse.linalg.eigen import eigsh except ImportError: # scipy <0.9.0 names eigsh differently from scipy.sparse.linalg import eigen_symmetric as eigsh try: nnodes,_=A.shape except AttributeError: raise nx.NetworkXError(\ "sparse_spectral() takes an adjacency matrix as input") # form Laplacian matrix data=np.asarray(A.sum(axis=1).T) D=spdiags(data,0,nnodes,nnodes) L=D-A k=dim+1 # number of Lanczos vectors for ARPACK solver.What is the right scaling? ncv=max(2*k+1,int(np.sqrt(nnodes))) # return smallest k eigenvalues and eigenvectors eigenvalues,eigenvectors=eigsh(L,k,which='SM',ncv=ncv) index=np.argsort(eigenvalues)[1:k] # 0 index is zero eigenvalue return np.real(eigenvectors[:,index]) def _rescale_layout(pos, scale=1.): # rescale to [0, scale) in each axis # Find max length over all dimensions maxlim=0 for i in range(pos.shape[1]): pos[:,i] -= pos[:,i].min() # shift min to zero maxlim = max(maxlim, pos[:,i].max()) if maxlim > 0: for i in range(pos.shape[1]): pos[:,i] *= scale / maxlim return pos # fixture for nose tests def setup_module(module): from nose import SkipTest try: import numpy except: raise SkipTest("NumPy not available") try: import scipy except: raise SkipTest("SciPy not available") networkx-1.11/networkx/drawing/tests/0000755000175000017500000000000012653231454017656 5ustar aricaric00000000000000networkx-1.11/networkx/drawing/tests/test_pydot.py0000644000175000017500000000272212653165361022434 0ustar aricaric00000000000000""" Unit tests for pydot drawing functions. """ import os import tempfile from nose import SkipTest from nose.tools import assert_true import networkx as nx from networkx.testing import assert_graphs_equal class TestPydot(object): @classmethod def setupClass(cls): global pydotplus try: import pydotplus except ImportError: raise SkipTest('pydotplus not available.') def pydot_checks(self, G): G.add_edge('A','B') G.add_edge('A','C') G.add_edge('B','C') G.add_edge('A','D') G.add_node('E') P = nx.nx_pydot.to_pydot(G) G2 = G.__class__(nx.nx_pydot.from_pydot(P)) assert_graphs_equal(G, G2) fname = tempfile.mktemp() assert_true( P.write_raw(fname) ) Pin = pydotplus.graph_from_dot_file(fname) n1 = sorted([p.get_name() for p in P.get_node_list()]) n2 = sorted([p.get_name() for p in Pin.get_node_list()]) assert_true( n1 == n2 ) e1=[(e.get_source(),e.get_destination()) for e in P.get_edge_list()] e2=[(e.get_source(),e.get_destination()) for e in Pin.get_edge_list()] assert_true( sorted(e1)==sorted(e2) ) Hin = nx.nx_pydot.read_dot(fname) Hin = G.__class__(Hin) assert_graphs_equal(G, Hin) # os.unlink(fname) def testUndirected(self): self.pydot_checks(nx.Graph()) def testDirected(self): self.pydot_checks(nx.DiGraph()) networkx-1.11/networkx/drawing/tests/test_layout.py0000644000175000017500000001402212637544500022604 0ustar aricaric00000000000000"""Unit tests for layout functions.""" import sys from nose import SkipTest from nose.tools import assert_equal import networkx as nx class TestLayout(object): numpy=1 # nosetests attribute, use nosetests -a 'not numpy' to skip test @classmethod def setupClass(cls): global numpy try: import numpy except ImportError: raise SkipTest('numpy not available.') def setUp(self): self.Gi=nx.grid_2d_graph(5,5) self.Gs=nx.Graph() self.Gs.add_path('abcdef') self.bigG=nx.grid_2d_graph(25,25) #bigger than 500 nodes for sparse def test_smoke_int(self): G=self.Gi vpos=nx.random_layout(G) vpos=nx.circular_layout(G) vpos=nx.spring_layout(G) vpos=nx.fruchterman_reingold_layout(G) vpos=nx.spectral_layout(G) vpos=nx.spectral_layout(self.bigG) vpos=nx.shell_layout(G) def test_smoke_string(self): G = self.Gs vpos = nx.random_layout(G) vpos = nx.circular_layout(G) vpos = nx.spring_layout(G) vpos = nx.fruchterman_reingold_layout(G) vpos = nx.spectral_layout(G) vpos = nx.shell_layout(G) def test_empty_graph(self): G=nx.Graph() vpos = nx.random_layout(G) vpos = nx.circular_layout(G) vpos = nx.spring_layout(G) vpos = nx.fruchterman_reingold_layout(G) vpos = nx.shell_layout(G) vpos = nx.spectral_layout(G) # center arg vpos = nx.random_layout(G, scale=2, center=(4,5)) vpos = nx.circular_layout(G, scale=2, center=(4,5)) vpos = nx.spring_layout(G, scale=2, center=(4,5)) vpos = nx.shell_layout(G, scale=2, center=(4,5)) vpos = nx.spectral_layout(G, scale=2, center=(4,5)) def test_single_node(self): G = nx.Graph() G.add_node(0) vpos = nx.random_layout(G) vpos = nx.circular_layout(G) vpos = nx.spring_layout(G) vpos = nx.fruchterman_reingold_layout(G) vpos = nx.shell_layout(G) vpos = nx.spectral_layout(G) # center arg vpos = nx.random_layout(G, scale=2, center=(4,5)) vpos = nx.circular_layout(G, scale=2, center=(4,5)) vpos = nx.spring_layout(G, scale=2, center=(4,5)) vpos = nx.shell_layout(G, scale=2, center=(4,5)) vpos = nx.spectral_layout(G, scale=2, center=(4,5)) def check_scale_and_center(self, pos, scale, center): center = numpy.array(center) low = center - 0.5 * scale hi = center + 0.5 * scale vpos = numpy.array(list(pos.values())) length = vpos.max(0) - vpos.min(0) assert (length <= scale).all() assert (vpos >= low).all() assert (vpos <= hi).all() def test_scale_and_center_arg(self): G = nx.complete_graph(9) G.add_node(9) vpos = nx.random_layout(G, scale=2, center=(4,5)) self.check_scale_and_center(vpos, scale=2, center=(4,5)) vpos = nx.spring_layout(G, scale=2, center=(4,5)) self.check_scale_and_center(vpos, scale=2, center=(4,5)) vpos = nx.spectral_layout(G, scale=2, center=(4,5)) self.check_scale_and_center(vpos, scale=2, center=(4,5)) # circular can have twice as big length vpos = nx.circular_layout(G, scale=2, center=(4,5)) self.check_scale_and_center(vpos, scale=2*2, center=(4,5)) vpos = nx.shell_layout(G, scale=2, center=(4,5)) self.check_scale_and_center(vpos, scale=2*2, center=(4,5)) # check default center and scale vpos = nx.random_layout(G) self.check_scale_and_center(vpos, scale=1, center=(0.5,0.5)) vpos = nx.spring_layout(G) self.check_scale_and_center(vpos, scale=1, center=(0.5,0.5)) vpos = nx.spectral_layout(G) self.check_scale_and_center(vpos, scale=1, center=(0.5,0.5)) vpos = nx.circular_layout(G) self.check_scale_and_center(vpos, scale=2, center=(0,0)) vpos = nx.shell_layout(G) self.check_scale_and_center(vpos, scale=2, center=(0,0)) def test_shell_layout(self): G = nx.complete_graph(9) shells=[[0], [1,2,3,5], [4,6,7,8]] vpos = nx.shell_layout(G, nlist=shells) vpos = nx.shell_layout(G, nlist=shells, scale=2, center=(3,4)) shells=[[0,1,2,3,5], [4,6,7,8]] vpos = nx.shell_layout(G, nlist=shells) vpos = nx.shell_layout(G, nlist=shells, scale=2, center=(3,4)) def test_spring_args(self): G = nx.complete_graph(9) vpos = nx.spring_layout(G, dim=3) assert_equal(vpos[0].shape, (3,)) vpos = nx.spring_layout(G, fixed=[0,1], pos={1:(0,0), 2:(1,1)}) vpos = nx.spring_layout(G, k=2, fixed=[0,1], pos={1:(0,0), 2:(1,1)}) vpos = nx.spring_layout(G, scale=3, center=(2,5)) vpos = nx.spring_layout(G, scale=3) vpos = nx.spring_layout(G, center=(2,5)) def test_spectral_for_small_graphs(self): G = nx.Graph() vpos = nx.spectral_layout(G) vpos = nx.spectral_layout(G, center=(2,3)) G.add_node(0) vpos = nx.spectral_layout(G) vpos = nx.spectral_layout(G, center=(2,3)) G.add_node(1) vpos = nx.spectral_layout(G) vpos = nx.spectral_layout(G, center=(2,3)) # 3 nodes should allow eigensolvers to work G.add_node(2) vpos = nx.spectral_layout(G) vpos = nx.spectral_layout(G, center=(2,3)) def test_adjacency_interface_numpy(self): A=nx.to_numpy_matrix(self.Gs) pos=nx.drawing.layout._fruchterman_reingold(A) pos=nx.drawing.layout._fruchterman_reingold(A,dim=3) assert_equal(pos.shape,(6,3)) def test_adjacency_interface_scipy(self): try: import scipy except ImportError: raise SkipTest('scipy not available.') A=nx.to_scipy_sparse_matrix(self.Gs,dtype='d') pos=nx.drawing.layout._sparse_fruchterman_reingold(A) pos=nx.drawing.layout._sparse_spectral(A) pos=nx.drawing.layout._sparse_fruchterman_reingold(A,dim=3) assert_equal(pos.shape,(6,3)) networkx-1.11/networkx/drawing/tests/test_pylab.py0000644000175000017500000000216112637544450022403 0ustar aricaric00000000000000""" Unit tests for matplotlib drawing functions. """ import os from nose import SkipTest import networkx as nx class TestPylab(object): @classmethod def setupClass(cls): global plt try: import matplotlib as mpl mpl.use('PS',warn=False) import matplotlib.pyplot as plt plt.rcParams['text.usetex'] = False except ImportError: raise SkipTest('matplotlib not available.') except RuntimeError: raise SkipTest('matplotlib not available.') def setUp(self): self.G=nx.barbell_graph(5,10) def test_draw(self): try: N=self.G nx.draw_spring(N) plt.savefig("test.ps") nx.draw_random(N) plt.savefig("test.ps") nx.draw_circular(N) plt.savefig("test.ps") nx.draw_spectral(N) plt.savefig("test.ps") nx.draw_spring(N.to_directed()) plt.savefig("test.ps") finally: try: os.unlink('test.ps') except OSError: pass networkx-1.11/networkx/drawing/tests/test_agraph.py0000644000175000017500000000361012653165361022534 0ustar aricaric00000000000000"""Unit tests for PyGraphviz intefaace. """ import os import tempfile from nose import SkipTest from nose.tools import assert_true,assert_equal from networkx.testing import assert_edges_equal, assert_nodes_equal import networkx as nx class TestAGraph(object): @classmethod def setupClass(cls): global pygraphviz try: import pygraphviz except ImportError: raise SkipTest('PyGraphviz not available.') def build_graph(self, G): G.add_edge('A','B') G.add_edge('A','C') G.add_edge('A','C') G.add_edge('B','C') G.add_edge('A','D') G.add_node('E') return G def assert_equal(self, G1, G2): assert_nodes_equal(G1.nodes(),G2.nodes()) assert_edges_equal(G1.edges(),G2.edges()) def agraph_checks(self, G): G = self.build_graph(G) A = nx.nx_agraph.to_agraph(G) H = nx.nx_agraph.from_agraph(A) self.assert_equal(G, H) fname=tempfile.mktemp() nx.drawing.nx_agraph.write_dot(H,fname) Hin = nx.nx_agraph.read_dot(fname) os.unlink(fname) self.assert_equal(H,Hin) (fd,fname)=tempfile.mkstemp() fh=open(fname,'w') nx.drawing.nx_agraph.write_dot(H,fh) fh.close() fh=open(fname,'r') Hin = nx.nx_agraph.read_dot(fh) fh.close() os.unlink(fname) self.assert_equal(H,Hin) def test_from_agraph_name(self): G = nx.Graph(name='test') A = nx.nx_agraph.to_agraph(G) H = nx.nx_agraph.from_agraph(A) assert_equal(G.name,'test') def testUndirected(self): self.agraph_checks(nx.Graph()) def testDirected(self): self.agraph_checks(nx.DiGraph()) def testMultiUndirected(self): self.agraph_checks(nx.MultiGraph()) def testMultiDirected(self): self.agraph_checks(nx.MultiDiGraph()) networkx-1.11/networkx/drawing/nx_agraph.py0000644000175000017500000003164112653165361021045 0ustar aricaric00000000000000""" *************** Graphviz AGraph *************** Interface to pygraphviz AGraph class. Examples -------- >>> G = nx.complete_graph(5) >>> A = nx.nx_agraph.to_agraph(G) >>> H = nx.nx_agraph.from_agraph(A) See Also -------- Pygraphviz: http://pygraphviz.github.io/ """ # Author: Aric Hagberg (hagberg@lanl.gov) # Copyright (C) 2004-2016 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import os import sys import tempfile import networkx as nx __all__ = ['from_agraph', 'to_agraph', 'write_dot', 'read_dot', 'graphviz_layout', 'pygraphviz_layout', 'view_pygraphviz'] def from_agraph(A,create_using=None): """Return a NetworkX Graph or DiGraph from a PyGraphviz graph. Parameters ---------- A : PyGraphviz AGraph A graph created with PyGraphviz create_using : NetworkX graph class instance The output is created using the given graph class instance Examples -------- >>> K5 = nx.complete_graph(5) >>> A = nx.nx_agraph.to_agraph(K5) >>> G = nx.nx_agraph.from_agraph(A) >>> G = nx.nx_agraph.from_agraph(A) Notes ----- The Graph G will have a dictionary G.graph_attr containing the default graphviz attributes for graphs, nodes and edges. Default node attributes will be in the dictionary G.node_attr which is keyed by node. Edge attributes will be returned as edge data in G. With edge_attr=False the edge data will be the Graphviz edge weight attribute or the value 1 if no edge weight attribute is found. """ if create_using is None: if A.is_directed(): if A.is_strict(): create_using=nx.DiGraph() else: create_using=nx.MultiDiGraph() else: if A.is_strict(): create_using=nx.Graph() else: create_using=nx.MultiGraph() # assign defaults N=nx.empty_graph(0,create_using) N.name='' if A.name is not None: N.name=A.name # add nodes, attributes to N.node_attr for n in A.nodes(): str_attr=dict((str(k),v) for k,v in n.attr.items()) N.add_node(str(n),**str_attr) # add edges, assign edge data as dictionary of attributes for e in A.edges(): u,v=str(e[0]),str(e[1]) attr=dict(e.attr) str_attr=dict((str(k),v) for k,v in attr.items()) if not N.is_multigraph(): if e.name is not None: str_attr['key']=e.name N.add_edge(u,v,**str_attr) else: N.add_edge(u,v,key=e.name,**str_attr) # add default attributes for graph, nodes, and edges # hang them on N.graph_attr N.graph['graph']=dict(A.graph_attr) N.graph['node']=dict(A.node_attr) N.graph['edge']=dict(A.edge_attr) return N def to_agraph(N): """Return a pygraphviz graph from a NetworkX graph N. Parameters ---------- N : NetworkX graph A graph created with NetworkX Examples -------- >>> K5 = nx.complete_graph(5) >>> A = nx.nx_agraph.to_agraph(K5) Notes ----- If N has an dict N.graph_attr an attempt will be made first to copy properties attached to the graph (see from_agraph) and then updated with the calling arguments if any. """ try: import pygraphviz except ImportError: raise ImportError('requires pygraphviz ', 'http://pygraphviz.github.io/') directed=N.is_directed() strict=N.number_of_selfloops()==0 and not N.is_multigraph() A=pygraphviz.AGraph(name=N.name,strict=strict,directed=directed) # default graph attributes A.graph_attr.update(N.graph.get('graph',{})) A.node_attr.update(N.graph.get('node',{})) A.edge_attr.update(N.graph.get('edge',{})) # add nodes for n,nodedata in N.nodes(data=True): A.add_node(n,**nodedata) # loop over edges if N.is_multigraph(): for u,v,key,edgedata in N.edges_iter(data=True,keys=True): str_edgedata=dict((k,str(v)) for k,v in edgedata.items()) A.add_edge(u,v,key=str(key),**str_edgedata) else: for u,v,edgedata in N.edges_iter(data=True): str_edgedata=dict((k,str(v)) for k,v in edgedata.items()) A.add_edge(u,v,**str_edgedata) return A def write_dot(G,path): """Write NetworkX graph G to Graphviz dot format on path. Parameters ---------- G : graph A networkx graph path : filename Filename or file handle to write """ try: import pygraphviz except ImportError: raise ImportError('requires pygraphviz ', 'http://pygraphviz.github.io/') A=to_agraph(G) A.write(path) A.clear() return def read_dot(path): """Return a NetworkX graph from a dot file on path. Parameters ---------- path : file or string File name or file handle to read. """ try: import pygraphviz except ImportError: raise ImportError('read_dot() requires pygraphviz ', 'http://pygraphviz.github.io/') A=pygraphviz.AGraph(file=path) return from_agraph(A) def graphviz_layout(G,prog='neato',root=None, args=''): """Create node positions for G using Graphviz. Parameters ---------- G : NetworkX graph A graph created with NetworkX prog : string Name of Graphviz layout program root : string, optional Root node for twopi layout args : string, optional Extra arguments to Graphviz layout program Returns : dictionary Dictionary of x,y, positions keyed by node. Examples -------- >>> G = nx.petersen_graph() >>> pos = nx.nx_agraph.graphviz_layout(G) >>> pos = nx.nx_agraph.graphviz_layout(G, prog='dot') Notes ----- This is a wrapper for pygraphviz_layout. """ return pygraphviz_layout(G,prog=prog,root=root,args=args) def pygraphviz_layout(G,prog='neato',root=None, args=''): """Create node positions for G using Graphviz. Parameters ---------- G : NetworkX graph A graph created with NetworkX prog : string Name of Graphviz layout program root : string, optional Root node for twopi layout args : string, optional Extra arguments to Graphviz layout program Returns : dictionary Dictionary of x,y, positions keyed by node. Examples -------- >>> G = nx.petersen_graph() >>> pos = nx.nx_agraph.graphviz_layout(G) >>> pos = nx.nx_agraph.graphviz_layout(G, prog='dot') """ try: import pygraphviz except ImportError: raise ImportError('requires pygraphviz ', 'http://pygraphviz.github.io/') if root is not None: args+="-Groot=%s"%root A=to_agraph(G) A.layout(prog=prog,args=args) node_pos={} for n in G: node=pygraphviz.Node(A,n) try: xx,yy=node.attr["pos"].split(',') node_pos[n]=(float(xx),float(yy)) except: print("no position for node",n) node_pos[n]=(0.0,0.0) return node_pos @nx.utils.open_file(5, 'w') def view_pygraphviz(G, edgelabel=None, prog='dot', args='', suffix='', path=None): """Views the graph G using the specified layout algorithm. Parameters ---------- G : NetworkX graph The machine to draw. edgelabel : str, callable, None If a string, then it specifes the edge attribute to be displayed on the edge labels. If a callable, then it is called for each edge and it should return the string to be displayed on the edges. The function signature of `edgelabel` should be edgelabel(data), where `data` is the edge attribute dictionary. prog : string Name of Graphviz layout program. args : str Additional arguments to pass to the Graphviz layout program. suffix : str If `filename` is None, we save to a temporary file. The value of `suffix` will appear at the tail end of the temporary filename. path : str, None The filename used to save the image. If None, save to a temporary file. File formats are the same as those from pygraphviz.agraph.draw. Returns ------- path : str The filename of the generated image. A : PyGraphviz graph The PyGraphviz graph instance used to generate the image. Notes ----- If this function is called in succession too quickly, sometimes the image is not displayed. So you might consider time.sleep(.5) between calls if you experience problems. """ if not len(G): raise nx.NetworkXException("An empty graph cannot be drawn.") import pygraphviz # If we are providing default values for graphviz, these must be set # before any nodes or edges are added to the PyGraphviz graph object. # The reason for this is that default values only affect incoming objects. # If you change the default values after the objects have been added, # then they inherit no value and are set only if explicitly set. # to_agraph() uses these values. attrs = ['edge', 'node', 'graph'] for attr in attrs: if attr not in G.graph: G.graph[attr] = {} # These are the default values. edge_attrs = {'fontsize': '10'} node_attrs = {'style': 'filled', 'fillcolor': '#0000FF40', 'height': '0.75', 'width': '0.75', 'shape': 'circle'} graph_attrs = {} def update_attrs(which, attrs): # Update graph attributes. Return list of those which were added. added = [] for k,v in attrs.items(): if k not in G.graph[which]: G.graph[which][k] = v added.append(k) def clean_attrs(which, added): # Remove added attributes for attr in added: del G.graph[which][attr] if not G.graph[which]: del G.graph[which] # Update all default values update_attrs('edge', edge_attrs) update_attrs('node', node_attrs) update_attrs('graph', graph_attrs) # Convert to agraph, so we inherit default values A = to_agraph(G) # Remove the default values we added to the original graph. clean_attrs('edge', edge_attrs) clean_attrs('node', node_attrs) clean_attrs('graph', graph_attrs) # If the user passed in an edgelabel, we update the labels for all edges. if edgelabel is not None: if not hasattr(edgelabel, '__call__'): def func(data): return ''.join([" ", str(data[edgelabel]), " "]) else: func = edgelabel # update all the edge labels if G.is_multigraph(): for u,v,key,data in G.edges_iter(keys=True, data=True): # PyGraphviz doesn't convert the key to a string. See #339 edge = A.get_edge(u,v,str(key)) edge.attr['label'] = str(func(data)) else: for u,v,data in G.edges_iter(data=True): edge = A.get_edge(u,v) edge.attr['label'] = str(func(data)) if path is None: ext = 'png' if suffix: suffix = '_%s.%s' % (suffix, ext) else: suffix = '.%s' % (ext,) path = tempfile.NamedTemporaryFile(suffix=suffix, delete=False) else: # Assume the decorator worked and it is a file-object. pass display_pygraphviz(A, path=path, prog=prog, args=args) return path.name, A def display_pygraphviz(graph, path, format=None, prog=None, args=''): """Internal function to display a graph in OS dependent manner. Parameters ---------- graph : PyGraphviz graph A PyGraphviz AGraph instance. path : file object An already opened file object that will be closed. format : str, None An attempt is made to guess the output format based on the extension of the filename. If that fails, the value of `format` is used. prog : string Name of Graphviz layout program. args : str Additional arguments to pass to the Graphviz layout program. Notes ----- If this function is called in succession too quickly, sometimes the image is not displayed. So you might consider time.sleep(.5) between calls if you experience problems. """ if format is None: filename = path.name format = os.path.splitext(filename)[1].lower()[1:] if not format: # Let the draw() function use its default format = None # Save to a file and display in the default viewer. # We must close the file before viewing it. graph.draw(path, format, prog, args) path.close() nx.utils.default_opener(filename) # fixture for nose tests def setup_module(module): from nose import SkipTest try: import pygraphviz except: raise SkipTest("pygraphviz not available") networkx-1.11/networkx/utils/0000755000175000017500000000000012653231454016221 5ustar aricaric00000000000000networkx-1.11/networkx/utils/heaps.py0000644000175000017500000002554712637544450017715 0ustar aricaric00000000000000""" Min-heaps. """ __author__ = """ysitu """ # Copyright (C) 2014 ysitu # All rights reserved. # BSD license. from heapq import heappop, heappush from itertools import count import networkx as nx __all__ = ['MinHeap', 'PairingHeap', 'BinaryHeap'] class MinHeap(object): """Base class for min-heaps. A MinHeap stores a collection of key-value pairs ordered by their values. It supports querying the minimum pair, inserting a new pair, decreasing the value in an existing pair and deleting the minimum pair. """ class _Item(object): """Used by subclassess to represent a key-value pair. """ __slots__ = ('key', 'value') def __init__(self, key, value): self.key = key self.value = value def __repr__(self): return repr((self.key, self.value)) def __init__(self): """Initialize a new min-heap. """ self._dict = {} def min(self): """Query the minimum key-value pair. Returns ------- key, value : tuple The key-value pair with the minimum value in the heap. Raises ------ NetworkXError If the heap is empty. """ raise NotImplementedError def pop(self): """Delete the minimum pair in the heap. Returns ------- key, value : tuple The key-value pair with the minimum value in the heap. Raises ------ NetworkXError If the heap is empty. """ raise NotImplementedError def get(self, key, default=None): """Return the value associated with a key. Parameters ---------- key : hashable object The key to be looked up. default : object Default value to return if the key is not present in the heap. Default value: None. Returns ------- value : object. The value associated with the key. """ raise NotImplementedError def insert(self, key, value, allow_increase=False): """Insert a new key-value pair or modify the value in an existing pair. Parameters ---------- key : hashable object The key. value : object comparable with existing values. The value. allow_increase : bool Whether the value is allowed to increase. If False, attempts to increase an existing value have no effect. Default value: False. Returns ------- decreased : bool True if a pair is inserted or the existing value is decreased. """ raise NotImplementedError def __nonzero__(self): """Return whether the heap if empty. """ return bool(self._dict) def __bool__(self): """Return whether the heap if empty. """ return bool(self._dict) def __len__(self): """Return the number of key-value pairs in the heap. """ return len(self._dict) def __contains__(self, key): """Return whether a key exists in the heap. Parameters ---------- key : any hashable object. The key to be looked up. """ return key in self._dict def _inherit_doc(cls): """Decorator for inheriting docstrings from base classes. """ def func(fn): fn.__doc__ = cls.__dict__[fn.__name__].__doc__ return fn return func class PairingHeap(MinHeap): """A pairing heap. """ class _Node(MinHeap._Item): """A node in a pairing heap. A tree in a pairing heap is stored using the left-child, right-sibling representation. """ __slots__ = ('left', 'next', 'prev', 'parent') def __init__(self, key, value): super(PairingHeap._Node, self).__init__(key, value) # The leftmost child. self.left = None # The next sibling. self.next = None # The previous sibling. self.prev = None # The parent. self.parent = None def __init__(self): """Initialize a pairing heap. """ super(PairingHeap, self).__init__() self._root = None @_inherit_doc(MinHeap) def min(self): if self._root is None: raise nx.NetworkXError('heap is empty.') return (self._root.key, self._root.value) @_inherit_doc(MinHeap) def pop(self): if self._root is None: raise nx.NetworkXError('heap is empty.') min_node = self._root self._root = self._merge_children(self._root) del self._dict[min_node.key] return (min_node.key, min_node.value) @_inherit_doc(MinHeap) def get(self, key, default=None): node = self._dict.get(key) return node.value if node is not None else default @_inherit_doc(MinHeap) def insert(self, key, value, allow_increase=False): node = self._dict.get(key) root = self._root if node is not None: if value < node.value: node.value = value if node is not root and value < node.parent.value: self._cut(node) self._root = self._link(root, node) return True elif allow_increase and value > node.value: node.value = value child = self._merge_children(node) # Nonstandard step: Link the merged subtree with the root. See # below for the standard step. if child is not None: self._root = self._link(self._root, child) # Standard step: Perform a decrease followed by a pop as if the # value were the smallest in the heap. Then insert the new # value into the heap. # if node is not root: # self._cut(node) # if child is not None: # root = self._link(root, child) # self._root = self._link(root, node) # else: # self._root = (self._link(node, child) # if child is not None else node) return False else: # Insert a new key. node = self._Node(key, value) self._dict[key] = node self._root = self._link(root, node) if root is not None else node return True def _link(self, root, other): """Link two nodes, making the one with the smaller value the parent of the other. """ if other.value < root.value: root, other = other, root next = root.left other.next = next if next is not None: next.prev = other other.prev = None root.left = other other.parent = root return root def _merge_children(self, root): """Merge the subtrees of the root using the standard two-pass method. The resulting subtree is detached from the root. """ node = root.left root.left = None if node is not None: link = self._link # Pass 1: Merge pairs of consecutive subtrees from left to right. # At the end of the pass, only the prev pointers of the resulting # subtrees have meaningful values. The other pointers will be fixed # in pass 2. prev = None while True: next = node.next if next is None: node.prev = prev break next_next = next.next node = link(node, next) node.prev = prev prev = node if next_next is None: break node = next_next # Pass 2: Successively merge the subtrees produced by pass 1 from # right to left with the rightmost one. prev = node.prev while prev is not None: prev_prev = prev.prev node = link(prev, node) prev = prev_prev # Now node can become the new root. Its has no parent nor siblings. node.prev = None node.next = None node.parent = None return node def _cut(self, node): """Cut a node from its parent. """ prev = node.prev next = node.next if prev is not None: prev.next = next else: node.parent.left = next node.prev = None if next is not None: next.prev = prev node.next = None node.parent = None class BinaryHeap(MinHeap): """A binary heap. """ def __init__(self): """Initialize a binary heap. """ super(BinaryHeap, self).__init__() self._heap = [] self._count = count() @_inherit_doc(MinHeap) def min(self): dict = self._dict if not dict: raise nx.NetworkXError('heap is empty') heap = self._heap pop = heappop # Repeatedly remove stale key-value pairs until a up-to-date one is # met. while True: value, _, key = heap[0] if key in dict and value == dict[key]: break pop(heap) return (key, value) @_inherit_doc(MinHeap) def pop(self): dict = self._dict if not dict: raise nx.NetworkXError('heap is empty') heap = self._heap pop = heappop # Repeatedly remove stale key-value pairs until a up-to-date one is # met. while True: value, _, key = heap[0] pop(heap) if key in dict and value == dict[key]: break del dict[key] return (key, value) @_inherit_doc(MinHeap) def get(self, key, default=None): return self._dict.get(key, default) @_inherit_doc(MinHeap) def insert(self, key, value, allow_increase=False): dict = self._dict if key in dict: old_value = dict[key] if value < old_value or (allow_increase and value > old_value): # Since there is no way to efficiently obtain the location of a # key-value pair in the heap, insert a new pair even if ones # with the same key may already be present. Deem the old ones # as stale and skip them when the minimum pair is queried. dict[key] = value heappush(self._heap, (value, next(self._count), key)) return value < old_value return False else: dict[key] = value heappush(self._heap, (value, next(self._count), key)) return True networkx-1.11/networkx/utils/__init__.py0000644000175000017500000000042012637544450020333 0ustar aricaric00000000000000from networkx.utils.misc import * from networkx.utils.decorators import * from networkx.utils.random_sequence import * from networkx.utils.union_find import * from networkx.utils.rcm import * from networkx.utils.heaps import * from networkx.utils.contextmanagers import * networkx-1.11/networkx/utils/rcm.py0000644000175000017500000001141112637544500017353 0ustar aricaric00000000000000""" Cuthill-McKee ordering of graph nodes to produce sparse matrices """ # Copyright (C) 2011-2014 by # Aric Hagberg # All rights reserved. # BSD license. from collections import deque from operator import itemgetter import networkx as nx __author__ = """\n""".join(['Aric Hagberg ']) __all__ = ['cuthill_mckee_ordering', 'reverse_cuthill_mckee_ordering'] def cuthill_mckee_ordering(G, heuristic=None): """Generate an ordering (permutation) of the graph nodes to make a sparse matrix. Uses the Cuthill-McKee heuristic (based on breadth-first search) [1]_. Parameters ---------- G : graph A NetworkX graph heuristic : function, optional Function to choose starting node for RCM algorithm. If None a node from a psuedo-peripheral pair is used. A user-defined function can be supplied that takes a graph object and returns a single node. Returns ------- nodes : generator Generator of nodes in Cuthill-McKee ordering. Examples -------- >>> from networkx.utils import cuthill_mckee_ordering >>> G = nx.path_graph(4) >>> rcm = list(cuthill_mckee_ordering(G)) >>> A = nx.adjacency_matrix(G, nodelist=rcm) # doctest: +SKIP Smallest degree node as heuristic function: >>> def smallest_degree(G): ... return min(G, key=G.degree) >>> rcm = list(cuthill_mckee_ordering(G, heuristic=smallest_degree)) See Also -------- reverse_cuthill_mckee_ordering Notes ----- The optimal solution the the bandwidth reduction is NP-complete [2]_. References ---------- .. [1] E. Cuthill and J. McKee. Reducing the bandwidth of sparse symmetric matrices, In Proc. 24th Nat. Conf. ACM, pages 157-172, 1969. http://doi.acm.org/10.1145/800195.805928 .. [2] Steven S. Skiena. 1997. The Algorithm Design Manual. Springer-Verlag New York, Inc., New York, NY, USA. """ for c in nx.connected_components(G): for n in connected_cuthill_mckee_ordering(G.subgraph(c), heuristic): yield n def reverse_cuthill_mckee_ordering(G, heuristic=None): """Generate an ordering (permutation) of the graph nodes to make a sparse matrix. Uses the reverse Cuthill-McKee heuristic (based on breadth-first search) [1]_. Parameters ---------- G : graph A NetworkX graph heuristic : function, optional Function to choose starting node for RCM algorithm. If None a node from a psuedo-peripheral pair is used. A user-defined function can be supplied that takes a graph object and returns a single node. Returns ------- nodes : generator Generator of nodes in reverse Cuthill-McKee ordering. Examples -------- >>> from networkx.utils import reverse_cuthill_mckee_ordering >>> G = nx.path_graph(4) >>> rcm = list(reverse_cuthill_mckee_ordering(G)) >>> A = nx.adjacency_matrix(G, nodelist=rcm) # doctest: +SKIP Smallest degree node as heuristic function: >>> def smallest_degree(G): ... return min(G, key=G.degree) >>> rcm = list(reverse_cuthill_mckee_ordering(G, heuristic=smallest_degree)) See Also -------- cuthill_mckee_ordering Notes ----- The optimal solution the the bandwidth reduction is NP-complete [2]_. References ---------- .. [1] E. Cuthill and J. McKee. Reducing the bandwidth of sparse symmetric matrices, In Proc. 24th Nat. Conf. ACM, pages 157-72, 1969. http://doi.acm.org/10.1145/800195.805928 .. [2] Steven S. Skiena. 1997. The Algorithm Design Manual. Springer-Verlag New York, Inc., New York, NY, USA. """ return reversed(list(cuthill_mckee_ordering(G, heuristic=heuristic))) def connected_cuthill_mckee_ordering(G, heuristic=None): # the cuthill mckee algorithm for connected graphs if heuristic is None: start = pseudo_peripheral_node(G) else: start = heuristic(G) visited = {start} queue = deque([start]) while queue: parent = queue.popleft() yield parent nd = sorted(G.degree(set(G[parent]) - visited).items(), key=itemgetter(1)) children = [n for n, d in nd] visited.update(children) queue.extend(children) def pseudo_peripheral_node(G): # helper for cuthill-mckee to find a node in a "pseudo peripheral pair" # to use as good starting node u = next(G.nodes_iter()) lp = 0 v = u while True: spl = nx.shortest_path_length(G, v) l = max(spl.values()) if l <= lp: break lp = l farthest = (n for n, dist in spl.items() if dist == l) v, deg = min(G.degree(farthest).items(), key=itemgetter(1)) return v networkx-1.11/networkx/utils/tests/0000755000175000017500000000000012653231455017364 5ustar aricaric00000000000000networkx-1.11/networkx/utils/tests/test_unionfind.py0000644000175000017500000000064712637544450023001 0ustar aricaric00000000000000from nose.tools import * import networkx as nx def test_unionfind(): # Fixed by: 2cddd5958689bdecdcd89b91ac9aaf6ce0e4f6b8 # Previously (in 2.x), the UnionFind class could handle mixed types. # But in Python 3.x, this causes a TypeError such as: # TypeError: unorderable types: str() > int() # # Now we just make sure that no exception is raised. x = nx.utils.UnionFind() x.union(0, 'a') networkx-1.11/networkx/utils/tests/test_contextmanager.py0000644000175000017500000000060312637544450024017 0ustar aricaric00000000000000from __future__ import absolute_import from nose.tools import * import networkx as nx def test_reversed(): G = nx.DiGraph() G.add_edge('A', 'B') # no exception with nx.utils.reversed(G): pass assert_true('B' in G['A']) # exception try: with nx.utils.reversed(G): raise Exception except: assert_true('B' in G['A']) networkx-1.11/networkx/utils/tests/test.txt0000644000175000017500000000002612637544450021106 0ustar aricaric00000000000000Blah... BLAH BLAH!!!! networkx-1.11/networkx/utils/tests/test_heaps.py0000644000175000017500000000761312637544450022110 0ustar aricaric00000000000000from nose.tools import * import networkx as nx from networkx.utils import * class X(object): def __eq__(self, other): raise self is other def __ne__(self, other): raise self is not other def __lt__(self, other): raise TypeError('cannot compare') def __le__(self, other): raise TypeError('cannot compare') def __ge__(self, other): raise TypeError('cannot compare') def __gt__(self, other): raise TypeError('cannot compare') def __hash__(self): return hash(id(self)) x = X() data = [# min should not invent an element. ('min', nx.NetworkXError), # Popping an empty heap should fail. ('pop', nx.NetworkXError), # Getting nonexisting elements should return None. ('get', 0, None), ('get', x, None), ('get', None, None), # Inserting a new key should succeed. ('insert', x, 1, True), ('get', x, 1), ('min', (x, 1)), # min should not pop the top element. ('min', (x, 1)), # Inserting a new key of different type should succeed. ('insert', 1, -2.0, True), # int and float values should interop. ('min', (1, -2.0)), # pop removes minimum-valued element. ('insert', 3, -10 ** 100, True), ('insert', 4, 5, True), ('pop', (3, -10 ** 100)), ('pop', (1, -2.0)), # Decrease-insert should succeed. ('insert', 4, -50, True), ('insert', 4, -60, False, True), # Decrease-insert should not create duplicate keys. ('pop', (4, -60)), ('pop', (x, 1)), # Popping all elements should empty the heap. ('min', nx.NetworkXError), ('pop', nx.NetworkXError), # Non-value-changing insert should fail. ('insert', x, 0, True), ('insert', x, 0, False, False), ('min', (x, 0)), ('insert', x, 0, True, False), ('min', (x, 0)), # Failed insert should not create duplicate keys. ('pop', (x, 0)), ('pop', nx.NetworkXError), # Increase-insert should succeed when allowed. ('insert', None, 0, True), ('insert', 2, -1, True), ('min', (2, -1)), ('insert', 2, 1, True, False), ('min', (None, 0)), # Increase-insert should fail when disallowed. ('insert', None, 2, False, False), ('min', (None, 0)), # Failed increase-insert should not create duplicate keys. ('pop', (None, 0)), ('pop', (2, 1)), ('min', nx.NetworkXError), ('pop', nx.NetworkXError)] def _test_heap_class(cls, *args, **kwargs): heap = cls(*args, **kwargs) # Basic behavioral test for op in data: if op[-1] is not nx.NetworkXError: assert_equal(op[-1], getattr(heap, op[0])(*op[1:-1])) else: assert_raises(op[-1], getattr(heap, op[0]), *op[1:-1]) # Coverage test. for i in range(99, -1, -1): assert_true(heap.insert(i, i)) for i in range(50): assert_equal(heap.pop(), (i, i)) for i in range(100): assert_equal(heap.insert(i, i), i < 50) for i in range(100): assert_false(heap.insert(i, i + 1)) for i in range(50): assert_equal(heap.pop(), (i, i)) for i in range(100): assert_equal(heap.insert(i, i + 1), i < 50) for i in range(49): assert_equal(heap.pop(), (i, i + 1)) assert_equal(sorted([heap.pop(), heap.pop()]), [(49, 50), (50, 50)]) for i in range(51, 100): assert_false(heap.insert(i, i + 1, True)) for i in range(51, 70): assert_equal(heap.pop(), (i, i + 1)) for i in range(100): assert_true(heap.insert(i, i)) for i in range(100): assert_equal(heap.pop(), (i, i)) assert_raises(nx.NetworkXError, heap.pop) def test_PairingHeap(): _test_heap_class(PairingHeap) def test_BinaryHeap(): _test_heap_class(BinaryHeap) networkx-1.11/networkx/utils/tests/test_random_sequence.py0000644000175000017500000000175512637544450024161 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * from networkx.utils import uniform_sequence,powerlaw_sequence,\ create_degree_sequence,zipf_rv,zipf_sequence,random_weighted_sample,\ weighted_choice import networkx.utils def test_degree_sequences(): seq=create_degree_sequence(10,uniform_sequence) assert_equal(len(seq), 10) seq=create_degree_sequence(10,powerlaw_sequence) assert_equal(len(seq), 10) def test_zipf_rv(): r = zipf_rv(2.3) assert_true(type(r),int) assert_raises(ValueError,zipf_rv,0.5) assert_raises(ValueError,zipf_rv,2,xmin=0) def test_zipf_sequence(): s = zipf_sequence(10) assert_equal(len(s),10) def test_random_weighted_sample(): mapping={'a':10,'b':20} s = random_weighted_sample(mapping,2) assert_equal(sorted(s),sorted(mapping.keys())) assert_raises(ValueError,random_weighted_sample,mapping,3) def test_random_weighted_choice(): mapping={'a':10,'b':0} c = weighted_choice(mapping) assert_equal(c,'a') networkx-1.11/networkx/utils/tests/test_decorators.py0000644000175000017500000000774212637544450023160 0ustar aricaric00000000000000import tempfile import os from nose.tools import * import networkx as nx from networkx.utils.decorators import open_file,not_implemented_for def test_not_implemented_decorator(): @not_implemented_for('directed') def test1(G): pass test1(nx.Graph()) @raises(KeyError) def test_not_implemented_decorator_key(): @not_implemented_for('foo') def test1(G): pass test1(nx.Graph()) @raises(nx.NetworkXNotImplemented) def test_not_implemented_decorator_raise(): @not_implemented_for('graph') def test1(G): pass test1(nx.Graph()) class TestOpenFileDecorator(object): def setUp(self): self.text = ['Blah... ', 'BLAH ', 'BLAH!!!!'] self.fobj = tempfile.NamedTemporaryFile('wb+', delete=False) self.name = self.fobj.name def write(self, path): for text in self.text: path.write(text.encode('ascii')) @open_file(1, 'r') def read(self, path): return path.readlines()[0] @staticmethod @open_file(0, 'wb') def writer_arg0(path): path.write('demo'.encode('ascii')) @open_file(1, 'wb+') def writer_arg1(self, path): self.write(path) @open_file(2, 'wb') def writer_arg2default(self, x, path=None): if path is None: fh = tempfile.NamedTemporaryFile('wb+', delete=False) close_fh = True else: fh = path close_fh = False try: self.write(fh) finally: if close_fh: fh.close() @open_file(4, 'wb') def writer_arg4default(self, x, y, other='hello', path=None, **kwargs): if path is None: fh = tempfile.NamedTemporaryFile('wb+', delete=False) close_fh = True else: fh = path close_fh = False try: self.write(fh) finally: if close_fh: fh.close() @open_file('path', 'wb') def writer_kwarg(self, **kwargs): path = kwargs.get('path', None) if path is None: fh = tempfile.NamedTemporaryFile('wb+', delete=False) close_fh = True else: fh = path close_fh = False try: self.write(fh) finally: if close_fh: fh.close() def test_writer_arg0_str(self): self.writer_arg0(self.name) def test_writer_arg0_fobj(self): self.writer_arg0(self.fobj) def test_writer_arg1_str(self): self.writer_arg1(self.name) assert_equal( self.read(self.name), ''.join(self.text) ) def test_writer_arg1_fobj(self): self.writer_arg1(self.fobj) assert_false(self.fobj.closed) self.fobj.close() assert_equal( self.read(self.name), ''.join(self.text) ) def test_writer_arg2default_str(self): self.writer_arg2default(0, path=None) self.writer_arg2default(0, path=self.name) assert_equal( self.read(self.name), ''.join(self.text) ) def test_writer_arg2default_fobj(self): self.writer_arg2default(0, path=self.fobj) assert_false(self.fobj.closed) self.fobj.close() assert_equal( self.read(self.name), ''.join(self.text) ) def test_writer_arg2default_fobj(self): self.writer_arg2default(0, path=None) def test_writer_arg4default_fobj(self): self.writer_arg4default(0, 1, dog='dog', other='other2') self.writer_arg4default(0, 1, dog='dog', other='other2', path=self.name) assert_equal( self.read(self.name), ''.join(self.text) ) def test_writer_kwarg_str(self): self.writer_kwarg(path=self.name) assert_equal( self.read(self.name), ''.join(self.text) ) def test_writer_kwarg_fobj(self): self.writer_kwarg(path=self.fobj) self.fobj.close() assert_equal( self.read(self.name), ''.join(self.text) ) def test_writer_kwarg_fobj(self): self.writer_kwarg(path=None) def tearDown(self): self.fobj.close() networkx-1.11/networkx/utils/tests/test_rcm.py0000644000175000017500000000243112637544500021556 0ustar aricaric00000000000000from nose.tools import * from networkx.utils import reverse_cuthill_mckee_ordering import networkx as nx def test_reverse_cuthill_mckee(): # example graph from # http://www.boost.org/doc/libs/1_37_0/libs/graph/example/cuthill_mckee_ordering.cpp G = nx.Graph([(0, 3), (0, 5), (1, 2), (1, 4), (1, 6), (1, 9), (2, 3), (2, 4), (3, 5), (3, 8), (4, 6), (5, 6), (5, 7), (6, 7)]) rcm = list(reverse_cuthill_mckee_ordering(G)) assert_true(rcm in [[0, 8, 5, 7, 3, 6, 2, 4, 1, 9], [0, 8, 5, 7, 3, 6, 4, 2, 1, 9]]) def test_rcm_alternate_heuristic(): # example from G = nx.Graph([(0, 0), (0, 4), (1, 1), (1, 2), (1, 5), (1, 7), (2, 2), (2, 4), (3, 3), (3, 6), (4, 4), (5, 5), (5, 7), (6, 6), (7, 7)]) answers = [[6, 3, 5, 7, 1, 2, 4, 0], [6, 3, 7, 5, 1, 2, 4, 0]] def smallest_degree(G): node, deg = min(G.degree().items(), key=lambda x: x[1]) return node rcm = list(reverse_cuthill_mckee_ordering(G, heuristic=smallest_degree)) assert_true(rcm in answers) networkx-1.11/networkx/utils/tests/test_misc.py0000644000175000017500000000665712637544500021746 0ustar aricaric00000000000000# -*- encoding: utf-8 -*- from nose.tools import * from nose import SkipTest import networkx as nx from networkx.utils import * def test_is_string_like(): assert_true(is_string_like("aaaa")) assert_false(is_string_like(None)) assert_false(is_string_like(123)) def test_iterable(): assert_false(iterable(None)) assert_false(iterable(10)) assert_true(iterable([1,2,3])) assert_true(iterable((1,2,3))) assert_true(iterable({1:"A",2:"X"})) assert_true(iterable("ABC")) def test_graph_iterable(): K=nx.complete_graph(10) assert_true(iterable(K)) assert_true(iterable(K.nodes_iter())) assert_true(iterable(K.edges_iter())) def test_is_list_of_ints(): assert_true(is_list_of_ints([1,2,3,42])) assert_false(is_list_of_ints([1,2,3,"kermit"])) def test_random_number_distribution(): # smoke test only z=uniform_sequence(20) z=powerlaw_sequence(20,exponent=2.5) z=pareto_sequence(20,exponent=1.5) z=discrete_sequence(20,distribution=[0,0,0,0,1,1,1,1,2,2,3]) def test_make_str_with_bytes(): import sys PY2 = sys.version_info[0] == 2 x = "qualité" y = make_str(x) if PY2: assert_true(isinstance(y, unicode)) # Since file encoding is utf-8, the é will be two bytes. assert_true(len(y) == 8) else: assert_true(isinstance(y, str)) assert_true(len(y) == 7) def test_make_str_with_unicode(): import sys PY2 = sys.version_info[0] == 2 if PY2: x = unicode("qualité", encoding='utf-8') y = make_str(x) assert_true(isinstance(y, unicode)) assert_true(len(y) == 7) else: x = "qualité" y = make_str(x) assert_true(isinstance(y, str)) assert_true(len(y) == 7) class TestNumpyArray(object): @classmethod def setupClass(cls): global numpy global assert_allclose try: import numpy from numpy.testing import assert_allclose except ImportError: raise SkipTest('NumPy not available.') def test_dict_to_numpy_array1(self): d = {'a':1,'b':2} a = dict_to_numpy_array1(d, mapping={'a':0, 'b':1}) assert_allclose(a, numpy.array([1,2])) a = dict_to_numpy_array1(d, mapping={'b':0, 'a':1}) assert_allclose(a, numpy.array([2,1])) a = dict_to_numpy_array1(d) assert_allclose(a.sum(), 3) def test_dict_to_numpy_array2(self): d = {'a': {'a':1,'b':2}, 'b': {'a':10,'b':20}} mapping = {'a':1, 'b': 0} a = dict_to_numpy_array2(d, mapping=mapping) assert_allclose(a, numpy.array([[20,10],[2,1]])) a = dict_to_numpy_array2(d) assert_allclose(a.sum(), 33) def test_dict_to_numpy_array_a(self): d = {'a': {'a':1,'b':2}, 'b': {'a':10,'b':20}} mapping = {'a':0, 'b': 1} a = dict_to_numpy_array(d, mapping=mapping) assert_allclose(a, numpy.array([[1,2],[10,20]])) mapping = {'a':1, 'b': 0} a = dict_to_numpy_array(d, mapping=mapping) assert_allclose(a, numpy.array([[20,10],[2,1]])) a = dict_to_numpy_array2(d) assert_allclose(a.sum(), 33) def test_dict_to_numpy_array_b(self): d = {'a':1,'b':2} mapping = {'a': 0, 'b': 1} a = dict_to_numpy_array(d, mapping=mapping) assert_allclose(a, numpy.array([1,2])) a = dict_to_numpy_array1(d) assert_allclose(a.sum(), 3) networkx-1.11/networkx/utils/union_find.py0000644000175000017500000000473312637544450020737 0ustar aricaric00000000000000""" Union-find data structure. """ # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. class UnionFind: """Union-find data structure. Each unionFind instance X maintains a family of disjoint sets of hashable objects, supporting the following two methods: - X[item] returns a name for the set containing the given item. Each set is named by an arbitrarily-chosen one of its members; as long as the set remains unchanged it will keep the same name. If the item is not yet part of a set in X, a new singleton set is created for it. - X.union(item1, item2, ...) merges the sets containing each item into a single larger set. If any item is not yet part of a set in X, it is added to X as one of the members of the merged set. Union-find data structure. Based on Josiah Carlson's code, http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/215912 with significant additional changes by D. Eppstein. http://www.ics.uci.edu/~eppstein/PADS/UnionFind.py """ def __init__(self): """Create a new empty union-find structure.""" self.weights = {} self.parents = {} def __getitem__(self, object): """Find and return the name of the set containing the object.""" # check for previously unknown object if object not in self.parents: self.parents[object] = object self.weights[object] = 1 return object # find path of objects leading to the root path = [object] root = self.parents[object] while root != path[-1]: path.append(root) root = self.parents[root] # compress the path and return for ancestor in path: self.parents[ancestor] = root return root def __iter__(self): """Iterate through all items ever found or unioned by this structure. """ return iter(self.parents) def union(self, *objects): """Find the sets containing the objects and merge them all.""" roots = [self[x] for x in objects] # Find the heaviest root according to its weight. heaviest = max(roots, key=lambda r: self.weights[r]) for r in roots: if r != heaviest: self.weights[heaviest] += self.weights[r] self.parents[r] = heaviest networkx-1.11/networkx/utils/misc.py0000644000175000017500000001211112637544500017523 0ustar aricaric00000000000000""" Miscellaneous Helpers for NetworkX. These are not imported into the base networkx namespace but can be accessed, for example, as >>> import networkx >>> networkx.utils.is_string_like('spam') True """ # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import sys import uuid # itertools.accumulate is only available on Python 3.2 or later. # # Once support for Python versions less than 3.2 is dropped, this code should # be removed. try: from itertools import accumulate except ImportError: import operator # The code for this function is from the Python 3.5 documentation, # distributed under the PSF license: # def accumulate(iterable, func=operator.add): it = iter(iterable) try: total = next(it) except StopIteration: return yield total for element in it: total = func(total, element) yield total import networkx as nx __author__ = '\n'.join(['Aric Hagberg (hagberg@lanl.gov)', 'Dan Schult(dschult@colgate.edu)', 'Ben Edwards(bedwards@cs.unm.edu)']) ### some cookbook stuff # used in deciding whether something is a bunch of nodes, edges, etc. # see G.add_nodes and others in Graph Class in networkx/base.py def is_string_like(obj): # from John Hunter, types-free version """Check if obj is string.""" try: obj + '' except (TypeError, ValueError): return False return True def iterable(obj): """ Return True if obj is iterable with a well-defined len().""" if hasattr(obj,"__iter__"): return True try: len(obj) except: return False return True def flatten(obj, result=None): """ Return flattened version of (possibly nested) iterable object. """ if not iterable(obj) or is_string_like(obj): return obj if result is None: result = [] for item in obj: if not iterable(item) or is_string_like(item): result.append(item) else: flatten(item, result) return obj.__class__(result) def is_list_of_ints( intlist ): """ Return True if list is a list of ints. """ if not isinstance(intlist,list): return False for i in intlist: if not isinstance(i,int): return False return True PY2 = sys.version_info[0] == 2 if PY2: def make_str(x): """Return the string representation of t.""" if isinstance(x, unicode): return x else: # Note, this will not work unless x is ascii-encoded. # That is good, since we should be working with unicode anyway. # Essentially, unless we are reading a file, we demand that users # convert any encoded strings to unicode before using the library. # # Also, the str() is necessary to convert integers, etc. # unicode(3) works, but unicode(3, 'unicode-escape') wants a buffer. # return unicode(str(x), 'unicode-escape') else: def make_str(x): """Return the string representation of t.""" return str(x) def generate_unique_node(): """ Generate a unique node label.""" return str(uuid.uuid1()) def default_opener(filename): """Opens `filename` using system's default program. Parameters ---------- filename : str The path of the file to be opened. """ from subprocess import call cmds = {'darwin': ['open'], 'linux2': ['xdg-open'], 'win32': ['cmd.exe', '/C', 'start', '']} cmd = cmds[sys.platform] + [filename] call(cmd) def dict_to_numpy_array(d,mapping=None): """Convert a dictionary of dictionaries to a numpy array with optional mapping.""" try: return dict_to_numpy_array2(d, mapping) except (AttributeError, TypeError): # AttributeError is when no mapping was provided and v.keys() fails. # TypeError is when a mapping was provided and d[k1][k2] fails. return dict_to_numpy_array1(d,mapping) def dict_to_numpy_array2(d,mapping=None): """Convert a dictionary of dictionaries to a 2d numpy array with optional mapping. """ import numpy if mapping is None: s=set(d.keys()) for k,v in d.items(): s.update(v.keys()) mapping=dict(zip(s,range(len(s)))) n=len(mapping) a = numpy.zeros((n, n)) for k1, i in mapping.items(): for k2, j in mapping.items(): try: a[i,j]=d[k1][k2] except KeyError: pass return a def dict_to_numpy_array1(d,mapping=None): """Convert a dictionary of numbers to a 1d numpy array with optional mapping. """ import numpy if mapping is None: s = set(d.keys()) mapping = dict(zip(s,range(len(s)))) n = len(mapping) a = numpy.zeros(n) for k1,i in mapping.items(): i = mapping[k1] a[i] = d[k1] return a networkx-1.11/networkx/utils/decorators.py0000644000175000017500000001646512637544450020761 0ustar aricaric00000000000000import sys from collections import defaultdict from os.path import splitext import networkx as nx from decorator import decorator from networkx.utils import is_string_like __all__ = [ 'not_implemented_for', 'open_file', ] def not_implemented_for(*graph_types): """Decorator to mark algorithms as not implemented Parameters ---------- graph_types : container of strings Entries must be one of 'directed','undirected', 'multigraph', 'graph'. Returns ------- _require : function The decorated function. Raises ------ NetworkXNotImplemnted If any of the packages cannot be imported Notes ----- Multiple types are joined logically with "and". For "or" use multiple @not_implemented_for() lines. Examples -------- Decorate functions like this:: @not_implemnted_for('directed') def sp_function(): pass @not_implemnted_for('directed','multigraph') def sp_np_function(): pass """ @decorator def _not_implemented_for(f,*args,**kwargs): graph = args[0] terms= {'directed':graph.is_directed(), 'undirected':not graph.is_directed(), 'multigraph':graph.is_multigraph(), 'graph':not graph.is_multigraph()} match = True try: for t in graph_types: match = match and terms[t] except KeyError: raise KeyError('use one or more of ', 'directed, undirected, multigraph, graph') if match: raise nx.NetworkXNotImplemented('not implemented for %s type'% ' '.join(graph_types)) else: return f(*args,**kwargs) return _not_implemented_for def _open_gz(path, mode): import gzip return gzip.open(path,mode=mode) def _open_bz2(path, mode): import bz2 return bz2.BZ2File(path,mode=mode) # To handle new extensions, define a function accepting a `path` and `mode`. # Then add the extension to _dispatch_dict. _dispatch_dict = defaultdict(lambda : open) _dispatch_dict['.gz'] = _open_gz _dispatch_dict['.bz2'] = _open_bz2 _dispatch_dict['.gzip'] = _open_gz def open_file(path_arg, mode='r'): """Decorator to ensure clean opening and closing of files. Parameters ---------- path_arg : int Location of the path argument in args. Even if the argument is a named positional argument (with a default value), you must specify its index as a positional argument. mode : str String for opening mode. Returns ------- _open_file : function Function which cleanly executes the io. Examples -------- Decorate functions like this:: @open_file(0,'r') def read_function(pathname): pass @open_file(1,'w') def write_function(G,pathname): pass @open_file(1,'w') def write_function(G, pathname='graph.dot') pass @open_file('path', 'w+') def another_function(arg, **kwargs): path = kwargs['path'] pass """ # Note that this decorator solves the problem when a path argument is # specified as a string, but it does not handle the situation when the # function wants to accept a default of None (and then handle it). # Here is an example: # # @open_file('path') # def some_function(arg1, arg2, path=None): # if path is None: # fobj = tempfile.NamedTemporaryFile(delete=False) # close_fobj = True # else: # # `path` could have been a string or file object or something # # similar. In any event, the decorator has given us a file object # # and it will close it for us, if it should. # fobj = path # close_fobj = False # # try: # fobj.write('blah') # finally: # if close_fobj: # fobj.close() # # Normally, we'd want to use "with" to ensure that fobj gets closed. # However, recall that the decorator will make `path` a file object for # us, and using "with" would undesirably close that file object. Instead, # you use a try block, as shown above. When we exit the function, fobj will # be closed, if it should be, by the decorator. @decorator def _open_file(func, *args, **kwargs): # Note that since we have used @decorator, *args, and **kwargs have # already been resolved to match the function signature of func. This # means default values have been propagated. For example, the function # func(x, y, a=1, b=2, **kwargs) if called as func(0,1,b=5,c=10) would # have args=(0,1,1,5) and kwargs={'c':10}. # First we parse the arguments of the decorator. The path_arg could # be an positional argument or a keyword argument. Even if it is try: # path_arg is a required positional argument # This works precisely because we are using @decorator path = args[path_arg] except TypeError: # path_arg is a keyword argument. It is "required" in the sense # that it must exist, according to the decorator specification, # It can exist in `kwargs` by a developer specified default value # or it could have been explicitly set by the user. try: path = kwargs[path_arg] except KeyError: # Could not find the keyword. Thus, no default was specified # in the function signature and the user did not provide it. msg = 'Missing required keyword argument: {0}' raise nx.NetworkXError(msg.format(path_arg)) else: is_kwarg = True except IndexError: # A "required" argument was missing. This can only happen if # the decorator of the function was incorrectly specified. # So this probably is not a user error, but a developer error. msg = "path_arg of open_file decorator is incorrect" raise nx.NetworkXError(msg) else: is_kwarg = False # Now we have the path_arg. There are two types of input to consider: # 1) string representing a path that should be opened # 2) an already opened file object if is_string_like(path): ext = splitext(path)[1] fobj = _dispatch_dict[ext](path, mode=mode) close_fobj = True elif hasattr(path, 'read'): # path is already a file-like object fobj = path close_fobj = False else: # could be None, in which case the algorithm will deal with it fobj = path close_fobj = False # Insert file object into args or kwargs. if is_kwarg: new_args = args kwargs[path_arg] = fobj else: # args is a tuple, so we must convert to list before modifying it. new_args = list(args) new_args[path_arg] = fobj # Finally, we call the original function, making sure to close the fobj. try: result = func(*new_args, **kwargs) finally: if close_fobj: fobj.close() return result return _open_file networkx-1.11/networkx/utils/contextmanagers.py0000644000175000017500000000104512637544450022002 0ustar aricaric00000000000000from __future__ import absolute_import from contextlib import contextmanager __all__ = [ 'reversed', ] @contextmanager def reversed(G): """A context manager for temporarily reversing a directed graph in place. This is a no-op for undirected graphs. Parameters ---------- G : graph A NetworkX graph. """ directed = G.is_directed() if directed: G.reverse(copy=False) try: yield finally: if directed: # Reverse the reverse. G.reverse(copy=False) networkx-1.11/networkx/utils/random_sequence.py0000644000175000017500000001443412637544500021752 0ustar aricaric00000000000000""" Utilities for generating random numbers, random sequences, and random selections. """ # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import random import sys import networkx as nx __author__ = '\n'.join(['Aric Hagberg (hagberg@lanl.gov)', 'Dan Schult(dschult@colgate.edu)', 'Ben Edwards(bedwards@cs.unm.edu)']) import warnings as _warnings def create_degree_sequence(n, sfunction=None, max_tries=50, **kwds): _warnings.warn("create_degree_sequence() is deprecated", DeprecationWarning) """ Attempt to create a valid degree sequence of length n using specified function sfunction(n,**kwds). Parameters ---------- n : int Length of degree sequence = number of nodes sfunction: function Function which returns a list of n real or integer values. Called as "sfunction(n,**kwds)". max_tries: int Max number of attempts at creating valid degree sequence. Notes ----- Repeatedly create a degree sequence by calling sfunction(n,**kwds) until achieving a valid degree sequence. If unsuccessful after max_tries attempts, raise an exception. For examples of sfunctions that return sequences of random numbers, see networkx.Utils. Examples -------- >>> from networkx.utils import uniform_sequence, create_degree_sequence >>> seq=create_degree_sequence(10,uniform_sequence) """ tries=0 max_deg=n while tries < max_tries: trialseq=sfunction(n,**kwds) # round to integer values in the range [0,max_deg] seq=[min(max_deg, max( int(round(s)),0 )) for s in trialseq] # if graphical return, else throw away and try again if nx.is_valid_degree_sequence(seq): return seq tries+=1 raise nx.NetworkXError(\ "Exceeded max (%d) attempts at a valid sequence."%max_tries) # The same helpers for choosing random sequences from distributions # uses Python's random module # http://www.python.org/doc/current/lib/module-random.html def pareto_sequence(n,exponent=1.0): """ Return sample sequence of length n from a Pareto distribution. """ return [random.paretovariate(exponent) for i in range(n)] def powerlaw_sequence(n,exponent=2.0): """ Return sample sequence of length n from a power law distribution. """ return [random.paretovariate(exponent-1) for i in range(n)] def zipf_rv(alpha, xmin=1, seed=None): r"""Return a random value chosen from the Zipf distribution. The return value is an integer drawn from the probability distribution ::math:: p(x)=\frac{x^{-\alpha}}{\zeta(\alpha,x_{min})}, where `\zeta(\alpha,x_{min})` is the Hurwitz zeta function. Parameters ---------- alpha : float Exponent value of the distribution xmin : int Minimum value seed : int Seed value for random number generator Returns ------- x : int Random value from Zipf distribution Raises ------ ValueError: If xmin < 1 or If alpha <= 1 Notes ----- The rejection algorithm generates random values for a the power-law distribution in uniformly bounded expected time dependent on parameters. See [1] for details on its operation. Examples -------- >>> nx.zipf_rv(alpha=2, xmin=3, seed=42) # doctest: +SKIP References ---------- ..[1] Luc Devroye, Non-Uniform Random Variate Generation, Springer-Verlag, New York, 1986. """ if xmin < 1: raise ValueError("xmin < 1") if alpha <= 1: raise ValueError("a <= 1.0") if not seed is None: random.seed(seed) a1 = alpha - 1.0 b = 2**a1 while True: u = 1.0 - random.random() # u in (0,1] v = random.random() # v in [0,1) x = int(xmin*u**-(1.0/a1)) t = (1.0+(1.0/x))**a1 if v*x*(t-1.0)/(b-1.0) <= t/b: break return x def zipf_sequence(n, alpha=2.0, xmin=1): """Return a sample sequence of length n from a Zipf distribution with exponent parameter alpha and minimum value xmin. See Also -------- zipf_rv """ return [ zipf_rv(alpha,xmin) for _ in range(n)] def uniform_sequence(n): """ Return sample sequence of length n from a uniform distribution. """ return [ random.uniform(0,n) for i in range(n)] def cumulative_distribution(distribution): """Return normalized cumulative distribution from discrete distribution.""" cdf=[] cdf.append(0.0) psum=float(sum(distribution)) for i in range(0,len(distribution)): cdf.append(cdf[i]+distribution[i]/psum) return cdf def discrete_sequence(n, distribution=None, cdistribution=None): """ Return sample sequence of length n from a given discrete distribution or discrete cumulative distribution. One of the following must be specified. distribution = histogram of values, will be normalized cdistribution = normalized discrete cumulative distribution """ import bisect if cdistribution is not None: cdf=cdistribution elif distribution is not None: cdf=cumulative_distribution(distribution) else: raise nx.NetworkXError( "discrete_sequence: distribution or cdistribution missing") # get a uniform random number inputseq=[random.random() for i in range(n)] # choose from CDF seq=[bisect.bisect_left(cdf,s)-1 for s in inputseq] return seq def random_weighted_sample(mapping, k): """Return k items without replacement from a weighted sample. The input is a dictionary of items with weights as values. """ if k > len(mapping): raise ValueError("sample larger than population") sample = set() while len(sample) < k: sample.add(weighted_choice(mapping)) return list(sample) def weighted_choice(mapping): """Return a single element from a weighted sample. The input is a dictionary of items with weights as values. """ # use roulette method rnd = random.random() * sum(mapping.values()) for k, w in mapping.items(): rnd -= w if rnd < 0: return k networkx-1.11/networkx/classes/0000755000175000017500000000000012653231454016516 5ustar aricaric00000000000000networkx-1.11/networkx/classes/function.py0000644000175000017500000004045312637544500020724 0ustar aricaric00000000000000"""Functional interface to graph methods and assorted utilities. """ # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. # import networkx as nx from networkx.utils import not_implemented_for import itertools __author__ = """\n""".join(['Aric Hagberg (hagberg@lanl.gov)', 'Pieter Swart (swart@lanl.gov)', 'Dan Schult(dschult@colgate.edu)']) __all__ = ['nodes', 'edges', 'degree', 'degree_histogram', 'neighbors', 'number_of_nodes', 'number_of_edges', 'density', 'nodes_iter', 'edges_iter', 'is_directed','info', 'freeze','is_frozen','subgraph','create_empty_copy', 'set_node_attributes','get_node_attributes', 'set_edge_attributes','get_edge_attributes', 'all_neighbors','non_neighbors', 'non_edges', 'common_neighbors', 'is_weighted','is_negatively_weighted', 'is_empty'] def nodes(G): """Return a copy of the graph nodes in a list.""" return G.nodes() def nodes_iter(G): """Return an iterator over the graph nodes.""" return G.nodes_iter() def edges(G,nbunch=None): """Return list of edges incident to nodes in nbunch. Return all edges if nbunch is unspecified or nbunch=None. For digraphs, edges=out_edges """ return G.edges(nbunch) def edges_iter(G,nbunch=None): """Return iterator over edges incident to nodes in nbunch. Return all edges if nbunch is unspecified or nbunch=None. For digraphs, edges=out_edges """ return G.edges_iter(nbunch) def degree(G,nbunch=None,weight=None): """Return degree of single node or of nbunch of nodes. If nbunch is ommitted, then return degrees of *all* nodes. """ return G.degree(nbunch,weight) def neighbors(G,n): """Return a list of nodes connected to node n. """ return G.neighbors(n) def number_of_nodes(G): """Return the number of nodes in the graph.""" return G.number_of_nodes() def number_of_edges(G): """Return the number of edges in the graph. """ return G.number_of_edges() def density(G): r"""Return the density of a graph. The density for undirected graphs is .. math:: d = \frac{2m}{n(n-1)}, and for directed graphs is .. math:: d = \frac{m}{n(n-1)}, where `n` is the number of nodes and `m` is the number of edges in `G`. Notes ----- The density is 0 for a graph without edges and 1 for a complete graph. The density of multigraphs can be higher than 1. Self loops are counted in the total number of edges so graphs with self loops can have density higher than 1. """ n=number_of_nodes(G) m=number_of_edges(G) if m==0 or n <= 1: d=0.0 else: if G.is_directed(): d=m/float(n*(n-1)) else: d= m*2.0/float(n*(n-1)) return d def degree_histogram(G): """Return a list of the frequency of each degree value. Parameters ---------- G : Networkx graph A graph Returns ------- hist : list A list of frequencies of degrees. The degree values are the index in the list. Notes ----- Note: the bins are width one, hence len(list) can be large (Order(number_of_edges)) """ degseq=list(G.degree().values()) dmax=max(degseq)+1 freq= [ 0 for d in range(dmax) ] for d in degseq: freq[d] += 1 return freq def is_directed(G): """ Return True if graph is directed.""" return G.is_directed() def frozen(*args): """Dummy method for raising errors when trying to modify frozen graphs""" raise nx.NetworkXError("Frozen graph can't be modified") def freeze(G): """Modify graph to prevent further change by adding or removing nodes or edges. Node and edge data can still be modified. Parameters ---------- G : graph A NetworkX graph Examples -------- >>> G=nx.Graph() >>> G.add_path([0,1,2,3]) >>> G=nx.freeze(G) >>> try: ... G.add_edge(4,5) ... except nx.NetworkXError as e: ... print(str(e)) Frozen graph can't be modified Notes ----- To "unfreeze" a graph you must make a copy by creating a new graph object: >>> graph = nx.path_graph(4) >>> frozen_graph = nx.freeze(graph) >>> unfrozen_graph = nx.Graph(frozen_graph) >>> nx.is_frozen(unfrozen_graph) False See Also -------- is_frozen """ G.add_node=frozen G.add_nodes_from=frozen G.remove_node=frozen G.remove_nodes_from=frozen G.add_edge=frozen G.add_edges_from=frozen G.remove_edge=frozen G.remove_edges_from=frozen G.clear=frozen G.frozen=True return G def is_frozen(G): """Return True if graph is frozen. Parameters ---------- G : graph A NetworkX graph See Also -------- freeze """ try: return G.frozen except AttributeError: return False def subgraph(G, nbunch): """Return the subgraph induced on nodes in nbunch. Parameters ---------- G : graph A NetworkX graph nbunch : list, iterable A container of nodes that will be iterated through once (thus it should be an iterator or be iterable). Each element of the container should be a valid node type: any hashable type except None. If nbunch is None, return all edges data in the graph. Nodes in nbunch that are not in the graph will be (quietly) ignored. Notes ----- subgraph(G) calls G.subgraph() """ return G.subgraph(nbunch) def create_empty_copy(G,with_nodes=True): """Return a copy of the graph G with all of the edges removed. Parameters ---------- G : graph A NetworkX graph with_nodes : bool (default=True) Include nodes. Notes ----- Graph, node, and edge data is not propagated to the new graph. """ H=G.__class__() if with_nodes: H.add_nodes_from(G) return H def info(G, n=None): """Print short summary of information for the graph G or the node n. Parameters ---------- G : Networkx graph A graph n : node (any hashable) A node in the graph G """ info='' # append this all to a string if n is None: info+="Name: %s\n"%G.name type_name = [type(G).__name__] info+="Type: %s\n"%",".join(type_name) info+="Number of nodes: %d\n"%G.number_of_nodes() info+="Number of edges: %d\n"%G.number_of_edges() nnodes=G.number_of_nodes() if len(G) > 0: if G.is_directed(): info+="Average in degree: %8.4f\n"%\ (sum(G.in_degree().values())/float(nnodes)) info+="Average out degree: %8.4f"%\ (sum(G.out_degree().values())/float(nnodes)) else: s=sum(G.degree().values()) info+="Average degree: %8.4f"%\ (float(s)/float(nnodes)) else: if n not in G: raise nx.NetworkXError("node %s not in graph"%(n,)) info+="Node % s has the following properties:\n"%n info+="Degree: %d\n"%G.degree(n) info+="Neighbors: " info+=' '.join(str(nbr) for nbr in G.neighbors(n)) return info def set_node_attributes(G, name, values): """Set node attributes from dictionary of nodes and values Parameters ---------- G : NetworkX Graph name : string Attribute name values: dict Dictionary of attribute values keyed by node. If `values` is not a dictionary, then it is treated as a single attribute value that is then applied to every node in `G`. Examples -------- >>> G = nx.path_graph(3) >>> bb = nx.betweenness_centrality(G) >>> nx.set_node_attributes(G, 'betweenness', bb) >>> G.node[1]['betweenness'] 1.0 """ try: values.items except AttributeError: # Treat `value` as the attribute value for each node. values = dict(zip(G.nodes(), [values] * len(G))) for node, value in values.items(): G.node[node][name] = value def get_node_attributes(G, name): """Get node attributes from graph Parameters ---------- G : NetworkX Graph name : string Attribute name Returns ------- Dictionary of attributes keyed by node. Examples -------- >>> G=nx.Graph() >>> G.add_nodes_from([1,2,3],color='red') >>> color=nx.get_node_attributes(G,'color') >>> color[1] 'red' """ return dict( (n,d[name]) for n,d in G.node.items() if name in d) def set_edge_attributes(G, name, values): """Set edge attributes from dictionary of edge tuples and values. Parameters ---------- G : NetworkX Graph name : string Attribute name values : dict Dictionary of attribute values keyed by edge (tuple). For multigraphs, the keys tuples must be of the form (u, v, key). For non-multigraphs, the keys must be tuples of the form (u, v). If `values` is not a dictionary, then it is treated as a single attribute value that is then applied to every edge in `G`. Examples -------- >>> G = nx.path_graph(3) >>> bb = nx.edge_betweenness_centrality(G, normalized=False) >>> nx.set_edge_attributes(G, 'betweenness', bb) >>> G[1][2]['betweenness'] 2.0 """ try: values.items except AttributeError: # Treat `value` as the attribute value for each edge. if G.is_multigraph(): edges = G.edges(keys=True) else: edges = G.edges() values = dict(zip(edges, [values] * len(edges))) if G.is_multigraph(): for (u, v, key), value in values.items(): G[u][v][key][name] = value else: for (u, v), value in values.items(): G[u][v][name] = value def get_edge_attributes(G, name): """Get edge attributes from graph Parameters ---------- G : NetworkX Graph name : string Attribute name Returns ------- Dictionary of attributes keyed by edge. For (di)graphs, the keys are 2-tuples of the form: (u,v). For multi(di)graphs, the keys are 3-tuples of the form: (u, v, key). Examples -------- >>> G=nx.Graph() >>> G.add_path([1,2,3],color='red') >>> color=nx.get_edge_attributes(G,'color') >>> color[(1,2)] 'red' """ if G.is_multigraph(): edges = G.edges(keys=True, data=True) else: edges = G.edges(data=True) return dict( (x[:-1], x[-1][name]) for x in edges if name in x[-1] ) def all_neighbors(graph, node): """ Returns all of the neighbors of a node in the graph. If the graph is directed returns predecessors as well as successors. Parameters ---------- graph : NetworkX graph Graph to find neighbors. node : node The node whose neighbors will be returned. Returns ------- neighbors : iterator Iterator of neighbors """ if graph.is_directed(): values = itertools.chain.from_iterable([graph.predecessors_iter(node), graph.successors_iter(node)]) else: values = graph.neighbors_iter(node) return values def non_neighbors(graph, node): """Returns the non-neighbors of the node in the graph. Parameters ---------- graph : NetworkX graph Graph to find neighbors. node : node The node whose neighbors will be returned. Returns ------- non_neighbors : iterator Iterator of nodes in the graph that are not neighbors of the node. """ nbors = set(neighbors(graph, node)) | set([node]) return (nnode for nnode in graph if nnode not in nbors) def non_edges(graph): """Returns the non-existent edges in the graph. Parameters ---------- graph : NetworkX graph. Graph to find non-existent edges. Returns ------- non_edges : iterator Iterator of edges that are not in the graph. """ if graph.is_directed(): for u in graph.nodes_iter(): for v in non_neighbors(graph, u): yield (u, v) else: nodes = set(graph) while nodes: u = nodes.pop() for v in nodes - set(graph[u]): yield (u, v) @not_implemented_for('directed') def common_neighbors(G, u, v): """Return the common neighbors of two nodes in a graph. Parameters ---------- G : graph A NetworkX undirected graph. u, v : nodes Nodes in the graph. Returns ------- cnbors : iterator Iterator of common neighbors of u and v in the graph. Raises ------ NetworkXError If u or v is not a node in the graph. Examples -------- >>> G = nx.complete_graph(5) >>> sorted(nx.common_neighbors(G, 0, 1)) [2, 3, 4] """ if u not in G: raise nx.NetworkXError('u is not in the graph.') if v not in G: raise nx.NetworkXError('v is not in the graph.') # Return a generator explicitly instead of yielding so that the above # checks are executed eagerly. return (w for w in G[u] if w in G[v] and w not in (u, v)) def is_weighted(G, edge=None, weight='weight'): """Returns ``True`` if ``G`` has weighted edges. Parameters ---------- G : graph A NetworkX graph. edge : tuple, optional A 2-tuple specifying the only edge in ``G`` that will be tested. If ``None``, then every edge in ``G`` is tested. weight: string, optional The attribute name used to query for edge weights. Returns ------- bool A boolean signifying if ``G``, or the specified edge, is weighted. Raises ------ NetworkXError If the specified edge does not exist. Examples -------- >>> G = nx.path_graph(4) >>> nx.is_weighted(G) False >>> nx.is_weighted(G, (2, 3)) False >>> G = nx.DiGraph() >>> G.add_edge(1, 2, weight=1) >>> nx.is_weighted(G) True """ if edge is not None: data = G.get_edge_data(*edge) if data is None: msg = 'Edge {!r} does not exist.'.format(edge) raise nx.NetworkXError(msg) return weight in data if is_empty(G): # Special handling required since: all([]) == True return False return all(weight in data for u, v, data in G.edges(data=True)) def is_negatively_weighted(G, edge=None, weight='weight'): """Returns ``True`` if ``G`` has negatively weighted edges. Parameters ---------- G : graph A NetworkX graph. edge : tuple, optional A 2-tuple specifying the only edge in ``G`` that will be tested. If ``None``, then every edge in ``G`` is tested. weight: string, optional The attribute name used to query for edge weights. Returns ------- bool A boolean signifying if ``G``, or the specified edge, is negatively weighted. Raises ------ NetworkXError If the specified edge does not exist. Examples -------- >>> G=nx.Graph() >>> G.add_edges_from([(1, 3), (2, 4), (2, 6)]) >>> G.add_edge(1, 2, weight=4) >>> nx.is_negatively_weighted(G, (1, 2)) False >>> G[2][4]['weight'] = -2 >>> nx.is_negatively_weighted(G) True >>> G = nx.DiGraph() >>> G.add_weighted_edges_from([('0', '3', 3), ('0', '1', -5), ('1', '0', -2)]) >>> nx.is_negatively_weighted(G) True """ if edge is not None: data = G.get_edge_data(*edge) if data is None: msg = 'Edge {!r} does not exist.'.format(edge) raise nx.NetworkXError(msg) return weight in data and data[weight] < 0 return any(weight in data and data[weight] < 0 for u, v, data in G.edges(data=True)) def is_empty(G): """Returns ``True`` if ``G`` has no edges. Parameters ---------- G : graph A NetworkX graph. Returns ------- bool ``True`` if ``G`` has no edges, and ``False`` otherwise. Notes ----- An empty graph can have nodes but not edges. The empty graph with zero nodes is known as the null graph. This is an O(n) operation where n is the number of nodes in the graph. """ return not any(G.adj.values()) networkx-1.11/networkx/classes/digraph.py0000644000175000017500000012625312637544500020520 0ustar aricaric00000000000000"""Base class for directed graphs.""" # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. from copy import deepcopy import networkx as nx from networkx.classes.graph import Graph from networkx.exception import NetworkXError import networkx.convert as convert __author__ = """\n""".join(['Aric Hagberg (hagberg@lanl.gov)', 'Pieter Swart (swart@lanl.gov)', 'Dan Schult(dschult@colgate.edu)']) class DiGraph(Graph): """ Base class for directed graphs. A DiGraph stores nodes and edges with optional data, or attributes. DiGraphs hold directed edges. Self loops are allowed but multiple (parallel) edges are not. Nodes can be arbitrary (hashable) Python objects with optional key/value attributes. Edges are represented as links between nodes with optional key/value attributes. Parameters ---------- data : input graph Data to initialize graph. If data=None (default) an empty graph is created. The data can be an edge list, or any NetworkX graph object. If the corresponding optional Python packages are installed the data can also be a NumPy matrix or 2d ndarray, a SciPy sparse matrix, or a PyGraphviz graph. attr : keyword arguments, optional (default= no attributes) Attributes to add to graph as key=value pairs. See Also -------- Graph MultiGraph MultiDiGraph Examples -------- Create an empty graph structure (a "null graph") with no nodes and no edges. >>> G = nx.DiGraph() G can be grown in several ways. **Nodes:** Add one node at a time: >>> G.add_node(1) Add the nodes from any container (a list, dict, set or even the lines from a file or the nodes from another graph). >>> G.add_nodes_from([2,3]) >>> G.add_nodes_from(range(100,110)) >>> H=nx.Graph() >>> H.add_path([0,1,2,3,4,5,6,7,8,9]) >>> G.add_nodes_from(H) In addition to strings and integers any hashable Python object (except None) can represent a node, e.g. a customized node object, or even another Graph. >>> G.add_node(H) **Edges:** G can also be grown by adding edges. Add one edge, >>> G.add_edge(1, 2) a list of edges, >>> G.add_edges_from([(1,2),(1,3)]) or a collection of edges, >>> G.add_edges_from(H.edges()) If some edges connect nodes not yet in the graph, the nodes are added automatically. There are no errors when adding nodes or edges that already exist. **Attributes:** Each graph, node, and edge can hold key/value attribute pairs in an associated attribute dictionary (the keys must be hashable). By default these are empty, but can be added or changed using add_edge, add_node or direct manipulation of the attribute dictionaries named graph, node and edge respectively. >>> G = nx.DiGraph(day="Friday") >>> G.graph {'day': 'Friday'} Add node attributes using add_node(), add_nodes_from() or G.node >>> G.add_node(1, time='5pm') >>> G.add_nodes_from([3], time='2pm') >>> G.node[1] {'time': '5pm'} >>> G.node[1]['room'] = 714 >>> del G.node[1]['room'] # remove attribute >>> G.nodes(data=True) [(1, {'time': '5pm'}), (3, {'time': '2pm'})] Warning: adding a node to G.node does not add it to the graph. Add edge attributes using add_edge(), add_edges_from(), subscript notation, or G.edge. >>> G.add_edge(1, 2, weight=4.7 ) >>> G.add_edges_from([(3,4),(4,5)], color='red') >>> G.add_edges_from([(1,2,{'color':'blue'}), (2,3,{'weight':8})]) >>> G[1][2]['weight'] = 4.7 >>> G.edge[1][2]['weight'] = 4 **Shortcuts:** Many common graph features allow python syntax to speed reporting. >>> 1 in G # check if node in graph True >>> [n for n in G if n<3] # iterate through nodes [1, 2] >>> len(G) # number of nodes in graph 5 The fastest way to traverse all edges of a graph is via adjacency_iter(), but the edges() method is often more convenient. >>> for n,nbrsdict in G.adjacency_iter(): ... for nbr,eattr in nbrsdict.items(): ... if 'weight' in eattr: ... (n,nbr,eattr['weight']) (1, 2, 4) (2, 3, 8) >>> G.edges(data='weight') [(1, 2, 4), (2, 3, 8), (3, 4, None), (4, 5, None)] **Reporting:** Simple graph information is obtained using methods. Iterator versions of many reporting methods exist for efficiency. Methods exist for reporting nodes(), edges(), neighbors() and degree() as well as the number of nodes and edges. For details on these and other miscellaneous methods, see below. **Subclasses (Advanced):** The Graph class uses a dict-of-dict-of-dict data structure. The outer dict (node_dict) holds adjacency lists keyed by node. The next dict (adjlist) represents the adjacency list and holds edge data keyed by neighbor. The inner dict (edge_attr) represents the edge data and holds edge attribute values keyed by attribute names. Each of these three dicts can be replaced by a user defined dict-like object. In general, the dict-like features should be maintained but extra features can be added. To replace one of the dicts create a new graph class by changing the class(!) variable holding the factory for that dict-like structure. The variable names are node_dict_factory, adjlist_dict_factory and edge_attr_dict_factory. node_dict_factory : function, optional (default: dict) Factory function to be used to create the outer-most dict in the data structure that holds adjacency lists keyed by node. It should require no arguments and return a dict-like object. adjlist_dict_factory : function, optional (default: dict) Factory function to be used to create the adjacency list dict which holds edge data keyed by neighbor. It should require no arguments and return a dict-like object edge_attr_dict_factory : function, optional (default: dict) Factory function to be used to create the edge attribute dict which holds attrbute values keyed by attribute name. It should require no arguments and return a dict-like object. Examples -------- Create a graph object that tracks the order nodes are added. >>> from collections import OrderedDict >>> class OrderedNodeGraph(nx.Graph): ... node_dict_factory=OrderedDict >>> G=OrderedNodeGraph() >>> G.add_nodes_from( (2,1) ) >>> G.nodes() [2, 1] >>> G.add_edges_from( ((2,2), (2,1), (1,1)) ) >>> G.edges() [(2, 1), (2, 2), (1, 1)] Create a graph object that tracks the order nodes are added and for each node track the order that neighbors are added. >>> class OrderedGraph(nx.Graph): ... node_dict_factory = OrderedDict ... adjlist_dict_factory = OrderedDict >>> G = OrderedGraph() >>> G.add_nodes_from( (2,1) ) >>> G.nodes() [2, 1] >>> G.add_edges_from( ((2,2), (2,1), (1,1)) ) >>> G.edges() [(2, 2), (2, 1), (1, 1)] Create a low memory graph class that effectively disallows edge attributes by using a single attribute dict for all edges. This reduces the memory used, but you lose edge attributes. >>> class ThinGraph(nx.Graph): ... all_edge_dict = {'weight': 1} ... def single_edge_dict(self): ... return self.all_edge_dict ... edge_attr_dict_factory = single_edge_dict >>> G = ThinGraph() >>> G.add_edge(2,1) >>> G.edges(data= True) [(1, 2, {'weight': 1})] >>> G.add_edge(2,2) >>> G[2][1] is G[2][2] True """ def __init__(self, data=None, **attr): """Initialize a graph with edges, name, graph attributes. Parameters ---------- data : input graph Data to initialize graph. If data=None (default) an empty graph is created. The data can be an edge list, or any NetworkX graph object. If the corresponding optional Python packages are installed the data can also be a NumPy matrix or 2d ndarray, a SciPy sparse matrix, or a PyGraphviz graph. name : string, optional (default='') An optional name for the graph. attr : keyword arguments, optional (default= no attributes) Attributes to add to graph as key=value pairs. See Also -------- convert Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G = nx.Graph(name='my graph') >>> e = [(1,2),(2,3),(3,4)] # list of edges >>> G = nx.Graph(e) Arbitrary graph attribute pairs (key=value) may be assigned >>> G=nx.Graph(e, day="Friday") >>> G.graph {'day': 'Friday'} """ self.node_dict_factory = ndf = self.node_dict_factory self.adjlist_dict_factory = self.adjlist_dict_factory self.edge_attr_dict_factory = self.edge_attr_dict_factory self.graph = {} # dictionary for graph attributes self.node = ndf() # dictionary for node attributes # We store two adjacency lists: # the predecessors of node n are stored in the dict self.pred # the successors of node n are stored in the dict self.succ=self.adj self.adj = ndf() # empty adjacency dictionary self.pred = ndf() # predecessor self.succ = self.adj # successor # attempt to load graph with data if data is not None: convert.to_networkx_graph(data,create_using=self) # load graph attributes (must be after convert) self.graph.update(attr) self.edge=self.adj def add_node(self, n, attr_dict=None, **attr): """Add a single node n and update node attributes. Parameters ---------- n : node A node can be any hashable Python object except None. attr_dict : dictionary, optional (default= no attributes) Dictionary of node attributes. Key/value pairs will update existing data associated with the node. attr : keyword arguments, optional Set or change attributes using key=value. See Also -------- add_nodes_from Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_node(1) >>> G.add_node('Hello') >>> K3 = nx.Graph([(0,1),(1,2),(2,0)]) >>> G.add_node(K3) >>> G.number_of_nodes() 3 Use keywords set/change node attributes: >>> G.add_node(1,size=10) >>> G.add_node(3,weight=0.4,UTM=('13S',382871,3972649)) Notes ----- A hashable object is one that can be used as a key in a Python dictionary. This includes strings, numbers, tuples of strings and numbers, etc. On many platforms hashable items also include mutables such as NetworkX Graphs, though one should be careful that the hash doesn't change on mutables. """ # set up attribute dict if attr_dict is None: attr_dict=attr else: try: attr_dict.update(attr) except AttributeError: raise NetworkXError(\ "The attr_dict argument must be a dictionary.") if n not in self.succ: self.succ[n] = self.adjlist_dict_factory() self.pred[n] = self.adjlist_dict_factory() self.node[n] = attr_dict else: # update attr even if node already exists self.node[n].update(attr_dict) def add_nodes_from(self, nodes, **attr): """Add multiple nodes. Parameters ---------- nodes : iterable container A container of nodes (list, dict, set, etc.). OR A container of (node, attribute dict) tuples. Node attributes are updated using the attribute dict. attr : keyword arguments, optional (default= no attributes) Update attributes for all nodes in nodes. Node attributes specified in nodes as a tuple take precedence over attributes specified generally. See Also -------- add_node Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_nodes_from('Hello') >>> K3 = nx.Graph([(0,1),(1,2),(2,0)]) >>> G.add_nodes_from(K3) >>> sorted(G.nodes(),key=str) [0, 1, 2, 'H', 'e', 'l', 'o'] Use keywords to update specific node attributes for every node. >>> G.add_nodes_from([1,2], size=10) >>> G.add_nodes_from([3,4], weight=0.4) Use (node, attrdict) tuples to update attributes for specific nodes. >>> G.add_nodes_from([(1,dict(size=11)), (2,{'color':'blue'})]) >>> G.node[1]['size'] 11 >>> H = nx.Graph() >>> H.add_nodes_from(G.nodes(data=True)) >>> H.node[1]['size'] 11 """ for n in nodes: # keep all this inside try/except because # CPython throws TypeError on n not in self.succ, # while pre-2.7.5 ironpython throws on self.succ[n] try: if n not in self.succ: self.succ[n] = self.adjlist_dict_factory() self.pred[n] = self.adjlist_dict_factory() self.node[n] = attr.copy() else: self.node[n].update(attr) except TypeError: nn,ndict = n if nn not in self.succ: self.succ[nn] = self.adjlist_dict_factory() self.pred[nn] = self.adjlist_dict_factory() newdict = attr.copy() newdict.update(ndict) self.node[nn] = newdict else: olddict = self.node[nn] olddict.update(attr) olddict.update(ndict) def remove_node(self, n): """Remove node n. Removes the node n and all adjacent edges. Attempting to remove a non-existent node will raise an exception. Parameters ---------- n : node A node in the graph Raises ------- NetworkXError If n is not in the graph. See Also -------- remove_nodes_from Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2]) >>> G.edges() [(0, 1), (1, 2)] >>> G.remove_node(1) >>> G.edges() [] """ try: nbrs=self.succ[n] del self.node[n] except KeyError: # NetworkXError if n not in self raise NetworkXError("The node %s is not in the digraph."%(n,)) for u in nbrs: del self.pred[u][n] # remove all edges n-u in digraph del self.succ[n] # remove node from succ for u in self.pred[n]: del self.succ[u][n] # remove all edges n-u in digraph del self.pred[n] # remove node from pred def remove_nodes_from(self, nbunch): """Remove multiple nodes. Parameters ---------- nodes : iterable container A container of nodes (list, dict, set, etc.). If a node in the container is not in the graph it is silently ignored. See Also -------- remove_node Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2]) >>> e = G.nodes() >>> e [0, 1, 2] >>> G.remove_nodes_from(e) >>> G.nodes() [] """ for n in nbunch: try: succs=self.succ[n] del self.node[n] for u in succs: del self.pred[u][n] # remove all edges n-u in digraph del self.succ[n] # now remove node for u in self.pred[n]: del self.succ[u][n] # remove all edges n-u in digraph del self.pred[n] # now remove node except KeyError: pass # silent failure on remove def add_edge(self, u, v, attr_dict=None, **attr): """Add an edge between u and v. The nodes u and v will be automatically added if they are not already in the graph. Edge attributes can be specified with keywords or by providing a dictionary with key/value pairs. See examples below. Parameters ---------- u, v : nodes Nodes can be, for example, strings or numbers. Nodes must be hashable (and not None) Python objects. attr_dict : dictionary, optional (default= no attributes) Dictionary of edge attributes. Key/value pairs will update existing data associated with the edge. attr : keyword arguments, optional Edge data (or labels or objects) can be assigned using keyword arguments. See Also -------- add_edges_from : add a collection of edges Notes ----- Adding an edge that already exists updates the edge data. Many NetworkX algorithms designed for weighted graphs use as the edge weight a numerical value assigned to a keyword which by default is 'weight'. Examples -------- The following all add the edge e=(1,2) to graph G: >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> e = (1,2) >>> G.add_edge(1, 2) # explicit two-node form >>> G.add_edge(*e) # single edge as tuple of two nodes >>> G.add_edges_from( [(1,2)] ) # add edges from iterable container Associate data to edges using keywords: >>> G.add_edge(1, 2, weight=3) >>> G.add_edge(1, 3, weight=7, capacity=15, length=342.7) """ # set up attribute dict if attr_dict is None: attr_dict=attr else: try: attr_dict.update(attr) except AttributeError: raise NetworkXError(\ "The attr_dict argument must be a dictionary.") # add nodes if u not in self.succ: self.succ[u]= self.adjlist_dict_factory() self.pred[u]= self.adjlist_dict_factory() self.node[u] = {} if v not in self.succ: self.succ[v]= self.adjlist_dict_factory() self.pred[v]= self.adjlist_dict_factory() self.node[v] = {} # add the edge datadict=self.adj[u].get(v,self.edge_attr_dict_factory()) datadict.update(attr_dict) self.succ[u][v]=datadict self.pred[v][u]=datadict def add_edges_from(self, ebunch, attr_dict=None, **attr): """Add all the edges in ebunch. Parameters ---------- ebunch : container of edges Each edge given in the container will be added to the graph. The edges must be given as as 2-tuples (u,v) or 3-tuples (u,v,d) where d is a dictionary containing edge data. attr_dict : dictionary, optional (default= no attributes) Dictionary of edge attributes. Key/value pairs will update existing data associated with each edge. attr : keyword arguments, optional Edge data (or labels or objects) can be assigned using keyword arguments. See Also -------- add_edge : add a single edge add_weighted_edges_from : convenient way to add weighted edges Notes ----- Adding the same edge twice has no effect but any edge data will be updated when each duplicate edge is added. Edge attributes specified in edges take precedence over attributes specified generally. Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_edges_from([(0,1),(1,2)]) # using a list of edge tuples >>> e = zip(range(0,3),range(1,4)) >>> G.add_edges_from(e) # Add the path graph 0-1-2-3 Associate data to edges >>> G.add_edges_from([(1,2),(2,3)], weight=3) >>> G.add_edges_from([(3,4),(1,4)], label='WN2898') """ # set up attribute dict if attr_dict is None: attr_dict=attr else: try: attr_dict.update(attr) except AttributeError: raise NetworkXError(\ "The attr_dict argument must be a dict.") # process ebunch for e in ebunch: ne = len(e) if ne==3: u,v,dd = e assert hasattr(dd,"update") elif ne==2: u,v = e dd = {} else: raise NetworkXError(\ "Edge tuple %s must be a 2-tuple or 3-tuple."%(e,)) if u not in self.succ: self.succ[u] = self.adjlist_dict_factory() self.pred[u] = self.adjlist_dict_factory() self.node[u] = {} if v not in self.succ: self.succ[v] = self.adjlist_dict_factory() self.pred[v] = self.adjlist_dict_factory() self.node[v] = {} datadict=self.adj[u].get(v,self.edge_attr_dict_factory()) datadict.update(attr_dict) datadict.update(dd) self.succ[u][v] = datadict self.pred[v][u] = datadict def remove_edge(self, u, v): """Remove the edge between u and v. Parameters ---------- u, v : nodes Remove the edge between nodes u and v. Raises ------ NetworkXError If there is not an edge between u and v. See Also -------- remove_edges_from : remove a collection of edges Examples -------- >>> G = nx.Graph() # or DiGraph, etc >>> G.add_path([0,1,2,3]) >>> G.remove_edge(0,1) >>> e = (1,2) >>> G.remove_edge(*e) # unpacks e from an edge tuple >>> e = (2,3,{'weight':7}) # an edge with attribute data >>> G.remove_edge(*e[:2]) # select first part of edge tuple """ try: del self.succ[u][v] del self.pred[v][u] except KeyError: raise NetworkXError("The edge %s-%s not in graph."%(u,v)) def remove_edges_from(self, ebunch): """Remove all edges specified in ebunch. Parameters ---------- ebunch: list or container of edge tuples Each edge given in the list or container will be removed from the graph. The edges can be: - 2-tuples (u,v) edge between u and v. - 3-tuples (u,v,k) where k is ignored. See Also -------- remove_edge : remove a single edge Notes ----- Will fail silently if an edge in ebunch is not in the graph. Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> ebunch=[(1,2),(2,3)] >>> G.remove_edges_from(ebunch) """ for e in ebunch: (u,v)=e[:2] # ignore edge data if u in self.succ and v in self.succ[u]: del self.succ[u][v] del self.pred[v][u] def has_successor(self, u, v): """Return True if node u has successor v. This is true if graph has the edge u->v. """ return (u in self.succ and v in self.succ[u]) def has_predecessor(self, u, v): """Return True if node u has predecessor v. This is true if graph has the edge u<-v. """ return (u in self.pred and v in self.pred[u]) def successors_iter(self,n): """Return an iterator over successor nodes of n. neighbors_iter() and successors_iter() are the same. """ try: return iter(self.succ[n]) except KeyError: raise NetworkXError("The node %s is not in the digraph."%(n,)) def predecessors_iter(self,n): """Return an iterator over predecessor nodes of n.""" try: return iter(self.pred[n]) except KeyError: raise NetworkXError("The node %s is not in the digraph."%(n,)) def successors(self, n): """Return a list of successor nodes of n. neighbors() and successors() are the same function. """ return list(self.successors_iter(n)) def predecessors(self, n): """Return a list of predecessor nodes of n.""" return list(self.predecessors_iter(n)) # digraph definitions neighbors = successors neighbors_iter = successors_iter def edges_iter(self, nbunch=None, data=False, default=None): """Return an iterator over the edges. Edges are returned as tuples with optional data in the order (node, neighbor, data). Parameters ---------- nbunch : iterable container, optional (default= all nodes) A container of nodes. The container will be iterated through once. data : string or bool, optional (default=False) The edge attribute returned in 3-tuple (u,v,ddict[data]). If True, return edge attribute dict in 3-tuple (u,v,ddict). If False, return 2-tuple (u,v). default : value, optional (default=None) Value used for edges that dont have the requested attribute. Only relevant if data is not True or False. Returns ------- edge_iter : iterator An iterator of (u,v) or (u,v,d) tuples of edges. See Also -------- edges : return a list of edges Notes ----- Nodes in nbunch that are not in the graph will be (quietly) ignored. For directed graphs this returns the out-edges. Examples -------- >>> G = nx.DiGraph() # or MultiDiGraph, etc >>> G.add_path([0,1,2]) >>> G.add_edge(2,3,weight=5) >>> [e for e in G.edges_iter()] [(0, 1), (1, 2), (2, 3)] >>> list(G.edges_iter(data=True)) # default data is {} (empty dict) [(0, 1, {}), (1, 2, {}), (2, 3, {'weight': 5})] >>> list(G.edges_iter(data='weight', default=1)) [(0, 1, 1), (1, 2, 1), (2, 3, 5)] >>> list(G.edges_iter([0,2])) [(0, 1), (2, 3)] >>> list(G.edges_iter(0)) [(0, 1)] """ if nbunch is None: nodes_nbrs=self.adj.items() else: nodes_nbrs=((n,self.adj[n]) for n in self.nbunch_iter(nbunch)) if data is True: for n,nbrs in nodes_nbrs: for nbr,ddict in nbrs.items(): yield (n,nbr,ddict) elif data is not False: for n,nbrs in nodes_nbrs: for nbr,ddict in nbrs.items(): d=ddict[data] if data in ddict else default yield (n,nbr,d) else: for n,nbrs in nodes_nbrs: for nbr in nbrs: yield (n,nbr) # alias out_edges to edges out_edges_iter=edges_iter out_edges=Graph.edges def in_edges_iter(self, nbunch=None, data=False): """Return an iterator over the incoming edges. Parameters ---------- nbunch : iterable container, optional (default= all nodes) A container of nodes. The container will be iterated through once. data : bool, optional (default=False) If True, return edge attribute dict in 3-tuple (u,v,data). Returns ------- in_edge_iter : iterator An iterator of (u,v) or (u,v,d) tuples of incoming edges. See Also -------- edges_iter : return an iterator of edges """ if nbunch is None: nodes_nbrs=self.pred.items() else: nodes_nbrs=((n,self.pred[n]) for n in self.nbunch_iter(nbunch)) if data: for n,nbrs in nodes_nbrs: for nbr,data in nbrs.items(): yield (nbr,n,data) else: for n,nbrs in nodes_nbrs: for nbr in nbrs: yield (nbr,n) def in_edges(self, nbunch=None, data=False): """Return a list of the incoming edges. See Also -------- edges : return a list of edges """ return list(self.in_edges_iter(nbunch, data)) def degree_iter(self, nbunch=None, weight=None): """Return an iterator for (node, degree). The node degree is the number of edges adjacent to the node. Parameters ---------- nbunch : iterable container, optional (default=all nodes) A container of nodes. The container will be iterated through once. weight : string or None, optional (default=None) The edge attribute that holds the numerical value used as a weight. If None, then each edge has weight 1. The degree is the sum of the edge weights adjacent to the node. Returns ------- nd_iter : an iterator The iterator returns two-tuples of (node, degree). See Also -------- degree, in_degree, out_degree, in_degree_iter, out_degree_iter Examples -------- >>> G = nx.DiGraph() # or MultiDiGraph >>> G.add_path([0,1,2,3]) >>> list(G.degree_iter(0)) # node 0 with degree 1 [(0, 1)] >>> list(G.degree_iter([0,1])) [(0, 1), (1, 2)] """ if nbunch is None: nodes_nbrs=( (n,succs,self.pred[n]) for n,succs in self.succ.items()) else: nodes_nbrs=( (n,self.succ[n],self.pred[n]) for n in self.nbunch_iter(nbunch)) if weight is None: for n,succ,pred in nodes_nbrs: yield (n,len(succ)+len(pred)) else: # edge weighted graph - degree is sum of edge weights for n,succ,pred in nodes_nbrs: yield (n, sum((succ[nbr].get(weight,1) for nbr in succ))+ sum((pred[nbr].get(weight,1) for nbr in pred))) def in_degree_iter(self, nbunch=None, weight=None): """Return an iterator for (node, in-degree). The node in-degree is the number of edges pointing in to the node. Parameters ---------- nbunch : iterable container, optional (default=all nodes) A container of nodes. The container will be iterated through once. weight : string or None, optional (default=None) The edge attribute that holds the numerical value used as a weight. If None, then each edge has weight 1. The degree is the sum of the edge weights adjacent to the node. Returns ------- nd_iter : an iterator The iterator returns two-tuples of (node, in-degree). See Also -------- degree, in_degree, out_degree, out_degree_iter Examples -------- >>> G = nx.DiGraph() >>> G.add_path([0,1,2,3]) >>> list(G.in_degree_iter(0)) # node 0 with degree 0 [(0, 0)] >>> list(G.in_degree_iter([0,1])) [(0, 0), (1, 1)] """ if nbunch is None: nodes_nbrs=self.pred.items() else: nodes_nbrs=((n,self.pred[n]) for n in self.nbunch_iter(nbunch)) if weight is None: for n,nbrs in nodes_nbrs: yield (n,len(nbrs)) else: # edge weighted graph - degree is sum of edge weights for n,nbrs in nodes_nbrs: yield (n, sum(data.get(weight,1) for data in nbrs.values())) def out_degree_iter(self, nbunch=None, weight=None): """Return an iterator for (node, out-degree). The node out-degree is the number of edges pointing out of the node. Parameters ---------- nbunch : iterable container, optional (default=all nodes) A container of nodes. The container will be iterated through once. weight : string or None, optional (default=None) The edge attribute that holds the numerical value used as a weight. If None, then each edge has weight 1. The degree is the sum of the edge weights adjacent to the node. Returns ------- nd_iter : an iterator The iterator returns two-tuples of (node, out-degree). See Also -------- degree, in_degree, out_degree, in_degree_iter Examples -------- >>> G = nx.DiGraph() >>> G.add_path([0,1,2,3]) >>> list(G.out_degree_iter(0)) # node 0 with degree 1 [(0, 1)] >>> list(G.out_degree_iter([0,1])) [(0, 1), (1, 1)] """ if nbunch is None: nodes_nbrs=self.succ.items() else: nodes_nbrs=((n,self.succ[n]) for n in self.nbunch_iter(nbunch)) if weight is None: for n,nbrs in nodes_nbrs: yield (n,len(nbrs)) else: # edge weighted graph - degree is sum of edge weights for n,nbrs in nodes_nbrs: yield (n, sum(data.get(weight,1) for data in nbrs.values())) def in_degree(self, nbunch=None, weight=None): """Return the in-degree of a node or nodes. The node in-degree is the number of edges pointing in to the node. Parameters ---------- nbunch : iterable container, optional (default=all nodes) A container of nodes. The container will be iterated through once. weight : string or None, optional (default=None) The edge attribute that holds the numerical value used as a weight. If None, then each edge has weight 1. The degree is the sum of the edge weights adjacent to the node. Returns ------- nd : dictionary, or number A dictionary with nodes as keys and in-degree as values or a number if a single node is specified. See Also -------- degree, out_degree, in_degree_iter Examples -------- >>> G = nx.DiGraph() # or MultiDiGraph >>> G.add_path([0,1,2,3]) >>> G.in_degree(0) 0 >>> G.in_degree([0,1]) {0: 0, 1: 1} >>> list(G.in_degree([0,1]).values()) [0, 1] """ if nbunch in self: # return a single node return next(self.in_degree_iter(nbunch,weight))[1] else: # return a dict return dict(self.in_degree_iter(nbunch,weight)) def out_degree(self, nbunch=None, weight=None): """Return the out-degree of a node or nodes. The node out-degree is the number of edges pointing out of the node. Parameters ---------- nbunch : iterable container, optional (default=all nodes) A container of nodes. The container will be iterated through once. weight : string or None, optional (default=None) The edge attribute that holds the numerical value used as a weight. If None, then each edge has weight 1. The degree is the sum of the edge weights adjacent to the node. Returns ------- nd : dictionary, or number A dictionary with nodes as keys and out-degree as values or a number if a single node is specified. Examples -------- >>> G = nx.DiGraph() # or MultiDiGraph >>> G.add_path([0,1,2,3]) >>> G.out_degree(0) 1 >>> G.out_degree([0,1]) {0: 1, 1: 1} >>> list(G.out_degree([0,1]).values()) [1, 1] """ if nbunch in self: # return a single node return next(self.out_degree_iter(nbunch,weight))[1] else: # return a dict return dict(self.out_degree_iter(nbunch,weight)) def clear(self): """Remove all nodes and edges from the graph. This also removes the name, and all graph, node, and edge attributes. Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> G.clear() >>> G.nodes() [] >>> G.edges() [] """ self.succ.clear() self.pred.clear() self.node.clear() self.graph.clear() def is_multigraph(self): """Return True if graph is a multigraph, False otherwise.""" return False def is_directed(self): """Return True if graph is directed, False otherwise.""" return True def to_directed(self): """Return a directed copy of the graph. Returns ------- G : DiGraph A deepcopy of the graph. Notes ----- This returns a "deepcopy" of the edge, node, and graph attributes which attempts to completely copy all of the data and references. This is in contrast to the similar D=DiGraph(G) which returns a shallow copy of the data. See the Python copy module for more information on shallow and deep copies, http://docs.python.org/library/copy.html. Examples -------- >>> G = nx.Graph() # or MultiGraph, etc >>> G.add_path([0,1]) >>> H = G.to_directed() >>> H.edges() [(0, 1), (1, 0)] If already directed, return a (deep) copy >>> G = nx.DiGraph() # or MultiDiGraph, etc >>> G.add_path([0,1]) >>> H = G.to_directed() >>> H.edges() [(0, 1)] """ return deepcopy(self) def to_undirected(self, reciprocal=False): """Return an undirected representation of the digraph. Parameters ---------- reciprocal : bool (optional) If True only keep edges that appear in both directions in the original digraph. Returns ------- G : Graph An undirected graph with the same name and nodes and with edge (u,v,data) if either (u,v,data) or (v,u,data) is in the digraph. If both edges exist in digraph and their edge data is different, only one edge is created with an arbitrary choice of which edge data to use. You must check and correct for this manually if desired. Notes ----- If edges in both directions (u,v) and (v,u) exist in the graph, attributes for the new undirected edge will be a combination of the attributes of the directed edges. The edge data is updated in the (arbitrary) order that the edges are encountered. For more customized control of the edge attributes use add_edge(). This returns a "deepcopy" of the edge, node, and graph attributes which attempts to completely copy all of the data and references. This is in contrast to the similar G=DiGraph(D) which returns a shallow copy of the data. See the Python copy module for more information on shallow and deep copies, http://docs.python.org/library/copy.html. Warning: If you have subclassed DiGraph to use dict-like objects in the data structure, those changes do not transfer to the Graph created by this method. """ H=Graph() H.name=self.name H.add_nodes_from(self) if reciprocal is True: H.add_edges_from( (u,v,deepcopy(d)) for u,nbrs in self.adjacency_iter() for v,d in nbrs.items() if v in self.pred[u]) else: H.add_edges_from( (u,v,deepcopy(d)) for u,nbrs in self.adjacency_iter() for v,d in nbrs.items() ) H.graph=deepcopy(self.graph) H.node=deepcopy(self.node) return H def reverse(self, copy=True): """Return the reverse of the graph. The reverse is a graph with the same nodes and edges but with the directions of the edges reversed. Parameters ---------- copy : bool optional (default=True) If True, return a new DiGraph holding the reversed edges. If False, reverse the reverse graph is created using the original graph (this changes the original graph). """ if copy: H = self.__class__(name="Reverse of (%s)"%self.name) H.add_nodes_from(self) H.add_edges_from( (v,u,deepcopy(d)) for u,v,d in self.edges(data=True) ) H.graph=deepcopy(self.graph) H.node=deepcopy(self.node) else: self.pred,self.succ=self.succ,self.pred self.adj=self.succ H=self return H def subgraph(self, nbunch): """Return the subgraph induced on nodes in nbunch. The induced subgraph of the graph contains the nodes in nbunch and the edges between those nodes. Parameters ---------- nbunch : list, iterable A container of nodes which will be iterated through once. Returns ------- G : Graph A subgraph of the graph with the same edge attributes. Notes ----- The graph, edge or node attributes just point to the original graph. So changes to the node or edge structure will not be reflected in the original graph while changes to the attributes will. To create a subgraph with its own copy of the edge/node attributes use: nx.Graph(G.subgraph(nbunch)) If edge attributes are containers, a deep copy can be obtained using: G.subgraph(nbunch).copy() For an inplace reduction of a graph to a subgraph you can remove nodes: G.remove_nodes_from([ n in G if n not in set(nbunch)]) Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> H = G.subgraph([0,1,2]) >>> H.edges() [(0, 1), (1, 2)] """ bunch = self.nbunch_iter(nbunch) # create new graph and copy subgraph into it H = self.__class__() # copy node and attribute dictionaries for n in bunch: H.node[n]=self.node[n] # namespace shortcuts for speed H_succ=H.succ H_pred=H.pred self_succ=self.succ # add nodes for n in H: H_succ[n]=H.adjlist_dict_factory() H_pred[n]=H.adjlist_dict_factory() # add edges for u in H_succ: Hnbrs=H_succ[u] for v,datadict in self_succ[u].items(): if v in H_succ: # add both representations of edge: u-v and v-u Hnbrs[v]=datadict H_pred[v][u]=datadict H.graph=self.graph return H networkx-1.11/networkx/classes/multigraph.py0000644000175000017500000010714112637544500021251 0ustar aricaric00000000000000"""Base class for MultiGraph.""" # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. from copy import deepcopy import networkx as nx from networkx.classes.graph import Graph from networkx import NetworkXError __author__ = """\n""".join(['Aric Hagberg (hagberg@lanl.gov)', 'Pieter Swart (swart@lanl.gov)', 'Dan Schult(dschult@colgate.edu)']) class MultiGraph(Graph): """ An undirected graph class that can store multiedges. Multiedges are multiple edges between two nodes. Each edge can hold optional data or attributes. A MultiGraph holds undirected edges. Self loops are allowed. Nodes can be arbitrary (hashable) Python objects with optional key/value attributes. Edges are represented as links between nodes with optional key/value attributes. Parameters ---------- data : input graph Data to initialize graph. If data=None (default) an empty graph is created. The data can be an edge list, or any NetworkX graph object. If the corresponding optional Python packages are installed the data can also be a NumPy matrix or 2d ndarray, a SciPy sparse matrix, or a PyGraphviz graph. attr : keyword arguments, optional (default= no attributes) Attributes to add to graph as key=value pairs. See Also -------- Graph DiGraph MultiDiGraph Examples -------- Create an empty graph structure (a "null graph") with no nodes and no edges. >>> G = nx.MultiGraph() G can be grown in several ways. **Nodes:** Add one node at a time: >>> G.add_node(1) Add the nodes from any container (a list, dict, set or even the lines from a file or the nodes from another graph). >>> G.add_nodes_from([2,3]) >>> G.add_nodes_from(range(100,110)) >>> H=nx.Graph() >>> H.add_path([0,1,2,3,4,5,6,7,8,9]) >>> G.add_nodes_from(H) In addition to strings and integers any hashable Python object (except None) can represent a node, e.g. a customized node object, or even another Graph. >>> G.add_node(H) **Edges:** G can also be grown by adding edges. Add one edge, >>> G.add_edge(1, 2) a list of edges, >>> G.add_edges_from([(1,2),(1,3)]) or a collection of edges, >>> G.add_edges_from(H.edges()) If some edges connect nodes not yet in the graph, the nodes are added automatically. If an edge already exists, an additional edge is created and stored using a key to identify the edge. By default the key is the lowest unused integer. >>> G.add_edges_from([(4,5,dict(route=282)), (4,5,dict(route=37))]) >>> G[4] {3: {0: {}}, 5: {0: {}, 1: {'route': 282}, 2: {'route': 37}}} **Attributes:** Each graph, node, and edge can hold key/value attribute pairs in an associated attribute dictionary (the keys must be hashable). By default these are empty, but can be added or changed using add_edge, add_node or direct manipulation of the attribute dictionaries named graph, node and edge respectively. >>> G = nx.MultiGraph(day="Friday") >>> G.graph {'day': 'Friday'} Add node attributes using add_node(), add_nodes_from() or G.node >>> G.add_node(1, time='5pm') >>> G.add_nodes_from([3], time='2pm') >>> G.node[1] {'time': '5pm'} >>> G.node[1]['room'] = 714 >>> del G.node[1]['room'] # remove attribute >>> G.nodes(data=True) [(1, {'time': '5pm'}), (3, {'time': '2pm'})] Warning: adding a node to G.node does not add it to the graph. Add edge attributes using add_edge(), add_edges_from(), subscript notation, or G.edge. >>> G.add_edge(1, 2, weight=4.7 ) >>> G.add_edges_from([(3,4),(4,5)], color='red') >>> G.add_edges_from([(1,2,{'color':'blue'}), (2,3,{'weight':8})]) >>> G[1][2][0]['weight'] = 4.7 >>> G.edge[1][2][0]['weight'] = 4 **Shortcuts:** Many common graph features allow python syntax to speed reporting. >>> 1 in G # check if node in graph True >>> [n for n in G if n<3] # iterate through nodes [1, 2] >>> len(G) # number of nodes in graph 5 >>> G[1] # adjacency dict keyed by neighbor to edge attributes ... # Note: you should not change this dict manually! {2: {0: {'weight': 4}, 1: {'color': 'blue'}}} The fastest way to traverse all edges of a graph is via adjacency_iter(), but the edges() method is often more convenient. >>> for n,nbrsdict in G.adjacency_iter(): ... for nbr,keydict in nbrsdict.items(): ... for key,eattr in keydict.items(): ... if 'weight' in eattr: ... (n,nbr,key,eattr['weight']) (1, 2, 0, 4) (2, 1, 0, 4) (2, 3, 0, 8) (3, 2, 0, 8) >>> G.edges(data='weight', keys=True) [(1, 2, 0, 4), (1, 2, 1, None), (2, 3, 0, 8), (3, 4, 0, None), (4, 5, 0, None)] **Reporting:** Simple graph information is obtained using methods. Iterator versions of many reporting methods exist for efficiency. Methods exist for reporting nodes(), edges(), neighbors() and degree() as well as the number of nodes and edges. For details on these and other miscellaneous methods, see below. **Subclasses (Advanced):** The MultiGraph class uses a dict-of-dict-of-dict-of-dict data structure. The outer dict (node_dict) holds adjacency lists keyed by node. The next dict (adjlist) represents the adjacency list and holds edge_key dicts keyed by neighbor. The edge_key dict holds each edge_attr dict keyed by edge key. The inner dict (edge_attr) represents the edge data and holds edge attribute values keyed by attribute names. Each of these four dicts in the dict-of-dict-of-dict-of-dict structure can be replaced by a user defined dict-like object. In general, the dict-like features should be maintained but extra features can be added. To replace one of the dicts create a new graph class by changing the class(!) variable holding the factory for that dict-like structure. The variable names are node_dict_factory, adjlist_dict_factory, edge_key_dict_factory and edge_attr_dict_factory. node_dict_factory : function, (default: dict) Factory function to be used to create the outer-most dict in the data structure that holds adjacency lists keyed by node. It should require no arguments and return a dict-like object. adjlist_dict_factory : function, (default: dict) Factory function to be used to create the adjacency list dict which holds multiedge key dicts keyed by neighbor. It should require no arguments and return a dict-like object. edge_key_dict_factory : function, (default: dict) Factory function to be used to create the edge key dict which holds edge data keyed by edge key. It should require no arguments and return a dict-like object. edge_attr_dict_factory : function, (default: dict) Factory function to be used to create the edge attribute dict which holds attrbute values keyed by attribute name. It should require no arguments and return a dict-like object. Examples -------- Create a multigraph object that tracks the order nodes are added. >>> from collections import OrderedDict >>> class OrderedGraph(nx.MultiGraph): ... node_dict_factory = OrderedDict >>> G = OrderedGraph() >>> G.add_nodes_from( (2,1) ) >>> G.nodes() [2, 1] >>> G.add_edges_from( ((2,2), (2,1), (2,1), (1,1)) ) >>> G.edges() [(2, 1), (2, 1), (2, 2), (1, 1)] Create a multgraph object that tracks the order nodes are added and for each node track the order that neighbors are added and for each neighbor tracks the order that multiedges are added. >>> class OrderedGraph(nx.MultiGraph): ... node_dict_factory = OrderedDict ... adjlist_dict_factory = OrderedDict ... edge_key_dict_factory = OrderedDict >>> G = OrderedGraph() >>> G.add_nodes_from( (2,1) ) >>> G.nodes() [2, 1] >>> G.add_edges_from( ((2,2), (2,1,2,{'weight':0.1}), (2,1,1,{'weight':0.2}), (1,1)) ) >>> G.edges(keys=True) [(2, 2, 0), (2, 1, 2), (2, 1, 1), (1, 1, 0)] """ # node_dict_factory=dict # already assigned in Graph # adjlist_dict_factory=dict edge_key_dict_factory = dict # edge_attr_dict_factory=dict def __init__(self, data=None, **attr): self.edge_key_dict_factory = self.edge_key_dict_factory Graph.__init__(self, data, **attr) def add_edge(self, u, v, key=None, attr_dict=None, **attr): """Add an edge between u and v. The nodes u and v will be automatically added if they are not already in the graph. Edge attributes can be specified with keywords or by providing a dictionary with key/value pairs. See examples below. Parameters ---------- u, v : nodes Nodes can be, for example, strings or numbers. Nodes must be hashable (and not None) Python objects. key : hashable identifier, optional (default=lowest unused integer) Used to distinguish multiedges between a pair of nodes. attr_dict : dictionary, optional (default= no attributes) Dictionary of edge attributes. Key/value pairs will update existing data associated with the edge. attr : keyword arguments, optional Edge data (or labels or objects) can be assigned using keyword arguments. See Also -------- add_edges_from : add a collection of edges Notes ----- To replace/update edge data, use the optional key argument to identify a unique edge. Otherwise a new edge will be created. NetworkX algorithms designed for weighted graphs cannot use multigraphs directly because it is not clear how to handle multiedge weights. Convert to Graph using edge attribute 'weight' to enable weighted graph algorithms. Examples -------- The following all add the edge e=(1,2) to graph G: >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> e = (1,2) >>> G.add_edge(1, 2) # explicit two-node form >>> G.add_edge(*e) # single edge as tuple of two nodes >>> G.add_edges_from( [(1,2)] ) # add edges from iterable container Associate data to edges using keywords: >>> G.add_edge(1, 2, weight=3) >>> G.add_edge(1, 2, key=0, weight=4) # update data for key=0 >>> G.add_edge(1, 3, weight=7, capacity=15, length=342.7) """ # set up attribute dict if attr_dict is None: attr_dict = attr else: try: attr_dict.update(attr) except AttributeError: raise NetworkXError( "The attr_dict argument must be a dictionary.") # add nodes if u not in self.adj: self.adj[u] = self.adjlist_dict_factory() self.node[u] = {} if v not in self.adj: self.adj[v] = self.adjlist_dict_factory() self.node[v] = {} if v in self.adj[u]: keydict = self.adj[u][v] if key is None: # find a unique integer key # other methods might be better here? key = len(keydict) while key in keydict: key += 1 datadict = keydict.get(key, self.edge_attr_dict_factory()) datadict.update(attr_dict) keydict[key] = datadict else: # selfloops work this way without special treatment if key is None: key = 0 datadict = self.edge_attr_dict_factory() datadict.update(attr_dict) keydict = self.edge_key_dict_factory() keydict[key] = datadict self.adj[u][v] = keydict self.adj[v][u] = keydict def add_edges_from(self, ebunch, attr_dict=None, **attr): """Add all the edges in ebunch. Parameters ---------- ebunch : container of edges Each edge given in the container will be added to the graph. The edges can be: - 2-tuples (u,v) or - 3-tuples (u,v,d) for an edge attribute dict d, or - 4-tuples (u,v,k,d) for an edge identified by key k attr_dict : dictionary, optional (default= no attributes) Dictionary of edge attributes. Key/value pairs will update existing data associated with each edge. attr : keyword arguments, optional Edge data (or labels or objects) can be assigned using keyword arguments. See Also -------- add_edge : add a single edge add_weighted_edges_from : convenient way to add weighted edges Notes ----- Adding the same edge twice has no effect but any edge data will be updated when each duplicate edge is added. Edge attributes specified in edges take precedence over attributes specified generally. Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_edges_from([(0,1),(1,2)]) # using a list of edge tuples >>> e = zip(range(0,3),range(1,4)) >>> G.add_edges_from(e) # Add the path graph 0-1-2-3 Associate data to edges >>> G.add_edges_from([(1,2),(2,3)], weight=3) >>> G.add_edges_from([(3,4),(1,4)], label='WN2898') """ # set up attribute dict if attr_dict is None: attr_dict = attr else: try: attr_dict.update(attr) except AttributeError: raise NetworkXError( "The attr_dict argument must be a dictionary.") # process ebunch for e in ebunch: ne = len(e) if ne == 4: u, v, key, dd = e elif ne == 3: u, v, dd = e key = None elif ne == 2: u, v = e dd = {} key = None else: raise NetworkXError( "Edge tuple %s must be a 2-tuple, 3-tuple or 4-tuple." % (e,)) ddd = {} ddd.update(attr_dict) ddd.update(dd) self.add_edge(u, v, key, ddd) def remove_edge(self, u, v, key=None): """Remove an edge between u and v. Parameters ---------- u, v : nodes Remove an edge between nodes u and v. key : hashable identifier, optional (default=None) Used to distinguish multiple edges between a pair of nodes. If None remove a single (abritrary) edge between u and v. Raises ------ NetworkXError If there is not an edge between u and v, or if there is no edge with the specified key. See Also -------- remove_edges_from : remove a collection of edges Examples -------- >>> G = nx.MultiGraph() >>> G.add_path([0,1,2,3]) >>> G.remove_edge(0,1) >>> e = (1,2) >>> G.remove_edge(*e) # unpacks e from an edge tuple For multiple edges >>> G = nx.MultiGraph() # or MultiDiGraph, etc >>> G.add_edges_from([(1,2),(1,2),(1,2)]) >>> G.remove_edge(1,2) # remove a single (arbitrary) edge For edges with keys >>> G = nx.MultiGraph() # or MultiDiGraph, etc >>> G.add_edge(1,2,key='first') >>> G.add_edge(1,2,key='second') >>> G.remove_edge(1,2,key='second') """ try: d = self.adj[u][v] except (KeyError): raise NetworkXError( "The edge %s-%s is not in the graph." % (u, v)) # remove the edge with specified data if key is None: d.popitem() else: try: del d[key] except (KeyError): raise NetworkXError( "The edge %s-%s with key %s is not in the graph." % ( u, v, key)) if len(d) == 0: # remove the key entries if last edge del self.adj[u][v] if u!=v: # check for selfloop del self.adj[v][u] def remove_edges_from(self, ebunch): """Remove all edges specified in ebunch. Parameters ---------- ebunch: list or container of edge tuples Each edge given in the list or container will be removed from the graph. The edges can be: - 2-tuples (u,v) All edges between u and v are removed. - 3-tuples (u,v,key) The edge identified by key is removed. - 4-tuples (u,v,key,data) where data is ignored. See Also -------- remove_edge : remove a single edge Notes ----- Will fail silently if an edge in ebunch is not in the graph. Examples -------- >>> G = nx.MultiGraph() # or MultiDiGraph >>> G.add_path([0,1,2,3]) >>> ebunch=[(1,2),(2,3)] >>> G.remove_edges_from(ebunch) Removing multiple copies of edges >>> G = nx.MultiGraph() >>> G.add_edges_from([(1,2),(1,2),(1,2)]) >>> G.remove_edges_from([(1,2),(1,2)]) >>> G.edges() [(1, 2)] >>> G.remove_edges_from([(1,2),(1,2)]) # silently ignore extra copy >>> G.edges() # now empty graph [] """ for e in ebunch: try: self.remove_edge(*e[:3]) except NetworkXError: pass def has_edge(self, u, v, key=None): """Return True if the graph has an edge between nodes u and v. Parameters ---------- u, v : nodes Nodes can be, for example, strings or numbers. key : hashable identifier, optional (default=None) If specified return True only if the edge with key is found. Returns ------- edge_ind : bool True if edge is in the graph, False otherwise. Examples -------- Can be called either using two nodes u,v, an edge tuple (u,v), or an edge tuple (u,v,key). >>> G = nx.MultiGraph() # or MultiDiGraph >>> G.add_path([0,1,2,3]) >>> G.has_edge(0,1) # using two nodes True >>> e = (0,1) >>> G.has_edge(*e) # e is a 2-tuple (u,v) True >>> G.add_edge(0,1,key='a') >>> G.has_edge(0,1,key='a') # specify key True >>> e=(0,1,'a') >>> G.has_edge(*e) # e is a 3-tuple (u,v,'a') True The following syntax are equivalent: >>> G.has_edge(0,1) True >>> 1 in G[0] # though this gives KeyError if 0 not in G True """ try: if key is None: return v in self.adj[u] else: return key in self.adj[u][v] except KeyError: return False def edges(self, nbunch=None, data=False, keys=False, default=None): """Return a list of edges. Edges are returned as tuples with optional data and keys in the order (node, neighbor, key, data). Parameters ---------- nbunch : iterable container, optional (default= all nodes) A container of nodes. The container will be iterated through once. data : bool, optional (default=False) Return two tuples (u,v) (False) or three-tuples (u,v,data) (True). keys : bool, optional (default=False) Return two tuples (u,v) (False) or three-tuples (u,v,key) (True). Returns -------- edge_list: list of edge tuples Edges that are adjacent to any node in nbunch, or a list of all edges if nbunch is not specified. See Also -------- edges_iter : return an iterator over the edges Notes ----- Nodes in nbunch that are not in the graph will be (quietly) ignored. For directed graphs this returns the out-edges. Examples -------- >>> G = nx.MultiGraph() # or MultiDiGraph >>> G.add_path([0,1,2]) >>> G.add_edge(2,3,weight=5) >>> G.edges() [(0, 1), (1, 2), (2, 3)] >>> G.edges(data=True) # default edge data is {} (empty dictionary) [(0, 1, {}), (1, 2, {}), (2, 3, {'weight': 5})] >>> list(G.edges_iter(data='weight', default=1)) [(0, 1, 1), (1, 2, 1), (2, 3, 5)] >>> G.edges(keys=True) # default keys are integers [(0, 1, 0), (1, 2, 0), (2, 3, 0)] >>> G.edges(data=True,keys=True) # default keys are integers [(0, 1, 0, {}), (1, 2, 0, {}), (2, 3, 0, {'weight': 5})] >>> list(G.edges(data='weight',default=1,keys=True)) [(0, 1, 0, 1), (1, 2, 0, 1), (2, 3, 0, 5)] >>> G.edges([0,3]) [(0, 1), (3, 2)] >>> G.edges(0) [(0, 1)] """ return list(self.edges_iter(nbunch, data, keys, default)) def edges_iter(self, nbunch=None, data=False, keys=False, default=None): """Return an iterator over the edges. Edges are returned as tuples with optional data and keys in the order (node, neighbor, key, data). Parameters ---------- nbunch : iterable container, optional (default= all nodes) A container of nodes. The container will be iterated through once. data : string or bool, optional (default=False) The edge attribute returned in 3-tuple (u,v,ddict[data]). If True, return edge attribute dict in 3-tuple (u,v,ddict). If False, return 2-tuple (u,v). default : value, optional (default=None) Value used for edges that dont have the requested attribute. Only relevant if data is not True or False. keys : bool, optional (default=False) If True, return edge keys with each edge. Returns ------- edge_iter : iterator An iterator of (u,v), (u,v,d) or (u,v,key,d) tuples of edges. See Also -------- edges : return a list of edges Notes ----- Nodes in nbunch that are not in the graph will be (quietly) ignored. For directed graphs this returns the out-edges. Examples -------- >>> G = nx.MultiGraph() # or MultiDiGraph >>> G.add_path([0,1,2]) >>> G.add_edge(2,3,weight=5) >>> [e for e in G.edges_iter()] [(0, 1), (1, 2), (2, 3)] >>> list(G.edges_iter(data=True)) # default data is {} (empty dict) [(0, 1, {}), (1, 2, {}), (2, 3, {'weight': 5})] >>> list(G.edges_iter(data='weight', default=1)) [(0, 1, 1), (1, 2, 1), (2, 3, 5)] >>> list(G.edges(keys=True)) # default keys are integers [(0, 1, 0), (1, 2, 0), (2, 3, 0)] >>> list(G.edges(data=True,keys=True)) # default keys are integers [(0, 1, 0, {}), (1, 2, 0, {}), (2, 3, 0, {'weight': 5})] >>> list(G.edges(data='weight',default=1,keys=True)) [(0, 1, 0, 1), (1, 2, 0, 1), (2, 3, 0, 5)] >>> list(G.edges_iter([0,3])) [(0, 1), (3, 2)] >>> list(G.edges_iter(0)) [(0, 1)] """ seen = {} # helper dict to keep track of multiply stored edges if nbunch is None: nodes_nbrs = self.adj.items() else: nodes_nbrs = ((n, self.adj[n]) for n in self.nbunch_iter(nbunch)) if data is True: for n, nbrs in nodes_nbrs: for nbr, keydict in nbrs.items(): if nbr not in seen: for key, ddict in keydict.items(): yield (n, nbr, key, ddict) if keys else (n, nbr, ddict) seen[n] = 1 elif data is not False: for n, nbrs in nodes_nbrs: for nbr, keydict in nbrs.items(): if nbr not in seen: for key, ddict in keydict.items(): d = ddict[data] if data in ddict else default yield (n, nbr, key, d) if keys else (n, nbr, d) seen[n] = 1 else: for n, nbrs in nodes_nbrs: for nbr, keydict in nbrs.items(): if nbr not in seen: for key in keydict: yield (n, nbr, key) if keys else (n, nbr) seen[n] = 1 del seen def get_edge_data(self, u, v, key=None, default=None): """Return the attribute dictionary associated with edge (u,v). Parameters ---------- u, v : nodes default : any Python object (default=None) Value to return if the edge (u,v) is not found. key : hashable identifier, optional (default=None) Return data only for the edge with specified key. Returns ------- edge_dict : dictionary The edge attribute dictionary. Notes ----- It is faster to use G[u][v][key]. >>> G = nx.MultiGraph() # or MultiDiGraph >>> G.add_edge(0,1,key='a',weight=7) >>> G[0][1]['a'] # key='a' {'weight': 7} Warning: Assigning G[u][v][key] corrupts the graph data structure. But it is safe to assign attributes to that dictionary, >>> G[0][1]['a']['weight'] = 10 >>> G[0][1]['a']['weight'] 10 >>> G[1][0]['a']['weight'] 10 Examples -------- >>> G = nx.MultiGraph() # or MultiDiGraph >>> G.add_path([0,1,2,3]) >>> G.get_edge_data(0,1) {0: {}} >>> e = (0,1) >>> G.get_edge_data(*e) # tuple form {0: {}} >>> G.get_edge_data('a','b',default=0) # edge not in graph, return 0 0 """ try: if key is None: return self.adj[u][v] else: return self.adj[u][v][key] except KeyError: return default def degree_iter(self, nbunch=None, weight=None): """Return an iterator for (node, degree). The node degree is the number of edges adjacent to the node. Parameters ---------- nbunch : iterable container, optional (default=all nodes) A container of nodes. The container will be iterated through once. weight : string or None, optional (default=None) The edge attribute that holds the numerical value used as a weight. If None, then each edge has weight 1. The degree is the sum of the edge weights adjacent to the node. Returns ------- nd_iter : an iterator The iterator returns two-tuples of (node, degree). See Also -------- degree Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> list(G.degree_iter(0)) # node 0 with degree 1 [(0, 1)] >>> list(G.degree_iter([0,1])) [(0, 1), (1, 2)] """ if nbunch is None: nodes_nbrs = self.adj.items() else: nodes_nbrs = ((n, self.adj[n]) for n in self.nbunch_iter(nbunch)) if weight is None: for n, nbrs in nodes_nbrs: deg = sum([len(data) for data in nbrs.values()]) yield (n, deg + (n in nbrs and len(nbrs[n]))) else: # edge weighted graph - degree is sum of nbr edge weights for n, nbrs in nodes_nbrs: deg = sum([d.get(weight, 1) for data in nbrs.values() for d in data.values()]) if n in nbrs: deg += sum([d.get(weight, 1) for key, d in nbrs[n].items()]) yield (n, deg) def is_multigraph(self): """Return True if graph is a multigraph, False otherwise.""" return True def is_directed(self): """Return True if graph is directed, False otherwise.""" return False def to_directed(self): """Return a directed representation of the graph. Returns ------- G : MultiDiGraph A directed graph with the same name, same nodes, and with each edge (u,v,data) replaced by two directed edges (u,v,data) and (v,u,data). Notes ----- This returns a "deepcopy" of the edge, node, and graph attributes which attempts to completely copy all of the data and references. This is in contrast to the similar D=DiGraph(G) which returns a shallow copy of the data. See the Python copy module for more information on shallow and deep copies, http://docs.python.org/library/copy.html. Warning: If you have subclassed MultiGraph to use dict-like objects in the data structure, those changes do not transfer to the MultiDiGraph created by this method. Examples -------- >>> G = nx.Graph() # or MultiGraph, etc >>> G.add_path([0,1]) >>> H = G.to_directed() >>> H.edges() [(0, 1), (1, 0)] If already directed, return a (deep) copy >>> G = nx.DiGraph() # or MultiDiGraph, etc >>> G.add_path([0,1]) >>> H = G.to_directed() >>> H.edges() [(0, 1)] """ from networkx.classes.multidigraph import MultiDiGraph G = MultiDiGraph() G.add_nodes_from(self) G.add_edges_from((u, v, key, deepcopy(datadict)) for u, nbrs in self.adjacency_iter() for v, keydict in nbrs.items() for key, datadict in keydict.items()) G.graph = deepcopy(self.graph) G.node = deepcopy(self.node) return G def selfloop_edges(self, data=False, keys=False, default=None): """Return a list of selfloop edges. A selfloop edge has the same node at both ends. Parameters ---------- data : bool, optional (default=False) Return selfloop edges as two tuples (u,v) (data=False) or three-tuples (u,v,datadict) (data=True) or three-tuples (u,v,datavalue) (data='attrname') default : value, optional (default=None) Value used for edges that dont have the requested attribute. Only relevant if data is not True or False. keys : bool, optional (default=False) If True, return edge keys with each edge. Returns ------- edgelist : list of edge tuples A list of all selfloop edges. See Also -------- nodes_with_selfloops, number_of_selfloops Examples -------- >>> G = nx.MultiGraph() # or MultiDiGraph >>> G.add_edge(1,1) >>> G.add_edge(1,2) >>> G.selfloop_edges() [(1, 1)] >>> G.selfloop_edges(data=True) [(1, 1, {})] >>> G.selfloop_edges(keys=True) [(1, 1, 0)] >>> G.selfloop_edges(keys=True, data=True) [(1, 1, 0, {})] """ if data is True: if keys: return [(n, n, k, d) for n, nbrs in self.adj.items() if n in nbrs for k, d in nbrs[n].items()] else: return [(n, n, d) for n, nbrs in self.adj.items() if n in nbrs for d in nbrs[n].values()] elif data is not False: if keys: return [(n, n, k, d.get(data, default)) for n, nbrs in self.adj.items() if n in nbrs for k, d in nbrs[n].items()] else: return [(n, n, d.get(data, default)) for n, nbrs in self.adj.items() if n in nbrs for d in nbrs[n].values()] else: if keys: return [(n, n, k) for n, nbrs in self.adj.items() if n in nbrs for k in nbrs[n].keys()] else: return [(n, n) for n, nbrs in self.adj.items() if n in nbrs for d in nbrs[n].values()] def number_of_edges(self, u=None, v=None): """Return the number of edges between two nodes. Parameters ---------- u, v : nodes, optional (default=all edges) If u and v are specified, return the number of edges between u and v. Otherwise return the total number of all edges. Returns ------- nedges : int The number of edges in the graph. If nodes u and v are specified return the number of edges between those nodes. See Also -------- size Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> G.number_of_edges() 3 >>> G.number_of_edges(0,1) 1 >>> e = (0,1) >>> G.number_of_edges(*e) 1 """ if u is None: return self.size() try: edgedata = self.adj[u][v] except KeyError: return 0 # no such edge return len(edgedata) def subgraph(self, nbunch): """Return the subgraph induced on nodes in nbunch. The induced subgraph of the graph contains the nodes in nbunch and the edges between those nodes. Parameters ---------- nbunch : list, iterable A container of nodes which will be iterated through once. Returns ------- G : Graph A subgraph of the graph with the same edge attributes. Notes ----- The graph, edge or node attributes just point to the original graph. So changes to the node or edge structure will not be reflected in the original graph while changes to the attributes will. To create a subgraph with its own copy of the edge/node attributes use: nx.Graph(G.subgraph(nbunch)) If edge attributes are containers, a deep copy can be obtained using: G.subgraph(nbunch).copy() For an inplace reduction of a graph to a subgraph you can remove nodes: G.remove_nodes_from([ n in G if n not in set(nbunch)]) Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> H = G.subgraph([0,1,2]) >>> H.edges() [(0, 1), (1, 2)] """ bunch = self.nbunch_iter(nbunch) # create new graph and copy subgraph into it H = self.__class__() # copy node and attribute dictionaries for n in bunch: H.node[n] = self.node[n] # namespace shortcuts for speed H_adj = H.adj self_adj = self.adj # add nodes and edges (undirected method) for n in H: Hnbrs = H.adjlist_dict_factory() H_adj[n] = Hnbrs for nbr, edgedict in self_adj[n].items(): if nbr in H_adj: # add both representations of edge: n-nbr and nbr-n # they share the same edgedict ed = edgedict.copy() Hnbrs[nbr] = ed H_adj[nbr][n] = ed H.graph = self.graph return H networkx-1.11/networkx/classes/__init__.py0000644000175000017500000000026012637544450020632 0ustar aricaric00000000000000from .graph import Graph from .digraph import DiGraph from .multigraph import MultiGraph from .multidigraph import MultiDiGraph from .ordered import * from .function import * networkx-1.11/networkx/classes/graph.py0000644000175000017500000016401512653165361020203 0ustar aricaric00000000000000"""Base class for undirected graphs. The Graph class allows any hashable object as a node and can associate key/value attribute pairs with each undirected edge. Self-loops are allowed but multiple edges are not (see MultiGraph). For directed graphs see DiGraph and MultiDiGraph. """ # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. from copy import deepcopy import networkx as nx from networkx.exception import NetworkXError import networkx.convert as convert __author__ = """\n""".join(['Aric Hagberg (hagberg@lanl.gov)', 'Pieter Swart (swart@lanl.gov)', 'Dan Schult(dschult@colgate.edu)']) class Graph(object): """ Base class for undirected graphs. A Graph stores nodes and edges with optional data, or attributes. Graphs hold undirected edges. Self loops are allowed but multiple (parallel) edges are not. Nodes can be arbitrary (hashable) Python objects with optional key/value attributes. Edges are represented as links between nodes with optional key/value attributes. Parameters ---------- data : input graph Data to initialize graph. If data=None (default) an empty graph is created. The data can be an edge list, or any NetworkX graph object. If the corresponding optional Python packages are installed the data can also be a NumPy matrix or 2d ndarray, a SciPy sparse matrix, or a PyGraphviz graph. attr : keyword arguments, optional (default= no attributes) Attributes to add to graph as key=value pairs. See Also -------- DiGraph MultiGraph MultiDiGraph Examples -------- Create an empty graph structure (a "null graph") with no nodes and no edges. >>> G = nx.Graph() G can be grown in several ways. **Nodes:** Add one node at a time: >>> G.add_node(1) Add the nodes from any container (a list, dict, set or even the lines from a file or the nodes from another graph). >>> G.add_nodes_from([2,3]) >>> G.add_nodes_from(range(100,110)) >>> H=nx.Graph() >>> H.add_path([0,1,2,3,4,5,6,7,8,9]) >>> G.add_nodes_from(H) In addition to strings and integers any hashable Python object (except None) can represent a node, e.g. a customized node object, or even another Graph. >>> G.add_node(H) **Edges:** G can also be grown by adding edges. Add one edge, >>> G.add_edge(1, 2) a list of edges, >>> G.add_edges_from([(1,2),(1,3)]) or a collection of edges, >>> G.add_edges_from(H.edges()) If some edges connect nodes not yet in the graph, the nodes are added automatically. There are no errors when adding nodes or edges that already exist. **Attributes:** Each graph, node, and edge can hold key/value attribute pairs in an associated attribute dictionary (the keys must be hashable). By default these are empty, but can be added or changed using add_edge, add_node or direct manipulation of the attribute dictionaries named graph, node and edge respectively. >>> G = nx.Graph(day="Friday") >>> G.graph {'day': 'Friday'} Add node attributes using add_node(), add_nodes_from() or G.node >>> G.add_node(1, time='5pm') >>> G.add_nodes_from([3], time='2pm') >>> G.node[1] {'time': '5pm'} >>> G.node[1]['room'] = 714 >>> del G.node[1]['room'] # remove attribute >>> G.nodes(data=True) [(1, {'time': '5pm'}), (3, {'time': '2pm'})] Warning: adding a node to G.node does not add it to the graph. Add edge attributes using add_edge(), add_edges_from(), subscript notation, or G.edge. >>> G.add_edge(1, 2, weight=4.7 ) >>> G.add_edges_from([(3,4),(4,5)], color='red') >>> G.add_edges_from([(1,2,{'color':'blue'}), (2,3,{'weight':8})]) >>> G[1][2]['weight'] = 4.7 >>> G.edge[1][2]['weight'] = 4 **Shortcuts:** Many common graph features allow python syntax to speed reporting. >>> 1 in G # check if node in graph True >>> [n for n in G if n<3] # iterate through nodes [1, 2] >>> len(G) # number of nodes in graph 5 The fastest way to traverse all edges of a graph is via adjacency_iter(), but the edges() method is often more convenient. >>> for n,nbrsdict in G.adjacency_iter(): ... for nbr,eattr in nbrsdict.items(): ... if 'weight' in eattr: ... (n,nbr,eattr['weight']) (1, 2, 4) (2, 1, 4) (2, 3, 8) (3, 2, 8) >>> G.edges(data='weight') [(1, 2, 4), (2, 3, 8), (3, 4, None), (4, 5, None)] **Reporting:** Simple graph information is obtained using methods. Iterator versions of many reporting methods exist for efficiency. Methods exist for reporting nodes(), edges(), neighbors() and degree() as well as the number of nodes and edges. For details on these and other miscellaneous methods, see below. **Subclasses (Advanced):** The Graph class uses a dict-of-dict-of-dict data structure. The outer dict (node_dict) holds adjacency lists keyed by node. The next dict (adjlist) represents the adjacency list and holds edge data keyed by neighbor. The inner dict (edge_attr) represents the edge data and holds edge attribute values keyed by attribute names. Each of these three dicts can be replaced by a user defined dict-like object. In general, the dict-like features should be maintained but extra features can be added. To replace one of the dicts create a new graph class by changing the class(!) variable holding the factory for that dict-like structure. The variable names are node_dict_factory, adjlist_dict_factory and edge_attr_dict_factory. node_dict_factory : function, (default: dict) Factory function to be used to create the outer-most dict in the data structure that holds adjacency lists keyed by node. It should require no arguments and return a dict-like object. adjlist_dict_factory : function, (default: dict) Factory function to be used to create the adjacency list dict which holds edge data keyed by neighbor. It should require no arguments and return a dict-like object edge_attr_dict_factory : function, (default: dict) Factory function to be used to create the edge attribute dict which holds attrbute values keyed by attribute name. It should require no arguments and return a dict-like object. Examples -------- Create a graph object that tracks the order nodes are added. >>> from collections import OrderedDict >>> class OrderedNodeGraph(nx.Graph): ... node_dict_factory=OrderedDict >>> G=OrderedNodeGraph() >>> G.add_nodes_from( (2,1) ) >>> G.nodes() [2, 1] >>> G.add_edges_from( ((2,2), (2,1), (1,1)) ) >>> G.edges() [(2, 1), (2, 2), (1, 1)] Create a graph object that tracks the order nodes are added and for each node track the order that neighbors are added. >>> class OrderedGraph(nx.Graph): ... node_dict_factory = OrderedDict ... adjlist_dict_factory = OrderedDict >>> G = OrderedGraph() >>> G.add_nodes_from( (2,1) ) >>> G.nodes() [2, 1] >>> G.add_edges_from( ((2,2), (2,1), (1,1)) ) >>> G.edges() [(2, 2), (2, 1), (1, 1)] Create a low memory graph class that effectively disallows edge attributes by using a single attribute dict for all edges. This reduces the memory used, but you lose edge attributes. >>> class ThinGraph(nx.Graph): ... all_edge_dict = {'weight': 1} ... def single_edge_dict(self): ... return self.all_edge_dict ... edge_attr_dict_factory = single_edge_dict >>> G = ThinGraph() >>> G.add_edge(2,1) >>> G.edges(data= True) [(1, 2, {'weight': 1})] >>> G.add_edge(2,2) >>> G[2][1] is G[2][2] True """ node_dict_factory = dict adjlist_dict_factory = dict edge_attr_dict_factory = dict def __init__(self, data=None, **attr): """Initialize a graph with edges, name, graph attributes. Parameters ---------- data : input graph Data to initialize graph. If data=None (default) an empty graph is created. The data can be an edge list, or any NetworkX graph object. If the corresponding optional Python packages are installed the data can also be a NumPy matrix or 2d ndarray, a SciPy sparse matrix, or a PyGraphviz graph. name : string, optional (default='') An optional name for the graph. attr : keyword arguments, optional (default= no attributes) Attributes to add to graph as key=value pairs. See Also -------- convert Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G = nx.Graph(name='my graph') >>> e = [(1,2),(2,3),(3,4)] # list of edges >>> G = nx.Graph(e) Arbitrary graph attribute pairs (key=value) may be assigned >>> G=nx.Graph(e, day="Friday") >>> G.graph {'day': 'Friday'} """ self.node_dict_factory = ndf = self.node_dict_factory self.adjlist_dict_factory = self.adjlist_dict_factory self.edge_attr_dict_factory = self.edge_attr_dict_factory self.graph = {} # dictionary for graph attributes self.node = ndf() # empty node attribute dict self.adj = ndf() # empty adjacency dict # attempt to load graph with data if data is not None: convert.to_networkx_graph(data, create_using=self) # load graph attributes (must be after convert) self.graph.update(attr) self.edge = self.adj @property def name(self): return self.graph.get('name', '') @name.setter def name(self, s): self.graph['name'] = s def __str__(self): """Return the graph name. Returns ------- name : string The name of the graph. Examples -------- >>> G = nx.Graph(name='foo') >>> str(G) 'foo' """ return self.name def __iter__(self): """Iterate over the nodes. Use the expression 'for n in G'. Returns ------- niter : iterator An iterator over all nodes in the graph. Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) """ return iter(self.node) def __contains__(self, n): """Return True if n is a node, False otherwise. Use the expression 'n in G'. Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> 1 in G True """ try: return n in self.node except TypeError: return False def __len__(self): """Return the number of nodes. Use the expression 'len(G)'. Returns ------- nnodes : int The number of nodes in the graph. Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> len(G) 4 """ return len(self.node) def __getitem__(self, n): """Return a dict of neighbors of node n. Use the expression 'G[n]'. Parameters ---------- n : node A node in the graph. Returns ------- adj_dict : dictionary The adjacency dictionary for nodes connected to n. Notes ----- G[n] is similar to G.neighbors(n) but the internal data dictionary is returned instead of a list. Assigning G[n] will corrupt the internal graph data structure. Use G[n] for reading data only. Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> G[0] {1: {}} """ return self.adj[n] def add_node(self, n, attr_dict=None, **attr): """Add a single node n and update node attributes. Parameters ---------- n : node A node can be any hashable Python object except None. attr_dict : dictionary, optional (default= no attributes) Dictionary of node attributes. Key/value pairs will update existing data associated with the node. attr : keyword arguments, optional Set or change attributes using key=value. See Also -------- add_nodes_from Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_node(1) >>> G.add_node('Hello') >>> K3 = nx.Graph([(0,1),(1,2),(2,0)]) >>> G.add_node(K3) >>> G.number_of_nodes() 3 Use keywords set/change node attributes: >>> G.add_node(1,size=10) >>> G.add_node(3,weight=0.4,UTM=('13S',382871,3972649)) Notes ----- A hashable object is one that can be used as a key in a Python dictionary. This includes strings, numbers, tuples of strings and numbers, etc. On many platforms hashable items also include mutables such as NetworkX Graphs, though one should be careful that the hash doesn't change on mutables. """ # set up attribute dict if attr_dict is None: attr_dict = attr else: try: attr_dict.update(attr) except AttributeError: raise NetworkXError( "The attr_dict argument must be a dictionary.") if n not in self.node: self.adj[n] = self.adjlist_dict_factory() self.node[n] = attr_dict else: # update attr even if node already exists self.node[n].update(attr_dict) def add_nodes_from(self, nodes, **attr): """Add multiple nodes. Parameters ---------- nodes : iterable container A container of nodes (list, dict, set, etc.). OR A container of (node, attribute dict) tuples. Node attributes are updated using the attribute dict. attr : keyword arguments, optional (default= no attributes) Update attributes for all nodes in nodes. Node attributes specified in nodes as a tuple take precedence over attributes specified generally. See Also -------- add_node Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_nodes_from('Hello') >>> K3 = nx.Graph([(0,1),(1,2),(2,0)]) >>> G.add_nodes_from(K3) >>> sorted(G.nodes(),key=str) [0, 1, 2, 'H', 'e', 'l', 'o'] Use keywords to update specific node attributes for every node. >>> G.add_nodes_from([1,2], size=10) >>> G.add_nodes_from([3,4], weight=0.4) Use (node, attrdict) tuples to update attributes for specific nodes. >>> G.add_nodes_from([(1,dict(size=11)), (2,{'color':'blue'})]) >>> G.node[1]['size'] 11 >>> H = nx.Graph() >>> H.add_nodes_from(G.nodes(data=True)) >>> H.node[1]['size'] 11 """ for n in nodes: # keep all this inside try/except because # CPython throws TypeError on n not in self.succ, # while pre-2.7.5 ironpython throws on self.succ[n] try: if n not in self.node: self.adj[n] = self.adjlist_dict_factory() self.node[n] = attr.copy() else: self.node[n].update(attr) except TypeError: nn, ndict = n if nn not in self.node: self.adj[nn] = self.adjlist_dict_factory() newdict = attr.copy() newdict.update(ndict) self.node[nn] = newdict else: olddict = self.node[nn] olddict.update(attr) olddict.update(ndict) def remove_node(self, n): """Remove node n. Removes the node n and all adjacent edges. Attempting to remove a non-existent node will raise an exception. Parameters ---------- n : node A node in the graph Raises ------- NetworkXError If n is not in the graph. See Also -------- remove_nodes_from Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2]) >>> G.edges() [(0, 1), (1, 2)] >>> G.remove_node(1) >>> G.edges() [] """ adj = self.adj try: nbrs = list(adj[n].keys()) # keys handles self-loops (allow mutation later) del self.node[n] except KeyError: # NetworkXError if n not in self raise NetworkXError("The node %s is not in the graph." % (n,)) for u in nbrs: del adj[u][n] # remove all edges n-u in graph del adj[n] # now remove node def remove_nodes_from(self, nodes): """Remove multiple nodes. Parameters ---------- nodes : iterable container A container of nodes (list, dict, set, etc.). If a node in the container is not in the graph it is silently ignored. See Also -------- remove_node Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2]) >>> e = G.nodes() >>> e [0, 1, 2] >>> G.remove_nodes_from(e) >>> G.nodes() [] """ adj = self.adj for n in nodes: try: del self.node[n] for u in list(adj[n].keys()): # keys() handles self-loops del adj[u][n] # (allows mutation of dict in loop) del adj[n] except KeyError: pass def nodes_iter(self, data=False): """Return an iterator over the nodes. Parameters ---------- data : boolean, optional (default=False) If False the iterator returns nodes. If True return a two-tuple of node and node data dictionary Returns ------- niter : iterator An iterator over nodes. If data=True the iterator gives two-tuples containing (node, node data, dictionary) Notes ----- If the node data is not required it is simpler and equivalent to use the expression 'for n in G'. >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2]) Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2]) >>> [d for n,d in G.nodes_iter(data=True)] [{}, {}, {}] """ if data: return iter(self.node.items()) return iter(self.node) def nodes(self, data=False): """Return a list of the nodes in the graph. Parameters ---------- data : boolean, optional (default=False) If False return a list of nodes. If True return a two-tuple of node and node data dictionary Returns ------- nlist : list A list of nodes. If data=True a list of two-tuples containing (node, node data dictionary). Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2]) >>> G.nodes() [0, 1, 2] >>> G.add_node(1, time='5pm') >>> G.nodes(data=True) [(0, {}), (1, {'time': '5pm'}), (2, {})] """ return list(self.nodes_iter(data=data)) def number_of_nodes(self): """Return the number of nodes in the graph. Returns ------- nnodes : int The number of nodes in the graph. See Also -------- order, __len__ which are identical Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2]) >>> len(G) 3 """ return len(self.node) def order(self): """Return the number of nodes in the graph. Returns ------- nnodes : int The number of nodes in the graph. See Also -------- number_of_nodes, __len__ which are identical """ return len(self.node) def has_node(self, n): """Return True if the graph contains the node n. Parameters ---------- n : node Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2]) >>> G.has_node(0) True It is more readable and simpler to use >>> 0 in G True """ try: return n in self.node except TypeError: return False def add_edge(self, u, v, attr_dict=None, **attr): """Add an edge between u and v. The nodes u and v will be automatically added if they are not already in the graph. Edge attributes can be specified with keywords or by providing a dictionary with key/value pairs. See examples below. Parameters ---------- u, v : nodes Nodes can be, for example, strings or numbers. Nodes must be hashable (and not None) Python objects. attr_dict : dictionary, optional (default= no attributes) Dictionary of edge attributes. Key/value pairs will update existing data associated with the edge. attr : keyword arguments, optional Edge data (or labels or objects) can be assigned using keyword arguments. See Also -------- add_edges_from : add a collection of edges Notes ----- Adding an edge that already exists updates the edge data. Many NetworkX algorithms designed for weighted graphs use as the edge weight a numerical value assigned to a keyword which by default is 'weight'. Examples -------- The following all add the edge e=(1,2) to graph G: >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> e = (1,2) >>> G.add_edge(1, 2) # explicit two-node form >>> G.add_edge(*e) # single edge as tuple of two nodes >>> G.add_edges_from( [(1,2)] ) # add edges from iterable container Associate data to edges using keywords: >>> G.add_edge(1, 2, weight=3) >>> G.add_edge(1, 3, weight=7, capacity=15, length=342.7) """ # set up attribute dictionary if attr_dict is None: attr_dict = attr else: try: attr_dict.update(attr) except AttributeError: raise NetworkXError( "The attr_dict argument must be a dictionary.") # add nodes if u not in self.node: self.adj[u] = self.adjlist_dict_factory() self.node[u] = {} if v not in self.node: self.adj[v] = self.adjlist_dict_factory() self.node[v] = {} # add the edge datadict = self.adj[u].get(v, self.edge_attr_dict_factory()) datadict.update(attr_dict) self.adj[u][v] = datadict self.adj[v][u] = datadict def add_edges_from(self, ebunch, attr_dict=None, **attr): """Add all the edges in ebunch. Parameters ---------- ebunch : container of edges Each edge given in the container will be added to the graph. The edges must be given as as 2-tuples (u,v) or 3-tuples (u,v,d) where d is a dictionary containing edge data. attr_dict : dictionary, optional (default= no attributes) Dictionary of edge attributes. Key/value pairs will update existing data associated with each edge. attr : keyword arguments, optional Edge data (or labels or objects) can be assigned using keyword arguments. See Also -------- add_edge : add a single edge add_weighted_edges_from : convenient way to add weighted edges Notes ----- Adding the same edge twice has no effect but any edge data will be updated when each duplicate edge is added. Edge attributes specified in edges take precedence over attributes specified generally. Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_edges_from([(0,1),(1,2)]) # using a list of edge tuples >>> e = zip(range(0,3),range(1,4)) >>> G.add_edges_from(e) # Add the path graph 0-1-2-3 Associate data to edges >>> G.add_edges_from([(1,2),(2,3)], weight=3) >>> G.add_edges_from([(3,4),(1,4)], label='WN2898') """ # set up attribute dict if attr_dict is None: attr_dict = attr else: try: attr_dict.update(attr) except AttributeError: raise NetworkXError( "The attr_dict argument must be a dictionary.") # process ebunch for e in ebunch: ne = len(e) if ne == 3: u, v, dd = e elif ne == 2: u, v = e dd = {} # doesnt need edge_attr_dict_factory else: raise NetworkXError( "Edge tuple %s must be a 2-tuple or 3-tuple." % (e,)) if u not in self.node: self.adj[u] = self.adjlist_dict_factory() self.node[u] = {} if v not in self.node: self.adj[v] = self.adjlist_dict_factory() self.node[v] = {} datadict = self.adj[u].get(v, self.edge_attr_dict_factory()) datadict.update(attr_dict) datadict.update(dd) self.adj[u][v] = datadict self.adj[v][u] = datadict def add_weighted_edges_from(self, ebunch, weight='weight', **attr): """Add all the edges in ebunch as weighted edges with specified weights. Parameters ---------- ebunch : container of edges Each edge given in the list or container will be added to the graph. The edges must be given as 3-tuples (u,v,w) where w is a number. weight : string, optional (default= 'weight') The attribute name for the edge weights to be added. attr : keyword arguments, optional (default= no attributes) Edge attributes to add/update for all edges. See Also -------- add_edge : add a single edge add_edges_from : add multiple edges Notes ----- Adding the same edge twice for Graph/DiGraph simply updates the edge data. For MultiGraph/MultiDiGraph, duplicate edges are stored. Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_weighted_edges_from([(0,1,3.0),(1,2,7.5)]) """ self.add_edges_from(((u, v, {weight: d}) for u, v, d in ebunch), **attr) def remove_edge(self, u, v): """Remove the edge between u and v. Parameters ---------- u, v : nodes Remove the edge between nodes u and v. Raises ------ NetworkXError If there is not an edge between u and v. See Also -------- remove_edges_from : remove a collection of edges Examples -------- >>> G = nx.Graph() # or DiGraph, etc >>> G.add_path([0,1,2,3]) >>> G.remove_edge(0,1) >>> e = (1,2) >>> G.remove_edge(*e) # unpacks e from an edge tuple >>> e = (2,3,{'weight':7}) # an edge with attribute data >>> G.remove_edge(*e[:2]) # select first part of edge tuple """ try: del self.adj[u][v] if u != v: # self-loop needs only one entry removed del self.adj[v][u] except KeyError: raise NetworkXError("The edge %s-%s is not in the graph" % (u, v)) def remove_edges_from(self, ebunch): """Remove all edges specified in ebunch. Parameters ---------- ebunch: list or container of edge tuples Each edge given in the list or container will be removed from the graph. The edges can be: - 2-tuples (u,v) edge between u and v. - 3-tuples (u,v,k) where k is ignored. See Also -------- remove_edge : remove a single edge Notes ----- Will fail silently if an edge in ebunch is not in the graph. Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> ebunch=[(1,2),(2,3)] >>> G.remove_edges_from(ebunch) """ adj = self.adj for e in ebunch: u, v = e[:2] # ignore edge data if present if u in adj and v in adj[u]: del adj[u][v] if u != v: # self loop needs only one entry removed del adj[v][u] def has_edge(self, u, v): """Return True if the edge (u,v) is in the graph. Parameters ---------- u, v : nodes Nodes can be, for example, strings or numbers. Nodes must be hashable (and not None) Python objects. Returns ------- edge_ind : bool True if edge is in the graph, False otherwise. Examples -------- Can be called either using two nodes u,v or edge tuple (u,v) >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> G.has_edge(0,1) # using two nodes True >>> e = (0,1) >>> G.has_edge(*e) # e is a 2-tuple (u,v) True >>> e = (0,1,{'weight':7}) >>> G.has_edge(*e[:2]) # e is a 3-tuple (u,v,data_dictionary) True The following syntax are all equivalent: >>> G.has_edge(0,1) True >>> 1 in G[0] # though this gives KeyError if 0 not in G True """ try: return v in self.adj[u] except KeyError: return False def neighbors(self, n): """Return a list of the nodes connected to the node n. Parameters ---------- n : node A node in the graph Returns ------- nlist : list A list of nodes that are adjacent to n. Raises ------ NetworkXError If the node n is not in the graph. Notes ----- It is usually more convenient (and faster) to access the adjacency dictionary as G[n]: >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_edge('a','b',weight=7) >>> G['a'] {'b': {'weight': 7}} Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> G.neighbors(0) [1] """ try: return list(self.adj[n]) except KeyError: raise NetworkXError("The node %s is not in the graph." % (n,)) def neighbors_iter(self, n): """Return an iterator over all neighbors of node n. Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> [n for n in G.neighbors_iter(0)] [1] Notes ----- It is faster to use the idiom "in G[0]", e.g. >>> G = nx.path_graph(4) >>> [n for n in G[0]] [1] """ try: return iter(self.adj[n]) except KeyError: raise NetworkXError("The node %s is not in the graph." % (n,)) def edges(self, nbunch=None, data=False, default=None): """Return a list of edges. Edges are returned as tuples with optional data in the order (node, neighbor, data). Parameters ---------- nbunch : iterable container, optional (default= all nodes) A container of nodes. The container will be iterated through once. data : string or bool, optional (default=False) The edge attribute returned in 3-tuple (u,v,ddict[data]). If True, return edge attribute dict in 3-tuple (u,v,ddict). If False, return 2-tuple (u,v). default : value, optional (default=None) Value used for edges that dont have the requested attribute. Only relevant if data is not True or False. Returns -------- edge_list: list of edge tuples Edges that are adjacent to any node in nbunch, or a list of all edges if nbunch is not specified. See Also -------- edges_iter : return an iterator over the edges Notes ----- Nodes in nbunch that are not in the graph will be (quietly) ignored. For directed graphs this returns the out-edges. Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2]) >>> G.add_edge(2,3,weight=5) >>> G.edges() [(0, 1), (1, 2), (2, 3)] >>> G.edges(data=True) # default edge data is {} (empty dictionary) [(0, 1, {}), (1, 2, {}), (2, 3, {'weight': 5})] >>> list(G.edges_iter(data='weight', default=1)) [(0, 1, 1), (1, 2, 1), (2, 3, 5)] >>> G.edges([0,3]) [(0, 1), (3, 2)] >>> G.edges(0) [(0, 1)] """ return list(self.edges_iter(nbunch, data, default)) def edges_iter(self, nbunch=None, data=False, default=None): """Return an iterator over the edges. Edges are returned as tuples with optional data in the order (node, neighbor, data). Parameters ---------- nbunch : iterable container, optional (default= all nodes) A container of nodes. The container will be iterated through once. data : string or bool, optional (default=False) The edge attribute returned in 3-tuple (u,v,ddict[data]). If True, return edge attribute dict in 3-tuple (u,v,ddict). If False, return 2-tuple (u,v). default : value, optional (default=None) Value used for edges that dont have the requested attribute. Only relevant if data is not True or False. Returns ------- edge_iter : iterator An iterator of (u,v) or (u,v,d) tuples of edges. See Also -------- edges : return a list of edges Notes ----- Nodes in nbunch that are not in the graph will be (quietly) ignored. For directed graphs this returns the out-edges. Examples -------- >>> G = nx.Graph() # or MultiGraph, etc >>> G.add_path([0,1,2]) >>> G.add_edge(2,3,weight=5) >>> [e for e in G.edges_iter()] [(0, 1), (1, 2), (2, 3)] >>> list(G.edges_iter(data=True)) # default data is {} (empty dict) [(0, 1, {}), (1, 2, {}), (2, 3, {'weight': 5})] >>> list(G.edges_iter(data='weight', default=1)) [(0, 1, 1), (1, 2, 1), (2, 3, 5)] >>> list(G.edges_iter([0,3])) [(0, 1), (3, 2)] >>> list(G.edges_iter(0)) [(0, 1)] """ seen = {} # helper dict to keep track of multiply stored edges if nbunch is None: nodes_nbrs = self.adj.items() else: nodes_nbrs = ((n, self.adj[n]) for n in self.nbunch_iter(nbunch)) if data is True: for n, nbrs in nodes_nbrs: for nbr, ddict in nbrs.items(): if nbr not in seen: yield (n, nbr, ddict) seen[n] = 1 elif data is not False: for n, nbrs in nodes_nbrs: for nbr, ddict in nbrs.items(): if nbr not in seen: d = ddict[data] if data in ddict else default yield (n, nbr, d) seen[n] = 1 else: # data is False for n, nbrs in nodes_nbrs: for nbr in nbrs: if nbr not in seen: yield (n, nbr) seen[n] = 1 del seen def get_edge_data(self, u, v, default=None): """Return the attribute dictionary associated with edge (u,v). Parameters ---------- u, v : nodes default: any Python object (default=None) Value to return if the edge (u,v) is not found. Returns ------- edge_dict : dictionary The edge attribute dictionary. Notes ----- It is faster to use G[u][v]. >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> G[0][1] {} Warning: Assigning G[u][v] corrupts the graph data structure. But it is safe to assign attributes to that dictionary, >>> G[0][1]['weight'] = 7 >>> G[0][1]['weight'] 7 >>> G[1][0]['weight'] 7 Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> G.get_edge_data(0,1) # default edge data is {} {} >>> e = (0,1) >>> G.get_edge_data(*e) # tuple form {} >>> G.get_edge_data('a','b',default=0) # edge not in graph, return 0 0 """ try: return self.adj[u][v] except KeyError: return default def adjacency_list(self): """Return an adjacency list representation of the graph. The output adjacency list is in the order of G.nodes(). For directed graphs, only outgoing adjacencies are included. Returns ------- adj_list : lists of lists The adjacency structure of the graph as a list of lists. See Also -------- adjacency_iter Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> G.adjacency_list() # in order given by G.nodes() [[1], [0, 2], [1, 3], [2]] """ return list(map(list, iter(self.adj.values()))) def adjacency_iter(self): """Return an iterator of (node, adjacency dict) tuples for all nodes. This is the fastest way to look at every edge. For directed graphs, only outgoing adjacencies are included. Returns ------- adj_iter : iterator An iterator of (node, adjacency dictionary) for all nodes in the graph. See Also -------- adjacency_list Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> [(n,nbrdict) for n,nbrdict in G.adjacency_iter()] [(0, {1: {}}), (1, {0: {}, 2: {}}), (2, {1: {}, 3: {}}), (3, {2: {}})] """ return iter(self.adj.items()) def degree(self, nbunch=None, weight=None): """Return the degree of a node or nodes. The node degree is the number of edges adjacent to that node. Parameters ---------- nbunch : iterable container, optional (default=all nodes) A container of nodes. The container will be iterated through once. weight : string or None, optional (default=None) The edge attribute that holds the numerical value used as a weight. If None, then each edge has weight 1. The degree is the sum of the edge weights adjacent to the node. Returns ------- nd : dictionary, or number A dictionary with nodes as keys and degree as values or a number if a single node is specified. Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> G.degree(0) 1 >>> G.degree([0,1]) {0: 1, 1: 2} >>> list(G.degree([0,1]).values()) [1, 2] """ if nbunch in self: # return a single node return next(self.degree_iter(nbunch, weight))[1] else: # return a dict return dict(self.degree_iter(nbunch, weight)) def degree_iter(self, nbunch=None, weight=None): """Return an iterator for (node, degree). The node degree is the number of edges adjacent to the node. Parameters ---------- nbunch : iterable container, optional (default=all nodes) A container of nodes. The container will be iterated through once. weight : string or None, optional (default=None) The edge attribute that holds the numerical value used as a weight. If None, then each edge has weight 1. The degree is the sum of the edge weights adjacent to the node. Returns ------- nd_iter : an iterator The iterator returns two-tuples of (node, degree). See Also -------- degree Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> list(G.degree_iter(0)) # node 0 with degree 1 [(0, 1)] >>> list(G.degree_iter([0,1])) [(0, 1), (1, 2)] """ if nbunch is None: nodes_nbrs = self.adj.items() else: nodes_nbrs = ((n, self.adj[n]) for n in self.nbunch_iter(nbunch)) if weight is None: for n, nbrs in nodes_nbrs: yield (n, len(nbrs) + (n in nbrs)) # return tuple (n,degree) else: # edge weighted graph - degree is sum of nbr edge weights for n, nbrs in nodes_nbrs: yield (n, sum((nbrs[nbr].get(weight, 1) for nbr in nbrs)) + (n in nbrs and nbrs[n].get(weight, 1))) def clear(self): """Remove all nodes and edges from the graph. This also removes the name, and all graph, node, and edge attributes. Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> G.clear() >>> G.nodes() [] >>> G.edges() [] """ self.name = '' self.adj.clear() self.node.clear() self.graph.clear() def copy(self): """Return a copy of the graph. Returns ------- G : Graph A copy of the graph. See Also -------- to_directed: return a directed copy of the graph. Notes ----- This makes a complete copy of the graph including all of the node or edge attributes. Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> H = G.copy() """ return deepcopy(self) def is_multigraph(self): """Return True if graph is a multigraph, False otherwise.""" return False def is_directed(self): """Return True if graph is directed, False otherwise.""" return False def to_directed(self): """Return a directed representation of the graph. Returns ------- G : DiGraph A directed graph with the same name, same nodes, and with each edge (u,v,data) replaced by two directed edges (u,v,data) and (v,u,data). Notes ----- This returns a "deepcopy" of the edge, node, and graph attributes which attempts to completely copy all of the data and references. This is in contrast to the similar D=DiGraph(G) which returns a shallow copy of the data. See the Python copy module for more information on shallow and deep copies, http://docs.python.org/library/copy.html. Warning: If you have subclassed Graph to use dict-like objects in the data structure, those changes do not transfer to the DiGraph created by this method. Examples -------- >>> G = nx.Graph() # or MultiGraph, etc >>> G.add_path([0,1]) >>> H = G.to_directed() >>> H.edges() [(0, 1), (1, 0)] If already directed, return a (deep) copy >>> G = nx.DiGraph() # or MultiDiGraph, etc >>> G.add_path([0,1]) >>> H = G.to_directed() >>> H.edges() [(0, 1)] """ from networkx import DiGraph G = DiGraph() G.name = self.name G.add_nodes_from(self) G.add_edges_from(((u, v, deepcopy(data)) for u, nbrs in self.adjacency_iter() for v, data in nbrs.items())) G.graph = deepcopy(self.graph) G.node = deepcopy(self.node) return G def to_undirected(self): """Return an undirected copy of the graph. Returns ------- G : Graph/MultiGraph A deepcopy of the graph. See Also -------- copy, add_edge, add_edges_from Notes ----- This returns a "deepcopy" of the edge, node, and graph attributes which attempts to completely copy all of the data and references. This is in contrast to the similar G=DiGraph(D) which returns a shallow copy of the data. See the Python copy module for more information on shallow and deep copies, http://docs.python.org/library/copy.html. Examples -------- >>> G = nx.Graph() # or MultiGraph, etc >>> G.add_path([0,1]) >>> H = G.to_directed() >>> H.edges() [(0, 1), (1, 0)] >>> G2 = H.to_undirected() >>> G2.edges() [(0, 1)] """ return deepcopy(self) def subgraph(self, nbunch): """Return the subgraph induced on nodes in nbunch. The induced subgraph of the graph contains the nodes in nbunch and the edges between those nodes. Parameters ---------- nbunch : list, iterable A container of nodes which will be iterated through once. Returns ------- G : Graph A subgraph of the graph with the same edge attributes. Notes ----- The graph, edge or node attributes just point to the original graph. So changes to the node or edge structure will not be reflected in the original graph while changes to the attributes will. To create a subgraph with its own copy of the edge/node attributes use: nx.Graph(G.subgraph(nbunch)) If edge attributes are containers, a deep copy can be obtained using: G.subgraph(nbunch).copy() For an inplace reduction of a graph to a subgraph you can remove nodes: G.remove_nodes_from([ n in G if n not in set(nbunch)]) Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> H = G.subgraph([0,1,2]) >>> H.edges() [(0, 1), (1, 2)] """ bunch = self.nbunch_iter(nbunch) # create new graph and copy subgraph into it H = self.__class__() # copy node and attribute dictionaries for n in bunch: H.node[n] = self.node[n] # namespace shortcuts for speed H_adj = H.adj self_adj = self.adj # add nodes and edges (undirected method) for n in H.node: Hnbrs = H.adjlist_dict_factory() H_adj[n] = Hnbrs for nbr, d in self_adj[n].items(): if nbr in H_adj: # add both representations of edge: n-nbr and nbr-n Hnbrs[nbr] = d H_adj[nbr][n] = d H.graph = self.graph return H def nodes_with_selfloops(self): """Return a list of nodes with self loops. A node with a self loop has an edge with both ends adjacent to that node. Returns ------- nodelist : list A list of nodes with self loops. See Also -------- selfloop_edges, number_of_selfloops Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_edge(1,1) >>> G.add_edge(1,2) >>> G.nodes_with_selfloops() [1] """ return [n for n, nbrs in self.adj.items() if n in nbrs] def selfloop_edges(self, data=False, default=None): """Return a list of selfloop edges. A selfloop edge has the same node at both ends. Parameters ---------- data : string or bool, optional (default=False) Return selfloop edges as two tuples (u,v) (data=False) or three-tuples (u,v,datadict) (data=True) or three-tuples (u,v,datavalue) (data='attrname') default : value, optional (default=None) Value used for edges that dont have the requested attribute. Only relevant if data is not True or False. Returns ------- edgelist : list of edge tuples A list of all selfloop edges. See Also -------- nodes_with_selfloops, number_of_selfloops Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_edge(1,1) >>> G.add_edge(1,2) >>> G.selfloop_edges() [(1, 1)] >>> G.selfloop_edges(data=True) [(1, 1, {})] """ if data is True: return [(n, n, nbrs[n]) for n, nbrs in self.adj.items() if n in nbrs] elif data is not False: return [(n, n, nbrs[n].get(data, default)) for n, nbrs in self.adj.items() if n in nbrs] else: return [(n, n) for n, nbrs in self.adj.items() if n in nbrs] def number_of_selfloops(self): """Return the number of selfloop edges. A selfloop edge has the same node at both ends. Returns ------- nloops : int The number of selfloops. See Also -------- nodes_with_selfloops, selfloop_edges Examples -------- >>> G=nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_edge(1,1) >>> G.add_edge(1,2) >>> G.number_of_selfloops() 1 """ return len(self.selfloop_edges()) def size(self, weight=None): """Return the number of edges. Parameters ---------- weight : string or None, optional (default=None) The edge attribute that holds the numerical value used as a weight. If None, then each edge has weight 1. Returns ------- nedges : int The number of edges or sum of edge weights in the graph. See Also -------- number_of_edges Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> G.size() 3 >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_edge('a','b',weight=2) >>> G.add_edge('b','c',weight=4) >>> G.size() 2 >>> G.size(weight='weight') 6.0 """ s = sum(self.degree(weight=weight).values()) / 2 if weight is None: return int(s) else: return float(s) def number_of_edges(self, u=None, v=None): """Return the number of edges between two nodes. Parameters ---------- u, v : nodes, optional (default=all edges) If u and v are specified, return the number of edges between u and v. Otherwise return the total number of all edges. Returns ------- nedges : int The number of edges in the graph. If nodes u and v are specified return the number of edges between those nodes. See Also -------- size Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> G.number_of_edges() 3 >>> G.number_of_edges(0,1) 1 >>> e = (0,1) >>> G.number_of_edges(*e) 1 """ if u is None: return int(self.size()) if v in self.adj[u]: return 1 else: return 0 def add_star(self, nodes, **attr): """Add a star. The first node in nodes is the middle of the star. It is connected to all other nodes. Parameters ---------- nodes : iterable container A container of nodes. attr : keyword arguments, optional (default= no attributes) Attributes to add to every edge in star. See Also -------- add_path, add_cycle Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_star([0,1,2,3]) >>> G.add_star([10,11,12],weight=2) """ nlist = list(nodes) v = nlist[0] edges = ((v, n) for n in nlist[1:]) self.add_edges_from(edges, **attr) def add_path(self, nodes, **attr): """Add a path. Parameters ---------- nodes : iterable container A container of nodes. A path will be constructed from the nodes (in order) and added to the graph. attr : keyword arguments, optional (default= no attributes) Attributes to add to every edge in path. See Also -------- add_star, add_cycle Examples -------- >>> G=nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> G.add_path([10,11,12],weight=7) """ nlist = list(nodes) edges = zip(nlist[:-1], nlist[1:]) self.add_edges_from(edges, **attr) def add_cycle(self, nodes, **attr): """Add a cycle. Parameters ---------- nodes: iterable container A container of nodes. A cycle will be constructed from the nodes (in order) and added to the graph. attr : keyword arguments, optional (default= no attributes) Attributes to add to every edge in cycle. See Also -------- add_path, add_star Examples -------- >>> G=nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_cycle([0,1,2,3]) >>> G.add_cycle([10,11,12],weight=7) """ nlist = list(nodes) edges = zip(nlist, nlist[1:] + [nlist[0]]) self.add_edges_from(edges, **attr) def nbunch_iter(self, nbunch=None): """Return an iterator of nodes contained in nbunch that are also in the graph. The nodes in nbunch are checked for membership in the graph and if not are silently ignored. Parameters ---------- nbunch : iterable container, optional (default=all nodes) A container of nodes. The container will be iterated through once. Returns ------- niter : iterator An iterator over nodes in nbunch that are also in the graph. If nbunch is None, iterate over all nodes in the graph. Raises ------ NetworkXError If nbunch is not a node or or sequence of nodes. If a node in nbunch is not hashable. See Also -------- Graph.__iter__ Notes ----- When nbunch is an iterator, the returned iterator yields values directly from nbunch, becoming exhausted when nbunch is exhausted. To test whether nbunch is a single node, one can use "if nbunch in self:", even after processing with this routine. If nbunch is not a node or a (possibly empty) sequence/iterator or None, a NetworkXError is raised. Also, if any object in nbunch is not hashable, a NetworkXError is raised. """ if nbunch is None: # include all nodes via iterator bunch = iter(self.adj.keys()) elif nbunch in self: # if nbunch is a single node bunch = iter([nbunch]) else: # if nbunch is a sequence of nodes def bunch_iter(nlist, adj): try: for n in nlist: if n in adj: yield n except TypeError as e: message = e.args[0] # capture error for non-sequence/iterator nbunch. if 'iter' in message: raise NetworkXError( "nbunch is not a node or a sequence of nodes.") # capture error for unhashable node. elif 'hashable' in message: raise NetworkXError( "Node %s in the sequence nbunch is not a valid node."%n) else: raise bunch = bunch_iter(nbunch, self.adj) return bunch networkx-1.11/networkx/classes/tests/0000755000175000017500000000000012653231454017660 5ustar aricaric00000000000000networkx-1.11/networkx/classes/tests/test_multidigraph.py0000644000175000017500000003136712637544500023775 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * import networkx from test_multigraph import BaseMultiGraphTester, TestMultiGraph class BaseMultiDiGraphTester(BaseMultiGraphTester): def test_edges(self): G=self.K3 assert_equal(sorted(G.edges()),[(0,1),(0,2),(1,0),(1,2),(2,0),(2,1)]) assert_equal(sorted(G.edges(0)),[(0,1),(0,2)]) assert_raises((KeyError,networkx.NetworkXError), G.edges,-1) def test_edges_data(self): G=self.K3 assert_equal(sorted(G.edges(data=True)), [(0,1,{}),(0,2,{}),(1,0,{}),(1,2,{}),(2,0,{}),(2,1,{})]) assert_equal(sorted(G.edges(0,data=True)),[(0,1,{}),(0,2,{})]) assert_raises((KeyError,networkx.NetworkXError), G.neighbors,-1) def test_edges_iter(self): G=self.K3 assert_equal(sorted(G.edges_iter()), [(0,1),(0,2),(1,0),(1,2),(2,0),(2,1)]) assert_equal(sorted(G.edges_iter(0)),[(0,1),(0,2)]) G.add_edge(0,1) assert_equal(sorted(G.edges_iter()), [(0,1),(0,1),(0,2),(1,0),(1,2),(2,0),(2,1)]) def test_out_edges(self): G=self.K3 assert_equal(sorted(G.out_edges()), [(0,1),(0,2),(1,0),(1,2),(2,0),(2,1)]) assert_equal(sorted(G.out_edges(0)),[(0,1),(0,2)]) assert_raises((KeyError,networkx.NetworkXError), G.out_edges,-1) assert_equal(sorted(G.out_edges(0,keys=True)),[(0,1,0),(0,2,0)]) def test_out_edges_iter(self): G=self.K3 assert_equal(sorted(G.out_edges_iter()), [(0,1),(0,2),(1,0),(1,2),(2,0),(2,1)]) assert_equal(sorted(G.out_edges_iter(0)),[(0,1),(0,2)]) G.add_edge(0,1,2) assert_equal(sorted(G.out_edges_iter()), [(0,1),(0,1),(0,2),(1,0),(1,2),(2,0),(2,1)]) def test_in_edges(self): G=self.K3 assert_equal(sorted(G.in_edges()), [(0,1),(0,2),(1,0),(1,2),(2,0),(2,1)]) assert_equal(sorted(G.in_edges(0)),[(1,0),(2,0)]) assert_raises((KeyError,networkx.NetworkXError), G.in_edges,-1) G.add_edge(0,1,2) assert_equal(sorted(G.in_edges()), [(0,1),(0,1),(0,2),(1,0),(1,2),(2,0),(2,1)]) assert_equal(sorted(G.in_edges(0,keys=True)),[(1,0,0),(2,0,0)]) def test_in_edges_iter(self): G=self.K3 assert_equal(sorted(G.in_edges_iter()), [(0,1),(0,2),(1,0),(1,2),(2,0),(2,1)]) assert_equal(sorted(G.in_edges_iter(0)),[(1,0),(2,0)]) G.add_edge(0,1,2) assert_equal(sorted(G.in_edges_iter()), [(0,1),(0,1),(0,2),(1,0),(1,2),(2,0),(2,1)]) assert_equal(sorted(G.in_edges_iter(data=True,keys=False)), [(0,1,{}),(0,1,{}),(0,2,{}),(1,0,{}),(1,2,{}), (2,0,{}),(2,1,{})]) def is_shallow(self,H,G): # graph assert_equal(G.graph['foo'],H.graph['foo']) G.graph['foo'].append(1) assert_equal(G.graph['foo'],H.graph['foo']) # node assert_equal(G.node[0]['foo'],H.node[0]['foo']) G.node[0]['foo'].append(1) assert_equal(G.node[0]['foo'],H.node[0]['foo']) # edge assert_equal(G[1][2][0]['foo'],H[1][2][0]['foo']) G[1][2][0]['foo'].append(1) assert_equal(G[1][2][0]['foo'],H[1][2][0]['foo']) def is_deep(self,H,G): # graph assert_equal(G.graph['foo'],H.graph['foo']) G.graph['foo'].append(1) assert_not_equal(G.graph['foo'],H.graph['foo']) # node assert_equal(G.node[0]['foo'],H.node[0]['foo']) G.node[0]['foo'].append(1) assert_not_equal(G.node[0]['foo'],H.node[0]['foo']) # edge assert_equal(G[1][2][0]['foo'],H[1][2][0]['foo']) G[1][2][0]['foo'].append(1) assert_not_equal(G[1][2][0]['foo'],H[1][2][0]['foo']) def test_to_undirected(self): # MultiDiGraph -> MultiGraph changes number of edges so it is # not a copy operation... use is_shallow, not is_shallow_copy G=self.K3 self.add_attributes(G) H=networkx.MultiGraph(G) self.is_shallow(H,G) H=G.to_undirected() self.is_deep(H,G) def test_has_successor(self): G=self.K3 assert_equal(G.has_successor(0,1),True) assert_equal(G.has_successor(0,-1),False) def test_successors(self): G=self.K3 assert_equal(sorted(G.successors(0)),[1,2]) assert_raises((KeyError,networkx.NetworkXError), G.successors,-1) def test_successors_iter(self): G=self.K3 assert_equal(sorted(G.successors_iter(0)),[1,2]) assert_raises((KeyError,networkx.NetworkXError), G.successors_iter,-1) def test_has_predecessor(self): G=self.K3 assert_equal(G.has_predecessor(0,1),True) assert_equal(G.has_predecessor(0,-1),False) def test_predecessors(self): G=self.K3 assert_equal(sorted(G.predecessors(0)),[1,2]) assert_raises((KeyError,networkx.NetworkXError), G.predecessors,-1) def test_predecessors_iter(self): G=self.K3 assert_equal(sorted(G.predecessors_iter(0)),[1,2]) assert_raises((KeyError,networkx.NetworkXError), G.predecessors_iter,-1) def test_degree(self): G=self.K3 assert_equal(list(G.degree().values()),[4,4,4]) assert_equal(G.degree(),{0:4,1:4,2:4}) assert_equal(G.degree(0),4) assert_equal(G.degree([0]),{0:4}) assert_equal(G.degree(iter([0])),{0:4}) assert_raises((KeyError,networkx.NetworkXError), G.degree,-1) def test_degree_iter(self): G=self.K3 assert_equal(list(G.degree_iter()),[(0,4),(1,4),(2,4)]) assert_equal(dict(G.degree_iter()),{0:4,1:4,2:4}) assert_equal(list(G.degree_iter(0)),[(0,4)]) assert_equal(list(G.degree_iter(iter([0]))),[(0,4)]) G.add_edge(0,1,weight=0.3,other=1.2) assert_equal(list(G.degree_iter(weight='weight')),[(0,4.3),(1,4.3),(2,4)]) assert_equal(list(G.degree_iter(weight='other')),[(0,5.2),(1,5.2),(2,4)]) def test_in_degree(self): G=self.K3 assert_equal(list(G.in_degree().values()),[2,2,2]) assert_equal(G.in_degree(),{0:2,1:2,2:2}) assert_equal(G.in_degree(0),2) assert_equal(G.in_degree([0]),{0:2}) assert_equal(G.in_degree(iter([0])),{0:2}) assert_raises((KeyError,networkx.NetworkXError), G.in_degree,-1) def test_in_degree_iter(self): G=self.K3 assert_equal(list(G.in_degree_iter()),[(0,2),(1,2),(2,2)]) assert_equal(dict(G.in_degree_iter()),{0:2,1:2,2:2}) assert_equal(list(G.in_degree_iter(0)),[(0,2)]) assert_equal(list(G.in_degree_iter(iter([0]))),[(0,2)]) assert_equal(list(G.in_degree_iter(0,weight='weight')),[(0,2)]) def test_out_degree(self): G=self.K3 assert_equal(list(G.out_degree().values()),[2,2,2]) assert_equal(G.out_degree(),{0:2,1:2,2:2}) assert_equal(G.out_degree(0),2) assert_equal(G.out_degree([0]),{0:2}) assert_equal(G.out_degree(iter([0])),{0:2}) assert_raises((KeyError,networkx.NetworkXError), G.out_degree,-1) def test_out_degree_iter(self): G=self.K3 assert_equal(list(G.out_degree_iter()),[(0,2),(1,2),(2,2)]) assert_equal(dict(G.out_degree_iter()),{0:2,1:2,2:2}) assert_equal(list(G.out_degree_iter(0)),[(0,2)]) assert_equal(list(G.out_degree_iter(iter([0]))),[(0,2)]) assert_equal(list(G.out_degree_iter(0,weight='weight')),[(0,2)]) def test_size(self): G=self.K3 assert_equal(G.size(),6) assert_equal(G.number_of_edges(),6) G.add_edge(0,1,weight=0.3,other=1.2) assert_equal(G.size(weight='weight'),6.3) assert_equal(G.size(weight='other'),7.2) def test_to_undirected_reciprocal(self): G=self.Graph() G.add_edge(1,2) assert_true(G.to_undirected().has_edge(1,2)) assert_false(G.to_undirected(reciprocal=True).has_edge(1,2)) G.add_edge(2,1) assert_true(G.to_undirected(reciprocal=True).has_edge(1,2)) def test_reverse_copy(self): G=networkx.MultiDiGraph([(0,1),(0,1)]) R=G.reverse() assert_equal(sorted(R.edges()),[(1,0),(1,0)]) R.remove_edge(1,0) assert_equal(sorted(R.edges()),[(1,0)]) assert_equal(sorted(G.edges()),[(0,1),(0,1)]) def test_reverse_nocopy(self): G=networkx.MultiDiGraph([(0,1),(0,1)]) R=G.reverse(copy=False) assert_equal(sorted(R.edges()),[(1,0),(1,0)]) R.remove_edge(1,0) assert_equal(sorted(R.edges()),[(1,0)]) assert_equal(sorted(G.edges()),[(1,0)]) class TestMultiDiGraph(BaseMultiDiGraphTester,TestMultiGraph): def setUp(self): self.Graph=networkx.MultiDiGraph # build K3 self.k3edges=[(0, 1), (0, 2), (1, 2)] self.k3nodes=[0, 1, 2] self.K3=self.Graph() self.K3.adj={0:{},1:{},2:{}} self.K3.succ=self.K3.adj self.K3.pred={0:{},1:{},2:{}} for u in self.k3nodes: for v in self.k3nodes: if u==v: continue d={0:{}} self.K3.succ[u][v]=d self.K3.pred[v][u]=d self.K3.adj=self.K3.succ self.K3.edge=self.K3.adj self.K3.node={} self.K3.node[0]={} self.K3.node[1]={} self.K3.node[2]={} def test_add_edge(self): G=self.Graph() G.add_edge(0,1) assert_equal(G.adj,{0: {1: {0:{}}}, 1: {}}) assert_equal(G.succ,{0: {1: {0:{}}}, 1: {}}) assert_equal(G.pred,{0: {}, 1: {0:{0:{}}}}) G=self.Graph() G.add_edge(*(0,1)) assert_equal(G.adj,{0: {1: {0:{}}}, 1: {}}) assert_equal(G.succ,{0: {1: {0:{}}}, 1: {}}) assert_equal(G.pred,{0: {}, 1: {0:{0:{}}}}) def test_add_edges_from(self): G=self.Graph() G.add_edges_from([(0,1),(0,1,{'weight':3})]) assert_equal(G.adj,{0: {1: {0:{},1:{'weight':3}}}, 1: {}}) assert_equal(G.succ,{0: {1: {0:{},1:{'weight':3}}}, 1: {}}) assert_equal(G.pred,{0: {}, 1: {0:{0:{},1:{'weight':3}}}}) G.add_edges_from([(0,1),(0,1,{'weight':3})],weight=2) assert_equal(G.succ,{0: {1: {0:{}, 1:{'weight':3}, 2:{'weight':2}, 3:{'weight':3}}}, 1: {}}) assert_equal(G.pred,{0: {}, 1: {0:{0:{},1:{'weight':3}, 2:{'weight':2}, 3:{'weight':3}}}}) assert_raises(networkx.NetworkXError, G.add_edges_from,[(0,)]) # too few in tuple assert_raises(networkx.NetworkXError, G.add_edges_from,[(0,1,2,3,4)]) # too many in tuple assert_raises(TypeError, G.add_edges_from,[0]) # not a tuple def test_remove_edge(self): G=self.K3 G.remove_edge(0,1) assert_equal(G.succ,{0:{2:{0:{}}}, 1:{0:{0:{}},2:{0:{}}}, 2:{0:{0:{}},1:{0:{}}}}) assert_equal(G.pred,{0:{1:{0:{}}, 2:{0:{}}}, 1:{2:{0:{}}}, 2:{0:{0:{}},1:{0:{}}}}) assert_raises((KeyError,networkx.NetworkXError), G.remove_edge,-1,0) assert_raises((KeyError,networkx.NetworkXError), G.remove_edge,0,2, key=1) def test_remove_multiedge(self): G=self.K3 G.add_edge(0,1,key='parallel edge') G.remove_edge(0,1,key='parallel edge') assert_equal(G.adj,{0: {1: {0:{}}, 2: {0:{}}}, 1: {0: {0:{}}, 2: {0:{}}}, 2: {0: {0:{}}, 1: {0:{}}}}) assert_equal(G.succ,{0: {1: {0:{}}, 2: {0:{}}}, 1: {0: {0:{}}, 2: {0:{}}}, 2: {0: {0:{}}, 1: {0:{}}}}) assert_equal(G.pred,{0:{1: {0:{}},2:{0:{}}}, 1:{0:{0:{}},2:{0:{}}}, 2:{0:{0:{}},1:{0:{}}}}) G.remove_edge(0,1) assert_equal(G.succ,{0:{2:{0:{}}}, 1:{0:{0:{}},2:{0:{}}}, 2:{0:{0:{}},1:{0:{}}}}) assert_equal(G.pred,{0:{1:{0:{}}, 2:{0:{}}}, 1:{2:{0:{}}}, 2:{0:{0:{}},1:{0:{}}}}) assert_raises((KeyError,networkx.NetworkXError), G.remove_edge,-1,0) def test_remove_edges_from(self): G=self.K3 G.remove_edges_from([(0,1)]) assert_equal(G.succ,{0:{2:{0:{}}}, 1:{0:{0:{}},2:{0:{}}}, 2:{0:{0:{}},1:{0:{}}}}) assert_equal(G.pred,{0:{1:{0:{}}, 2:{0:{}}}, 1:{2:{0:{}}}, 2:{0:{0:{}},1:{0:{}}}}) G.remove_edges_from([(0,0)]) # silent fail networkx-1.11/networkx/classes/tests/timingclasses.py0000644000175000017500000045000312637544500023102 0ustar aricaric00000000000000"""Base classes to benchmark timing of base class methods. The idea is to use these classes to compare timing with the current classes to see if adding features has slowed any methods. The classes are named TimingGraph, TimingDiGraph, TimingMultiGraph and TimingMultiDiGraph """ # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. from copy import deepcopy import networkx as nx from networkx.exception import NetworkXError import networkx.convert as convert __author__ = """\n""".join(['Aric Hagberg (hagberg@lanl.gov)', 'Pieter Swart (swart@lanl.gov)', 'Dan Schult(dschult@colgate.edu)']) class TimingGraph(object): """ Base class for undirected graphs. A Graph stores nodes and edges with optional data, or attributes. Graphs hold undirected edges. Self loops are allowed but multiple (parallel) edges are not. Nodes can be arbitrary (hashable) Python objects with optional key/value attributes. Edges are represented as links between nodes with optional key/value attributes. Parameters ---------- data : input graph Data to initialize graph. If data=None (default) an empty graph is created. The data can be an edge list, or any NetworkX graph object. If the corresponding optional Python packages are installed the data can also be a NumPy matrix or 2d ndarray, a SciPy sparse matrix, or a PyGraphviz graph. attr : keyword arguments, optional (default= no attributes) Attributes to add to graph as key=value pairs. See Also -------- DiGraph MultiGraph MultiDiGraph Examples -------- Create an empty graph structure (a "null graph") with no nodes and no edges. >>> G = nx.Graph() G can be grown in several ways. **Nodes:** Add one node at a time: >>> G.add_node(1) Add the nodes from any container (a list, dict, set or even the lines from a file or the nodes from another graph). >>> G.add_nodes_from([2,3]) >>> G.add_nodes_from(range(100,110)) >>> H=nx.Graph() >>> H.add_path([0,1,2,3,4,5,6,7,8,9]) >>> G.add_nodes_from(H) In addition to strings and integers any hashable Python object (except None) can represent a node, e.g. a customized node object, or even another Graph. >>> G.add_node(H) **Edges:** G can also be grown by adding edges. Add one edge, >>> G.add_edge(1, 2) a list of edges, >>> G.add_edges_from([(1,2),(1,3)]) or a collection of edges, >>> G.add_edges_from(H.edges()) If some edges connect nodes not yet in the graph, the nodes are added automatically. There are no errors when adding nodes or edges that already exist. **Attributes:** Each graph, node, and edge can hold key/value attribute pairs in an associated attribute dictionary (the keys must be hashable). By default these are empty, but can be added or changed using add_edge, add_node or direct manipulation of the attribute dictionaries named graph, node and edge respectively. >>> G = nx.Graph(day="Friday") >>> G.graph {'day': 'Friday'} Add node attributes using add_node(), add_nodes_from() or G.node >>> G.add_node(1, time='5pm') >>> G.add_nodes_from([3], time='2pm') >>> G.node[1] {'time': '5pm'} >>> G.node[1]['room'] = 714 >>> del G.node[1]['room'] # remove attribute >>> G.nodes(data=True) [(1, {'time': '5pm'}), (3, {'time': '2pm'})] Warning: adding a node to G.node does not add it to the graph. Add edge attributes using add_edge(), add_edges_from(), subscript notation, or G.edge. >>> G.add_edge(1, 2, weight=4.7 ) >>> G.add_edges_from([(3,4),(4,5)], color='red') >>> G.add_edges_from([(1,2,{'color':'blue'}), (2,3,{'weight':8})]) >>> G[1][2]['weight'] = 4.7 >>> G.edge[1][2]['weight'] = 4 **Shortcuts:** Many common graph features allow python syntax to speed reporting. >>> 1 in G # check if node in graph True >>> [n for n in G if n<3] # iterate through nodes [1, 2] >>> len(G) # number of nodes in graph 5 The fastest way to traverse all edges of a graph is via adjacency_iter(), but the edges() method is often more convenient. >>> for n,nbrsdict in G.adjacency_iter(): ... for nbr,eattr in nbrsdict.items(): ... if 'weight' in eattr: ... (n,nbr,eattr['weight']) (1, 2, 4) (2, 1, 4) (2, 3, 8) (3, 2, 8) >>> [ (u,v,edata['weight']) for u,v,edata in G.edges(data=True) if 'weight' in edata ] [(1, 2, 4), (2, 3, 8)] **Reporting:** Simple graph information is obtained using methods. Iterator versions of many reporting methods exist for efficiency. Methods exist for reporting nodes(), edges(), neighbors() and degree() as well as the number of nodes and edges. For details on these and other miscellaneous methods, see below. """ def __init__(self, data=None, **attr): """Initialize a graph with edges, name, graph attributes. Parameters ---------- data : input graph Data to initialize graph. If data=None (default) an empty graph is created. The data can be an edge list, or any NetworkX graph object. If the corresponding optional Python packages are installed the data can also be a NumPy matrix or 2d ndarray, a SciPy sparse matrix, or a PyGraphviz graph. name : string, optional (default='') An optional name for the graph. attr : keyword arguments, optional (default= no attributes) Attributes to add to graph as key=value pairs. See Also -------- convert Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G = nx.Graph(name='my graph') >>> e = [(1,2),(2,3),(3,4)] # list of edges >>> G = nx.Graph(e) Arbitrary graph attribute pairs (key=value) may be assigned >>> G=nx.Graph(e, day="Friday") >>> G.graph {'day': 'Friday'} """ self.graph = {} # dictionary for graph attributes self.node = {} # empty node dict (created before convert) self.adj = {} # empty adjacency dict # attempt to load graph with data if data is not None: convert.to_networkx_graph(data,create_using=self) # load graph attributes (must be after convert) self.graph.update(attr) self.edge = self.adj @property def name(self): return self.graph.get('name','') @name.setter def name(self, s): self.graph['name']=s def __str__(self): """Return the graph name. Returns ------- name : string The name of the graph. Examples -------- >>> G = nx.Graph(name='foo') >>> str(G) 'foo' """ return self.name def __iter__(self): """Iterate over the nodes. Use the expression 'for n in G'. Returns ------- niter : iterator An iterator over all nodes in the graph. Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) """ return iter(self.node) def __contains__(self,n): """Return True if n is a node, False otherwise. Use the expression 'n in G'. Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> 1 in G True """ try: return n in self.node except TypeError: return False def __len__(self): """Return the number of nodes. Use the expression 'len(G)'. Returns ------- nnodes : int The number of nodes in the graph. Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> len(G) 4 """ return len(self.node) def __getitem__(self, n): """Return a dict of neighbors of node n. Use the expression 'G[n]'. Parameters ---------- n : node A node in the graph. Returns ------- adj_dict : dictionary The adjacency dictionary for nodes connected to n. Notes ----- G[n] is similar to G.neighbors(n) but the internal data dictionary is returned instead of a list. Assigning G[n] will corrupt the internal graph data structure. Use G[n] for reading data only. Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> G[0] {1: {}} """ return self.adj[n] def add_node(self, n, attr_dict=None, **attr): """Add a single node n and update node attributes. Parameters ---------- n : node A node can be any hashable Python object except None. attr_dict : dictionary, optional (default= no attributes) Dictionary of node attributes. Key/value pairs will update existing data associated with the node. attr : keyword arguments, optional Set or change attributes using key=value. See Also -------- add_nodes_from Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_node(1) >>> G.add_node('Hello') >>> K3 = nx.Graph([(0,1),(1,2),(2,0)]) >>> G.add_node(K3) >>> G.number_of_nodes() 3 Use keywords set/change node attributes: >>> G.add_node(1,size=10) >>> G.add_node(3,weight=0.4,UTM=('13S',382871,3972649)) Notes ----- A hashable object is one that can be used as a key in a Python dictionary. This includes strings, numbers, tuples of strings and numbers, etc. On many platforms hashable items also include mutables such as NetworkX Graphs, though one should be careful that the hash doesn't change on mutables. """ # set up attribute dict if attr_dict is None: attr_dict=attr else: try: attr_dict.update(attr) except AttributeError: raise NetworkXError(\ "The attr_dict argument must be a dictionary.") if n not in self.node: self.adj[n] = {} self.node[n] = attr_dict else: # update attr even if node already exists self.node[n].update(attr_dict) def add_nodes_from(self, nodes, **attr): """Add multiple nodes. Parameters ---------- nodes : iterable container A container of nodes (list, dict, set, etc.). OR A container of (node, attribute dict) tuples. Node attributes are updated using the attribute dict. attr : keyword arguments, optional (default= no attributes) Update attributes for all nodes in nodes. Node attributes specified in nodes as a tuple take precedence over attributes specified generally. See Also -------- add_node Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_nodes_from('Hello') >>> K3 = nx.Graph([(0,1),(1,2),(2,0)]) >>> G.add_nodes_from(K3) >>> sorted(G.nodes(),key=str) [0, 1, 2, 'H', 'e', 'l', 'o'] Use keywords to update specific node attributes for every node. >>> G.add_nodes_from([1,2], size=10) >>> G.add_nodes_from([3,4], weight=0.4) Use (node, attrdict) tuples to update attributes for specific nodes. >>> G.add_nodes_from([(1,dict(size=11)), (2,{'color':'blue'})]) >>> G.node[1]['size'] 11 >>> H = nx.Graph() >>> H.add_nodes_from(G.nodes(data=True)) >>> H.node[1]['size'] 11 """ for n in nodes: try: if n not in self.node: self.adj[n] = {} self.node[n] = attr.copy() else: self.node[n].update(attr) except TypeError: nn,ndict = n if nn not in self.node: self.adj[nn] = {} newdict = attr.copy() newdict.update(ndict) self.node[nn] = newdict else: olddict = self.node[nn] olddict.update(attr) olddict.update(ndict) def remove_node(self,n): """Remove node n. Removes the node n and all adjacent edges. Attempting to remove a non-existent node will raise an exception. Parameters ---------- n : node A node in the graph Raises ------- NetworkXError If n is not in the graph. See Also -------- remove_nodes_from Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2]) >>> G.edges() [(0, 1), (1, 2)] >>> G.remove_node(1) >>> G.edges() [] """ adj = self.adj try: nbrs = list(adj[n].keys()) # keys handles self-loops (allow mutation later) del self.node[n] except KeyError: # NetworkXError if n not in self raise NetworkXError("The node %s is not in the graph."%(n,)) for u in nbrs: del adj[u][n] # remove all edges n-u in graph del adj[n] # now remove node def remove_nodes_from(self, nodes): """Remove multiple nodes. Parameters ---------- nodes : iterable container A container of nodes (list, dict, set, etc.). If a node in the container is not in the graph it is silently ignored. See Also -------- remove_node Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2]) >>> e = G.nodes() >>> e [0, 1, 2] >>> G.remove_nodes_from(e) >>> G.nodes() [] """ adj = self.adj for n in nodes: try: del self.node[n] for u in list(adj[n].keys()): # keys() handles self-loops del adj[u][n] #(allows mutation of dict in loop) del adj[n] except KeyError: pass def nodes_iter(self, data=False): """Return an iterator over the nodes. Parameters ---------- data : boolean, optional (default=False) If False the iterator returns nodes. If True return a two-tuple of node and node data dictionary Returns ------- niter : iterator An iterator over nodes. If data=True the iterator gives two-tuples containing (node, node data, dictionary) Notes ----- If the node data is not required it is simpler and equivalent to use the expression 'for n in G'. >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2]) Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2]) >>> [d for n,d in G.nodes_iter(data=True)] [{}, {}, {}] """ if data: return iter(self.node.items()) return iter(self.node) def nodes(self, data=False): """Return a list of the nodes in the graph. Parameters ---------- data : boolean, optional (default=False) If False return a list of nodes. If True return a two-tuple of node and node data dictionary Returns ------- nlist : list A list of nodes. If data=True a list of two-tuples containing (node, node data dictionary). Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2]) >>> G.nodes() [0, 1, 2] >>> G.add_node(1, time='5pm') >>> G.nodes(data=True) [(0, {}), (1, {'time': '5pm'}), (2, {})] """ return list(self.nodes_iter(data=data)) def number_of_nodes(self): """Return the number of nodes in the graph. Returns ------- nnodes : int The number of nodes in the graph. See Also -------- order, __len__ which are identical Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2]) >>> len(G) 3 """ return len(self.node) def order(self): """Return the number of nodes in the graph. Returns ------- nnodes : int The number of nodes in the graph. See Also -------- number_of_nodes, __len__ which are identical """ return len(self.node) def has_node(self, n): """Return True if the graph contains the node n. Parameters ---------- n : node Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2]) >>> G.has_node(0) True It is more readable and simpler to use >>> 0 in G True """ try: return n in self.node except TypeError: return False def add_edge(self, u, v, attr_dict=None, **attr): """Add an edge between u and v. The nodes u and v will be automatically added if they are not already in the graph. Edge attributes can be specified with keywords or by providing a dictionary with key/value pairs. See examples below. Parameters ---------- u,v : nodes Nodes can be, for example, strings or numbers. Nodes must be hashable (and not None) Python objects. attr_dict : dictionary, optional (default= no attributes) Dictionary of edge attributes. Key/value pairs will update existing data associated with the edge. attr : keyword arguments, optional Edge data (or labels or objects) can be assigned using keyword arguments. See Also -------- add_edges_from : add a collection of edges Notes ----- Adding an edge that already exists updates the edge data. Many NetworkX algorithms designed for weighted graphs use as the edge weight a numerical value assigned to a keyword which by default is 'weight'. Examples -------- The following all add the edge e=(1,2) to graph G: >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> e = (1,2) >>> G.add_edge(1, 2) # explicit two-node form >>> G.add_edge(*e) # single edge as tuple of two nodes >>> G.add_edges_from( [(1,2)] ) # add edges from iterable container Associate data to edges using keywords: >>> G.add_edge(1, 2, weight=3) >>> G.add_edge(1, 3, weight=7, capacity=15, length=342.7) """ # set up attribute dictionary if attr_dict is None: attr_dict=attr else: try: attr_dict.update(attr) except AttributeError: raise NetworkXError(\ "The attr_dict argument must be a dictionary.") # add nodes if u not in self.node: self.adj[u] = {} self.node[u] = {} if v not in self.node: self.adj[v] = {} self.node[v] = {} # add the edge datadict=self.adj[u].get(v,{}) datadict.update(attr_dict) self.adj[u][v] = datadict self.adj[v][u] = datadict def add_edges_from(self, ebunch, attr_dict=None, **attr): """Add all the edges in ebunch. Parameters ---------- ebunch : container of edges Each edge given in the container will be added to the graph. The edges must be given as as 2-tuples (u,v) or 3-tuples (u,v,d) where d is a dictionary containing edge data. attr_dict : dictionary, optional (default= no attributes) Dictionary of edge attributes. Key/value pairs will update existing data associated with each edge. attr : keyword arguments, optional Edge data (or labels or objects) can be assigned using keyword arguments. See Also -------- add_edge : add a single edge add_weighted_edges_from : convenient way to add weighted edges Notes ----- Adding the same edge twice has no effect but any edge data will be updated when each duplicate edge is added. Edge attributes specified in edges as a tuple take precedence over attributes specified generally. Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_edges_from([(0,1),(1,2)]) # using a list of edge tuples >>> e = zip(range(0,3),range(1,4)) >>> G.add_edges_from(e) # Add the path graph 0-1-2-3 Associate data to edges >>> G.add_edges_from([(1,2),(2,3)], weight=3) >>> G.add_edges_from([(3,4),(1,4)], label='WN2898') """ # set up attribute dict if attr_dict is None: attr_dict=attr else: try: attr_dict.update(attr) except AttributeError: raise NetworkXError(\ "The attr_dict argument must be a dictionary.") # process ebunch for e in ebunch: ne=len(e) if ne==3: u,v,dd = e elif ne==2: u,v = e dd = {} else: raise NetworkXError(\ "Edge tuple %s must be a 2-tuple or 3-tuple."%(e,)) if u not in self.node: self.adj[u] = {} self.node[u] = {} if v not in self.node: self.adj[v] = {} self.node[v] = {} datadict=self.adj[u].get(v,{}) datadict.update(attr_dict) datadict.update(dd) self.adj[u][v] = datadict self.adj[v][u] = datadict def add_weighted_edges_from(self, ebunch, weight='weight', **attr): """Add all the edges in ebunch as weighted edges with specified weights. Parameters ---------- ebunch : container of edges Each edge given in the list or container will be added to the graph. The edges must be given as 3-tuples (u,v,w) where w is a number. weight : string, optional (default= 'weight') The attribute name for the edge weights to be added. attr : keyword arguments, optional (default= no attributes) Edge attributes to add/update for all edges. See Also -------- add_edge : add a single edge add_edges_from : add multiple edges Notes ----- Adding the same edge twice for Graph/DiGraph simply updates the edge data. For MultiGraph/MultiDiGraph, duplicate edges are stored. Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_weighted_edges_from([(0,1,3.0),(1,2,7.5)]) """ self.add_edges_from(((u,v,{weight:d}) for u,v,d in ebunch),**attr) def remove_edge(self, u, v): """Remove the edge between u and v. Parameters ---------- u,v: nodes Remove the edge between nodes u and v. Raises ------ NetworkXError If there is not an edge between u and v. See Also -------- remove_edges_from : remove a collection of edges Examples -------- >>> G = nx.Graph() # or DiGraph, etc >>> G.add_path([0,1,2,3]) >>> G.remove_edge(0,1) >>> e = (1,2) >>> G.remove_edge(*e) # unpacks e from an edge tuple >>> e = (2,3,{'weight':7}) # an edge with attribute data >>> G.remove_edge(*e[:2]) # select first part of edge tuple """ try: del self.adj[u][v] if u != v: # self-loop needs only one entry removed del self.adj[v][u] except KeyError: raise NetworkXError("The edge %s-%s is not in the graph"%(u,v)) def remove_edges_from(self, ebunch): """Remove all edges specified in ebunch. Parameters ---------- ebunch: list or container of edge tuples Each edge given in the list or container will be removed from the graph. The edges can be: - 2-tuples (u,v) edge between u and v. - 3-tuples (u,v,k) where k is ignored. See Also -------- remove_edge : remove a single edge Notes ----- Will fail silently if an edge in ebunch is not in the graph. Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> ebunch=[(1,2),(2,3)] >>> G.remove_edges_from(ebunch) """ adj=self.adj for e in ebunch: u,v = e[:2] # ignore edge data if present if u in adj and v in adj[u]: del adj[u][v] if u != v: # self loop needs only one entry removed del adj[v][u] def has_edge(self, u, v): """Return True if the edge (u,v) is in the graph. Parameters ---------- u,v : nodes Nodes can be, for example, strings or numbers. Nodes must be hashable (and not None) Python objects. Returns ------- edge_ind : bool True if edge is in the graph, False otherwise. Examples -------- Can be called either using two nodes u,v or edge tuple (u,v) >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> G.has_edge(0,1) # using two nodes True >>> e = (0,1) >>> G.has_edge(*e) # e is a 2-tuple (u,v) True >>> e = (0,1,{'weight':7}) >>> G.has_edge(*e[:2]) # e is a 3-tuple (u,v,data_dictionary) True The following syntax are all equivalent: >>> G.has_edge(0,1) True >>> 1 in G[0] # though this gives KeyError if 0 not in G True """ try: return v in self.adj[u] except KeyError: return False def neighbors(self, n): """Return a list of the nodes connected to the node n. Parameters ---------- n : node A node in the graph Returns ------- nlist : list A list of nodes that are adjacent to n. Raises ------ NetworkXError If the node n is not in the graph. Notes ----- It is usually more convenient (and faster) to access the adjacency dictionary as G[n]: >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_edge('a','b',weight=7) >>> G['a'] {'b': {'weight': 7}} Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> G.neighbors(0) [1] """ try: return list(self.adj[n]) except KeyError: raise NetworkXError("The node %s is not in the graph."%(n,)) def neighbors_iter(self, n): """Return an iterator over all neighbors of node n. Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> [n for n in G.neighbors_iter(0)] [1] Notes ----- It is faster to use the idiom "in G[0]", e.g. >>> G = nx.path_graph(4) >>> [n for n in G[0]] [1] """ try: return iter(self.adj[n]) except KeyError: raise NetworkXError("The node %s is not in the graph."%(n,)) def edges(self, nbunch=None, data=False): """Return a list of edges. Edges are returned as tuples with optional data in the order (node, neighbor, data). Parameters ---------- nbunch : iterable container, optional (default= all nodes) A container of nodes. The container will be iterated through once. data : bool, optional (default=False) Return two tuples (u,v) (False) or three-tuples (u,v,data) (True). Returns -------- edge_list: list of edge tuples Edges that are adjacent to any node in nbunch, or a list of all edges if nbunch is not specified. See Also -------- edges_iter : return an iterator over the edges Notes ----- Nodes in nbunch that are not in the graph will be (quietly) ignored. For directed graphs this returns the out-edges. Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> G.edges() [(0, 1), (1, 2), (2, 3)] >>> G.edges(data=True) # default edge data is {} (empty dictionary) [(0, 1, {}), (1, 2, {}), (2, 3, {})] >>> G.edges([0,3]) [(0, 1), (3, 2)] >>> G.edges(0) [(0, 1)] """ return list(self.edges_iter(nbunch, data)) def edges_iter(self, nbunch=None, data=False): """Return an iterator over the edges. Edges are returned as tuples with optional data in the order (node, neighbor, data). Parameters ---------- nbunch : iterable container, optional (default= all nodes) A container of nodes. The container will be iterated through once. data : bool, optional (default=False) If True, return edge attribute dict in 3-tuple (u,v,data). Returns ------- edge_iter : iterator An iterator of (u,v) or (u,v,d) tuples of edges. See Also -------- edges : return a list of edges Notes ----- Nodes in nbunch that are not in the graph will be (quietly) ignored. For directed graphs this returns the out-edges. Examples -------- >>> G = nx.Graph() # or MultiGraph, etc >>> G.add_path([0,1,2,3]) >>> [e for e in G.edges_iter()] [(0, 1), (1, 2), (2, 3)] >>> list(G.edges_iter(data=True)) # default data is {} (empty dict) [(0, 1, {}), (1, 2, {}), (2, 3, {})] >>> list(G.edges_iter([0,3])) [(0, 1), (3, 2)] >>> list(G.edges_iter(0)) [(0, 1)] """ seen={} # helper dict to keep track of multiply stored edges if nbunch is None: nodes_nbrs = self.adj.items() else: nodes_nbrs=((n,self.adj[n]) for n in self.nbunch_iter(nbunch)) if data: for n,nbrs in nodes_nbrs: for nbr,data in nbrs.items(): if nbr not in seen: yield (n,nbr,data) seen[n]=1 else: for n,nbrs in nodes_nbrs: for nbr in nbrs: if nbr not in seen: yield (n,nbr) seen[n] = 1 del seen def get_edge_data(self, u, v, default=None): """Return the attribute dictionary associated with edge (u,v). Parameters ---------- u,v : nodes default: any Python object (default=None) Value to return if the edge (u,v) is not found. Returns ------- edge_dict : dictionary The edge attribute dictionary. Notes ----- It is faster to use G[u][v]. >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> G[0][1] {} Warning: Assigning G[u][v] corrupts the graph data structure. But it is safe to assign attributes to that dictionary, >>> G[0][1]['weight'] = 7 >>> G[0][1]['weight'] 7 >>> G[1][0]['weight'] 7 Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> G.get_edge_data(0,1) # default edge data is {} {} >>> e = (0,1) >>> G.get_edge_data(*e) # tuple form {} >>> G.get_edge_data('a','b',default=0) # edge not in graph, return 0 0 """ try: return self.adj[u][v] except KeyError: return default def adjacency_list(self): """Return an adjacency list representation of the graph. The output adjacency list is in the order of G.nodes(). For directed graphs, only outgoing adjacencies are included. Returns ------- adj_list : lists of lists The adjacency structure of the graph as a list of lists. See Also -------- adjacency_iter Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> G.adjacency_list() # in order given by G.nodes() [[1], [0, 2], [1, 3], [2]] """ return list(map(list,iter(self.adj.values()))) def adjacency_iter(self): """Return an iterator of (node, adjacency dict) tuples for all nodes. This is the fastest way to look at every edge. For directed graphs, only outgoing adjacencies are included. Returns ------- adj_iter : iterator An iterator of (node, adjacency dictionary) for all nodes in the graph. See Also -------- adjacency_list Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> [(n,nbrdict) for n,nbrdict in G.adjacency_iter()] [(0, {1: {}}), (1, {0: {}, 2: {}}), (2, {1: {}, 3: {}}), (3, {2: {}})] """ return iter(self.adj.items()) def degree(self, nbunch=None, weight=None): """Return the degree of a node or nodes. The node degree is the number of edges adjacent to that node. Parameters ---------- nbunch : iterable container, optional (default=all nodes) A container of nodes. The container will be iterated through once. weight : string or None, optional (default=None) The edge attribute that holds the numerical value used as a weight. If None, then each edge has weight 1. The degree is the sum of the edge weights adjacent to the node. Returns ------- nd : dictionary, or number A dictionary with nodes as keys and degree as values or a number if a single node is specified. Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> G.degree(0) 1 >>> G.degree([0,1]) {0: 1, 1: 2} >>> list(G.degree([0,1]).values()) [1, 2] """ if nbunch in self: # return a single node return next(self.degree_iter(nbunch,weight))[1] else: # return a dict return dict(self.degree_iter(nbunch,weight)) def degree_iter(self, nbunch=None, weight=None): """Return an iterator for (node, degree). The node degree is the number of edges adjacent to the node. Parameters ---------- nbunch : iterable container, optional (default=all nodes) A container of nodes. The container will be iterated through once. weight : string or None, optional (default=None) The edge attribute that holds the numerical value used as a weight. If None, then each edge has weight 1. The degree is the sum of the edge weights adjacent to the node. Returns ------- nd_iter : an iterator The iterator returns two-tuples of (node, degree). See Also -------- degree Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> list(G.degree_iter(0)) # node 0 with degree 1 [(0, 1)] >>> list(G.degree_iter([0,1])) [(0, 1), (1, 2)] """ if nbunch is None: nodes_nbrs = self.adj.items() else: nodes_nbrs=((n,self.adj[n]) for n in self.nbunch_iter(nbunch)) if weight is None: for n,nbrs in nodes_nbrs: yield (n,len(nbrs)+(n in nbrs)) # return tuple (n,degree) else: # edge weighted graph - degree is sum of nbr edge weights for n,nbrs in nodes_nbrs: yield (n, sum((nbrs[nbr].get(weight,1) for nbr in nbrs)) + (n in nbrs and nbrs[n].get(weight,1))) def clear(self): """Remove all nodes and edges from the graph. This also removes the name, and all graph, node, and edge attributes. Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> G.clear() >>> G.nodes() [] >>> G.edges() [] """ self.name = '' self.adj.clear() self.node.clear() self.graph.clear() def copy(self): """Return a copy of the graph. Returns ------- G : Graph A copy of the graph. See Also -------- to_directed: return a directed copy of the graph. Notes ----- This makes a complete copy of the graph including all of the node or edge attributes. Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> H = G.copy() """ return deepcopy(self) def is_multigraph(self): """Return True if graph is a multigraph, False otherwise.""" return False def is_directed(self): """Return True if graph is directed, False otherwise.""" return False def to_directed(self): """Return a directed representation of the graph. Returns ------- G : DiGraph A directed graph with the same name, same nodes, and with each edge (u,v,data) replaced by two directed edges (u,v,data) and (v,u,data). Notes ----- This returns a "deepcopy" of the edge, node, and graph attributes which attempts to completely copy all of the data and references. This is in contrast to the similar D=DiGraph(G) which returns a shallow copy of the data. See the Python copy module for more information on shallow and deep copies, http://docs.python.org/library/copy.html. Examples -------- >>> G = nx.Graph() # or MultiGraph, etc >>> G.add_path([0,1]) >>> H = G.to_directed() >>> H.edges() [(0, 1), (1, 0)] If already directed, return a (deep) copy >>> G = nx.DiGraph() # or MultiDiGraph, etc >>> G.add_path([0,1]) >>> H = G.to_directed() >>> H.edges() [(0, 1)] """ from networkx import DiGraph G=DiGraph() G.name=self.name G.add_nodes_from(self) G.add_edges_from( ((u,v,deepcopy(data)) for u,nbrs in self.adjacency_iter() for v,data in nbrs.items()) ) G.graph=deepcopy(self.graph) G.node=deepcopy(self.node) return G def to_undirected(self): """Return an undirected copy of the graph. Returns ------- G : Graph/MultiGraph A deepcopy of the graph. See Also -------- copy, add_edge, add_edges_from Notes ----- This returns a "deepcopy" of the edge, node, and graph attributes which attempts to completely copy all of the data and references. This is in contrast to the similar G=DiGraph(D) which returns a shallow copy of the data. See the Python copy module for more information on shallow and deep copies, http://docs.python.org/library/copy.html. Examples -------- >>> G = nx.Graph() # or MultiGraph, etc >>> G.add_path([0,1]) >>> H = G.to_directed() >>> H.edges() [(0, 1), (1, 0)] >>> G2 = H.to_undirected() >>> G2.edges() [(0, 1)] """ return deepcopy(self) def subgraph(self, nbunch): """Return the subgraph induced on nodes in nbunch. The induced subgraph of the graph contains the nodes in nbunch and the edges between those nodes. Parameters ---------- nbunch : list, iterable A container of nodes which will be iterated through once. Returns ------- G : Graph A subgraph of the graph with the same edge attributes. Notes ----- The graph, edge or node attributes just point to the original graph. So changes to the node or edge structure will not be reflected in the original graph while changes to the attributes will. To create a subgraph with its own copy of the edge/node attributes use: nx.Graph(G.subgraph(nbunch)) If edge attributes are containers, a deep copy can be obtained using: G.subgraph(nbunch).copy() For an inplace reduction of a graph to a subgraph you can remove nodes: G.remove_nodes_from([ n in G if n not in set(nbunch)]) Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> H = G.subgraph([0,1,2]) >>> H.edges() [(0, 1), (1, 2)] """ bunch =self.nbunch_iter(nbunch) # create new graph and copy subgraph into it H = self.__class__() # copy node and attribute dictionaries for n in bunch: H.node[n]=self.node[n] # namespace shortcuts for speed H_adj=H.adj self_adj=self.adj # add nodes and edges (undirected method) for n in H.node: Hnbrs={} H_adj[n]=Hnbrs for nbr,d in self_adj[n].items(): if nbr in H_adj: # add both representations of edge: n-nbr and nbr-n Hnbrs[nbr]=d H_adj[nbr][n]=d H.graph=self.graph return H def nodes_with_selfloops(self): """Return a list of nodes with self loops. A node with a self loop has an edge with both ends adjacent to that node. Returns ------- nodelist : list A list of nodes with self loops. See Also -------- selfloop_edges, number_of_selfloops Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_edge(1,1) >>> G.add_edge(1,2) >>> G.nodes_with_selfloops() [1] """ return [ n for n,nbrs in self.adj.items() if n in nbrs ] def selfloop_edges(self, data=False): """Return a list of selfloop edges. A selfloop edge has the same node at both ends. Parameters ----------- data : bool, optional (default=False) Return selfloop edges as two tuples (u,v) (data=False) or three-tuples (u,v,data) (data=True) Returns ------- edgelist : list of edge tuples A list of all selfloop edges. See Also -------- nodes_with_selfloops, number_of_selfloops Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_edge(1,1) >>> G.add_edge(1,2) >>> G.selfloop_edges() [(1, 1)] >>> G.selfloop_edges(data=True) [(1, 1, {})] """ if data: return [ (n,n,nbrs[n]) for n,nbrs in self.adj.items() if n in nbrs ] else: return [ (n,n) for n,nbrs in self.adj.items() if n in nbrs ] def number_of_selfloops(self): """Return the number of selfloop edges. A selfloop edge has the same node at both ends. Returns ------- nloops : int The number of selfloops. See Also -------- nodes_with_selfloops, selfloop_edges Examples -------- >>> G=nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_edge(1,1) >>> G.add_edge(1,2) >>> G.number_of_selfloops() 1 """ return len(self.selfloop_edges()) def size(self, weight=None): """Return the number of edges. Parameters ---------- weight : string or None, optional (default=None) The edge attribute that holds the numerical value used as a weight. If None, then each edge has weight 1. Returns ------- nedges : int The number of edges or sum of edge weights in the graph. See Also -------- number_of_edges Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> G.size() 3 >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_edge('a','b',weight=2) >>> G.add_edge('b','c',weight=4) >>> G.size() 2 >>> G.size(weight='weight') 6.0 """ s=sum(self.degree(weight=weight).values())/2 if weight is None: return int(s) else: return float(s) def number_of_edges(self, u=None, v=None): """Return the number of edges between two nodes. Parameters ---------- u,v : nodes, optional (default=all edges) If u and v are specified, return the number of edges between u and v. Otherwise return the total number of all edges. Returns ------- nedges : int The number of edges in the graph. If nodes u and v are specified return the number of edges between those nodes. See Also -------- size Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> G.number_of_edges() 3 >>> G.number_of_edges(0,1) 1 >>> e = (0,1) >>> G.number_of_edges(*e) 1 """ if u is None: return int(self.size()) if v in self.adj[u]: return 1 else: return 0 def add_star(self, nodes, **attr): """Add a star. The first node in nodes is the middle of the star. It is connected to all other nodes. Parameters ---------- nodes : iterable container A container of nodes. attr : keyword arguments, optional (default= no attributes) Attributes to add to every edge in star. See Also -------- add_path, add_cycle Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_star([0,1,2,3]) >>> G.add_star([10,11,12],weight=2) """ nlist = list(nodes) v=nlist[0] edges=((v,n) for n in nlist[1:]) self.add_edges_from(edges, **attr) def add_path(self, nodes, **attr): """Add a path. Parameters ---------- nodes : iterable container A container of nodes. A path will be constructed from the nodes (in order) and added to the graph. attr : keyword arguments, optional (default= no attributes) Attributes to add to every edge in path. See Also -------- add_star, add_cycle Examples -------- >>> G=nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> G.add_path([10,11,12],weight=7) """ nlist = list(nodes) edges=zip(nlist[:-1],nlist[1:]) self.add_edges_from(edges, **attr) def add_cycle(self, nodes, **attr): """Add a cycle. Parameters ---------- nodes: iterable container A container of nodes. A cycle will be constructed from the nodes (in order) and added to the graph. attr : keyword arguments, optional (default= no attributes) Attributes to add to every edge in cycle. See Also -------- add_path, add_star Examples -------- >>> G=nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_cycle([0,1,2,3]) >>> G.add_cycle([10,11,12],weight=7) """ nlist = list(nodes) edges=zip(nlist,nlist[1:]+[nlist[0]]) self.add_edges_from(edges, **attr) def nbunch_iter(self, nbunch=None): """Return an iterator of nodes contained in nbunch that are also in the graph. The nodes in nbunch are checked for membership in the graph and if not are silently ignored. Parameters ---------- nbunch : iterable container, optional (default=all nodes) A container of nodes. The container will be iterated through once. Returns ------- niter : iterator An iterator over nodes in nbunch that are also in the graph. If nbunch is None, iterate over all nodes in the graph. Raises ------ NetworkXError If nbunch is not a node or or sequence of nodes. If a node in nbunch is not hashable. See Also -------- Graph.__iter__ Notes ----- When nbunch is an iterator, the returned iterator yields values directly from nbunch, becoming exhausted when nbunch is exhausted. To test whether nbunch is a single node, one can use "if nbunch in self:", even after processing with this routine. If nbunch is not a node or a (possibly empty) sequence/iterator or None, a NetworkXError is raised. Also, if any object in nbunch is not hashable, a NetworkXError is raised. """ if nbunch is None: # include all nodes via iterator bunch=iter(self.adj.keys()) elif nbunch in self: # if nbunch is a single node bunch=iter([nbunch]) else: # if nbunch is a sequence of nodes def bunch_iter(nlist,adj): try: for n in nlist: if n in adj: yield n except TypeError as e: message=e.args[0] import sys sys.stdout.write(message) # capture error for non-sequence/iterator nbunch. if 'iter' in message: raise NetworkXError(\ "nbunch is not a node or a sequence of nodes.") # capture error for unhashable node. elif 'hashable' in message: raise NetworkXError(\ "Node %s in the sequence nbunch is not a valid node."%n) else: raise bunch=bunch_iter(nbunch,self.adj) return bunch class TimingDiGraph(TimingGraph): """ Base class for directed graphs. A DiGraph stores nodes and edges with optional data, or attributes. DiGraphs hold directed edges. Self loops are allowed but multiple (parallel) edges are not. Nodes can be arbitrary (hashable) Python objects with optional key/value attributes. Edges are represented as links between nodes with optional key/value attributes. Parameters ---------- data : input graph Data to initialize graph. If data=None (default) an empty graph is created. The data can be an edge list, or any NetworkX graph object. If the corresponding optional Python packages are installed the data can also be a NumPy matrix or 2d ndarray, a SciPy sparse matrix, or a PyGraphviz graph. attr : keyword arguments, optional (default= no attributes) Attributes to add to graph as key=value pairs. See Also -------- Graph MultiGraph MultiDiGraph Examples -------- Create an empty graph structure (a "null graph") with no nodes and no edges. >>> G = nx.DiGraph() G can be grown in several ways. **Nodes:** Add one node at a time: >>> G.add_node(1) Add the nodes from any container (a list, dict, set or even the lines from a file or the nodes from another graph). >>> G.add_nodes_from([2,3]) >>> G.add_nodes_from(range(100,110)) >>> H=nx.Graph() >>> H.add_path([0,1,2,3,4,5,6,7,8,9]) >>> G.add_nodes_from(H) In addition to strings and integers any hashable Python object (except None) can represent a node, e.g. a customized node object, or even another Graph. >>> G.add_node(H) **Edges:** G can also be grown by adding edges. Add one edge, >>> G.add_edge(1, 2) a list of edges, >>> G.add_edges_from([(1,2),(1,3)]) or a collection of edges, >>> G.add_edges_from(H.edges()) If some edges connect nodes not yet in the graph, the nodes are added automatically. There are no errors when adding nodes or edges that already exist. **Attributes:** Each graph, node, and edge can hold key/value attribute pairs in an associated attribute dictionary (the keys must be hashable). By default these are empty, but can be added or changed using add_edge, add_node or direct manipulation of the attribute dictionaries named graph, node and edge respectively. >>> G = nx.DiGraph(day="Friday") >>> G.graph {'day': 'Friday'} Add node attributes using add_node(), add_nodes_from() or G.node >>> G.add_node(1, time='5pm') >>> G.add_nodes_from([3], time='2pm') >>> G.node[1] {'time': '5pm'} >>> G.node[1]['room'] = 714 >>> del G.node[1]['room'] # remove attribute >>> G.nodes(data=True) [(1, {'time': '5pm'}), (3, {'time': '2pm'})] Warning: adding a node to G.node does not add it to the graph. Add edge attributes using add_edge(), add_edges_from(), subscript notation, or G.edge. >>> G.add_edge(1, 2, weight=4.7 ) >>> G.add_edges_from([(3,4),(4,5)], color='red') >>> G.add_edges_from([(1,2,{'color':'blue'}), (2,3,{'weight':8})]) >>> G[1][2]['weight'] = 4.7 >>> G.edge[1][2]['weight'] = 4 **Shortcuts:** Many common graph features allow python syntax to speed reporting. >>> 1 in G # check if node in graph True >>> [n for n in G if n<3] # iterate through nodes [1, 2] >>> len(G) # number of nodes in graph 5 The fastest way to traverse all edges of a graph is via adjacency_iter(), but the edges() method is often more convenient. >>> for n,nbrsdict in G.adjacency_iter(): ... for nbr,eattr in nbrsdict.items(): ... if 'weight' in eattr: ... (n,nbr,eattr['weight']) (1, 2, 4) (2, 3, 8) >>> [ (u,v,edata['weight']) for u,v,edata in G.edges(data=True) if 'weight' in edata ] [(1, 2, 4), (2, 3, 8)] **Reporting:** Simple graph information is obtained using methods. Iterator versions of many reporting methods exist for efficiency. Methods exist for reporting nodes(), edges(), neighbors() and degree() as well as the number of nodes and edges. For details on these and other miscellaneous methods, see below. """ def __init__(self, data=None, **attr): """Initialize a graph with edges, name, graph attributes. Parameters ---------- data : input graph Data to initialize graph. If data=None (default) an empty graph is created. The data can be an edge list, or any NetworkX graph object. If the corresponding optional Python packages are installed the data can also be a NumPy matrix or 2d ndarray, a SciPy sparse matrix, or a PyGraphviz graph. name : string, optional (default='') An optional name for the graph. attr : keyword arguments, optional (default= no attributes) Attributes to add to graph as key=value pairs. See Also -------- convert Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G = nx.Graph(name='my graph') >>> e = [(1,2),(2,3),(3,4)] # list of edges >>> G = nx.Graph(e) Arbitrary graph attribute pairs (key=value) may be assigned >>> G=nx.Graph(e, day="Friday") >>> G.graph {'day': 'Friday'} """ self.graph = {} # dictionary for graph attributes self.node = {} # dictionary for node attributes # We store two adjacency lists: # the predecessors of node n are stored in the dict self.pred # the successors of node n are stored in the dict self.succ=self.adj self.adj = {} # empty adjacency dictionary self.pred = {} # predecessor self.succ = self.adj # successor # attempt to load graph with data if data is not None: convert.to_networkx_graph(data,create_using=self) # load graph attributes (must be after convert) self.graph.update(attr) self.edge=self.adj def add_node(self, n, attr_dict=None, **attr): """Add a single node n and update node attributes. Parameters ---------- n : node A node can be any hashable Python object except None. attr_dict : dictionary, optional (default= no attributes) Dictionary of node attributes. Key/value pairs will update existing data associated with the node. attr : keyword arguments, optional Set or change attributes using key=value. See Also -------- add_nodes_from Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_node(1) >>> G.add_node('Hello') >>> K3 = nx.Graph([(0,1),(1,2),(2,0)]) >>> G.add_node(K3) >>> G.number_of_nodes() 3 Use keywords set/change node attributes: >>> G.add_node(1,size=10) >>> G.add_node(3,weight=0.4,UTM=('13S',382871,3972649)) Notes ----- A hashable object is one that can be used as a key in a Python dictionary. This includes strings, numbers, tuples of strings and numbers, etc. On many platforms hashable items also include mutables such as NetworkX Graphs, though one should be careful that the hash doesn't change on mutables. """ # set up attribute dict if attr_dict is None: attr_dict=attr else: try: attr_dict.update(attr) except AttributeError: raise NetworkXError(\ "The attr_dict argument must be a dictionary.") if n not in self.succ: self.succ[n] = {} self.pred[n] = {} self.node[n] = attr_dict else: # update attr even if node already exists self.node[n].update(attr_dict) def add_nodes_from(self, nodes, **attr): """Add multiple nodes. Parameters ---------- nodes : iterable container A container of nodes (list, dict, set, etc.). OR A container of (node, attribute dict) tuples. Node attributes are updated using the attribute dict. attr : keyword arguments, optional (default= no attributes) Update attributes for all nodes in nodes. Node attributes specified in nodes as a tuple take precedence over attributes specified generally. See Also -------- add_node Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_nodes_from('Hello') >>> K3 = nx.Graph([(0,1),(1,2),(2,0)]) >>> G.add_nodes_from(K3) >>> sorted(G.nodes(),key=str) [0, 1, 2, 'H', 'e', 'l', 'o'] Use keywords to update specific node attributes for every node. >>> G.add_nodes_from([1,2], size=10) >>> G.add_nodes_from([3,4], weight=0.4) Use (node, attrdict) tuples to update attributes for specific nodes. >>> G.add_nodes_from([(1,dict(size=11)), (2,{'color':'blue'})]) >>> G.node[1]['size'] 11 >>> H = nx.Graph() >>> H.add_nodes_from(G.nodes(data=True)) >>> H.node[1]['size'] 11 """ for n in nodes: try: if n not in self.succ: self.succ[n] = {} self.pred[n] = {} self.node[n] = attr.copy() else: self.node[n].update(attr) except TypeError: nn,ndict = n if nn not in self.succ: self.succ[nn] = {} self.pred[nn] = {} newdict = attr.copy() newdict.update(ndict) self.node[nn] = newdict else: olddict = self.node[nn] olddict.update(attr) olddict.update(ndict) def remove_node(self, n): """Remove node n. Removes the node n and all adjacent edges. Attempting to remove a non-existent node will raise an exception. Parameters ---------- n : node A node in the graph Raises ------- NetworkXError If n is not in the graph. See Also -------- remove_nodes_from Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2]) >>> G.edges() [(0, 1), (1, 2)] >>> G.remove_node(1) >>> G.edges() [] """ try: nbrs=self.succ[n] del self.node[n] except KeyError: # NetworkXError if n not in self raise NetworkXError("The node %s is not in the digraph."%(n,)) for u in nbrs: del self.pred[u][n] # remove all edges n-u in digraph del self.succ[n] # remove node from succ for u in self.pred[n]: del self.succ[u][n] # remove all edges n-u in digraph del self.pred[n] # remove node from pred def remove_nodes_from(self, nbunch): """Remove multiple nodes. Parameters ---------- nodes : iterable container A container of nodes (list, dict, set, etc.). If a node in the container is not in the graph it is silently ignored. See Also -------- remove_node Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2]) >>> e = G.nodes() >>> e [0, 1, 2] >>> G.remove_nodes_from(e) >>> G.nodes() [] """ for n in nbunch: try: succs=self.succ[n] del self.node[n] for u in succs: del self.pred[u][n] # remove all edges n-u in digraph del self.succ[n] # now remove node for u in self.pred[n]: del self.succ[u][n] # remove all edges n-u in digraph del self.pred[n] # now remove node except KeyError: pass # silent failure on remove def add_edge(self, u, v, attr_dict=None, **attr): """Add an edge between u and v. The nodes u and v will be automatically added if they are not already in the graph. Edge attributes can be specified with keywords or by providing a dictionary with key/value pairs. See examples below. Parameters ---------- u,v : nodes Nodes can be, for example, strings or numbers. Nodes must be hashable (and not None) Python objects. attr_dict : dictionary, optional (default= no attributes) Dictionary of edge attributes. Key/value pairs will update existing data associated with the edge. attr : keyword arguments, optional Edge data (or labels or objects) can be assigned using keyword arguments. See Also -------- add_edges_from : add a collection of edges Notes ----- Adding an edge that already exists updates the edge data. Many NetworkX algorithms designed for weighted graphs use as the edge weight a numerical value assigned to a keyword which by default is 'weight'. Examples -------- The following all add the edge e=(1,2) to graph G: >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> e = (1,2) >>> G.add_edge(1, 2) # explicit two-node form >>> G.add_edge(*e) # single edge as tuple of two nodes >>> G.add_edges_from( [(1,2)] ) # add edges from iterable container Associate data to edges using keywords: >>> G.add_edge(1, 2, weight=3) >>> G.add_edge(1, 3, weight=7, capacity=15, length=342.7) """ # set up attribute dict if attr_dict is None: attr_dict=attr else: try: attr_dict.update(attr) except AttributeError: raise NetworkXError(\ "The attr_dict argument must be a dictionary.") # add nodes if u not in self.succ: self.succ[u]={} self.pred[u]={} self.node[u] = {} if v not in self.succ: self.succ[v]={} self.pred[v]={} self.node[v] = {} # add the edge datadict=self.adj[u].get(v,{}) datadict.update(attr_dict) self.succ[u][v]=datadict self.pred[v][u]=datadict def add_edges_from(self, ebunch, attr_dict=None, **attr): """Add all the edges in ebunch. Parameters ---------- ebunch : container of edges Each edge given in the container will be added to the graph. The edges must be given as as 2-tuples (u,v) or 3-tuples (u,v,d) where d is a dictionary containing edge data. attr_dict : dictionary, optional (default= no attributes) Dictionary of edge attributes. Key/value pairs will update existing data associated with each edge. attr : keyword arguments, optional Edge data (or labels or objects) can be assigned using keyword arguments. See Also -------- add_edge : add a single edge add_weighted_edges_from : convenient way to add weighted edges Notes ----- Adding the same edge twice has no effect but any edge data will be updated when each duplicate edge is added. Edge attributes specified in edges as a tuple take precedence over attributes specified generally. Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_edges_from([(0,1),(1,2)]) # using a list of edge tuples >>> e = zip(range(0,3),range(1,4)) >>> G.add_edges_from(e) # Add the path graph 0-1-2-3 Associate data to edges >>> G.add_edges_from([(1,2),(2,3)], weight=3) >>> G.add_edges_from([(3,4),(1,4)], label='WN2898') """ # set up attribute dict if attr_dict is None: attr_dict=attr else: try: attr_dict.update(attr) except AttributeError: raise NetworkXError(\ "The attr_dict argument must be a dict.") # process ebunch for e in ebunch: ne = len(e) if ne==3: u,v,dd = e assert hasattr(dd,"update") elif ne==2: u,v = e dd = {} else: raise NetworkXError(\ "Edge tuple %s must be a 2-tuple or 3-tuple."%(e,)) if u not in self.succ: self.succ[u] = {} self.pred[u] = {} self.node[u] = {} if v not in self.succ: self.succ[v] = {} self.pred[v] = {} self.node[v] = {} datadict=self.adj[u].get(v,{}) datadict.update(attr_dict) datadict.update(dd) self.succ[u][v] = datadict self.pred[v][u] = datadict def remove_edge(self, u, v): """Remove the edge between u and v. Parameters ---------- u,v: nodes Remove the edge between nodes u and v. Raises ------ NetworkXError If there is not an edge between u and v. See Also -------- remove_edges_from : remove a collection of edges Examples -------- >>> G = nx.Graph() # or DiGraph, etc >>> G.add_path([0,1,2,3]) >>> G.remove_edge(0,1) >>> e = (1,2) >>> G.remove_edge(*e) # unpacks e from an edge tuple >>> e = (2,3,{'weight':7}) # an edge with attribute data >>> G.remove_edge(*e[:2]) # select first part of edge tuple """ try: del self.succ[u][v] del self.pred[v][u] except KeyError: raise NetworkXError("The edge %s-%s not in graph."%(u,v)) def remove_edges_from(self, ebunch): """Remove all edges specified in ebunch. Parameters ---------- ebunch: list or container of edge tuples Each edge given in the list or container will be removed from the graph. The edges can be: - 2-tuples (u,v) edge between u and v. - 3-tuples (u,v,k) where k is ignored. See Also -------- remove_edge : remove a single edge Notes ----- Will fail silently if an edge in ebunch is not in the graph. Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> ebunch=[(1,2),(2,3)] >>> G.remove_edges_from(ebunch) """ for e in ebunch: (u,v)=e[:2] # ignore edge data if u in self.succ and v in self.succ[u]: del self.succ[u][v] del self.pred[v][u] def has_successor(self, u, v): """Return True if node u has successor v. This is true if graph has the edge u->v. """ return (u in self.succ and v in self.succ[u]) def has_predecessor(self, u, v): """Return True if node u has predecessor v. This is true if graph has the edge u<-v. """ return (u in self.pred and v in self.pred[u]) def successors_iter(self,n): """Return an iterator over successor nodes of n. neighbors_iter() and successors_iter() are the same. """ try: return iter(self.succ[n]) except KeyError: raise NetworkXError("The node %s is not in the digraph."%(n,)) def predecessors_iter(self,n): """Return an iterator over predecessor nodes of n.""" try: return iter(self.pred[n]) except KeyError: raise NetworkXError("The node %s is not in the digraph."%(n,)) def successors(self, n): """Return a list of successor nodes of n. neighbors() and successors() are the same function. """ return list(self.successors_iter(n)) def predecessors(self, n): """Return a list of predecessor nodes of n.""" return list(self.predecessors_iter(n)) # digraph definitions neighbors = successors neighbors_iter = successors_iter def edges_iter(self, nbunch=None, data=False): """Return an iterator over the edges. Edges are returned as tuples with optional data in the order (node, neighbor, data). Parameters ---------- nbunch : iterable container, optional (default= all nodes) A container of nodes. The container will be iterated through once. data : bool, optional (default=False) If True, return edge attribute dict in 3-tuple (u,v,data). Returns ------- edge_iter : iterator An iterator of (u,v) or (u,v,d) tuples of edges. See Also -------- edges : return a list of edges Notes ----- Nodes in nbunch that are not in the graph will be (quietly) ignored. For directed graphs this returns the out-edges. Examples -------- >>> G = nx.DiGraph() # or MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> [e for e in G.edges_iter()] [(0, 1), (1, 2), (2, 3)] >>> list(G.edges_iter(data=True)) # default data is {} (empty dict) [(0, 1, {}), (1, 2, {}), (2, 3, {})] >>> list(G.edges_iter([0,2])) [(0, 1), (2, 3)] >>> list(G.edges_iter(0)) [(0, 1)] """ if nbunch is None: nodes_nbrs=self.adj.items() else: nodes_nbrs=((n,self.adj[n]) for n in self.nbunch_iter(nbunch)) if data: for n,nbrs in nodes_nbrs: for nbr,data in nbrs.items(): yield (n,nbr,data) else: for n,nbrs in nodes_nbrs: for nbr in nbrs: yield (n,nbr) # alias out_edges to edges out_edges_iter=edges_iter out_edges=TimingGraph.edges def in_edges_iter(self, nbunch=None, data=False): """Return an iterator over the incoming edges. Parameters ---------- nbunch : iterable container, optional (default= all nodes) A container of nodes. The container will be iterated through once. data : bool, optional (default=False) If True, return edge attribute dict in 3-tuple (u,v,data). Returns ------- in_edge_iter : iterator An iterator of (u,v) or (u,v,d) tuples of incoming edges. See Also -------- edges_iter : return an iterator of edges """ if nbunch is None: nodes_nbrs=self.pred.items() else: nodes_nbrs=((n,self.pred[n]) for n in self.nbunch_iter(nbunch)) if data: for n,nbrs in nodes_nbrs: for nbr,data in nbrs.items(): yield (nbr,n,data) else: for n,nbrs in nodes_nbrs: for nbr in nbrs: yield (nbr,n) def in_edges(self, nbunch=None, data=False): """Return a list of the incoming edges. See Also -------- edges : return a list of edges """ return list(self.in_edges_iter(nbunch, data)) def degree_iter(self, nbunch=None, weight=None): """Return an iterator for (node, degree). The node degree is the number of edges adjacent to the node. Parameters ---------- nbunch : iterable container, optional (default=all nodes) A container of nodes. The container will be iterated through once. weight : string or None, optional (default=None) The edge attribute that holds the numerical value used as a weight. If None, then each edge has weight 1. The degree is the sum of the edge weights adjacent to the node. Returns ------- nd_iter : an iterator The iterator returns two-tuples of (node, degree). See Also -------- degree, in_degree, out_degree, in_degree_iter, out_degree_iter Examples -------- >>> G = nx.DiGraph() # or MultiDiGraph >>> G.add_path([0,1,2,3]) >>> list(G.degree_iter(0)) # node 0 with degree 1 [(0, 1)] >>> list(G.degree_iter([0,1])) [(0, 1), (1, 2)] """ if nbunch is None: nodes_nbrs=zip(iter(self.succ.items()),iter(self.pred.items())) else: nodes_nbrs=zip( ((n,self.succ[n]) for n in self.nbunch_iter(nbunch)), ((n,self.pred[n]) for n in self.nbunch_iter(nbunch))) if weight is None: for (n,succ),(n2,pred) in nodes_nbrs: yield (n,len(succ)+len(pred)) else: # edge weighted graph - degree is sum of edge weights for (n,succ),(n2,pred) in nodes_nbrs: yield (n, sum((succ[nbr].get(weight,1) for nbr in succ))+ sum((pred[nbr].get(weight,1) for nbr in pred))) def in_degree_iter(self, nbunch=None, weight=None): """Return an iterator for (node, in-degree). The node in-degree is the number of edges pointing in to the node. Parameters ---------- nbunch : iterable container, optional (default=all nodes) A container of nodes. The container will be iterated through once. weight : string or None, optional (default=None) The edge attribute that holds the numerical value used as a weight. If None, then each edge has weight 1. The degree is the sum of the edge weights adjacent to the node. Returns ------- nd_iter : an iterator The iterator returns two-tuples of (node, in-degree). See Also -------- degree, in_degree, out_degree, out_degree_iter Examples -------- >>> G = nx.DiGraph() >>> G.add_path([0,1,2,3]) >>> list(G.in_degree_iter(0)) # node 0 with degree 0 [(0, 0)] >>> list(G.in_degree_iter([0,1])) [(0, 0), (1, 1)] """ if nbunch is None: nodes_nbrs=self.pred.items() else: nodes_nbrs=((n,self.pred[n]) for n in self.nbunch_iter(nbunch)) if weight is None: for n,nbrs in nodes_nbrs: yield (n,len(nbrs)) else: # edge weighted graph - degree is sum of edge weights for n,nbrs in nodes_nbrs: yield (n, sum(data.get(weight,1) for data in nbrs.values())) def out_degree_iter(self, nbunch=None, weight=None): """Return an iterator for (node, out-degree). The node out-degree is the number of edges pointing out of the node. Parameters ---------- nbunch : iterable container, optional (default=all nodes) A container of nodes. The container will be iterated through once. weight : string or None, optional (default=None) The edge attribute that holds the numerical value used as a weight. If None, then each edge has weight 1. The degree is the sum of the edge weights adjacent to the node. Returns ------- nd_iter : an iterator The iterator returns two-tuples of (node, out-degree). See Also -------- degree, in_degree, out_degree, in_degree_iter Examples -------- >>> G = nx.DiGraph() >>> G.add_path([0,1,2,3]) >>> list(G.out_degree_iter(0)) # node 0 with degree 1 [(0, 1)] >>> list(G.out_degree_iter([0,1])) [(0, 1), (1, 1)] """ if nbunch is None: nodes_nbrs=self.succ.items() else: nodes_nbrs=((n,self.succ[n]) for n in self.nbunch_iter(nbunch)) if weight is None: for n,nbrs in nodes_nbrs: yield (n,len(nbrs)) else: # edge weighted graph - degree is sum of edge weights for n,nbrs in nodes_nbrs: yield (n, sum(data.get(weight,1) for data in nbrs.values())) def in_degree(self, nbunch=None, weight=None): """Return the in-degree of a node or nodes. The node in-degree is the number of edges pointing in to the node. Parameters ---------- nbunch : iterable container, optional (default=all nodes) A container of nodes. The container will be iterated through once. weight : string or None, optional (default=None) The edge attribute that holds the numerical value used as a weight. If None, then each edge has weight 1. The degree is the sum of the edge weights adjacent to the node. Returns ------- nd : dictionary, or number A dictionary with nodes as keys and in-degree as values or a number if a single node is specified. See Also -------- degree, out_degree, in_degree_iter Examples -------- >>> G = nx.DiGraph() # or MultiDiGraph >>> G.add_path([0,1,2,3]) >>> G.in_degree(0) 0 >>> G.in_degree([0,1]) {0: 0, 1: 1} >>> list(G.in_degree([0,1]).values()) [0, 1] """ if nbunch in self: # return a single node return next(self.in_degree_iter(nbunch,weight))[1] else: # return a dict return dict(self.in_degree_iter(nbunch,weight)) def out_degree(self, nbunch=None, weight=None): """Return the out-degree of a node or nodes. The node out-degree is the number of edges pointing out of the node. Parameters ---------- nbunch : iterable container, optional (default=all nodes) A container of nodes. The container will be iterated through once. weight : string or None, optional (default=None) The edge attribute that holds the numerical value used as a weight. If None, then each edge has weight 1. The degree is the sum of the edge weights adjacent to the node. Returns ------- nd : dictionary, or number A dictionary with nodes as keys and out-degree as values or a number if a single node is specified. Examples -------- >>> G = nx.DiGraph() # or MultiDiGraph >>> G.add_path([0,1,2,3]) >>> G.out_degree(0) 1 >>> G.out_degree([0,1]) {0: 1, 1: 1} >>> list(G.out_degree([0,1]).values()) [1, 1] """ if nbunch in self: # return a single node return next(self.out_degree_iter(nbunch,weight))[1] else: # return a dict return dict(self.out_degree_iter(nbunch,weight)) def clear(self): """Remove all nodes and edges from the graph. This also removes the name, and all graph, node, and edge attributes. Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> G.clear() >>> G.nodes() [] >>> G.edges() [] """ self.succ.clear() self.pred.clear() self.node.clear() self.graph.clear() def is_multigraph(self): """Return True if graph is a multigraph, False otherwise.""" return False def is_directed(self): """Return True if graph is directed, False otherwise.""" return True def to_directed(self): """Return a directed copy of the graph. Returns ------- G : DiGraph A deepcopy of the graph. Notes ----- This returns a "deepcopy" of the edge, node, and graph attributes which attempts to completely copy all of the data and references. This is in contrast to the similar D=DiGraph(G) which returns a shallow copy of the data. See the Python copy module for more information on shallow and deep copies, http://docs.python.org/library/copy.html. Examples -------- >>> G = nx.Graph() # or MultiGraph, etc >>> G.add_path([0,1]) >>> H = G.to_directed() >>> H.edges() [(0, 1), (1, 0)] If already directed, return a (deep) copy >>> G = nx.DiGraph() # or MultiDiGraph, etc >>> G.add_path([0,1]) >>> H = G.to_directed() >>> H.edges() [(0, 1)] """ return deepcopy(self) def to_undirected(self, reciprocal=False): """Return an undirected representation of the digraph. Parameters ---------- reciprocal : bool (optional) If True only keep edges that appear in both directions in the original digraph. Returns ------- G : Graph An undirected graph with the same name and nodes and with edge (u,v,data) if either (u,v,data) or (v,u,data) is in the digraph. If both edges exist in digraph and their edge data is different, only one edge is created with an arbitrary choice of which edge data to use. You must check and correct for this manually if desired. Notes ----- If edges in both directions (u,v) and (v,u) exist in the graph, attributes for the new undirected edge will be a combination of the attributes of the directed edges. The edge data is updated in the (arbitrary) order that the edges are encountered. For more customized control of the edge attributes use add_edge(). This returns a "deepcopy" of the edge, node, and graph attributes which attempts to completely copy all of the data and references. This is in contrast to the similar G=DiGraph(D) which returns a shallow copy of the data. See the Python copy module for more information on shallow and deep copies, http://docs.python.org/library/copy.html. """ H=TimingGraph() H.name=self.name H.add_nodes_from(self) if reciprocal is True: H.add_edges_from( (u,v,deepcopy(d)) for u,nbrs in self.adjacency_iter() for v,d in nbrs.items() if v in self.pred[u]) else: H.add_edges_from( (u,v,deepcopy(d)) for u,nbrs in self.adjacency_iter() for v,d in nbrs.items() ) H.graph=deepcopy(self.graph) H.node=deepcopy(self.node) return H def reverse(self, copy=True): """Return the reverse of the graph. The reverse is a graph with the same nodes and edges but with the directions of the edges reversed. Parameters ---------- copy : bool optional (default=True) If True, return a new DiGraph holding the reversed edges. If False, reverse the reverse graph is created using the original graph (this changes the original graph). """ if copy: H = self.__class__(name="Reverse of (%s)"%self.name) H.add_nodes_from(self) H.add_edges_from( (v,u,deepcopy(d)) for u,v,d in self.edges(data=True) ) H.graph=deepcopy(self.graph) H.node=deepcopy(self.node) else: self.pred,self.succ=self.succ,self.pred self.adj=self.succ H=self return H def subgraph(self, nbunch): """Return the subgraph induced on nodes in nbunch. The induced subgraph of the graph contains the nodes in nbunch and the edges between those nodes. Parameters ---------- nbunch : list, iterable A container of nodes which will be iterated through once. Returns ------- G : Graph A subgraph of the graph with the same edge attributes. Notes ----- The graph, edge or node attributes just point to the original graph. So changes to the node or edge structure will not be reflected in the original graph while changes to the attributes will. To create a subgraph with its own copy of the edge/node attributes use: nx.Graph(G.subgraph(nbunch)) If edge attributes are containers, a deep copy can be obtained using: G.subgraph(nbunch).copy() For an inplace reduction of a graph to a subgraph you can remove nodes: G.remove_nodes_from([ n in G if n not in set(nbunch)]) Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> H = G.subgraph([0,1,2]) >>> H.edges() [(0, 1), (1, 2)] """ bunch = self.nbunch_iter(nbunch) # create new graph and copy subgraph into it H = self.__class__() # copy node and attribute dictionaries for n in bunch: H.node[n]=self.node[n] # namespace shortcuts for speed H_succ=H.succ H_pred=H.pred self_succ=self.succ # add nodes for n in H: H_succ[n]={} H_pred[n]={} # add edges for u in H_succ: Hnbrs=H_succ[u] for v,datadict in self_succ[u].items(): if v in H_succ: # add both representations of edge: u-v and v-u Hnbrs[v]=datadict H_pred[v][u]=datadict H.graph=self.graph return H class TimingMultiGraph(TimingGraph): """ An undirected graph class that can store multiedges. Multiedges are multiple edges between two nodes. Each edge can hold optional data or attributes. A MultiGraph holds undirected edges. Self loops are allowed. Nodes can be arbitrary (hashable) Python objects with optional key/value attributes. Edges are represented as links between nodes with optional key/value attributes. Parameters ---------- data : input graph Data to initialize graph. If data=None (default) an empty graph is created. The data can be an edge list, or any NetworkX graph object. If the corresponding optional Python packages are installed the data can also be a NumPy matrix or 2d ndarray, a SciPy sparse matrix, or a PyGraphviz graph. attr : keyword arguments, optional (default= no attributes) Attributes to add to graph as key=value pairs. See Also -------- Graph DiGraph MultiDiGraph Examples -------- Create an empty graph structure (a "null graph") with no nodes and no edges. >>> G = nx.MultiGraph() G can be grown in several ways. **Nodes:** Add one node at a time: >>> G.add_node(1) Add the nodes from any container (a list, dict, set or even the lines from a file or the nodes from another graph). >>> G.add_nodes_from([2,3]) >>> G.add_nodes_from(range(100,110)) >>> H=nx.Graph() >>> H.add_path([0,1,2,3,4,5,6,7,8,9]) >>> G.add_nodes_from(H) In addition to strings and integers any hashable Python object (except None) can represent a node, e.g. a customized node object, or even another Graph. >>> G.add_node(H) **Edges:** G can also be grown by adding edges. Add one edge, >>> G.add_edge(1, 2) a list of edges, >>> G.add_edges_from([(1,2),(1,3)]) or a collection of edges, >>> G.add_edges_from(H.edges()) If some edges connect nodes not yet in the graph, the nodes are added automatically. If an edge already exists, an additional edge is created and stored using a key to identify the edge. By default the key is the lowest unused integer. >>> G.add_edges_from([(4,5,dict(route=282)), (4,5,dict(route=37))]) >>> G[4] {3: {0: {}}, 5: {0: {}, 1: {'route': 282}, 2: {'route': 37}}} **Attributes:** Each graph, node, and edge can hold key/value attribute pairs in an associated attribute dictionary (the keys must be hashable). By default these are empty, but can be added or changed using add_edge, add_node or direct manipulation of the attribute dictionaries named graph, node and edge respectively. >>> G = nx.MultiGraph(day="Friday") >>> G.graph {'day': 'Friday'} Add node attributes using add_node(), add_nodes_from() or G.node >>> G.add_node(1, time='5pm') >>> G.add_nodes_from([3], time='2pm') >>> G.node[1] {'time': '5pm'} >>> G.node[1]['room'] = 714 >>> del G.node[1]['room'] # remove attribute >>> G.nodes(data=True) [(1, {'time': '5pm'}), (3, {'time': '2pm'})] Warning: adding a node to G.node does not add it to the graph. Add edge attributes using add_edge(), add_edges_from(), subscript notation, or G.edge. >>> G.add_edge(1, 2, weight=4.7 ) >>> G.add_edges_from([(3,4),(4,5)], color='red') >>> G.add_edges_from([(1,2,{'color':'blue'}), (2,3,{'weight':8})]) >>> G[1][2][0]['weight'] = 4.7 >>> G.edge[1][2][0]['weight'] = 4 **Shortcuts:** Many common graph features allow python syntax to speed reporting. >>> 1 in G # check if node in graph True >>> [n for n in G if n<3] # iterate through nodes [1, 2] >>> len(G) # number of nodes in graph 5 >>> G[1] # adjacency dict keyed by neighbor to edge attributes ... # Note: you should not change this dict manually! {2: {0: {'weight': 4}, 1: {'color': 'blue'}}} The fastest way to traverse all edges of a graph is via adjacency_iter(), but the edges() method is often more convenient. >>> for n,nbrsdict in G.adjacency_iter(): ... for nbr,keydict in nbrsdict.items(): ... for key,eattr in keydict.items(): ... if 'weight' in eattr: ... (n,nbr,eattr['weight']) (1, 2, 4) (2, 1, 4) (2, 3, 8) (3, 2, 8) >>> [ (u,v,edata['weight']) for u,v,edata in G.edges(data=True) if 'weight' in edata ] [(1, 2, 4), (2, 3, 8)] **Reporting:** Simple graph information is obtained using methods. Iterator versions of many reporting methods exist for efficiency. Methods exist for reporting nodes(), edges(), neighbors() and degree() as well as the number of nodes and edges. For details on these and other miscellaneous methods, see below. """ def add_edge(self, u, v, key=None, attr_dict=None, **attr): """Add an edge between u and v. The nodes u and v will be automatically added if they are not already in the graph. Edge attributes can be specified with keywords or by providing a dictionary with key/value pairs. See examples below. Parameters ---------- u,v : nodes Nodes can be, for example, strings or numbers. Nodes must be hashable (and not None) Python objects. key : hashable identifier, optional (default=lowest unused integer) Used to distinguish multiedges between a pair of nodes. attr_dict : dictionary, optional (default= no attributes) Dictionary of edge attributes. Key/value pairs will update existing data associated with the edge. attr : keyword arguments, optional Edge data (or labels or objects) can be assigned using keyword arguments. See Also -------- add_edges_from : add a collection of edges Notes ----- To replace/update edge data, use the optional key argument to identify a unique edge. Otherwise a new edge will be created. NetworkX algorithms designed for weighted graphs cannot use multigraphs directly because it is not clear how to handle multiedge weights. Convert to Graph using edge attribute 'weight' to enable weighted graph algorithms. Examples -------- The following all add the edge e=(1,2) to graph G: >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> e = (1,2) >>> G.add_edge(1, 2) # explicit two-node form >>> G.add_edge(*e) # single edge as tuple of two nodes >>> G.add_edges_from( [(1,2)] ) # add edges from iterable container Associate data to edges using keywords: >>> G.add_edge(1, 2, weight=3) >>> G.add_edge(1, 2, key=0, weight=4) # update data for key=0 >>> G.add_edge(1, 3, weight=7, capacity=15, length=342.7) """ # set up attribute dict if attr_dict is None: attr_dict=attr else: try: attr_dict.update(attr) except AttributeError: raise NetworkXError(\ "The attr_dict argument must be a dictionary.") # add nodes if u not in self.adj: self.adj[u] = {} self.node[u] = {} if v not in self.adj: self.adj[v] = {} self.node[v] = {} if v in self.adj[u]: keydict=self.adj[u][v] if key is None: # find a unique integer key # other methods might be better here? key=len(keydict) while key in keydict: key+=1 datadict=keydict.get(key,{}) datadict.update(attr_dict) keydict[key]=datadict else: # selfloops work this way without special treatment if key is None: key=0 datadict={} datadict.update(attr_dict) keydict={key:datadict} self.adj[u][v] = keydict self.adj[v][u] = keydict def add_edges_from(self, ebunch, attr_dict=None, **attr): """Add all the edges in ebunch. Parameters ---------- ebunch : container of edges Each edge given in the container will be added to the graph. The edges can be: - 2-tuples (u,v) or - 3-tuples (u,v,d) for an edge attribute dict d, or - 4-tuples (u,v,k,d) for an edge identified by key k attr_dict : dictionary, optional (default= no attributes) Dictionary of edge attributes. Key/value pairs will update existing data associated with each edge. attr : keyword arguments, optional Edge data (or labels or objects) can be assigned using keyword arguments. See Also -------- add_edge : add a single edge add_weighted_edges_from : convenient way to add weighted edges Notes ----- Adding the same edge twice has no effect but any edge data will be updated when each duplicate edge is added. Edge attributes specified in edges as a tuple take precedence over attributes specified generally. Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_edges_from([(0,1),(1,2)]) # using a list of edge tuples >>> e = zip(range(0,3),range(1,4)) >>> G.add_edges_from(e) # Add the path graph 0-1-2-3 Associate data to edges >>> G.add_edges_from([(1,2),(2,3)], weight=3) >>> G.add_edges_from([(3,4),(1,4)], label='WN2898') """ # set up attribute dict if attr_dict is None: attr_dict=attr else: try: attr_dict.update(attr) except AttributeError: raise NetworkXError(\ "The attr_dict argument must be a dictionary.") # process ebunch for e in ebunch: ne=len(e) if ne==4: u,v,key,dd = e elif ne==3: u,v,dd = e key=None elif ne==2: u,v = e dd = {} key=None else: raise NetworkXError(\ "Edge tuple %s must be a 2-tuple, 3-tuple or 4-tuple."%(e,)) if u in self.adj: keydict=self.adj[u].get(v,{}) else: keydict={} if key is None: # find a unique integer key # other methods might be better here? key=len(keydict) while key in keydict: key+=1 datadict=keydict.get(key,{}) datadict.update(attr_dict) datadict.update(dd) self.add_edge(u,v,key=key,attr_dict=datadict) def remove_edge(self, u, v, key=None): """Remove an edge between u and v. Parameters ---------- u,v: nodes Remove an edge between nodes u and v. key : hashable identifier, optional (default=None) Used to distinguish multiple edges between a pair of nodes. If None remove a single (abritrary) edge between u and v. Raises ------ NetworkXError If there is not an edge between u and v, or if there is no edge with the specified key. See Also -------- remove_edges_from : remove a collection of edges Examples -------- >>> G = nx.MultiGraph() >>> G.add_path([0,1,2,3]) >>> G.remove_edge(0,1) >>> e = (1,2) >>> G.remove_edge(*e) # unpacks e from an edge tuple For multiple edges >>> G = nx.MultiGraph() # or MultiDiGraph, etc >>> G.add_edges_from([(1,2),(1,2),(1,2)]) >>> G.remove_edge(1,2) # remove a single (arbitrary) edge For edges with keys >>> G = nx.MultiGraph() # or MultiDiGraph, etc >>> G.add_edge(1,2,key='first') >>> G.add_edge(1,2,key='second') >>> G.remove_edge(1,2,key='second') """ try: d=self.adj[u][v] except (KeyError): raise NetworkXError( "The edge %s-%s is not in the graph."%(u,v)) # remove the edge with specified data if key is None: d.popitem() else: try: del d[key] except (KeyError): raise NetworkXError( "The edge %s-%s with key %s is not in the graph."%(u,v,key)) if len(d)==0: # remove the key entries if last edge del self.adj[u][v] if u!=v: # check for selfloop del self.adj[v][u] def remove_edges_from(self, ebunch): """Remove all edges specified in ebunch. Parameters ---------- ebunch: list or container of edge tuples Each edge given in the list or container will be removed from the graph. The edges can be: - 2-tuples (u,v) All edges between u and v are removed. - 3-tuples (u,v,key) The edge identified by key is removed. - 4-tuples (u,v,key,data) where data is ignored. See Also -------- remove_edge : remove a single edge Notes ----- Will fail silently if an edge in ebunch is not in the graph. Examples -------- >>> G = nx.MultiGraph() # or MultiDiGraph >>> G.add_path([0,1,2,3]) >>> ebunch=[(1,2),(2,3)] >>> G.remove_edges_from(ebunch) Removing multiple copies of edges >>> G = nx.MultiGraph() >>> G.add_edges_from([(1,2),(1,2),(1,2)]) >>> G.remove_edges_from([(1,2),(1,2)]) >>> G.edges() [(1, 2)] >>> G.remove_edges_from([(1,2),(1,2)]) # silently ignore extra copy >>> G.edges() # now empty graph [] """ for e in ebunch: try: self.remove_edge(*e[:3]) except NetworkXError: pass def has_edge(self, u, v, key=None): """Return True if the graph has an edge between nodes u and v. Parameters ---------- u,v : nodes Nodes can be, for example, strings or numbers. key : hashable identifier, optional (default=None) If specified return True only if the edge with key is found. Returns ------- edge_ind : bool True if edge is in the graph, False otherwise. Examples -------- Can be called either using two nodes u,v, an edge tuple (u,v), or an edge tuple (u,v,key). >>> G = nx.MultiGraph() # or MultiDiGraph >>> G.add_path([0,1,2,3]) >>> G.has_edge(0,1) # using two nodes True >>> e = (0,1) >>> G.has_edge(*e) # e is a 2-tuple (u,v) True >>> G.add_edge(0,1,key='a') >>> G.has_edge(0,1,key='a') # specify key True >>> e=(0,1,'a') >>> G.has_edge(*e) # e is a 3-tuple (u,v,'a') True The following syntax are equivalent: >>> G.has_edge(0,1) True >>> 1 in G[0] # though this gives KeyError if 0 not in G True """ try: if key is None: return v in self.adj[u] else: return key in self.adj[u][v] except KeyError: return False def edges(self, nbunch=None, data=False, keys=False): """Return a list of edges. Edges are returned as tuples with optional data and keys in the order (node, neighbor, key, data). Parameters ---------- nbunch : iterable container, optional (default= all nodes) A container of nodes. The container will be iterated through once. data : bool, optional (default=False) Return two tuples (u,v) (False) or three-tuples (u,v,data) (True). keys : bool, optional (default=False) Return two tuples (u,v) (False) or three-tuples (u,v,key) (True). Returns -------- edge_list: list of edge tuples Edges that are adjacent to any node in nbunch, or a list of all edges if nbunch is not specified. See Also -------- edges_iter : return an iterator over the edges Notes ----- Nodes in nbunch that are not in the graph will be (quietly) ignored. For directed graphs this returns the out-edges. Examples -------- >>> G = nx.MultiGraph() # or MultiDiGraph >>> G.add_path([0,1,2,3]) >>> G.edges() [(0, 1), (1, 2), (2, 3)] >>> G.edges(data=True) # default edge data is {} (empty dictionary) [(0, 1, {}), (1, 2, {}), (2, 3, {})] >>> G.edges(keys=True) # default keys are integers [(0, 1, 0), (1, 2, 0), (2, 3, 0)] >>> G.edges(data=True,keys=True) # default keys are integers [(0, 1, 0, {}), (1, 2, 0, {}), (2, 3, 0, {})] >>> G.edges([0,3]) [(0, 1), (3, 2)] >>> G.edges(0) [(0, 1)] """ return list(self.edges_iter(nbunch, data=data,keys=keys)) def edges_iter(self, nbunch=None, data=False, keys=False): """Return an iterator over the edges. Edges are returned as tuples with optional data and keys in the order (node, neighbor, key, data). Parameters ---------- nbunch : iterable container, optional (default= all nodes) A container of nodes. The container will be iterated through once. data : bool, optional (default=False) If True, return edge attribute dict with each edge. keys : bool, optional (default=False) If True, return edge keys with each edge. Returns ------- edge_iter : iterator An iterator of (u,v), (u,v,d) or (u,v,key,d) tuples of edges. See Also -------- edges : return a list of edges Notes ----- Nodes in nbunch that are not in the graph will be (quietly) ignored. For directed graphs this returns the out-edges. Examples -------- >>> G = nx.MultiGraph() # or MultiDiGraph >>> G.add_path([0,1,2,3]) >>> [e for e in G.edges_iter()] [(0, 1), (1, 2), (2, 3)] >>> list(G.edges_iter(data=True)) # default data is {} (empty dict) [(0, 1, {}), (1, 2, {}), (2, 3, {})] >>> list(G.edges(keys=True)) # default keys are integers [(0, 1, 0), (1, 2, 0), (2, 3, 0)] >>> list(G.edges(data=True,keys=True)) # default keys are integers [(0, 1, 0, {}), (1, 2, 0, {}), (2, 3, 0, {})] >>> list(G.edges_iter([0,3])) [(0, 1), (3, 2)] >>> list(G.edges_iter(0)) [(0, 1)] """ seen={} # helper dict to keep track of multiply stored edges if nbunch is None: nodes_nbrs = self.adj.items() else: nodes_nbrs=((n,self.adj[n]) for n in self.nbunch_iter(nbunch)) if data: for n,nbrs in nodes_nbrs: for nbr,keydict in nbrs.items(): if nbr not in seen: for key,data in keydict.items(): if keys: yield (n,nbr,key,data) else: yield (n,nbr,data) seen[n]=1 else: for n,nbrs in nodes_nbrs: for nbr,keydict in nbrs.items(): if nbr not in seen: for key,data in keydict.items(): if keys: yield (n,nbr,key) else: yield (n,nbr) seen[n] = 1 del seen def get_edge_data(self, u, v, key=None, default=None): """Return the attribute dictionary associated with edge (u,v). Parameters ---------- u,v : nodes default: any Python object (default=None) Value to return if the edge (u,v) is not found. key : hashable identifier, optional (default=None) Return data only for the edge with specified key. Returns ------- edge_dict : dictionary The edge attribute dictionary. Notes ----- It is faster to use G[u][v][key]. >>> G = nx.MultiGraph() # or MultiDiGraph >>> G.add_edge(0,1,key='a',weight=7) >>> G[0][1]['a'] # key='a' {'weight': 7} Warning: Assigning G[u][v][key] corrupts the graph data structure. But it is safe to assign attributes to that dictionary, >>> G[0][1]['a']['weight'] = 10 >>> G[0][1]['a']['weight'] 10 >>> G[1][0]['a']['weight'] 10 Examples -------- >>> G = nx.MultiGraph() # or MultiDiGraph >>> G.add_path([0,1,2,3]) >>> G.get_edge_data(0,1) {0: {}} >>> e = (0,1) >>> G.get_edge_data(*e) # tuple form {0: {}} >>> G.get_edge_data('a','b',default=0) # edge not in graph, return 0 0 """ try: if key is None: return self.adj[u][v] else: return self.adj[u][v][key] except KeyError: return default def degree_iter(self, nbunch=None, weight=None): """Return an iterator for (node, degree). The node degree is the number of edges adjacent to the node. Parameters ---------- nbunch : iterable container, optional (default=all nodes) A container of nodes. The container will be iterated through once. weight : string or None, optional (default=None) The edge attribute that holds the numerical value used as a weight. If None, then each edge has weight 1. The degree is the sum of the edge weights adjacent to the node. Returns ------- nd_iter : an iterator The iterator returns two-tuples of (node, degree). See Also -------- degree Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> list(G.degree_iter(0)) # node 0 with degree 1 [(0, 1)] >>> list(G.degree_iter([0,1])) [(0, 1), (1, 2)] """ if nbunch is None: nodes_nbrs = self.adj.items() else: nodes_nbrs=((n,self.adj[n]) for n in self.nbunch_iter(nbunch)) if weight is None: for n,nbrs in nodes_nbrs: deg = sum([len(data) for data in nbrs.values()]) yield (n, deg+(n in nbrs and len(nbrs[n]))) else: # edge weighted graph - degree is sum of nbr edge weights for n,nbrs in nodes_nbrs: deg = sum([d.get(weight,1) for data in nbrs.values() for d in data.values()]) if n in nbrs: deg += sum([d.get(weight,1) for key,d in nbrs[n].items()]) yield (n, deg) def is_multigraph(self): """Return True if graph is a multigraph, False otherwise.""" return True def is_directed(self): """Return True if graph is directed, False otherwise.""" return False def to_directed(self): """Return a directed representation of the graph. Returns ------- G : MultiDiGraph A directed graph with the same name, same nodes, and with each edge (u,v,data) replaced by two directed edges (u,v,data) and (v,u,data). Notes ----- This returns a "deepcopy" of the edge, node, and graph attributes which attempts to completely copy all of the data and references. This is in contrast to the similar D=DiGraph(G) which returns a shallow copy of the data. See the Python copy module for more information on shallow and deep copies, http://docs.python.org/library/copy.html. Examples -------- >>> G = nx.Graph() # or MultiGraph, etc >>> G.add_path([0,1]) >>> H = G.to_directed() >>> H.edges() [(0, 1), (1, 0)] If already directed, return a (deep) copy >>> G = nx.DiGraph() # or MultiDiGraph, etc >>> G.add_path([0,1]) >>> H = G.to_directed() >>> H.edges() [(0, 1)] """ from networkx.classes.multidigraph import MultiDiGraph G=MultiDiGraph() G.add_nodes_from(self) G.add_edges_from( (u,v,key,deepcopy(datadict)) for u,nbrs in self.adjacency_iter() for v,keydict in nbrs.items() for key,datadict in keydict.items() ) G.graph=deepcopy(self.graph) G.node=deepcopy(self.node) return G def selfloop_edges(self, data=False, keys=False): """Return a list of selfloop edges. A selfloop edge has the same node at both ends. Parameters ----------- data : bool, optional (default=False) Return selfloop edges as two tuples (u,v) (data=False) or three-tuples (u,v,data) (data=True) keys : bool, optional (default=False) If True, return edge keys with each edge. Returns ------- edgelist : list of edge tuples A list of all selfloop edges. See Also -------- nodes_with_selfloops, number_of_selfloops Examples -------- >>> G = nx.MultiGraph() # or MultiDiGraph >>> G.add_edge(1,1) >>> G.add_edge(1,2) >>> G.selfloop_edges() [(1, 1)] >>> G.selfloop_edges(data=True) [(1, 1, {})] >>> G.selfloop_edges(keys=True) [(1, 1, 0)] >>> G.selfloop_edges(keys=True, data=True) [(1, 1, 0, {})] """ if data: if keys: return [ (n,n,k,d) for n,nbrs in self.adj.items() if n in nbrs for k,d in nbrs[n].items()] else: return [ (n,n,d) for n,nbrs in self.adj.items() if n in nbrs for d in nbrs[n].values()] else: if keys: return [ (n,n,k) for n,nbrs in self.adj.items() if n in nbrs for k in nbrs[n].keys()] else: return [ (n,n) for n,nbrs in self.adj.items() if n in nbrs for d in nbrs[n].values()] def number_of_edges(self, u=None, v=None): """Return the number of edges between two nodes. Parameters ---------- u,v : nodes, optional (default=all edges) If u and v are specified, return the number of edges between u and v. Otherwise return the total number of all edges. Returns ------- nedges : int The number of edges in the graph. If nodes u and v are specified return the number of edges between those nodes. See Also -------- size Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> G.number_of_edges() 3 >>> G.number_of_edges(0,1) 1 >>> e = (0,1) >>> G.number_of_edges(*e) 1 """ if u is None: return self.size() try: edgedata=self.adj[u][v] except KeyError: return 0 # no such edge return len(edgedata) def subgraph(self, nbunch): """Return the subgraph induced on nodes in nbunch. The induced subgraph of the graph contains the nodes in nbunch and the edges between those nodes. Parameters ---------- nbunch : list, iterable A container of nodes which will be iterated through once. Returns ------- G : Graph A subgraph of the graph with the same edge attributes. Notes ----- The graph, edge or node attributes just point to the original graph. So changes to the node or edge structure will not be reflected in the original graph while changes to the attributes will. To create a subgraph with its own copy of the edge/node attributes use: nx.Graph(G.subgraph(nbunch)) If edge attributes are containers, a deep copy can be obtained using: G.subgraph(nbunch).copy() For an inplace reduction of a graph to a subgraph you can remove nodes: G.remove_nodes_from([ n in G if n not in set(nbunch)]) Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> H = G.subgraph([0,1,2]) >>> H.edges() [(0, 1), (1, 2)] """ bunch =self.nbunch_iter(nbunch) # create new graph and copy subgraph into it H = self.__class__() # copy node and attribute dictionaries for n in bunch: H.node[n]=self.node[n] # namespace shortcuts for speed H_adj=H.adj self_adj=self.adj # add nodes and edges (undirected method) for n in H: Hnbrs={} H_adj[n]=Hnbrs for nbr,edgedict in self_adj[n].items(): if nbr in H_adj: # add both representations of edge: n-nbr and nbr-n # they share the same edgedict ed=edgedict.copy() Hnbrs[nbr]=ed H_adj[nbr][n]=ed H.graph=self.graph return H class TimingMultiDiGraph(TimingMultiGraph,TimingDiGraph): """A directed graph class that can store multiedges. Multiedges are multiple edges between two nodes. Each edge can hold optional data or attributes. A MultiDiGraph holds directed edges. Self loops are allowed. Nodes can be arbitrary (hashable) Python objects with optional key/value attributes. Edges are represented as links between nodes with optional key/value attributes. Parameters ---------- data : input graph Data to initialize graph. If data=None (default) an empty graph is created. The data can be an edge list, or any NetworkX graph object. If the corresponding optional Python packages are installed the data can also be a NumPy matrix or 2d ndarray, a SciPy sparse matrix, or a PyGraphviz graph. attr : keyword arguments, optional (default= no attributes) Attributes to add to graph as key=value pairs. See Also -------- Graph DiGraph MultiGraph Examples -------- Create an empty graph structure (a "null graph") with no nodes and no edges. >>> G = nx.MultiDiGraph() G can be grown in several ways. **Nodes:** Add one node at a time: >>> G.add_node(1) Add the nodes from any container (a list, dict, set or even the lines from a file or the nodes from another graph). >>> G.add_nodes_from([2,3]) >>> G.add_nodes_from(range(100,110)) >>> H=nx.Graph() >>> H.add_path([0,1,2,3,4,5,6,7,8,9]) >>> G.add_nodes_from(H) In addition to strings and integers any hashable Python object (except None) can represent a node, e.g. a customized node object, or even another Graph. >>> G.add_node(H) **Edges:** G can also be grown by adding edges. Add one edge, >>> G.add_edge(1, 2) a list of edges, >>> G.add_edges_from([(1,2),(1,3)]) or a collection of edges, >>> G.add_edges_from(H.edges()) If some edges connect nodes not yet in the graph, the nodes are added automatically. If an edge already exists, an additional edge is created and stored using a key to identify the edge. By default the key is the lowest unused integer. >>> G.add_edges_from([(4,5,dict(route=282)), (4,5,dict(route=37))]) >>> G[4] {5: {0: {}, 1: {'route': 282}, 2: {'route': 37}}} **Attributes:** Each graph, node, and edge can hold key/value attribute pairs in an associated attribute dictionary (the keys must be hashable). By default these are empty, but can be added or changed using add_edge, add_node or direct manipulation of the attribute dictionaries named graph, node and edge respectively. >>> G = nx.MultiDiGraph(day="Friday") >>> G.graph {'day': 'Friday'} Add node attributes using add_node(), add_nodes_from() or G.node >>> G.add_node(1, time='5pm') >>> G.add_nodes_from([3], time='2pm') >>> G.node[1] {'time': '5pm'} >>> G.node[1]['room'] = 714 >>> del G.node[1]['room'] # remove attribute >>> G.nodes(data=True) [(1, {'time': '5pm'}), (3, {'time': '2pm'})] Warning: adding a node to G.node does not add it to the graph. Add edge attributes using add_edge(), add_edges_from(), subscript notation, or G.edge. >>> G.add_edge(1, 2, weight=4.7 ) >>> G.add_edges_from([(3,4),(4,5)], color='red') >>> G.add_edges_from([(1,2,{'color':'blue'}), (2,3,{'weight':8})]) >>> G[1][2][0]['weight'] = 4.7 >>> G.edge[1][2][0]['weight'] = 4 **Shortcuts:** Many common graph features allow python syntax to speed reporting. >>> 1 in G # check if node in graph True >>> [n for n in G if n<3] # iterate through nodes [1, 2] >>> len(G) # number of nodes in graph 5 >>> G[1] # adjacency dict keyed by neighbor to edge attributes ... # Note: you should not change this dict manually! {2: {0: {'weight': 4}, 1: {'color': 'blue'}}} The fastest way to traverse all edges of a graph is via adjacency_iter(), but the edges() method is often more convenient. >>> for n,nbrsdict in G.adjacency_iter(): ... for nbr,keydict in nbrsdict.items(): ... for key,eattr in keydict.items(): ... if 'weight' in eattr: ... (n,nbr,eattr['weight']) (1, 2, 4) (2, 3, 8) >>> [ (u,v,edata['weight']) for u,v,edata in G.edges(data=True) if 'weight' in edata ] [(1, 2, 4), (2, 3, 8)] **Reporting:** Simple graph information is obtained using methods. Iterator versions of many reporting methods exist for efficiency. Methods exist for reporting nodes(), edges(), neighbors() and degree() as well as the number of nodes and edges. For details on these and other miscellaneous methods, see below. """ def add_edge(self, u, v, key=None, attr_dict=None, **attr): """Add an edge between u and v. The nodes u and v will be automatically added if they are not already in the graph. Edge attributes can be specified with keywords or by providing a dictionary with key/value pairs. See examples below. Parameters ---------- u,v : nodes Nodes can be, for example, strings or numbers. Nodes must be hashable (and not None) Python objects. key : hashable identifier, optional (default=lowest unused integer) Used to distinguish multiedges between a pair of nodes. attr_dict : dictionary, optional (default= no attributes) Dictionary of edge attributes. Key/value pairs will update existing data associated with the edge. attr : keyword arguments, optional Edge data (or labels or objects) can be assigned using keyword arguments. See Also -------- add_edges_from : add a collection of edges Notes ----- To replace/update edge data, use the optional key argument to identify a unique edge. Otherwise a new edge will be created. NetworkX algorithms designed for weighted graphs cannot use multigraphs directly because it is not clear how to handle multiedge weights. Convert to Graph using edge attribute 'weight' to enable weighted graph algorithms. Examples -------- The following all add the edge e=(1,2) to graph G: >>> G = nx.MultiDiGraph() >>> e = (1,2) >>> G.add_edge(1, 2) # explicit two-node form >>> G.add_edge(*e) # single edge as tuple of two nodes >>> G.add_edges_from( [(1,2)] ) # add edges from iterable container Associate data to edges using keywords: >>> G.add_edge(1, 2, weight=3) >>> G.add_edge(1, 2, key=0, weight=4) # update data for key=0 >>> G.add_edge(1, 3, weight=7, capacity=15, length=342.7) """ # set up attribute dict if attr_dict is None: attr_dict=attr else: try: attr_dict.update(attr) except AttributeError: raise NetworkXError(\ "The attr_dict argument must be a dictionary.") # add nodes if u not in self.succ: self.succ[u] = {} self.pred[u] = {} self.node[u] = {} if v not in self.succ: self.succ[v] = {} self.pred[v] = {} self.node[v] = {} if v in self.succ[u]: keydict=self.adj[u][v] if key is None: # find a unique integer key # other methods might be better here? key=len(keydict) while key in keydict: key+=1 datadict=keydict.get(key,{}) datadict.update(attr_dict) keydict[key]=datadict else: # selfloops work this way without special treatment if key is None: key=0 datadict={} datadict.update(attr_dict) keydict={key:datadict} self.succ[u][v] = keydict self.pred[v][u] = keydict def remove_edge(self, u, v, key=None): """Remove an edge between u and v. Parameters ---------- u,v: nodes Remove an edge between nodes u and v. key : hashable identifier, optional (default=None) Used to distinguish multiple edges between a pair of nodes. If None remove a single (abritrary) edge between u and v. Raises ------ NetworkXError If there is not an edge between u and v, or if there is no edge with the specified key. See Also -------- remove_edges_from : remove a collection of edges Examples -------- >>> G = nx.MultiDiGraph() >>> G.add_path([0,1,2,3]) >>> G.remove_edge(0,1) >>> e = (1,2) >>> G.remove_edge(*e) # unpacks e from an edge tuple For multiple edges >>> G = nx.MultiDiGraph() >>> G.add_edges_from([(1,2),(1,2),(1,2)]) >>> G.remove_edge(1,2) # remove a single (arbitrary) edge For edges with keys >>> G = nx.MultiDiGraph() >>> G.add_edge(1,2,key='first') >>> G.add_edge(1,2,key='second') >>> G.remove_edge(1,2,key='second') """ try: d=self.adj[u][v] except (KeyError): raise NetworkXError( "The edge %s-%s is not in the graph."%(u,v)) # remove the edge with specified data if key is None: d.popitem() else: try: del d[key] except (KeyError): raise NetworkXError( "The edge %s-%s with key %s is not in the graph."%(u,v,key)) if len(d)==0: # remove the key entries if last edge del self.succ[u][v] del self.pred[v][u] def edges_iter(self, nbunch=None, data=False, keys=False): """Return an iterator over the edges. Edges are returned as tuples with optional data and keys in the order (node, neighbor, key, data). Parameters ---------- nbunch : iterable container, optional (default= all nodes) A container of nodes. The container will be iterated through once. data : bool, optional (default=False) If True, return edge attribute dict with each edge. keys : bool, optional (default=False) If True, return edge keys with each edge. Returns ------- edge_iter : iterator An iterator of (u,v), (u,v,d) or (u,v,key,d) tuples of edges. See Also -------- edges : return a list of edges Notes ----- Nodes in nbunch that are not in the graph will be (quietly) ignored. For directed graphs this returns the out-edges. Examples -------- >>> G = nx.MultiDiGraph() >>> G.add_path([0,1,2,3]) >>> [e for e in G.edges_iter()] [(0, 1), (1, 2), (2, 3)] >>> list(G.edges_iter(data=True)) # default data is {} (empty dict) [(0, 1, {}), (1, 2, {}), (2, 3, {})] >>> list(G.edges_iter([0,2])) [(0, 1), (2, 3)] >>> list(G.edges_iter(0)) [(0, 1)] """ if nbunch is None: nodes_nbrs = self.adj.items() else: nodes_nbrs=((n,self.adj[n]) for n in self.nbunch_iter(nbunch)) if data: for n,nbrs in nodes_nbrs: for nbr,keydict in nbrs.items(): for key,data in keydict.items(): if keys: yield (n,nbr,key,data) else: yield (n,nbr,data) else: for n,nbrs in nodes_nbrs: for nbr,keydict in nbrs.items(): for key,data in keydict.items(): if keys: yield (n,nbr,key) else: yield (n,nbr) # alias out_edges to edges out_edges_iter=edges_iter def out_edges(self, nbunch=None, keys=False, data=False): """Return a list of the outgoing edges. Edges are returned as tuples with optional data and keys in the order (node, neighbor, key, data). Parameters ---------- nbunch : iterable container, optional (default= all nodes) A container of nodes. The container will be iterated through once. data : bool, optional (default=False) If True, return edge attribute dict with each edge. keys : bool, optional (default=False) If True, return edge keys with each edge. Returns ------- out_edges : list An listr of (u,v), (u,v,d) or (u,v,key,d) tuples of edges. Notes ----- Nodes in nbunch that are not in the graph will be (quietly) ignored. For directed graphs edges() is the same as out_edges(). See Also -------- in_edges: return a list of incoming edges """ return list(self.out_edges_iter(nbunch, keys=keys, data=data)) def in_edges_iter(self, nbunch=None, data=False, keys=False): """Return an iterator over the incoming edges. Parameters ---------- nbunch : iterable container, optional (default= all nodes) A container of nodes. The container will be iterated through once. data : bool, optional (default=False) If True, return edge attribute dict with each edge. keys : bool, optional (default=False) If True, return edge keys with each edge. Returns ------- in_edge_iter : iterator An iterator of (u,v), (u,v,d) or (u,v,key,d) tuples of edges. See Also -------- edges_iter : return an iterator of edges """ if nbunch is None: nodes_nbrs=self.pred.items() else: nodes_nbrs=((n,self.pred[n]) for n in self.nbunch_iter(nbunch)) if data: for n,nbrs in nodes_nbrs: for nbr,keydict in nbrs.items(): for key,data in keydict.items(): if keys: yield (nbr,n,key,data) else: yield (nbr,n,data) else: for n,nbrs in nodes_nbrs: for nbr,keydict in nbrs.items(): for key,data in keydict.items(): if keys: yield (nbr,n,key) else: yield (nbr,n) def in_edges(self, nbunch=None, keys=False, data=False): """Return a list of the incoming edges. Parameters ---------- nbunch : iterable container, optional (default= all nodes) A container of nodes. The container will be iterated through once. data : bool, optional (default=False) If True, return edge attribute dict with each edge. keys : bool, optional (default=False) If True, return edge keys with each edge. Returns ------- in_edges : list A list of (u,v), (u,v,d) or (u,v,key,d) tuples of edges. See Also -------- out_edges: return a list of outgoing edges """ return list(self.in_edges_iter(nbunch, keys=keys, data=data)) def degree_iter(self, nbunch=None, weight=None): """Return an iterator for (node, degree). The node degree is the number of edges adjacent to the node. Parameters ---------- nbunch : iterable container, optional (default=all nodes) A container of nodes. The container will be iterated through once. weight : string or None, optional (default=None) The edge attribute that holds the numerical value used as a weight. If None, then each edge has weight 1. The degree is the sum of the edge weights. Returns ------- nd_iter : an iterator The iterator returns two-tuples of (node, degree). See Also -------- degree Examples -------- >>> G = nx.MultiDiGraph() >>> G.add_path([0,1,2,3]) >>> list(G.degree_iter(0)) # node 0 with degree 1 [(0, 1)] >>> list(G.degree_iter([0,1])) [(0, 1), (1, 2)] """ if nbunch is None: nodes_nbrs=zip(iter(self.succ.items()),iter(self.pred.items())) else: nodes_nbrs=zip( ((n,self.succ[n]) for n in self.nbunch_iter(nbunch)), ((n,self.pred[n]) for n in self.nbunch_iter(nbunch))) if weight is None: for (n,succ),(n2,pred) in nodes_nbrs: indeg = sum([len(data) for data in pred.values()]) outdeg = sum([len(data) for data in succ.values()]) yield (n, indeg + outdeg) else: # edge weighted graph - degree is sum of nbr edge weights for (n,succ),(n2,pred) in nodes_nbrs: deg = sum([d.get(weight,1) for data in pred.values() for d in data.values()]) deg += sum([d.get(weight,1) for data in succ.values() for d in data.values()]) yield (n, deg) def in_degree_iter(self, nbunch=None, weight=None): """Return an iterator for (node, in-degree). The node in-degree is the number of edges pointing in to the node. Parameters ---------- nbunch : iterable container, optional (default=all nodes) A container of nodes. The container will be iterated through once. weight : string or None, optional (default=None) The edge attribute that holds the numerical value used as a weight. If None, then each edge has weight 1. The degree is the sum of the edge weights adjacent to the node. Returns ------- nd_iter : an iterator The iterator returns two-tuples of (node, in-degree). See Also -------- degree, in_degree, out_degree, out_degree_iter Examples -------- >>> G = nx.MultiDiGraph() >>> G.add_path([0,1,2,3]) >>> list(G.in_degree_iter(0)) # node 0 with degree 0 [(0, 0)] >>> list(G.in_degree_iter([0,1])) [(0, 0), (1, 1)] """ if nbunch is None: nodes_nbrs=self.pred.items() else: nodes_nbrs=((n,self.pred[n]) for n in self.nbunch_iter(nbunch)) if weight is None: for n,nbrs in nodes_nbrs: yield (n, sum([len(data) for data in nbrs.values()]) ) else: # edge weighted graph - degree is sum of nbr edge weights for n,pred in nodes_nbrs: deg = sum([d.get(weight,1) for data in pred.values() for d in data.values()]) yield (n, deg) def out_degree_iter(self, nbunch=None, weight=None): """Return an iterator for (node, out-degree). The node out-degree is the number of edges pointing out of the node. Parameters ---------- nbunch : iterable container, optional (default=all nodes) A container of nodes. The container will be iterated through once. weight : string or None, optional (default=None) The edge attribute that holds the numerical value used as a weight. If None, then each edge has weight 1. The degree is the sum of the edge weights. Returns ------- nd_iter : an iterator The iterator returns two-tuples of (node, out-degree). See Also -------- degree, in_degree, out_degree, in_degree_iter Examples -------- >>> G = nx.MultiDiGraph() >>> G.add_path([0,1,2,3]) >>> list(G.out_degree_iter(0)) # node 0 with degree 1 [(0, 1)] >>> list(G.out_degree_iter([0,1])) [(0, 1), (1, 1)] """ if nbunch is None: nodes_nbrs=self.succ.items() else: nodes_nbrs=((n,self.succ[n]) for n in self.nbunch_iter(nbunch)) if weight is None: for n,nbrs in nodes_nbrs: yield (n, sum([len(data) for data in nbrs.values()]) ) else: for n,succ in nodes_nbrs: deg = sum([d.get(weight,1) for data in succ.values() for d in data.values()]) yield (n, deg) def is_multigraph(self): """Return True if graph is a multigraph, False otherwise.""" return True def is_directed(self): """Return True if graph is directed, False otherwise.""" return True def to_directed(self): """Return a directed copy of the graph. Returns ------- G : MultiDiGraph A deepcopy of the graph. Notes ----- If edges in both directions (u,v) and (v,u) exist in the graph, attributes for the new undirected edge will be a combination of the attributes of the directed edges. The edge data is updated in the (arbitrary) order that the edges are encountered. For more customized control of the edge attributes use add_edge(). This returns a "deepcopy" of the edge, node, and graph attributes which attempts to completely copy all of the data and references. This is in contrast to the similar G=DiGraph(D) which returns a shallow copy of the data. See the Python copy module for more information on shallow and deep copies, http://docs.python.org/library/copy.html. Examples -------- >>> G = nx.Graph() # or MultiGraph, etc >>> G.add_path([0,1]) >>> H = G.to_directed() >>> H.edges() [(0, 1), (1, 0)] If already directed, return a (deep) copy >>> G = nx.MultiDiGraph() >>> G.add_path([0,1]) >>> H = G.to_directed() >>> H.edges() [(0, 1)] """ return deepcopy(self) def to_undirected(self, reciprocal=False): """Return an undirected representation of the digraph. Parameters ---------- reciprocal : bool (optional) If True only keep edges that appear in both directions in the original digraph. Returns ------- G : MultiGraph An undirected graph with the same name and nodes and with edge (u,v,data) if either (u,v,data) or (v,u,data) is in the digraph. If both edges exist in digraph and their edge data is different, only one edge is created with an arbitrary choice of which edge data to use. You must check and correct for this manually if desired. Notes ----- This returns a "deepcopy" of the edge, node, and graph attributes which attempts to completely copy all of the data and references. This is in contrast to the similar D=DiGraph(G) which returns a shallow copy of the data. See the Python copy module for more information on shallow and deep copies, http://docs.python.org/library/copy.html. """ H=MultiGraph() H.name=self.name H.add_nodes_from(self) if reciprocal is True: H.add_edges_from( (u,v,key,deepcopy(data)) for u,nbrs in self.adjacency_iter() for v,keydict in nbrs.items() for key,data in keydict.items() if self.has_edge(v,u,key)) else: H.add_edges_from( (u,v,key,deepcopy(data)) for u,nbrs in self.adjacency_iter() for v,keydict in nbrs.items() for key,data in keydict.items()) H.graph=deepcopy(self.graph) H.node=deepcopy(self.node) return H def subgraph(self, nbunch): """Return the subgraph induced on nodes in nbunch. The induced subgraph of the graph contains the nodes in nbunch and the edges between those nodes. Parameters ---------- nbunch : list, iterable A container of nodes which will be iterated through once. Returns ------- G : Graph A subgraph of the graph with the same edge attributes. Notes ----- The graph, edge or node attributes just point to the original graph. So changes to the node or edge structure will not be reflected in the original graph while changes to the attributes will. To create a subgraph with its own copy of the edge/node attributes use: nx.Graph(G.subgraph(nbunch)) If edge attributes are containers, a deep copy can be obtained using: G.subgraph(nbunch).copy() For an inplace reduction of a graph to a subgraph you can remove nodes: G.remove_nodes_from([ n in G if n not in set(nbunch)]) Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> H = G.subgraph([0,1,2]) >>> H.edges() [(0, 1), (1, 2)] """ bunch = self.nbunch_iter(nbunch) # create new graph and copy subgraph into it H = self.__class__() # copy node and attribute dictionaries for n in bunch: H.node[n]=self.node[n] # namespace shortcuts for speed H_succ=H.succ H_pred=H.pred self_succ=self.succ self_pred=self.pred # add nodes for n in H: H_succ[n]={} H_pred[n]={} # add edges for u in H_succ: Hnbrs=H_succ[u] for v,edgedict in self_succ[u].items(): if v in H_succ: # add both representations of edge: u-v and v-u # they share the same edgedict ed=edgedict.copy() Hnbrs[v]=ed H_pred[v][u]=ed H.graph=self.graph return H def reverse(self, copy=True): """Return the reverse of the graph. The reverse is a graph with the same nodes and edges but with the directions of the edges reversed. Parameters ---------- copy : bool optional (default=True) If True, return a new DiGraph holding the reversed edges. If False, reverse the reverse graph is created using the original graph (this changes the original graph). """ if copy: H = self.__class__(name="Reverse of (%s)"%self.name) H.add_nodes_from(self) H.add_edges_from( (v,u,k,deepcopy(d)) for u,v,k,d in self.edges(keys=True, data=True) ) H.graph=deepcopy(self.graph) H.node=deepcopy(self.node) else: self.pred,self.succ=self.succ,self.pred self.adj=self.succ H=self return H networkx-1.11/networkx/classes/tests/test_ordered.py0000644000175000017500000000050612637544450022723 0ustar aricaric00000000000000import networkx as nx class SmokeTestOrdered(object): # Just test instantiation. def test_graph(): G = nx.OrderedGraph() def test_digraph(): G = nx.OrderedDiGraph() def test_multigraph(): G = nx.OrderedMultiGraph() def test_multidigraph(): G = nx.OrderedMultiDiGraph() networkx-1.11/networkx/classes/tests/test_graph.py0000644000175000017500000005152212637544500022400 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * import networkx class BaseGraphTester(object): """ Tests for data-structure independent graph class features.""" def test_contains(self): G=self.K3 assert(1 in G ) assert(4 not in G ) assert('b' not in G ) assert([] not in G ) # no exception for nonhashable assert({1:1} not in G) # no exception for nonhashable def test_order(self): G=self.K3 assert_equal(len(G),3) assert_equal(G.order(),3) assert_equal(G.number_of_nodes(),3) def test_nodes_iter(self): G=self.K3 assert_equal(sorted(G.nodes_iter()),self.k3nodes) assert_equal(sorted(G.nodes_iter(data=True)),[(0,{}),(1,{}),(2,{})]) def test_nodes(self): G=self.K3 assert_equal(sorted(G.nodes()),self.k3nodes) assert_equal(sorted(G.nodes(data=True)),[(0,{}),(1,{}),(2,{})]) def test_has_node(self): G=self.K3 assert(G.has_node(1)) assert(not G.has_node(4)) assert(not G.has_node([])) # no exception for nonhashable assert(not G.has_node({1:1})) # no exception for nonhashable def test_has_edge(self): G=self.K3 assert_equal(G.has_edge(0,1),True) assert_equal(G.has_edge(0,-1),False) def test_neighbors(self): G=self.K3 assert_equal(sorted(G.neighbors(0)),[1,2]) assert_raises((KeyError,networkx.NetworkXError), G.neighbors,-1) def test_neighbors_iter(self): G=self.K3 assert_equal(sorted(G.neighbors_iter(0)),[1,2]) assert_raises((KeyError,networkx.NetworkXError), G.neighbors_iter,-1) def test_edges(self): G=self.K3 assert_equal(sorted(G.edges()),[(0,1),(0,2),(1,2)]) assert_equal(sorted(G.edges(0)),[(0,1),(0,2)]) assert_raises((KeyError,networkx.NetworkXError), G.edges,-1) def test_edges_iter(self): G=self.K3 assert_equal(sorted(G.edges_iter()),[(0,1),(0,2),(1,2)]) assert_equal(sorted(G.edges_iter(0)),[(0,1),(0,2)]) f=lambda x:list(G.edges_iter(x)) assert_raises((KeyError,networkx.NetworkXError), f, -1) def test_adjacency_list(self): G=self.K3 assert_equal(G.adjacency_list(),[[1,2],[0,2],[0,1]]) def test_degree(self): G=self.K3 assert_equal(list(G.degree().values()),[2,2,2]) assert_equal(G.degree(),{0:2,1:2,2:2}) assert_equal(G.degree(0),2) assert_equal(G.degree([0]),{0:2}) assert_raises((KeyError,networkx.NetworkXError), G.degree,-1) def test_weighted_degree(self): G=self.Graph() G.add_edge(1,2,weight=2) G.add_edge(2,3,weight=3) assert_equal(list(G.degree(weight='weight').values()),[2,5,3]) assert_equal(G.degree(weight='weight'),{1:2,2:5,3:3}) assert_equal(G.degree(1,weight='weight'),2) assert_equal(G.degree([1],weight='weight'),{1:2}) def test_degree_iter(self): G=self.K3 assert_equal(list(G.degree_iter()),[(0,2),(1,2),(2,2)]) assert_equal(dict(G.degree_iter()),{0:2,1:2,2:2}) assert_equal(list(G.degree_iter(0)),[(0,2)]) def test_size(self): G=self.K3 assert_equal(G.size(),3) assert_equal(G.number_of_edges(),3) def test_add_star(self): G=self.K3.copy() nlist=[12,13,14,15] G.add_star(nlist) assert_equal(sorted(G.edges(nlist)),[(12,13),(12,14),(12,15)]) G=self.K3.copy() G.add_star(nlist,weight=2.0) assert_equal(sorted(G.edges(nlist,data=True)),\ [(12,13,{'weight':2.}), (12,14,{'weight':2.}), (12,15,{'weight':2.})]) def test_add_path(self): G=self.K3.copy() nlist=[12,13,14,15] G.add_path(nlist) assert_equal(sorted(G.edges(nlist)),[(12,13),(13,14),(14,15)]) G=self.K3.copy() G.add_path(nlist,weight=2.0) assert_equal(sorted(G.edges(nlist,data=True)),\ [(12,13,{'weight':2.}), (13,14,{'weight':2.}), (14,15,{'weight':2.})]) def test_add_cycle(self): G=self.K3.copy() nlist=[12,13,14,15] oklists=[ [(12,13),(12,15),(13,14),(14,15)], \ [(12,13),(13,14),(14,15),(15,12)] ] G.add_cycle(nlist) assert_true(sorted(G.edges(nlist)) in oklists) G=self.K3.copy() oklists=[ [(12,13,{'weight':1.}),\ (12,15,{'weight':1.}),\ (13,14,{'weight':1.}),\ (14,15,{'weight':1.})], \ \ [(12,13,{'weight':1.}),\ (13,14,{'weight':1.}),\ (14,15,{'weight':1.}),\ (15,12,{'weight':1.})] \ ] G.add_cycle(nlist,weight=1.0) assert_true(sorted(G.edges(nlist,data=True)) in oklists) def test_nbunch_iter(self): G=self.K3 assert_equal(list(G.nbunch_iter()),self.k3nodes) # all nodes assert_equal(list(G.nbunch_iter(0)),[0]) # single node assert_equal(list(G.nbunch_iter([0,1])),[0,1]) # sequence # sequence with none in graph assert_equal(list(G.nbunch_iter([-1])),[]) # string sequence with none in graph assert_equal(list(G.nbunch_iter("foo")),[]) # node not in graph doesn't get caught upon creation of iterator bunch=G.nbunch_iter(-1) # but gets caught when iterator used assert_raises(networkx.NetworkXError,list,bunch) # unhashable doesn't get caught upon creation of iterator bunch=G.nbunch_iter([0,1,2,{}]) # but gets caught when iterator hits the unhashable assert_raises(networkx.NetworkXError,list,bunch) def test_selfloop_degree(self): G=self.Graph() G.add_edge(1,1) assert_equal(list(G.degree().values()),[2]) assert_equal(G.degree(),{1:2}) assert_equal(G.degree(1),2) assert_equal(G.degree([1]),{1:2}) assert_equal(G.degree([1],weight='weight'),{1:2}) def test_selfloops(self): G=self.K3.copy() G.add_edge(0,0) assert_equal(G.nodes_with_selfloops(),[0]) assert_equal(G.selfloop_edges(),[(0,0)]) assert_equal(G.number_of_selfloops(),1) G.remove_edge(0,0) G.add_edge(0,0) G.remove_edges_from([(0,0)]) G.add_edge(1,1) G.remove_node(1) G.add_edge(0,0) G.add_edge(1,1) G.remove_nodes_from([0,1]) class BaseAttrGraphTester(BaseGraphTester): """ Tests of graph class attribute features.""" def test_weighted_degree(self): G=self.Graph() G.add_edge(1,2,weight=2,other=3) G.add_edge(2,3,weight=3,other=4) assert_equal(list(G.degree(weight='weight').values()),[2,5,3]) assert_equal(G.degree(weight='weight'),{1:2,2:5,3:3}) assert_equal(G.degree(1,weight='weight'),2) assert_equal(G.degree([1],weight='weight'),{1:2}) assert_equal(list(G.degree(weight='other').values()),[3,7,4]) assert_equal(G.degree(weight='other'),{1:3,2:7,3:4}) assert_equal(G.degree(1,weight='other'),3) assert_equal(G.degree([1],weight='other'),{1:3}) def add_attributes(self,G): G.graph['foo']=[] G.node[0]['foo']=[] G.remove_edge(1,2) ll=[] G.add_edge(1,2,foo=ll) G.add_edge(2,1,foo=ll) # attr_dict must be dict assert_raises(networkx.NetworkXError,G.add_edge,0,1,attr_dict=[]) def test_name(self): G=self.Graph(name='') assert_equal(G.name,"") G=self.Graph(name='test') assert_equal(G.__str__(),"test") assert_equal(G.name,"test") def test_copy(self): G=self.K3 self.add_attributes(G) H=G.copy() self.is_deepcopy(H,G) H=G.__class__(G) self.is_shallow_copy(H,G) def test_copy_attr(self): G=self.Graph(foo=[]) G.add_node(0,foo=[]) G.add_edge(1,2,foo=[]) H=G.copy() self.is_deepcopy(H,G) H=G.__class__(G) # just copy self.is_shallow_copy(H,G) def is_deepcopy(self,H,G): self.graphs_equal(H,G) self.different_attrdict(H,G) self.deep_copy_attrdict(H,G) def deep_copy_attrdict(self,H,G): self.deepcopy_graph_attr(H,G) self.deepcopy_node_attr(H,G) self.deepcopy_edge_attr(H,G) def deepcopy_graph_attr(self,H,G): assert_equal(G.graph['foo'],H.graph['foo']) G.graph['foo'].append(1) assert_not_equal(G.graph['foo'],H.graph['foo']) def deepcopy_node_attr(self,H,G): assert_equal(G.node[0]['foo'],H.node[0]['foo']) G.node[0]['foo'].append(1) assert_not_equal(G.node[0]['foo'],H.node[0]['foo']) def deepcopy_edge_attr(self,H,G): assert_equal(G[1][2]['foo'],H[1][2]['foo']) G[1][2]['foo'].append(1) assert_not_equal(G[1][2]['foo'],H[1][2]['foo']) def is_shallow_copy(self,H,G): self.graphs_equal(H,G) self.different_attrdict(H,G) self.shallow_copy_attrdict(H,G) def shallow_copy_attrdict(self,H,G): self.shallow_copy_graph_attr(H,G) self.shallow_copy_node_attr(H,G) self.shallow_copy_edge_attr(H,G) def shallow_copy_graph_attr(self,H,G): assert_equal(G.graph['foo'],H.graph['foo']) G.graph['foo'].append(1) assert_equal(G.graph['foo'],H.graph['foo']) def shallow_copy_node_attr(self,H,G): assert_equal(G.node[0]['foo'],H.node[0]['foo']) G.node[0]['foo'].append(1) assert_equal(G.node[0]['foo'],H.node[0]['foo']) def shallow_copy_edge_attr(self,H,G): assert_equal(G[1][2]['foo'],H[1][2]['foo']) G[1][2]['foo'].append(1) assert_equal(G[1][2]['foo'],H[1][2]['foo']) def same_attrdict(self, H, G): old_foo=H[1][2]['foo'] H.add_edge(1,2,foo='baz') assert_equal(G.edge,H.edge) H.add_edge(1,2,foo=old_foo) assert_equal(G.edge,H.edge) old_foo=H.node[0]['foo'] H.node[0]['foo']='baz' assert_equal(G.node,H.node) H.node[0]['foo']=old_foo assert_equal(G.node,H.node) def different_attrdict(self, H, G): old_foo=H[1][2]['foo'] H.add_edge(1,2,foo='baz') assert_not_equal(G.edge,H.edge) H.add_edge(1,2,foo=old_foo) assert_equal(G.edge,H.edge) old_foo=H.node[0]['foo'] H.node[0]['foo']='baz' assert_not_equal(G.node,H.node) H.node[0]['foo']=old_foo assert_equal(G.node,H.node) def graphs_equal(self,H,G): assert_equal(G.adj,H.adj) assert_equal(G.edge,H.edge) assert_equal(G.node,H.node) assert_equal(G.graph,H.graph) assert_equal(G.name,H.name) if not G.is_directed() and not H.is_directed(): assert_true(H.adj[1][2] is H.adj[2][1]) assert_true(G.adj[1][2] is G.adj[2][1]) else: # at least one is directed if not G.is_directed(): G.pred=G.adj G.succ=G.adj if not H.is_directed(): H.pred=H.adj H.succ=H.adj assert_equal(G.pred,H.pred) assert_equal(G.succ,H.succ) assert_true(H.succ[1][2] is H.pred[2][1]) assert_true(G.succ[1][2] is G.pred[2][1]) def test_graph_attr(self): G=self.K3 G.graph['foo']='bar' assert_equal(G.graph['foo'], 'bar') del G.graph['foo'] assert_equal(G.graph, {}) H=self.Graph(foo='bar') assert_equal(H.graph['foo'], 'bar') def test_node_attr(self): G=self.K3 G.add_node(1,foo='bar') assert_equal(G.nodes(), [0,1,2]) assert_equal(G.nodes(data=True), [(0,{}),(1,{'foo':'bar'}),(2,{})]) G.node[1]['foo']='baz' assert_equal(G.nodes(data=True), [(0,{}),(1,{'foo':'baz'}),(2,{})]) def test_node_attr2(self): G=self.K3 a={'foo':'bar'} G.add_node(3,attr_dict=a) assert_equal(G.nodes(), [0,1,2,3]) assert_equal(G.nodes(data=True), [(0,{}),(1,{}),(2,{}),(3,{'foo':'bar'})]) def test_edge_attr(self): G=self.Graph() G.add_edge(1,2,foo='bar') assert_equal(G.edges(data=True), [(1,2,{'foo':'bar'})]) assert_equal(G.edges(data='foo'), [(1,2,'bar')]) def test_edge_attr2(self): G=self.Graph() G.add_edges_from([(1,2),(3,4)],foo='foo') assert_equal(sorted(G.edges(data=True)), [(1,2,{'foo':'foo'}),(3,4,{'foo':'foo'})]) assert_equal(sorted(G.edges(data='foo')), [(1,2,'foo'),(3,4,'foo')]) def test_edge_attr3(self): G=self.Graph() G.add_edges_from([(1,2,{'weight':32}),(3,4,{'weight':64})],foo='foo') assert_equal(G.edges(data=True), [(1,2,{'foo':'foo','weight':32}),\ (3,4,{'foo':'foo','weight':64})]) G.remove_edges_from([(1,2),(3,4)]) G.add_edge(1,2,data=7,spam='bar',bar='foo') assert_equal(G.edges(data=True), [(1,2,{'data':7,'spam':'bar','bar':'foo'})]) def test_edge_attr4(self): G=self.Graph() G.add_edge(1,2,data=7,spam='bar',bar='foo') assert_equal(G.edges(data=True), [(1,2,{'data':7,'spam':'bar','bar':'foo'})]) G[1][2]['data']=10 # OK to set data like this assert_equal(G.edges(data=True), [(1,2,{'data':10,'spam':'bar','bar':'foo'})]) G.edge[1][2]['data']=20 # another spelling, "edge" assert_equal(G.edges(data=True), [(1,2,{'data':20,'spam':'bar','bar':'foo'})]) G.edge[1][2]['listdata']=[20,200] G.edge[1][2]['weight']=20 assert_equal(G.edges(data=True), [(1,2,{'data':20,'spam':'bar', 'bar':'foo','listdata':[20,200],'weight':20})]) def test_attr_dict_not_dict(self): # attr_dict must be dict G=self.Graph() edges=[(1,2)] assert_raises(networkx.NetworkXError,G.add_edges_from,edges, attr_dict=[]) def test_to_undirected(self): G=self.K3 self.add_attributes(G) H=networkx.Graph(G) self.is_shallow_copy(H,G) H=G.to_undirected() self.is_deepcopy(H,G) def test_to_directed(self): G=self.K3 self.add_attributes(G) H=networkx.DiGraph(G) self.is_shallow_copy(H,G) H=G.to_directed() self.is_deepcopy(H,G) def test_subgraph(self): G=self.K3 self.add_attributes(G) H=G.subgraph([0,1,2,5]) # assert_equal(H.name, 'Subgraph of ('+G.name+')') H.name=G.name self.graphs_equal(H,G) self.same_attrdict(H,G) self.shallow_copy_attrdict(H,G) H=G.subgraph(0) assert_equal(H.adj,{0:{}}) H=G.subgraph([]) assert_equal(H.adj,{}) assert_not_equal(G.adj,{}) def test_selfloops_attr(self): G=self.K3.copy() G.add_edge(0,0) G.add_edge(1,1,weight=2) assert_equal(G.selfloop_edges(data=True), [(0,0,{}),(1,1,{'weight':2})]) assert_equal(G.selfloop_edges(data='weight'), [(0,0,None),(1,1,2)]) class TestGraph(BaseAttrGraphTester): """Tests specific to dict-of-dict-of-dict graph data structure""" def setUp(self): self.Graph=networkx.Graph # build dict-of-dict-of-dict K3 ed1,ed2,ed3 = ({},{},{}) self.k3adj={0: {1: ed1, 2: ed2}, 1: {0: ed1, 2: ed3}, 2: {0: ed2, 1: ed3}} self.k3edges=[(0, 1), (0, 2), (1, 2)] self.k3nodes=[0, 1, 2] self.K3=self.Graph() self.K3.adj=self.K3.edge=self.k3adj self.K3.node={} self.K3.node[0]={} self.K3.node[1]={} self.K3.node[2]={} def test_data_input(self): G=self.Graph(data={1:[2],2:[1]}, name="test") assert_equal(G.name,"test") assert_equal(sorted(G.adj.items()),[(1, {2: {}}), (2, {1: {}})]) G=self.Graph({1:[2],2:[1]}, name="test") assert_equal(G.name,"test") assert_equal(sorted(G.adj.items()),[(1, {2: {}}), (2, {1: {}})]) def test_adjacency_iter(self): G=self.K3 assert_equal(dict(G.adjacency_iter()), {0: {1: {}, 2: {}}, 1: {0: {}, 2: {}}, 2: {0: {}, 1: {}}}) def test_getitem(self): G=self.K3 assert_equal(G[0],{1: {}, 2: {}}) assert_raises(KeyError, G.__getitem__, 'j') assert_raises((TypeError,networkx.NetworkXError), G.__getitem__, ['A']) def test_add_node(self): G=self.Graph() G.add_node(0) assert_equal(G.adj,{0:{}}) # test add attributes G.add_node(1,c='red') G.add_node(2,{'c':'blue'}) G.add_node(3,{'c':'blue'},c='red') assert_raises(networkx.NetworkXError, G.add_node, 4, []) assert_raises(networkx.NetworkXError, G.add_node, 4, 4) assert_equal(G.node[1]['c'],'red') assert_equal(G.node[2]['c'],'blue') assert_equal(G.node[3]['c'],'red') # test updating attributes G.add_node(1,c='blue') G.add_node(2,{'c':'red'}) G.add_node(3,{'c':'red'},c='blue') assert_equal(G.node[1]['c'],'blue') assert_equal(G.node[2]['c'],'red') assert_equal(G.node[3]['c'],'blue') def test_add_nodes_from(self): G=self.Graph() G.add_nodes_from([0,1,2]) assert_equal(G.adj,{0:{},1:{},2:{}}) # test add attributes G.add_nodes_from([0,1,2],c='red') assert_equal(G.node[0]['c'],'red') assert_equal(G.node[2]['c'],'red') # test that attribute dicts are not the same assert(G.node[0] is not G.node[1]) # test updating attributes G.add_nodes_from([0,1,2],c='blue') assert_equal(G.node[0]['c'],'blue') assert_equal(G.node[2]['c'],'blue') assert(G.node[0] is not G.node[1]) # test tuple input H=self.Graph() H.add_nodes_from(G.nodes(data=True)) assert_equal(H.node[0]['c'],'blue') assert_equal(H.node[2]['c'],'blue') assert(H.node[0] is not H.node[1]) # specific overrides general H.add_nodes_from([0,(1,{'c':'green'}),(3,{'c':'cyan'})],c='red') assert_equal(H.node[0]['c'],'red') assert_equal(H.node[1]['c'],'green') assert_equal(H.node[2]['c'],'blue') assert_equal(H.node[3]['c'],'cyan') def test_remove_node(self): G=self.K3 G.remove_node(0) assert_equal(G.adj,{1:{2:{}},2:{1:{}}}) assert_raises((KeyError,networkx.NetworkXError), G.remove_node,-1) # generator here to implement list,set,string... def test_remove_nodes_from(self): G=self.K3 G.remove_nodes_from([0,1]) assert_equal(G.adj,{2:{}}) G.remove_nodes_from([-1]) # silent fail def test_add_edge(self): G=self.Graph() G.add_edge(0,1) assert_equal(G.adj,{0: {1: {}}, 1: {0: {}}}) G=self.Graph() G.add_edge(*(0,1)) assert_equal(G.adj,{0: {1: {}}, 1: {0: {}}}) def test_add_edges_from(self): G=self.Graph() G.add_edges_from([(0,1),(0,2,{'weight':3})]) assert_equal(G.adj,{0: {1:{}, 2:{'weight':3}}, 1: {0:{}}, \ 2:{0:{'weight':3}}}) G=self.Graph() G.add_edges_from([(0,1),(0,2,{'weight':3}),(1,2,{'data':4})],data=2) assert_equal(G.adj,{\ 0: {1:{'data':2}, 2:{'weight':3,'data':2}}, \ 1: {0:{'data':2}, 2:{'data':4}}, \ 2: {0:{'weight':3,'data':2}, 1:{'data':4}} \ }) assert_raises(networkx.NetworkXError, G.add_edges_from,[(0,)]) # too few in tuple assert_raises(networkx.NetworkXError, G.add_edges_from,[(0,1,2,3)]) # too many in tuple assert_raises(TypeError, G.add_edges_from,[0]) # not a tuple def test_remove_edge(self): G=self.K3 G.remove_edge(0,1) assert_equal(G.adj,{0:{2:{}},1:{2:{}},2:{0:{},1:{}}}) assert_raises((KeyError,networkx.NetworkXError), G.remove_edge,-1,0) def test_remove_edges_from(self): G=self.K3 G.remove_edges_from([(0,1)]) assert_equal(G.adj,{0:{2:{}},1:{2:{}},2:{0:{},1:{}}}) G.remove_edges_from([(0,0)]) # silent fail def test_clear(self): G=self.K3 G.clear() assert_equal(G.adj,{}) def test_edges_data(self): G=self.K3 assert_equal(sorted(G.edges(data=True)),[(0,1,{}),(0,2,{}),(1,2,{})]) assert_equal(sorted(G.edges(0,data=True)),[(0,1,{}),(0,2,{})]) assert_raises((KeyError,networkx.NetworkXError), G.edges,-1) def test_get_edge_data(self): G=self.K3 assert_equal(G.get_edge_data(0,1),{}) assert_equal(G[0][1],{}) assert_equal(G.get_edge_data(10,20),None) assert_equal(G.get_edge_data(-1,0),None) assert_equal(G.get_edge_data(-1,0,default=1),1) networkx-1.11/networkx/classes/tests/test_special.py0000644000175000017500000000722012637544450022717 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * import networkx as nx from test_graph import TestGraph from test_digraph import TestDiGraph from test_multigraph import TestMultiGraph from test_multidigraph import TestMultiDiGraph try: # python 2.7+ from collections import OrderedDict except ImportError: # python 2.6 try: from ordereddict import OrderedDict except ImportError: from nose import SkipTest raise SkipTest('ordereddict not available') class SpecialGraphTester(TestGraph): def setUp(self): TestGraph.setUp(self) self.Graph=nx.Graph class OrderedGraphTester(TestGraph): def setUp(self): TestGraph.setUp(self) class MyGraph(nx.Graph): node_dict_factory = OrderedDict adjlist_dict_factory = OrderedDict edge_attr_dict_factory = OrderedDict self.Graph=MyGraph class ThinGraphTester(TestGraph): def setUp(self): all_edge_dict = {'weight' : 1} class MyGraph(nx.Graph): edge_attr_dict_factory = lambda : all_edge_dict self.Graph=MyGraph # build dict-of-dict-of-dict K3 ed1,ed2,ed3 = (all_edge_dict,all_edge_dict,all_edge_dict) self.k3adj={0: {1: ed1, 2: ed2}, 1: {0: ed1, 2: ed3}, 2: {0: ed2, 1: ed3}} self.k3edges=[(0, 1), (0, 2), (1, 2)] self.k3nodes=[0, 1, 2] self.K3=self.Graph() self.K3.adj=self.K3.edge=self.k3adj self.K3.node={} self.K3.node[0]={} self.K3.node[1]={} self.K3.node[2]={} class SpecialDiGraphTester(TestDiGraph): def setUp(self): TestDiGraph.setUp(self) self.Graph=nx.DiGraph class OrderedDiGraphTester(TestDiGraph): def setUp(self): TestGraph.setUp(self) class MyGraph(nx.DiGraph): node_dict_factory = OrderedDict adjlist_dict_factory = OrderedDict edge_attr_dict_factory = OrderedDict self.Graph=MyGraph class ThinDiGraphTester(TestDiGraph): def setUp(self): all_edge_dict = {'weight' : 1} class MyGraph(nx.DiGraph): edge_attr_dict_factory = lambda : all_edge_dict self.Graph=MyGraph # build dict-of-dict-of-dict K3 ed1,ed2,ed3 = (all_edge_dict,all_edge_dict,all_edge_dict) self.k3adj={0: {1: ed1, 2: ed2}, 1: {0: ed1, 2: ed3}, 2: {0: ed2, 1: ed3}} self.k3edges=[(0, 1), (0, 2), (1, 2)] self.k3nodes=[0, 1, 2] self.K3=self.Graph() self.K3.adj=self.K3.edge=self.k3adj self.K3.node={} self.K3.node[0]={} self.K3.node[1]={} self.K3.node[2]={} class SpecialMultiGraphTester(TestMultiGraph): def setUp(self): TestMultiGraph.setUp(self) self.Graph=nx.MultiGraph class OrderedMultiGraphTester(TestMultiGraph): def setUp(self): TestMultiGraph.setUp(self) class MyGraph(nx.MultiGraph): node_dict_factory = OrderedDict adjlist_dict_factory = OrderedDict edge_key_dict_factory = OrderedDict edge_attr_dict_factory = OrderedDict self.Graph=MyGraph class SpecialMultiDiGraphTester(TestMultiDiGraph): def setUp(self): TestMultiDiGraph.setUp(self) self.Graph=nx.MultiDiGraph class OrderedMultiDiGraphTester(TestMultiDiGraph): def setUp(self): TestMultiDiGraph.setUp(self) class MyGraph(nx.MultiDiGraph): node_dict_factory = OrderedDict adjlist_dict_factory = OrderedDict edge_key_dict_factory = OrderedDict edge_attr_dict_factory = OrderedDict self.Graph=MyGraph networkx-1.11/networkx/classes/tests/test_digraph.py0000644000175000017500000002500212637544500022707 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * import networkx from test_graph import BaseGraphTester, BaseAttrGraphTester, TestGraph class BaseDiGraphTester(BaseGraphTester): def test_has_successor(self): G=self.K3 assert_equal(G.has_successor(0,1),True) assert_equal(G.has_successor(0,-1),False) def test_successors(self): G=self.K3 assert_equal(sorted(G.successors(0)),[1,2]) assert_raises((KeyError,networkx.NetworkXError), G.successors,-1) def test_successors_iter(self): G=self.K3 assert_equal(sorted(G.successors_iter(0)),[1,2]) assert_raises((KeyError,networkx.NetworkXError), G.successors_iter,-1) def test_has_predecessor(self): G=self.K3 assert_equal(G.has_predecessor(0,1),True) assert_equal(G.has_predecessor(0,-1),False) def test_predecessors(self): G=self.K3 assert_equal(sorted(G.predecessors(0)),[1,2]) assert_raises((KeyError,networkx.NetworkXError), G.predecessors,-1) def test_predecessors_iter(self): G=self.K3 assert_equal(sorted(G.predecessors_iter(0)),[1,2]) assert_raises((KeyError,networkx.NetworkXError), G.predecessors_iter,-1) def test_edges(self): G=self.K3 assert_equal(sorted(G.edges()),[(0,1),(0,2),(1,0),(1,2),(2,0),(2,1)]) assert_equal(sorted(G.edges(0)),[(0,1),(0,2)]) assert_raises((KeyError,networkx.NetworkXError), G.edges,-1) def test_edges_iter(self): G=self.K3 assert_equal(sorted(G.edges_iter()), [(0,1),(0,2),(1,0),(1,2),(2,0),(2,1)]) assert_equal(sorted(G.edges_iter(0)),[(0,1),(0,2)]) def test_edges_data(self): G=self.K3 assert_equal(sorted(G.edges(data=True)), [(0,1,{}),(0,2,{}),(1,0,{}),(1,2,{}),(2,0,{}),(2,1,{})]) assert_equal(sorted(G.edges(0,data=True)),[(0,1,{}),(0,2,{})]) assert_raises((KeyError,networkx.NetworkXError), G.edges,-1) def test_out_edges(self): G=self.K3 assert_equal(sorted(G.out_edges()), [(0,1),(0,2),(1,0),(1,2),(2,0),(2,1)]) assert_equal(sorted(G.out_edges(0)),[(0,1),(0,2)]) assert_raises((KeyError,networkx.NetworkXError), G.out_edges,-1) def test_out_edges_iter(self): G=self.K3 assert_equal(sorted(G.out_edges_iter()), [(0,1),(0,2),(1,0),(1,2),(2,0),(2,1)]) assert_equal(sorted(G.edges_iter(0)),[(0,1),(0,2)]) def test_out_edges_dir(self): G=self.P3 assert_equal(sorted(G.out_edges()),[(0, 1), (1, 2)]) assert_equal(sorted(G.out_edges(0)),[(0, 1)]) assert_equal(sorted(G.out_edges(2)),[]) def test_out_edges_iter_dir(self): G=self.P3 assert_equal(sorted(G.out_edges_iter()),[(0, 1), (1, 2)]) assert_equal(sorted(G.out_edges_iter(0)),[(0, 1)]) assert_equal(sorted(G.out_edges_iter(2)),[]) def test_in_edges_dir(self): G=self.P3 assert_equal(sorted(G.in_edges()),[(0, 1), (1, 2)]) assert_equal(sorted(G.in_edges(0)),[]) assert_equal(sorted(G.in_edges(2)),[(1,2)]) def test_in_edges_iter_dir(self): G=self.P3 assert_equal(sorted(G.in_edges_iter()),[(0, 1), (1, 2)]) assert_equal(sorted(G.in_edges_iter(0)),[]) assert_equal(sorted(G.in_edges_iter(2)),[(1,2)]) def test_degree(self): G=self.K3 assert_equal(list(G.degree().values()),[4,4,4]) assert_equal(G.degree(),{0:4,1:4,2:4}) assert_equal(G.degree(0),4) assert_equal(G.degree([0]),{0:4}) assert_raises((KeyError,networkx.NetworkXError), G.degree,-1) def test_degree_iter(self): G=self.K3 assert_equal(list(G.degree_iter()),[(0,4),(1,4),(2,4)]) assert_equal(dict(G.degree_iter()),{0:4,1:4,2:4}) assert_equal(list(G.degree_iter(0)),[(0,4)]) assert_equal(list(G.degree_iter(iter([0]))),[(0,4)]) #run through iterator def test_in_degree(self): G=self.K3 assert_equal(list(G.in_degree().values()),[2,2,2]) assert_equal(G.in_degree(),{0:2,1:2,2:2}) assert_equal(G.in_degree(0),2) assert_equal(G.in_degree([0]),{0:2}) assert_equal(G.in_degree(iter([0])),{0:2}) assert_raises((KeyError,networkx.NetworkXError), G.in_degree,-1) def test_in_degree_iter(self): G=self.K3 assert_equal(list(G.in_degree_iter()),[(0,2),(1,2),(2,2)]) assert_equal(dict(G.in_degree_iter()),{0:2,1:2,2:2}) assert_equal(list(G.in_degree_iter(0)),[(0,2)]) assert_equal(list(G.in_degree_iter(iter([0]))),[(0,2)]) #run through iterator def test_in_degree_iter_weighted(self): G=self.K3 G.add_edge(0,1,weight=0.3,other=1.2) assert_equal(list(G.in_degree_iter(weight='weight')),[(0,2),(1,1.3),(2,2)]) assert_equal(dict(G.in_degree_iter(weight='weight')),{0:2,1:1.3,2:2}) assert_equal(list(G.in_degree_iter(1,weight='weight')),[(1,1.3)]) assert_equal(list(G.in_degree_iter(weight='other')),[(0,2),(1,2.2),(2,2)]) assert_equal(dict(G.in_degree_iter(weight='other')),{0:2,1:2.2,2:2}) assert_equal(list(G.in_degree_iter(1,weight='other')),[(1,2.2)]) assert_equal(list(G.in_degree_iter(iter([1]),weight='other')),[(1,2.2)]) def test_out_degree(self): G=self.K3 assert_equal(list(G.out_degree().values()),[2,2,2]) assert_equal(G.out_degree(),{0:2,1:2,2:2}) assert_equal(G.out_degree(0),2) assert_equal(G.out_degree([0]),{0:2}) assert_equal(G.out_degree(iter([0])),{0:2}) assert_raises((KeyError,networkx.NetworkXError), G.out_degree,-1) def test_out_degree_iter_weighted(self): G=self.K3 G.add_edge(0,1,weight=0.3,other=1.2) assert_equal(list(G.out_degree_iter(weight='weight')),[(0,1.3),(1,2),(2,2)]) assert_equal(dict(G.out_degree_iter(weight='weight')),{0:1.3,1:2,2:2}) assert_equal(list(G.out_degree_iter(0,weight='weight')),[(0,1.3)]) assert_equal(list(G.out_degree_iter(weight='other')),[(0,2.2),(1,2),(2,2)]) assert_equal(dict(G.out_degree_iter(weight='other')),{0:2.2,1:2,2:2}) assert_equal(list(G.out_degree_iter(0,weight='other')),[(0,2.2)]) assert_equal(list(G.out_degree_iter(iter([0]),weight='other')),[(0,2.2)]) def test_out_degree_iter(self): G=self.K3 assert_equal(list(G.out_degree_iter()),[(0,2),(1,2),(2,2)]) assert_equal(dict(G.out_degree_iter()),{0:2,1:2,2:2}) assert_equal(list(G.out_degree_iter(0)),[(0,2)]) assert_equal(list(G.out_degree_iter(iter([0]))),[(0,2)]) def test_size(self): G=self.K3 assert_equal(G.size(),6) assert_equal(G.number_of_edges(),6) def test_to_undirected_reciprocal(self): G=self.Graph() G.add_edge(1,2) assert_true(G.to_undirected().has_edge(1,2)) assert_false(G.to_undirected(reciprocal=True).has_edge(1,2)) G.add_edge(2,1) assert_true(G.to_undirected(reciprocal=True).has_edge(1,2)) def test_reverse_copy(self): G=networkx.DiGraph([(0,1),(1,2)]) R=G.reverse() assert_equal(sorted(R.edges()),[(1,0),(2,1)]) R.remove_edge(1,0) assert_equal(sorted(R.edges()),[(2,1)]) assert_equal(sorted(G.edges()),[(0,1),(1,2)]) def test_reverse_nocopy(self): G=networkx.DiGraph([(0,1),(1,2)]) R=G.reverse(copy=False) assert_equal(sorted(R.edges()),[(1,0),(2,1)]) R.remove_edge(1,0) assert_equal(sorted(R.edges()),[(2,1)]) assert_equal(sorted(G.edges()),[(2,1)]) class BaseAttrDiGraphTester(BaseDiGraphTester,BaseAttrGraphTester): pass class TestDiGraph(BaseAttrDiGraphTester,TestGraph): """Tests specific to dict-of-dict-of-dict digraph data structure""" def setUp(self): self.Graph=networkx.DiGraph # build dict-of-dict-of-dict K3 ed1,ed2,ed3,ed4,ed5,ed6 = ({},{},{},{},{},{}) self.k3adj={0: {1: ed1, 2: ed2}, 1: {0: ed3, 2: ed4}, 2: {0: ed5, 1:ed6}} self.k3edges=[(0, 1), (0, 2), (1, 2)] self.k3nodes=[0, 1, 2] self.K3=self.Graph() self.K3.adj = self.K3.succ = self.K3.edge = self.k3adj self.K3.pred={0: {1: ed3, 2: ed5}, 1: {0: ed1, 2: ed6}, 2: {0: ed2, 1:ed4}} ed1,ed2 = ({},{}) self.P3=self.Graph() self.P3.adj={0: {1: ed1}, 1: {2: ed2}, 2: {}} self.P3.succ=self.P3.adj self.P3.pred={0: {}, 1: {0: ed1}, 2: {1: ed2}} self.K3.node={} self.K3.node[0]={} self.K3.node[1]={} self.K3.node[2]={} self.P3.node={} self.P3.node[0]={} self.P3.node[1]={} self.P3.node[2]={} def test_data_input(self): G=self.Graph(data={1:[2],2:[1]}, name="test") assert_equal(G.name,"test") assert_equal(sorted(G.adj.items()),[(1, {2: {}}), (2, {1: {}})]) assert_equal(sorted(G.succ.items()),[(1, {2: {}}), (2, {1: {}})]) assert_equal(sorted(G.pred.items()),[(1, {2: {}}), (2, {1: {}})]) def test_add_edge(self): G=self.Graph() G.add_edge(0,1) assert_equal(G.adj,{0: {1: {}}, 1: {}}) assert_equal(G.succ,{0: {1: {}}, 1: {}}) assert_equal(G.pred,{0: {}, 1: {0:{}}}) G=self.Graph() G.add_edge(*(0,1)) assert_equal(G.adj,{0: {1: {}}, 1: {}}) assert_equal(G.succ,{0: {1: {}}, 1: {}}) assert_equal(G.pred,{0: {}, 1: {0:{}}}) def test_add_edges_from(self): G=self.Graph() G.add_edges_from([(0,1),(0,2,{'data':3})],data=2) assert_equal(G.adj,{0: {1: {'data':2}, 2: {'data':3}}, 1: {}, 2: {}}) assert_equal(G.succ,{0: {1: {'data':2}, 2: {'data':3}}, 1: {}, 2: {}}) assert_equal(G.pred,{0: {}, 1: {0: {'data':2}}, 2: {0: {'data':3}}}) assert_raises(networkx.NetworkXError, G.add_edges_from,[(0,)]) # too few in tuple assert_raises(networkx.NetworkXError, G.add_edges_from,[(0,1,2,3)]) # too many in tuple assert_raises(TypeError, G.add_edges_from,[0]) # not a tuple def test_remove_edge(self): G=self.K3 G.remove_edge(0,1) assert_equal(G.succ,{0:{2:{}},1:{0:{},2:{}},2:{0:{},1:{}}}) assert_equal(G.pred,{0:{1:{}, 2:{}}, 1:{2:{}}, 2:{0:{},1:{}}}) assert_raises((KeyError,networkx.NetworkXError), G.remove_edge,-1,0) def test_remove_edges_from(self): G=self.K3 G.remove_edges_from([(0,1)]) assert_equal(G.succ,{0:{2:{}},1:{0:{},2:{}},2:{0:{},1:{}}}) assert_equal(G.pred,{0:{1:{}, 2:{}}, 1:{2:{}}, 2:{0:{},1: {}}}) G.remove_edges_from([(0,0)]) # silent fail networkx-1.11/networkx/classes/tests/test_multigraph.py0000644000175000017500000002060612637544500023452 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * import networkx from test_graph import BaseAttrGraphTester, TestGraph class BaseMultiGraphTester(BaseAttrGraphTester): def test_has_edge(self): G=self.K3 assert_equal(G.has_edge(0,1),True) assert_equal(G.has_edge(0,-1),False) assert_equal(G.has_edge(0,1,0),True) assert_equal(G.has_edge(0,1,1),False) def test_get_edge_data(self): G=self.K3 assert_equal(G.get_edge_data(0,1),{0:{}}) assert_equal(G[0][1],{0:{}}) assert_equal(G[0][1][0],{}) assert_equal(G.get_edge_data(10,20),None) assert_equal(G.get_edge_data(0,1,0),{}) def test_adjacency_iter(self): G=self.K3 assert_equal(dict(G.adjacency_iter()), {0: {1: {0:{}}, 2: {0:{}}}, 1: {0: {0:{}}, 2: {0:{}}}, 2: {0: {0:{}}, 1: {0:{}}}}) def deepcopy_edge_attr(self,H,G): assert_equal(G[1][2][0]['foo'],H[1][2][0]['foo']) G[1][2][0]['foo'].append(1) assert_not_equal(G[1][2][0]['foo'],H[1][2][0]['foo']) def shallow_copy_edge_attr(self,H,G): assert_equal(G[1][2][0]['foo'],H[1][2][0]['foo']) G[1][2][0]['foo'].append(1) assert_equal(G[1][2][0]['foo'],H[1][2][0]['foo']) def same_attrdict(self, H, G): # same attrdict in the edgedata old_foo=H[1][2][0]['foo'] H.add_edge(1,2,0,foo='baz') assert_equal(G.edge,H.edge) H.add_edge(1,2,0,foo=old_foo) assert_equal(G.edge,H.edge) # but not same edgedata dict H.add_edge(1,2,foo='baz') assert_not_equal(G.edge,H.edge) old_foo=H.node[0]['foo'] H.node[0]['foo']='baz' assert_equal(G.node,H.node) H.node[0]['foo']=old_foo assert_equal(G.node,H.node) def different_attrdict(self, H, G): # used by graph_equal_but_different old_foo=H[1][2][0]['foo'] H.add_edge(1,2,0,foo='baz') assert_not_equal(G.edge,H.edge) H.add_edge(1,2,0,foo=old_foo) assert_equal(G.edge,H.edge) HH=H.copy() H.add_edge(1,2,foo='baz') assert_not_equal(G.edge,H.edge) H=HH old_foo=H.node[0]['foo'] H.node[0]['foo']='baz' assert_not_equal(G.node,H.node) H.node[0]['foo']=old_foo assert_equal(G.node,H.node) def test_to_undirected(self): G=self.K3 self.add_attributes(G) H=networkx.MultiGraph(G) self.is_shallow_copy(H,G) H=G.to_undirected() self.is_deepcopy(H,G) def test_to_directed(self): G=self.K3 self.add_attributes(G) H=networkx.MultiDiGraph(G) self.is_shallow_copy(H,G) H=G.to_directed() self.is_deepcopy(H,G) def test_selfloops(self): G=self.K3 G.add_edge(0,0) assert_equal(G.nodes_with_selfloops(),[0]) assert_equal(G.selfloop_edges(),[(0,0)]) assert_equal(G.selfloop_edges(data=True),[(0,0,{})]) assert_equal(G.number_of_selfloops(),1) def test_selfloops2(self): G=self.K3 G.add_edge(0,0) G.add_edge(0,0) G.add_edge(0,0,key='parallel edge') G.remove_edge(0,0,key='parallel edge') assert_equal(G.number_of_edges(0,0),2) G.remove_edge(0,0) assert_equal(G.number_of_edges(0,0),1) def test_edge_attr4(self): G=self.Graph() G.add_edge(1,2,key=0,data=7,spam='bar',bar='foo') assert_equal(G.edges(data=True), [(1,2,{'data':7,'spam':'bar','bar':'foo'})]) G[1][2][0]['data']=10 # OK to set data like this assert_equal(G.edges(data=True), [(1,2,{'data':10,'spam':'bar','bar':'foo'})]) G.edge[1][2][0]['data']=20 # another spelling, "edge" assert_equal(G.edges(data=True), [(1,2,{'data':20,'spam':'bar','bar':'foo'})]) G.edge[1][2][0]['listdata']=[20,200] G.edge[1][2][0]['weight']=20 assert_equal(G.edges(data=True), [(1,2,{'data':20,'spam':'bar', 'bar':'foo','listdata':[20,200],'weight':20})]) class TestMultiGraph(BaseMultiGraphTester,TestGraph): def setUp(self): self.Graph=networkx.MultiGraph # build K3 ed1,ed2,ed3 = ({0:{}},{0:{}},{0:{}}) self.k3adj={0: {1: ed1, 2: ed2}, 1: {0: ed1, 2: ed3}, 2: {0: ed2, 1: ed3}} self.k3edges=[(0, 1), (0, 2), (1, 2)] self.k3nodes=[0, 1, 2] self.K3=self.Graph() self.K3.adj = self.K3.edge = self.k3adj self.K3.node={} self.K3.node[0]={} self.K3.node[1]={} self.K3.node[2]={} def test_data_input(self): G=self.Graph(data={1:[2],2:[1]}, name="test") assert_equal(G.name,"test") assert_equal(sorted(G.adj.items()),[(1, {2: {0:{}}}), (2, {1: {0:{}}})]) def test_getitem(self): G=self.K3 assert_equal(G[0],{1: {0:{}}, 2: {0:{}}}) assert_raises(KeyError, G.__getitem__, 'j') assert_raises((TypeError,networkx.NetworkXError), G.__getitem__, ['A']) def test_remove_node(self): G=self.K3 G.remove_node(0) assert_equal(G.adj,{1:{2:{0:{}}},2:{1:{0:{}}}}) assert_raises((KeyError,networkx.NetworkXError), G.remove_node,-1) def test_add_edge(self): G=self.Graph() G.add_edge(0,1) assert_equal(G.adj,{0: {1: {0:{}}}, 1: {0: {0:{}}}}) G=self.Graph() G.add_edge(*(0,1)) assert_equal(G.adj,{0: {1: {0:{}}}, 1: {0: {0:{}}}}) def test_add_edge_conflicting_key(self): G=self.Graph() G.add_edge(0,1,key=1) G.add_edge(0,1) assert_equal(G.number_of_edges(),2) G=self.Graph() G.add_edges_from([(0,1,1,{})]) G.add_edges_from([(0,1)]) assert_equal(G.number_of_edges(),2) def test_add_edges_from(self): G=self.Graph() G.add_edges_from([(0,1),(0,1,{'weight':3})]) assert_equal(G.adj,{0: {1: {0:{},1:{'weight':3}}}, 1: {0: {0:{},1:{'weight':3}}}}) G.add_edges_from([(0,1),(0,1,{'weight':3})],weight=2) assert_equal(G.adj,{0: {1: {0:{},1:{'weight':3}, 2:{'weight':2},3:{'weight':3}}}, 1: {0: {0:{},1:{'weight':3}, 2:{'weight':2},3:{'weight':3}}}}) # too few in tuple assert_raises(networkx.NetworkXError, G.add_edges_from,[(0,)]) # too many in tuple assert_raises(networkx.NetworkXError, G.add_edges_from,[(0,1,2,3,4)]) assert_raises(TypeError, G.add_edges_from,[0]) # not a tuple def test_remove_edge(self): G=self.K3 G.remove_edge(0,1) assert_equal(G.adj,{0: {2: {0: {}}}, 1: {2: {0: {}}}, 2: {0: {0: {}}, 1: {0: {}}}}) assert_raises((KeyError,networkx.NetworkXError), G.remove_edge,-1,0) assert_raises((KeyError,networkx.NetworkXError), G.remove_edge,0,2, key=1) def test_remove_edges_from(self): G=self.K3.copy() G.remove_edges_from([(0,1)]) assert_equal(G.adj,{0:{2:{0:{}}},1:{2:{0:{}}},2:{0:{0:{}},1:{0:{}}}}) G.remove_edges_from([(0,0)]) # silent fail self.K3.add_edge(0,1) G=self.K3.copy() G.remove_edges_from(G.edges(data=True,keys=True)) assert_equal(G.adj,{0:{},1:{},2:{}}) G=self.K3.copy() G.remove_edges_from(G.edges(data=False,keys=True)) assert_equal(G.adj,{0:{},1:{},2:{}}) G=self.K3.copy() G.remove_edges_from(G.edges(data=False,keys=False)) assert_equal(G.adj,{0:{},1:{},2:{}}) G=self.K3.copy() G.remove_edges_from([(0,1,0),(0,2,0,{}),(1,2)]) assert_equal(G.adj,{0:{1:{1:{}}},1:{0:{1:{}}},2:{}}) def test_remove_multiedge(self): G=self.K3 G.add_edge(0,1,key='parallel edge') G.remove_edge(0,1,key='parallel edge') assert_equal(G.adj,{0: {1: {0:{}}, 2: {0:{}}}, 1: {0: {0:{}}, 2: {0:{}}}, 2: {0: {0:{}}, 1: {0:{}}}}) G.remove_edge(0,1) assert_equal(G.adj,{0:{2:{0:{}}},1:{2:{0:{}}},2:{0:{0:{}},1:{0:{}}}}) assert_raises((KeyError,networkx.NetworkXError), G.remove_edge,-1,0) networkx-1.11/networkx/classes/tests/historical_tests.py0000644000175000017500000004247512637544500023632 0ustar aricaric00000000000000#!/usr/bin/env python """Original NetworkX graph tests""" from nose.tools import * import networkx import networkx as nx from networkx import convert_node_labels_to_integers as cnlti from networkx.testing import * class HistoricalTests(object): def setUp(self): self.null=nx.null_graph() self.P1=cnlti(nx.path_graph(1),first_label=1) self.P3=cnlti(nx.path_graph(3),first_label=1) self.P10=cnlti(nx.path_graph(10),first_label=1) self.K1=cnlti(nx.complete_graph(1),first_label=1) self.K3=cnlti(nx.complete_graph(3),first_label=1) self.K4=cnlti(nx.complete_graph(4),first_label=1) self.K5=cnlti(nx.complete_graph(5),first_label=1) self.K10=cnlti(nx.complete_graph(10),first_label=1) self.G=nx.Graph def test_name(self): G=self.G(name="test") assert_equal(str(G),'test') assert_equal(G.name,'test') H=self.G() assert_equal(H.name,'') # Nodes def test_add_remove_node(self): G=self.G() G.add_node('A') assert_true(G.has_node('A')) G.remove_node('A') assert_false(G.has_node('A')) def test_nonhashable_node(self): # Test if a non-hashable object is in the Graph. A python dict will # raise a TypeError, but for a Graph class a simple False should be # returned (see Graph __contains__). If it cannot be a node then it is # not a node. G=self.G() assert_false(G.has_node(['A'])) assert_false(G.has_node({'A':1})) def test_add_nodes_from(self): G=self.G() G.add_nodes_from(list("ABCDEFGHIJKL")) assert_true(G.has_node("L")) G.remove_nodes_from(['H','I','J','K','L']) G.add_nodes_from([1,2,3,4]) assert_equal(sorted(G.nodes(),key=str), [1, 2, 3, 4, 'A', 'B', 'C', 'D', 'E', 'F', 'G']) # test __iter__ assert_equal(sorted(G,key=str), [1, 2, 3, 4, 'A', 'B', 'C', 'D', 'E', 'F', 'G']) def test_contains(self): G=self.G() G.add_node('A') assert_true('A' in G) assert_false([] in G) # never raise a Key or TypeError in this test assert_false({1:1} in G) def test_add_remove(self): # Test add_node and remove_node acting for various nbunch G=self.G() G.add_node('m') assert_true(G.has_node('m')) G.add_node('m') # no complaints assert_raises(nx.NetworkXError,G.remove_node,'j') G.remove_node('m') assert_equal(G.nodes(),[]) def test_nbunch_is_list(self): G=self.G() G.add_nodes_from(list("ABCD")) G.add_nodes_from(self.P3) # add nbunch of nodes (nbunch=Graph) assert_equal(sorted(G.nodes(),key=str), [1, 2, 3, 'A', 'B', 'C', 'D']) G.remove_nodes_from(self.P3) # remove nbunch of nodes (nbunch=Graph) assert_equal(sorted(G.nodes(),key=str), ['A', 'B', 'C', 'D']) def test_nbunch_is_set(self): G=self.G() nbunch=set("ABCDEFGHIJKL") G.add_nodes_from(nbunch) assert_true(G.has_node("L")) def test_nbunch_dict(self): # nbunch is a dict with nodes as keys G=self.G() nbunch=set("ABCDEFGHIJKL") G.add_nodes_from(nbunch) nbunch={'I':"foo",'J':2,'K':True,'L':"spam"} G.remove_nodes_from(nbunch) assert_true(sorted(G.nodes(),key=str), ['A', 'B', 'C', 'D', 'E', 'F', 'G', 'H']) def test_nbunch_iterator(self): G=self.G() G.add_nodes_from(['A', 'B', 'C', 'D', 'E', 'F', 'G', 'H']) n_iter=self.P3.nodes_iter() G.add_nodes_from(n_iter) assert_equal(sorted(G.nodes(),key=str), [1, 2, 3, 'A', 'B', 'C', 'D', 'E', 'F', 'G', 'H']) n_iter=self.P3.nodes_iter() # rebuild same iterator G.remove_nodes_from(n_iter) # remove nbunch of nodes (nbunch=iterator) assert_equal(sorted(G.nodes(),key=str), ['A', 'B', 'C', 'D', 'E', 'F', 'G', 'H']) def test_nbunch_graph(self): G=self.G() G.add_nodes_from(['A', 'B', 'C', 'D', 'E', 'F', 'G', 'H']) nbunch=self.K3 G.add_nodes_from(nbunch) assert_true(sorted(G.nodes(),key=str), [1, 2, 3, 'A', 'B', 'C', 'D', 'E', 'F', 'G', 'H']) # Edges def test_add_edge(self): G=self.G() assert_raises(TypeError,G.add_edge,'A') G.add_edge('A','B') # testing add_edge() G.add_edge('A','B') # should fail silently assert_true(G.has_edge('A','B')) assert_false(G.has_edge('A','C')) assert_true(G.has_edge( *('A','B') )) if G.is_directed(): assert_false(G.has_edge('B','A')) else: # G is undirected, so B->A is an edge assert_true(G.has_edge('B','A')) G.add_edge('A','C') # test directedness G.add_edge('C','A') G.remove_edge('C','A') if G.is_directed(): assert_true(G.has_edge('A','C')) else: assert_false(G.has_edge('A','C')) assert_false(G.has_edge('C','A')) def test_self_loop(self): G=self.G() G.add_edge('A','A') # test self loops assert_true(G.has_edge('A','A')) G.remove_edge('A','A') G.add_edge('X','X') assert_true(G.has_node('X')) G.remove_node('X') G.add_edge('A','Z') # should add the node silently assert_true(G.has_node('Z')) def test_add_edges_from(self): G=self.G() G.add_edges_from([('B','C')]) # test add_edges_from() assert_true(G.has_edge('B','C')) if G.is_directed(): assert_false(G.has_edge('C','B')) else: assert_true(G.has_edge('C','B')) # undirected G.add_edges_from([('D','F'),('B','D')]) assert_true(G.has_edge('D','F')) assert_true(G.has_edge('B','D')) if G.is_directed(): assert_false(G.has_edge('D','B')) else: assert_true(G.has_edge('D','B')) # undirected def test_add_edges_from2(self): G=self.G() # after failing silently, should add 2nd edge G.add_edges_from([tuple('IJ'),list('KK'),tuple('JK')]) assert_true(G.has_edge(*('I','J'))) assert_true(G.has_edge(*('K','K'))) assert_true(G.has_edge(*('J','K'))) if G.is_directed(): assert_false(G.has_edge(*('K','J'))) else: assert_true(G.has_edge(*('K','J'))) def test_add_edges_from3(self): G=self.G() G.add_edges_from(zip(list('ACD'),list('CDE'))) assert_true(G.has_edge('D','E')) assert_false(G.has_edge('E','C')) def test_remove_edge(self): G=self.G() G.add_nodes_from([1, 2, 3, 'A', 'B', 'C', 'D', 'E', 'F', 'G', 'H']) G.add_edges_from(zip(list('MNOP'),list('NOPM'))) assert_true(G.has_edge('O','P')) assert_true( G.has_edge('P','M')) G.remove_node('P') # tests remove_node()'s handling of edges. assert_false(G.has_edge('P','M')) assert_raises(TypeError,G.remove_edge,'M') G.add_edge('N','M') assert_true(G.has_edge('M','N')) G.remove_edge('M','N') assert_false(G.has_edge('M','N')) # self loop fails silently G.remove_edges_from([list('HI'),list('DF'), tuple('KK'),tuple('JK')]) assert_false(G.has_edge('H','I')) assert_false(G.has_edge('J','K')) G.remove_edges_from([list('IJ'),list('KK'),list('JK')]) assert_false(G.has_edge('I','J')) G.remove_nodes_from(set('ZEFHIMNO')) G.add_edge('J','K') def test_edges_nbunch(self): # Test G.edges(nbunch) with various forms of nbunch G=self.G() G.add_edges_from([('A', 'B'), ('A', 'C'), ('B', 'D'), ('C', 'B'), ('C', 'D')]) # node not in nbunch should be quietly ignored assert_raises(nx.NetworkXError,G.edges,6) assert_equals(G.edges('Z'),[]) # iterable non-node # nbunch can be an empty list assert_equals(G.edges([]),[]) if G.is_directed(): elist=[('A', 'B'), ('A', 'C'), ('B', 'D')] else: elist=[('A', 'B'), ('A', 'C'), ('B', 'C'), ('B', 'D')] # nbunch can be a list assert_edges_equal(G.edges(['A','B']),elist) # nbunch can be a set assert_edges_equal(G.edges(set(['A','B'])),elist) # nbunch can be a graph G1=self.G() G1.add_nodes_from('AB') assert_edges_equal(G.edges(G1),elist) # nbunch can be a dict with nodes as keys ndict={'A': "thing1", 'B': "thing2"} assert_edges_equal(G.edges(ndict),elist) # nbunch can be a single node assert_edges_equal(G.edges('A'), [('A', 'B'), ('A', 'C')]) assert_edges_equal(G.nodes_iter(), ['A', 'B', 'C', 'D']) def test_edges_iter_nbunch(self): G=self.G() G.add_edges_from([('A', 'B'), ('A', 'C'), ('B', 'D'), ('C', 'B'), ('C', 'D')]) # Test G.edges_iter(nbunch) with various forms of nbunch # node not in nbunch should be quietly ignored assert_equals(list(G.edges_iter('Z')),[]) # nbunch can be an empty list assert_equals(sorted(G.edges_iter([])),[]) if G.is_directed(): elist=[('A', 'B'), ('A', 'C'), ('B', 'D')] else: elist=[('A', 'B'), ('A', 'C'), ('B', 'C'), ('B', 'D')] # nbunch can be a list assert_edges_equal(G.edges_iter(['A','B']),elist) # nbunch can be a set assert_edges_equal(G.edges_iter(set(['A','B'])),elist) # nbunch can be a graph G1=self.G() G1.add_nodes_from(['A','B']) assert_edges_equal(G.edges_iter(G1),elist) # nbunch can be a dict with nodes as keys ndict={'A': "thing1", 'B': "thing2"} assert_edges_equal(G.edges_iter(ndict),elist) # nbunch can be a single node assert_edges_equal(G.edges_iter('A'), [('A', 'B'), ('A', 'C')]) # nbunch can be nothing (whole graph) assert_edges_equal(G.edges_iter(), [('A', 'B'), ('A', 'C'), ('B', 'D'), ('C', 'B'), ('C', 'D')]) def test_degree(self): G=self.G() G.add_edges_from([('A', 'B'), ('A', 'C'), ('B', 'D'), ('C', 'B'), ('C', 'D')]) assert_equal(G.degree('A'),2) # degree of single node in iterable container must return dict assert_equal(list(G.degree(['A']).values()),[2]) assert_equal(G.degree(['A']),{'A': 2}) assert_equal(sorted(G.degree(['A','B']).values()),[2, 3]) assert_equal(G.degree(['A','B']),{'A': 2, 'B': 3}) assert_equal(sorted(G.degree().values()),[2, 2, 3, 3]) assert_equal(sorted([v for k,v in G.degree_iter()]), [2, 2, 3, 3]) def test_degree2(self): H=self.G() H.add_edges_from([(1,24),(1,2)]) assert_equal(sorted(H.degree([1,24]).values()),[1, 2]) def test_degree_graph(self): P3=nx.path_graph(3) P5=nx.path_graph(5) # silently ignore nodes not in P3 assert_equal(P3.degree(['A','B']),{}) # nbunch can be a graph assert_equal(sorted(P5.degree(P3).values()),[1, 2, 2]) # nbunch can be a graph thats way to big assert_equal(sorted(P3.degree(P5).values()),[1, 1, 2]) assert_equal(P5.degree([]),{}) assert_equal(list(P5.degree_iter([])),[]) assert_equal(dict(P5.degree_iter([])),{}) def test_null(self): null=nx.null_graph() assert_equal(null.degree(),{}) assert_equal(list(null.degree_iter()),[]) assert_equal(dict(null.degree_iter()),{}) def test_order_size(self): G=self.G() G.add_edges_from([('A', 'B'), ('A', 'C'), ('B', 'D'), ('C', 'B'), ('C', 'D')]) assert_equal(G.order(),4) assert_equal(G.size(),5) assert_equal(G.number_of_edges(),5) assert_equal(G.number_of_edges('A','B'),1) assert_equal(G.number_of_edges('A','D'),0) def test_copy(self): G=self.G() H=G.copy() # copy assert_equal(H.adj,G.adj) assert_equal(H.name,G.name) assert_not_equal(H,G) def test_subgraph(self): G=self.G() G.add_edges_from([('A', 'B'), ('A', 'C'), ('B', 'D'), ('C', 'B'), ('C', 'D')]) SG=G.subgraph(['A','B','D']) assert_nodes_equal(SG.nodes(),['A', 'B', 'D']) assert_edges_equal(SG.edges(),[('A', 'B'), ('B', 'D')]) def test_to_directed(self): G=self.G() if not G.is_directed(): G.add_edges_from([('A', 'B'), ('A', 'C'), ('B', 'D'), ('C', 'B'), ('C', 'D')]) DG=G.to_directed() assert_not_equal(DG,G) # directed copy or copy assert_true(DG.is_directed()) assert_equal(DG.name,G.name) assert_equal(DG.adj,G.adj) assert_equal(sorted(DG.out_edges(list('AB'))), [('A', 'B'), ('A', 'C'), ('B', 'A'), ('B', 'C'), ('B', 'D')]) DG.remove_edge('A','B') assert_true(DG.has_edge('B','A')) # this removes B-A but not A-B assert_false(DG.has_edge('A','B')) def test_to_undirected(self): G=self.G() if G.is_directed(): G.add_edges_from([('A', 'B'), ('A', 'C'), ('B', 'D'), ('C', 'B'), ('C', 'D')]) UG=G.to_undirected() # to_undirected assert_not_equal(UG,G) assert_false(UG.is_directed()) assert_true(G.is_directed()) assert_equal(UG.name,G.name) assert_not_equal(UG.adj,G.adj) assert_equal(sorted(UG.edges(list('AB'))), [('A', 'B'), ('A', 'C'), ('B', 'C'), ('B', 'D')]) assert_equal(sorted(UG.edges(['A','B'])), [('A', 'B'), ('A', 'C'), ('B', 'C'), ('B', 'D')]) UG.remove_edge('A','B') assert_false(UG.has_edge('B','A')) assert_false( UG.has_edge('A','B')) def test_neighbors(self): G=self.G() G.add_edges_from([('A', 'B'), ('A', 'C'), ('B', 'D'), ('C', 'B'), ('C', 'D')]) G.add_nodes_from('GJK') assert_equal(sorted(G['A']),['B', 'C']) assert_equal(sorted(G.neighbors('A')),['B', 'C']) assert_equal(sorted(G.neighbors_iter('A')),['B', 'C']) assert_equal(sorted(G.neighbors('G')),[]) assert_raises(nx.NetworkXError,G.neighbors,'j') def test_iterators(self): G=self.G() G.add_edges_from([('A', 'B'), ('A', 'C'), ('B', 'D'), ('C', 'B'), ('C', 'D')]) G.add_nodes_from('GJK') assert_equal(sorted(G.nodes_iter()), ['A', 'B', 'C', 'D', 'G', 'J', 'K']) assert_edges_equal(G.edges_iter(), [('A', 'B'), ('A', 'C'), ('B', 'D'), ('C', 'B'), ('C', 'D')]) assert_equal(sorted([v for k,v in G.degree_iter()]), [0, 0, 0, 2, 2, 3, 3]) assert_equal(sorted(G.degree_iter(),key=str), [('A', 2), ('B', 3), ('C', 3), ('D', 2), ('G', 0), ('J', 0), ('K', 0)]) assert_equal(sorted(G.neighbors_iter('A')),['B', 'C']) assert_raises(nx.NetworkXError,G.neighbors_iter,'X') G.clear() assert_equal(nx.number_of_nodes(G),0) assert_equal(nx.number_of_edges(G),0) def test_null_subgraph(self): # Subgraph of a null graph is a null graph nullgraph=nx.null_graph() G=nx.null_graph() H=G.subgraph([]) assert_true(nx.is_isomorphic(H,nullgraph)) def test_empty_subgraph(self): # Subgraph of an empty graph is an empty graph. test 1 nullgraph=nx.null_graph() E5=nx.empty_graph(5) E10=nx.empty_graph(10) H=E10.subgraph([]) assert_true(nx.is_isomorphic(H,nullgraph)) H=E10.subgraph([1,2,3,4,5]) assert_true(nx.is_isomorphic(H,E5)) def test_complete_subgraph(self): # Subgraph of a complete graph is a complete graph K1=nx.complete_graph(1) K3=nx.complete_graph(3) K5=nx.complete_graph(5) H=K5.subgraph([1,2,3]) assert_true(nx.is_isomorphic(H,K3)) def test_subgraph_nbunch(self): nullgraph=nx.null_graph() K1=nx.complete_graph(1) K3=nx.complete_graph(3) K5=nx.complete_graph(5) # Test G.subgraph(nbunch), where nbunch is a single node H=K5.subgraph(1) assert_true(nx.is_isomorphic(H,K1)) # Test G.subgraph(nbunch), where nbunch is a set H=K5.subgraph(set([1])) assert_true(nx.is_isomorphic(H,K1)) # Test G.subgraph(nbunch), where nbunch is an iterator H=K5.subgraph(iter(K3)) assert_true(nx.is_isomorphic(H,K3)) # Test G.subgraph(nbunch), where nbunch is another graph H=K5.subgraph(K3) assert_true(nx.is_isomorphic(H,K3)) H=K5.subgraph([9]) assert_true(nx.is_isomorphic(H,nullgraph)) def test_node_tuple_error(self): H=self.G() # Test error handling of tuple as a node assert_raises(nx.NetworkXError,H.remove_node,(1,2)) H.remove_nodes_from([(1,2)]) # no error assert_raises(nx.NetworkXError,H.neighbors,(1,2)) networkx-1.11/networkx/classes/tests/test_timing.py0000644000175000017500000001433312637544500022565 0ustar aricaric00000000000000#!/usr/bin/env python from __future__ import print_function from nose.tools import * from collections import OrderedDict import networkx as nx from timeit import Timer # dict pointing to graph type to compare to. graph_type = { (False, False): "TimingGraph", (False, True): "TimingMultiGraph", (True, False): "TimingDiGraph", (True, True): "TimingMultiDiGraph" } # Setup tests N,p = 200,0.1 basic_setup=('for (u,v) in NX.binomial_graph(%s,%s).edges():\n' ' G.add_weighted_edges_from([(u,v,2),(v,u,2)])'%(N,p) ) elist_setup=('elist=[(i,i+3) for i in range(%s-3)]\n' 'G.add_nodes_from(range(%i))'%(N,N) ) all_tests=[ # Format: (name, (test_string, setup_string, runs, reps, cutoff_ratio)), ('add_nodes', ('G.clear()\nG.add_nodes_from(nlist)', 'nlist=range(%i)'%N, 3, 10)), ('add_edges', ('G.add_edges_from(elist)', elist_setup, 3, 10) ), ('add_and_remove_nodes', ('G.add_nodes_from(nlist)\nG.remove_nodes_from(nlist)', 'nlist=range(%i)'%N, 3, 10) ), ('add_and_remove_edges', ('G.add_edges_from(elist)\nG.remove_edges_from(elist)', elist_setup, 3, 10) ), ('neighbors', ('for n in G:\n for nbr in G.neighbors(n):\n pass', basic_setup, 3, 1) ), ('edges', ('for n in G:\n for e in G.edges(n):\n pass', basic_setup, 3, 1) ), ('edge_data', ('for n in G:\n for e in G.edges(n,data=True):\n pass', basic_setup, 3, 1) ), ('all_edges', (('for n,nbrs in G.adjacency_iter():\n' ' for nbr,data in nbrs.items():\n pass'), basic_setup, 3, 1) ), ('degree', ('for d in G.degree():\n pass', basic_setup, 3, 1) ), ('copy', ('H=G.copy()', basic_setup, 3, 1) ), ('dijkstra', ('p=NX.single_source_dijkstra(G,i)', 'i=6\n'+basic_setup, 3, 1) ), ('shortest_path', ('p=NX.single_source_shortest_path(G,i)', 'i=6\n'+basic_setup, 3, 1) ), ('subgraph', ('G.subgraph(nlist)', 'nlist=range(100,150)\n'+basic_setup, 3, 1) ), #('numpy_matrix', ('NX.to_numpy_matrix(G)', basic_setup, 3, 1) ), ] class Benchmark(object): """ Class to benchmark (time) various Graph routines. Parameters ---------- graph_classes : List of classes to test. tests : List of tests to run on each class. Format for tests: (name, (test_string, setup_string, runs, repeats, [cutoff_ratio])) name: A string used to identify this test when reporting results. test_string: The code-string used repeatedly in the test. setup_string: The code-string used once before running the test. Some text is prepended to setup_string. It imports NetworkX and creates an instance (called G) of the tested graph class. runs: Number of timing runs. repeats: Number of repeats of the test for each run. cutoff_ratio: optional ratio of times [current/TimingClass] If (ratio > cutoff_ratio) then check_ratios() returns False. Notes ----- Benchmark uses the timeit package and timeit.Timer class. """ def __init__(self, graph_classes, tests=all_tests): self.gc = graph_classes self.tests = tests def run(self, verbose=False, cutoff_default=3): errors='' headers=list(self.gc) if verbose: raw_times=" ".join( gc.rjust(12) for gc in headers ) print('Raw Times'.ljust(23) + raw_times) print("="*79, end=" ") results=[] # for each test list times for each graph_class for tst,params in self.tests: if verbose: print() #add newline print(tst.ljust(22), end=" ") times=[] for gc in headers: t,bt = self.time_me(gc, tst, params[:4]) cutoff=params[4] if len(params)>4 else cutoff_default rat=t/bt times.append( (tst, params, gc, t, bt, rat, cutoff) ) if rat > cutoff: errors+='Timing "'+tst+'" failed for class "'+gc+'". ' errors+='Time ratio (new/base): {:f}\n'.format(rat) if verbose: print("{:12.3e}".format(t), end=" ") results.append(times) if verbose: print('\n') hdrs=" ".join(gc.rjust(12) for gc in headers) print('Time Ratio to Baseline'.ljust(23) + hdrs) print("="*(23+len(hdrs))) for res in results: tst=res[0][0] output = " ".join( "{:12.3f}".format(t[5]) for t in res ) print(tst.ljust(23) + output) self.results=results if errors != '': print(errors) return False #not all passed return True #all passed def time_me(self, gc, tst, params): """ Time the test for class gc and its comparison TimingClass """ stmt,t_setup,runs,reps = params # setup="import networkx as NX\nG=NX.%s()\n"%gc + t_setup G = eval("nx."+gc+"()") cc = graph_type[ (G.is_directed(), G.is_multigraph()) ] compare_setup=("import networkx as NX\n" "import timingclasses as tc\n" "G=tc.%s()\n"%(cc,)) + t_setup # tgc = Timer(stmt, setup) tcc = Timer(stmt, compare_setup) # t = tgc.repeat(repeat = runs, number = reps) bt = tcc.repeat(repeat = runs, number = reps) return min(t),min(bt) # The fluctuations in timing make this problematic for travis-CI # uncomment it to use with nosetests. #class Test_Benchmark(Benchmark): # """ Class to allow nosetests to perform benchmark tests """ # def __init__(self): # classes=['Graph','MultiGraph','DiGraph','MultiDiGraph'] # self.gc = classes # self.tests = all_tests # # def test_ratios(self): # assert_true(self.run(verbose=False, cutoff_default=3)) if __name__ == "__main__": classes=['Graph','MultiGraph','DiGraph','MultiDiGraph'] # classes=['SpecialGraph','SpecialMultiGraph',\ # 'SpecialDiGraph','SpecialMultiDiGraph'] b=Benchmark(classes,tests=all_tests) # b=Benchmark(classes,tests=dict( (k,v) for k,v in all_tests.items() if "add" in k )) assert b.run(verbose=True) networkx-1.11/networkx/classes/tests/test_function.py0000644000175000017500000003573512637544500023134 0ustar aricaric00000000000000#!/usr/bin/env python import random from nose.tools import * import networkx as nx class TestFunction(object): def setUp(self): self.G=nx.Graph({0:[1,2,3], 1:[1,2,0], 4:[]}, name='Test') self.Gdegree={0:3, 1:2, 2:2, 3:1, 4:0} self.Gnodes=list(range(5)) self.Gedges=[(0,1),(0,2),(0,3),(1,0),(1,1),(1,2)] self.DG=nx.DiGraph({0:[1,2,3], 1:[1,2,0], 4:[]}) self.DGin_degree={0:1, 1:2, 2:2, 3:1, 4:0} self.DGout_degree={0:3, 1:3, 2:0, 3:0, 4:0} self.DGnodes=list(range(5)) self.DGedges=[(0,1),(0,2),(0,3),(1,0),(1,1),(1,2)] def test_nodes(self): assert_equal(self.G.nodes(),nx.nodes(self.G)) assert_equal(self.DG.nodes(),nx.nodes(self.DG)) def test_edges(self): assert_equal(self.G.edges(),nx.edges(self.G)) assert_equal(self.DG.edges(),nx.edges(self.DG)) assert_equal(self.G.edges(nbunch=[0,1,3]),nx.edges(self.G,nbunch=[0,1,3])) assert_equal(self.DG.edges(nbunch=[0,1,3]),nx.edges(self.DG,nbunch=[0,1,3])) def test_nodes_iter(self): assert_equal(list(self.G.nodes_iter()),list(nx.nodes_iter(self.G))) assert_equal(list(self.DG.nodes_iter()),list(nx.nodes_iter(self.DG))) def test_edges_iter(self): assert_equal(list(self.G.edges_iter()),list(nx.edges_iter(self.G))) assert_equal(list(self.DG.edges_iter()),list(nx.edges_iter(self.DG))) assert_equal(list(self.G.edges_iter(nbunch=[0,1,3])),list(nx.edges_iter(self.G,nbunch=[0,1,3]))) assert_equal(list(self.DG.edges_iter(nbunch=[0,1,3])),list(nx.edges_iter(self.DG,nbunch=[0,1,3]))) def test_degree(self): assert_equal(self.G.degree(),nx.degree(self.G)) assert_equal(self.DG.degree(),nx.degree(self.DG)) assert_equal(self.G.degree(nbunch=[0,1]),nx.degree(self.G,nbunch=[0,1])) assert_equal(self.DG.degree(nbunch=[0,1]),nx.degree(self.DG,nbunch=[0,1])) assert_equal(self.G.degree(weight='weight'),nx.degree(self.G,weight='weight')) assert_equal(self.DG.degree(weight='weight'),nx.degree(self.DG,weight='weight')) def test_neighbors(self): assert_equal(self.G.neighbors(1),nx.neighbors(self.G,1)) assert_equal(self.DG.neighbors(1),nx.neighbors(self.DG,1)) def test_number_of_nodes(self): assert_equal(self.G.number_of_nodes(),nx.number_of_nodes(self.G)) assert_equal(self.DG.number_of_nodes(),nx.number_of_nodes(self.DG)) def test_number_of_edges(self): assert_equal(self.G.number_of_edges(),nx.number_of_edges(self.G)) assert_equal(self.DG.number_of_edges(),nx.number_of_edges(self.DG)) def test_is_directed(self): assert_equal(self.G.is_directed(),nx.is_directed(self.G)) assert_equal(self.DG.is_directed(),nx.is_directed(self.DG)) def test_subgraph(self): assert_equal(self.G.subgraph([0,1,2,4]).adj,nx.subgraph(self.G,[0,1,2,4]).adj) assert_equal(self.DG.subgraph([0,1,2,4]).adj,nx.subgraph(self.DG,[0,1,2,4]).adj) def test_create_empty_copy(self): G=nx.create_empty_copy(self.G, with_nodes=False) assert_equal(G.nodes(),[]) assert_equal(G.graph,{}) assert_equal(G.node,{}) assert_equal(G.edge,{}) G=nx.create_empty_copy(self.G) assert_equal(G.nodes(),self.G.nodes()) assert_equal(G.graph,{}) assert_equal(G.node,{}.fromkeys(self.G.nodes(),{})) assert_equal(G.edge,{}.fromkeys(self.G.nodes(),{})) def test_degree_histogram(self): assert_equal(nx.degree_histogram(self.G), [1,1,1,1,1]) def test_density(self): assert_equal(nx.density(self.G), 0.5) assert_equal(nx.density(self.DG), 0.3) G=nx.Graph() G.add_node(1) assert_equal(nx.density(G), 0.0) def test_density_selfloop(self): G = nx.Graph() G.add_edge(1,1) assert_equal(nx.density(G), 0.0) G.add_edge(1,2) assert_equal(nx.density(G), 2.0) def test_freeze(self): G=nx.freeze(self.G) assert_equal(G.frozen,True) assert_raises(nx.NetworkXError, G.add_node, 1) assert_raises(nx.NetworkXError, G.add_nodes_from, [1]) assert_raises(nx.NetworkXError, G.remove_node, 1) assert_raises(nx.NetworkXError, G.remove_nodes_from, [1]) assert_raises(nx.NetworkXError, G.add_edge, 1,2) assert_raises(nx.NetworkXError, G.add_edges_from, [(1,2)]) assert_raises(nx.NetworkXError, G.remove_edge, 1,2) assert_raises(nx.NetworkXError, G.remove_edges_from, [(1,2)]) assert_raises(nx.NetworkXError, G.clear) def test_is_frozen(self): assert_equal(nx.is_frozen(self.G), False) G=nx.freeze(self.G) assert_equal(G.frozen, nx.is_frozen(self.G)) assert_equal(G.frozen,True) def test_info(self): G=nx.path_graph(5) info=nx.info(G) expected_graph_info='\n'.join(['Name: path_graph(5)', 'Type: Graph', 'Number of nodes: 5', 'Number of edges: 4', 'Average degree: 1.6000']) assert_equal(info,expected_graph_info) info=nx.info(G,n=1) expected_node_info='\n'.join( ['Node 1 has the following properties:', 'Degree: 2', 'Neighbors: 0 2']) assert_equal(info,expected_node_info) def test_info_digraph(self): G=nx.DiGraph(name='path_graph(5)') G.add_path([0,1,2,3,4]) info=nx.info(G) expected_graph_info='\n'.join(['Name: path_graph(5)', 'Type: DiGraph', 'Number of nodes: 5', 'Number of edges: 4', 'Average in degree: 0.8000', 'Average out degree: 0.8000']) assert_equal(info,expected_graph_info) info=nx.info(G,n=1) expected_node_info='\n'.join( ['Node 1 has the following properties:', 'Degree: 2', 'Neighbors: 2']) assert_equal(info,expected_node_info) assert_raises(nx.NetworkXError,nx.info,G,n=-1) def test_neighbors(self): graph = nx.complete_graph(100) pop = random.sample(graph.nodes(), 1) nbors = list(nx.neighbors(graph, pop[0])) # should be all the other vertices in the graph assert_equal(len(nbors), len(graph) - 1) graph = nx.path_graph(100) node = random.sample(graph.nodes(), 1)[0] nbors = list(nx.neighbors(graph, node)) # should be all the other vertices in the graph if node != 0 and node != 99: assert_equal(len(nbors), 2) else: assert_equal(len(nbors), 1) # create a star graph with 99 outer nodes graph = nx.star_graph(99) nbors = list(nx.neighbors(graph, 0)) assert_equal(len(nbors), 99) def test_non_neighbors(self): graph = nx.complete_graph(100) pop = random.sample(graph.nodes(), 1) nbors = list(nx.non_neighbors(graph, pop[0])) # should be all the other vertices in the graph assert_equal(len(nbors), 0) graph = nx.path_graph(100) node = random.sample(graph.nodes(), 1)[0] nbors = list(nx.non_neighbors(graph, node)) # should be all the other vertices in the graph if node != 0 and node != 99: assert_equal(len(nbors), 97) else: assert_equal(len(nbors), 98) # create a star graph with 99 outer nodes graph = nx.star_graph(99) nbors = list(nx.non_neighbors(graph, 0)) assert_equal(len(nbors), 0) # disconnected graph graph = nx.Graph() graph.add_nodes_from(range(10)) nbors = list(nx.non_neighbors(graph, 0)) assert_equal(len(nbors), 9) def test_non_edges(self): # All possible edges exist graph = nx.complete_graph(5) nedges = list(nx.non_edges(graph)) assert_equal(len(nedges), 0) graph = nx.path_graph(4) expected = [(0, 2), (0, 3), (1, 3)] nedges = list(nx.non_edges(graph)) for (u, v) in expected: assert_true( (u, v) in nedges or (v, u) in nedges ) graph = nx.star_graph(4) expected = [(1, 2), (1, 3), (1, 4), (2, 3), (2, 4), (3, 4)] nedges = list(nx.non_edges(graph)) for (u, v) in expected: assert_true( (u, v) in nedges or (v, u) in nedges ) # Directed graphs graph = nx.DiGraph() graph.add_edges_from([(0, 2), (2, 0), (2, 1)]) expected = [(0, 1), (1, 0), (1, 2)] nedges = list(nx.non_edges(graph)) for e in expected: assert_true(e in nedges) def test_is_weighted(self): G = nx.Graph() assert_false(nx.is_weighted(G)) G = nx.path_graph(4) assert_false(nx.is_weighted(G)) assert_false(nx.is_weighted(G, (2, 3))) G.add_node(4) G.add_edge(3, 4, weight=4) assert_false(nx.is_weighted(G)) assert_true(nx.is_weighted(G, (3, 4))) G = nx.DiGraph() G.add_weighted_edges_from([('0', '3', 3), ('0', '1', -5), ('1', '0', -5), ('0', '2', 2), ('1', '2', 4), ('2', '3', 1)]) assert_true(nx.is_weighted(G)) assert_true(nx.is_weighted(G, ('1', '0'))) G = G.to_undirected() assert_true(nx.is_weighted(G)) assert_true(nx.is_weighted(G, ('1', '0'))) assert_raises(nx.NetworkXError, nx.is_weighted, G, (1, 2)) def test_is_negatively_weighted(self): G = nx.Graph() assert_false(nx.is_negatively_weighted(G)) G.add_node(1) G.add_nodes_from([2, 3, 4, 5]) assert_false(nx.is_negatively_weighted(G)) G.add_edge(1, 2, weight=4) assert_false(nx.is_negatively_weighted(G, (1, 2))) G.add_edges_from([(1, 3), (2, 4), (2, 6)]) G[1][3]['color'] = 'blue' assert_false(nx.is_negatively_weighted(G)) assert_false(nx.is_negatively_weighted(G, (1, 3))) G[2][4]['weight'] = -2 assert_true(nx.is_negatively_weighted(G, (2, 4))) assert_true(nx.is_negatively_weighted(G)) G = nx.DiGraph() G.add_weighted_edges_from([('0', '3', 3), ('0', '1', -5), ('1', '0', -2), ('0', '2', 2), ('1', '2', -3), ('2', '3', 1)]) assert_true(nx.is_negatively_weighted(G)) assert_false(nx.is_negatively_weighted(G, ('0', '3'))) assert_true(nx.is_negatively_weighted(G, ('1', '0'))) assert_raises(nx.NetworkXError, nx.is_negatively_weighted, G, (1, 4)) class TestCommonNeighbors(): def setUp(self): self.func = nx.common_neighbors def test_func(G, u, v, expected): result = sorted(self.func(G, u, v)) assert_equal(result, expected) self.test = test_func def test_K5(self): G = nx.complete_graph(5) self.test(G, 0, 1, [2, 3, 4]) def test_P3(self): G = nx.path_graph(3) self.test(G, 0, 2, [1]) def test_S4(self): G = nx.star_graph(4) self.test(G, 1, 2, [0]) @raises(nx.NetworkXNotImplemented) def test_digraph(self): G = nx.DiGraph() G.add_edges_from([(0, 1), (1, 2)]) self.func(G, 0, 2) def test_nonexistent_nodes(self): G = nx.complete_graph(5) assert_raises(nx.NetworkXError, nx.common_neighbors, G, 5, 4) assert_raises(nx.NetworkXError, nx.common_neighbors, G, 4, 5) assert_raises(nx.NetworkXError, nx.common_neighbors, G, 5, 6) def test_custom1(self): """Case of no common neighbors.""" G = nx.Graph() G.add_nodes_from([0, 1]) self.test(G, 0, 1, []) def test_custom2(self): """Case of equal nodes.""" G = nx.complete_graph(4) self.test(G, 0, 0, [1, 2, 3]) def test_set_node_attributes(): graphs = [nx.Graph(), nx.DiGraph(), nx.MultiGraph(), nx.MultiDiGraph()] for G in graphs: G = nx.path_graph(3, create_using=G) # Test single value attr = 'hello' vals = 100 nx.set_node_attributes(G, attr, vals) assert_equal(G.node[0][attr], vals) assert_equal(G.node[1][attr], vals) assert_equal(G.node[2][attr], vals) # Test multiple values attr = 'hi' vals = dict(zip(sorted(G.nodes()), range(len(G)))) nx.set_node_attributes(G, attr, vals) assert_equal(G.node[0][attr], 0) assert_equal(G.node[1][attr], 1) assert_equal(G.node[2][attr], 2) def test_set_edge_attributes(): graphs = [nx.Graph(), nx.DiGraph()] for G in graphs: G = nx.path_graph(3, create_using=G) # Test single value attr = 'hello' vals = 3 nx.set_edge_attributes(G, attr, vals) assert_equal(G[0][1][attr], vals) assert_equal(G[1][2][attr], vals) # Test multiple values attr = 'hi' edges = [(0,1), (1,2)] vals = dict(zip(edges, range(len(edges)))) nx.set_edge_attributes(G, attr, vals) assert_equal(G[0][1][attr], 0) assert_equal(G[1][2][attr], 1) def test_set_edge_attributes_multi(): graphs = [nx.MultiGraph(), nx.MultiDiGraph()] for G in graphs: G = nx.path_graph(3, create_using=G) # Test single value attr = 'hello' vals = 3 nx.set_edge_attributes(G, attr, vals) assert_equal(G[0][1][0][attr], vals) assert_equal(G[1][2][0][attr], vals) # Test multiple values attr = 'hi' edges = [(0,1,0), (1,2,0)] vals = dict(zip(edges, range(len(edges)))) nx.set_edge_attributes(G, attr, vals) assert_equal(G[0][1][0][attr], 0) assert_equal(G[1][2][0][attr], 1) def test_get_node_attributes(): graphs = [nx.Graph(), nx.DiGraph(), nx.MultiGraph(), nx.MultiDiGraph()] for G in graphs: G = nx.path_graph(3, create_using=G) attr = 'hello' vals = 100 nx.set_node_attributes(G, attr, vals) attrs = nx.get_node_attributes(G, attr) assert_equal(attrs[0], vals) assert_equal(attrs[1], vals) assert_equal(attrs[2], vals) def test_get_edge_attributes(): graphs = [nx.Graph(), nx.DiGraph(), nx.MultiGraph(), nx.MultiDiGraph()] for G in graphs: G = nx.path_graph(3, create_using=G) attr = 'hello' vals = 100 nx.set_edge_attributes(G, attr, vals) attrs = nx.get_edge_attributes(G, attr) assert_equal(len(attrs), 2) if G.is_multigraph(): keys = [(0,1,0), (1,2,0)] else: keys = [(0,1), (1,2)] for key in keys: assert_equal(attrs[key], 100) def test_is_empty(): graphs = [nx.Graph(), nx.DiGraph(), nx.MultiGraph(), nx.MultiDiGraph()] for G in graphs: assert_true(nx.is_empty(G)) G.add_nodes_from(range(5)) assert_true(nx.is_empty(G)) G.add_edges_from([(1, 2), (3, 4)]) assert_false(nx.is_empty(G)) networkx-1.11/networkx/classes/tests/test_graph_historical.py0000644000175000017500000000044712637544450024625 0ustar aricaric00000000000000#!/usr/bin/env python """Original NetworkX graph tests""" from nose.tools import * import networkx import networkx as nx from historical_tests import HistoricalTests class TestGraphHistorical(HistoricalTests): def setUp(self): HistoricalTests.setUp(self) self.G=nx.Graph networkx-1.11/networkx/classes/tests/test_digraph_historical.py0000644000175000017500000001045112637544500025132 0ustar aricaric00000000000000#!/usr/bin/env python """Original NetworkX graph tests""" from nose.tools import * import networkx import networkx as nx from historical_tests import HistoricalTests class TestDiGraphHistorical(HistoricalTests): def setUp(self): HistoricalTests.setUp(self) self.G=nx.DiGraph def test_in_degree(self): G=self.G() G.add_nodes_from('GJK') G.add_edges_from([('A', 'B'), ('A', 'C'), ('B', 'D'), ('B', 'C'), ('C', 'D')]) assert_equal(sorted(G.in_degree().values()),[0, 0, 0, 0, 1, 2, 2]) assert_equal(G.in_degree(), {'A': 0, 'C': 2, 'B': 1, 'D': 2, 'G': 0, 'K': 0, 'J': 0}) assert_equal(sorted([v for k,v in G.in_degree_iter()]), [0, 0, 0, 0, 1, 2, 2]) assert_equal(dict(G.in_degree_iter()), {'A': 0, 'C': 2, 'B': 1, 'D': 2, 'G': 0, 'K': 0, 'J': 0}) def test_out_degree(self): G=self.G() G.add_nodes_from('GJK') G.add_edges_from([('A', 'B'), ('A', 'C'), ('B', 'D'), ('B', 'C'), ('C', 'D')]) assert_equal(sorted(G.out_degree().values()),[0, 0, 0, 0, 1, 2, 2]) assert_equal(G.out_degree(), {'A': 2, 'C': 1, 'B': 2, 'D': 0, 'G': 0, 'K': 0, 'J': 0}) assert_equal(sorted([v for k,v in G.in_degree_iter()]), [0, 0, 0, 0, 1, 2, 2]) assert_equal(dict(G.out_degree_iter()), {'A': 2, 'C': 1, 'B': 2, 'D': 0, 'G': 0, 'K': 0, 'J': 0}) def test_degree_digraph(self): H=nx.DiGraph() H.add_edges_from([(1,24),(1,2)]) assert_equal(sorted(H.in_degree([1,24]).values()),[0, 1]) assert_equal(sorted(H.out_degree([1,24]).values()),[0, 2]) assert_equal(sorted(H.degree([1,24]).values()),[1, 2]) def test_neighbors(self): G=self.G() G.add_nodes_from('GJK') G.add_edges_from([('A', 'B'), ('A', 'C'), ('B', 'D'), ('B', 'C'), ('C', 'D')]) assert_equal(sorted(G.neighbors('C')),['D']) assert_equal(sorted(G['C']),['D']) assert_equal(sorted(G.neighbors('A')),['B', 'C']) assert_equal(sorted(G.neighbors_iter('A')),['B', 'C']) assert_equal(sorted(G.neighbors_iter('C')),['D']) assert_equal(sorted(G.neighbors('A')),['B', 'C']) assert_raises(nx.NetworkXError,G.neighbors,'j') assert_raises(nx.NetworkXError,G.neighbors_iter,'j') def test_successors(self): G=self.G() G.add_nodes_from('GJK') G.add_edges_from([('A', 'B'), ('A', 'C'), ('B', 'D'), ('B', 'C'), ('C', 'D')]) assert_equal(sorted(G.successors('A')),['B', 'C']) assert_equal(sorted(G.successors_iter('A')),['B', 'C']) assert_equal(sorted(G.successors('G')),[]) assert_equal(sorted(G.successors('D')),[]) assert_equal(sorted(G.successors_iter('G')),[]) assert_raises(nx.NetworkXError,G.successors,'j') assert_raises(nx.NetworkXError,G.successors_iter,'j') def test_predecessors(self): G=self.G() G.add_nodes_from('GJK') G.add_edges_from([('A', 'B'), ('A', 'C'), ('B', 'D'), ('B', 'C'), ('C', 'D')]) assert_equal(sorted(G.predecessors('C')),['A', 'B']) assert_equal(sorted(G.predecessors_iter('C')),['A', 'B']) assert_equal(sorted(G.predecessors('G')),[]) assert_equal(sorted(G.predecessors('A')),[]) assert_equal(sorted(G.predecessors_iter('G')),[]) assert_equal(sorted(G.predecessors_iter('A')),[]) assert_equal(sorted(G.successors_iter('D')),[]) assert_raises(nx.NetworkXError,G.predecessors,'j') assert_raises(nx.NetworkXError,G.predecessors,'j') def test_reverse(self): G=nx.complete_graph(10) H=G.to_directed() HR=H.reverse() assert_true(nx.is_isomorphic(H,HR)) assert_equal(sorted(H.edges()),sorted(HR.edges())) def test_reverse2(self): H=nx.DiGraph() foo=[H.add_edge(u,u+1) for u in range(0,5)] HR=H.reverse() for u in range(0,5): assert_true(HR.has_edge(u+1,u)) def test_reverse3(self): H=nx.DiGraph() H.add_nodes_from([1,2,3,4]) HR=H.reverse() assert_equal(sorted(HR.nodes()),[1, 2, 3, 4]) networkx-1.11/networkx/classes/multidigraph.py0000644000175000017500000010123112637544500021560 0ustar aricaric00000000000000"""Base class for MultiDiGraph.""" # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. from copy import deepcopy import networkx as nx from networkx.classes.graph import Graph # for doctests from networkx.classes.digraph import DiGraph from networkx.classes.multigraph import MultiGraph from networkx.exception import NetworkXError __author__ = """\n""".join(['Aric Hagberg (hagberg@lanl.gov)', 'Pieter Swart (swart@lanl.gov)', 'Dan Schult(dschult@colgate.edu)']) class MultiDiGraph(MultiGraph,DiGraph): """A directed graph class that can store multiedges. Multiedges are multiple edges between two nodes. Each edge can hold optional data or attributes. A MultiDiGraph holds directed edges. Self loops are allowed. Nodes can be arbitrary (hashable) Python objects with optional key/value attributes. Edges are represented as links between nodes with optional key/value attributes. Parameters ---------- data : input graph Data to initialize graph. If data=None (default) an empty graph is created. The data can be an edge list, or any NetworkX graph object. If the corresponding optional Python packages are installed the data can also be a NumPy matrix or 2d ndarray, a SciPy sparse matrix, or a PyGraphviz graph. attr : keyword arguments, optional (default= no attributes) Attributes to add to graph as key=value pairs. See Also -------- Graph DiGraph MultiGraph Examples -------- Create an empty graph structure (a "null graph") with no nodes and no edges. >>> G = nx.MultiDiGraph() G can be grown in several ways. **Nodes:** Add one node at a time: >>> G.add_node(1) Add the nodes from any container (a list, dict, set or even the lines from a file or the nodes from another graph). >>> G.add_nodes_from([2,3]) >>> G.add_nodes_from(range(100,110)) >>> H=nx.Graph() >>> H.add_path([0,1,2,3,4,5,6,7,8,9]) >>> G.add_nodes_from(H) In addition to strings and integers any hashable Python object (except None) can represent a node, e.g. a customized node object, or even another Graph. >>> G.add_node(H) **Edges:** G can also be grown by adding edges. Add one edge, >>> G.add_edge(1, 2) a list of edges, >>> G.add_edges_from([(1,2),(1,3)]) or a collection of edges, >>> G.add_edges_from(H.edges()) If some edges connect nodes not yet in the graph, the nodes are added automatically. If an edge already exists, an additional edge is created and stored using a key to identify the edge. By default the key is the lowest unused integer. >>> G.add_edges_from([(4,5,dict(route=282)), (4,5,dict(route=37))]) >>> G[4] {5: {0: {}, 1: {'route': 282}, 2: {'route': 37}}} **Attributes:** Each graph, node, and edge can hold key/value attribute pairs in an associated attribute dictionary (the keys must be hashable). By default these are empty, but can be added or changed using add_edge, add_node or direct manipulation of the attribute dictionaries named graph, node and edge respectively. >>> G = nx.MultiDiGraph(day="Friday") >>> G.graph {'day': 'Friday'} Add node attributes using add_node(), add_nodes_from() or G.node >>> G.add_node(1, time='5pm') >>> G.add_nodes_from([3], time='2pm') >>> G.node[1] {'time': '5pm'} >>> G.node[1]['room'] = 714 >>> del G.node[1]['room'] # remove attribute >>> G.nodes(data=True) [(1, {'time': '5pm'}), (3, {'time': '2pm'})] Warning: adding a node to G.node does not add it to the graph. Add edge attributes using add_edge(), add_edges_from(), subscript notation, or G.edge. >>> G.add_edge(1, 2, weight=4.7 ) >>> G.add_edges_from([(3,4),(4,5)], color='red') >>> G.add_edges_from([(1,2,{'color':'blue'}), (2,3,{'weight':8})]) >>> G[1][2][0]['weight'] = 4.7 >>> G.edge[1][2][0]['weight'] = 4 **Shortcuts:** Many common graph features allow python syntax to speed reporting. >>> 1 in G # check if node in graph True >>> [n for n in G if n<3] # iterate through nodes [1, 2] >>> len(G) # number of nodes in graph 5 >>> G[1] # adjacency dict keyed by neighbor to edge attributes ... # Note: you should not change this dict manually! {2: {0: {'weight': 4}, 1: {'color': 'blue'}}} The fastest way to traverse all edges of a graph is via adjacency_iter(), but the edges() method is often more convenient. >>> for n,nbrsdict in G.adjacency_iter(): ... for nbr,keydict in nbrsdict.items(): ... for key,eattr in keydict.items(): ... if 'weight' in eattr: ... (n,nbr,eattr['weight']) (1, 2, 4) (2, 3, 8) >>> G.edges(data='weight') [(1, 2, 4), (1, 2, None), (2, 3, 8), (3, 4, None), (4, 5, None)] **Reporting:** Simple graph information is obtained using methods. Iterator versions of many reporting methods exist for efficiency. Methods exist for reporting nodes(), edges(), neighbors() and degree() as well as the number of nodes and edges. For details on these and other miscellaneous methods, see below. **Subclasses (Advanced):** The MultiDiGraph class uses a dict-of-dict-of-dict-of-dict structure. The outer dict (node_dict) holds adjacency lists keyed by node. The next dict (adjlist) represents the adjacency list and holds edge_key dicts keyed by neighbor. The edge_key dict holds each edge_attr dict keyed by edge key. The inner dict (edge_attr) represents the edge data and holds edge attribute values keyed by attribute names. Each of these four dicts in the dict-of-dict-of-dict-of-dict structure can be replaced by a user defined dict-like object. In general, the dict-like features should be maintained but extra features can be added. To replace one of the dicts create a new graph class by changing the class(!) variable holding the factory for that dict-like structure. The variable names are node_dict_factory, adjlist_dict_factory, edge_key_dict_factory and edge_attr_dict_factory. node_dict_factory : function, (default: dict) Factory function to be used to create the outer-most dict in the data structure that holds adjacency lists keyed by node. It should require no arguments and return a dict-like object. adjlist_dict_factory : function, (default: dict) Factory function to be used to create the adjacency list dict which holds multiedge key dicts keyed by neighbor. It should require no arguments and return a dict-like object. edge_key_dict_factory : function, (default: dict) Factory function to be used to create the edge key dict which holds edge data keyed by edge key. It should require no arguments and return a dict-like object. edge_attr_dict_factory : function, (default: dict) Factory function to be used to create the edge attribute dict which holds attrbute values keyed by attribute name. It should require no arguments and return a dict-like object. Examples -------- Create a multigraph object that tracks the order nodes are added. >>> from collections import OrderedDict >>> class OrderedGraph(nx.MultiDiGraph): ... node_dict_factory = OrderedDict >>> G = OrderedGraph() >>> G.add_nodes_from( (2,1) ) >>> G.nodes() [2, 1] >>> G.add_edges_from( ((2,2), (2,1), (2,1), (1,1)) ) >>> G.edges() [(2, 1), (2, 1), (2, 2), (1, 1)] Create a multdigraph object that tracks the order nodes are added and for each node track the order that neighbors are added and for each neighbor tracks the order that multiedges are added. >>> class OrderedGraph(nx.MultiDiGraph): ... node_dict_factory = OrderedDict ... adjlist_dict_factory = OrderedDict ... edge_key_dict_factory = OrderedDict >>> G = OrderedGraph() >>> G.add_nodes_from( (2,1) ) >>> G.nodes() [2, 1] >>> G.add_edges_from( ((2,2), (2,1,2,{'weight':0.1}), (2,1,1,{'weight':0.2}), (1,1)) ) >>> G.edges(keys=True) [(2, 2, 0), (2, 1, 2), (2, 1, 1), (1, 1, 0)] """ # node_dict_factory=dict # already assigned in Graph # adjlist_dict_factory=dict edge_key_dict_factory = dict # edge_attr_dict_factory=dict def __init__(self, data=None, **attr): self.edge_key_dict_factory = self.edge_key_dict_factory DiGraph.__init__(self, data, **attr) def add_edge(self, u, v, key=None, attr_dict=None, **attr): """Add an edge between u and v. The nodes u and v will be automatically added if they are not already in the graph. Edge attributes can be specified with keywords or by providing a dictionary with key/value pairs. See examples below. Parameters ---------- u, v : nodes Nodes can be, for example, strings or numbers. Nodes must be hashable (and not None) Python objects. key : hashable identifier, optional (default=lowest unused integer) Used to distinguish multiedges between a pair of nodes. attr_dict : dictionary, optional (default= no attributes) Dictionary of edge attributes. Key/value pairs will update existing data associated with the edge. attr : keyword arguments, optional Edge data (or labels or objects) can be assigned using keyword arguments. See Also -------- add_edges_from : add a collection of edges Notes ----- To replace/update edge data, use the optional key argument to identify a unique edge. Otherwise a new edge will be created. NetworkX algorithms designed for weighted graphs cannot use multigraphs directly because it is not clear how to handle multiedge weights. Convert to Graph using edge attribute 'weight' to enable weighted graph algorithms. Examples -------- The following all add the edge e=(1,2) to graph G: >>> G = nx.MultiDiGraph() >>> e = (1,2) >>> G.add_edge(1, 2) # explicit two-node form >>> G.add_edge(*e) # single edge as tuple of two nodes >>> G.add_edges_from( [(1,2)] ) # add edges from iterable container Associate data to edges using keywords: >>> G.add_edge(1, 2, weight=3) >>> G.add_edge(1, 2, key=0, weight=4) # update data for key=0 >>> G.add_edge(1, 3, weight=7, capacity=15, length=342.7) """ # set up attribute dict if attr_dict is None: attr_dict = attr else: try: attr_dict.update(attr) except AttributeError: raise NetworkXError( "The attr_dict argument must be a dictionary.") # add nodes if u not in self.succ: self.succ[u] = self.adjlist_dict_factory() self.pred[u] = self.adjlist_dict_factory() self.node[u] = {} if v not in self.succ: self.succ[v] = self.adjlist_dict_factory() self.pred[v] = self.adjlist_dict_factory() self.node[v] = {} if v in self.succ[u]: keydict = self.adj[u][v] if key is None: # find a unique integer key # other methods might be better here? key = len(keydict) while key in keydict: key += 1 datadict = keydict.get(key, self.edge_key_dict_factory()) datadict.update(attr_dict) keydict[key] = datadict else: # selfloops work this way without special treatment if key is None: key = 0 datadict = self.edge_attr_dict_factory() datadict.update(attr_dict) keydict = self.edge_key_dict_factory() keydict[key] = datadict self.succ[u][v] = keydict self.pred[v][u] = keydict def remove_edge(self, u, v, key=None): """Remove an edge between u and v. Parameters ---------- u, v : nodes Remove an edge between nodes u and v. key : hashable identifier, optional (default=None) Used to distinguish multiple edges between a pair of nodes. If None remove a single (abritrary) edge between u and v. Raises ------ NetworkXError If there is not an edge between u and v, or if there is no edge with the specified key. See Also -------- remove_edges_from : remove a collection of edges Examples -------- >>> G = nx.MultiDiGraph() >>> G.add_path([0,1,2,3]) >>> G.remove_edge(0,1) >>> e = (1,2) >>> G.remove_edge(*e) # unpacks e from an edge tuple For multiple edges >>> G = nx.MultiDiGraph() >>> G.add_edges_from([(1,2),(1,2),(1,2)]) >>> G.remove_edge(1,2) # remove a single (arbitrary) edge For edges with keys >>> G = nx.MultiDiGraph() >>> G.add_edge(1,2,key='first') >>> G.add_edge(1,2,key='second') >>> G.remove_edge(1,2,key='second') """ try: d = self.adj[u][v] except (KeyError): raise NetworkXError( "The edge %s-%s is not in the graph." % (u, v)) # remove the edge with specified data if key is None: d.popitem() else: try: del d[key] except (KeyError): raise NetworkXError( "The edge %s-%s with key %s is not in the graph." % (u, v, key)) if len(d) == 0: # remove the key entries if last edge del self.succ[u][v] del self.pred[v][u] def edges_iter(self, nbunch=None, data=False, keys=False, default=None): """Return an iterator over the edges. Edges are returned as tuples with optional data and keys in the order (node, neighbor, key, data). Parameters ---------- nbunch : iterable container, optional (default= all nodes) A container of nodes. The container will be iterated through once. data : string or bool, optional (default=False) The edge attribute returned in 3-tuple (u,v,ddict[data]). If True, return edge attribute dict in 3-tuple (u,v,ddict). If False, return 2-tuple (u,v). keys : bool, optional (default=False) If True, return edge keys with each edge. default : value, optional (default=None) Value used for edges that dont have the requested attribute. Only relevant if data is not True or False. Returns ------- edge_iter : iterator An iterator of (u,v), (u,v,d) or (u,v,key,d) tuples of edges. See Also -------- edges : return a list of edges Notes ----- Nodes in nbunch that are not in the graph will be (quietly) ignored. For directed graphs this returns the out-edges. Examples -------- >>> G = nx.MultiDiGraph() >>> G.add_path([0,1,2]) >>> G.add_edge(2,3,weight=5) >>> [e for e in G.edges_iter()] [(0, 1), (1, 2), (2, 3)] >>> list(G.edges_iter(data=True)) # default data is {} (empty dict) [(0, 1, {}), (1, 2, {}), (2, 3, {'weight': 5})] >>> list(G.edges_iter(data='weight', default=1)) [(0, 1, 1), (1, 2, 1), (2, 3, 5)] >>> list(G.edges(keys=True)) # default keys are integers [(0, 1, 0), (1, 2, 0), (2, 3, 0)] >>> list(G.edges(data=True,keys=True)) # default keys are integers [(0, 1, 0, {}), (1, 2, 0, {}), (2, 3, 0, {'weight': 5})] >>> list(G.edges(data='weight',default=1,keys=True)) [(0, 1, 0, 1), (1, 2, 0, 1), (2, 3, 0, 5)] >>> list(G.edges_iter([0,2])) [(0, 1), (2, 3)] >>> list(G.edges_iter(0)) [(0, 1)] """ if nbunch is None: nodes_nbrs = self.adj.items() else: nodes_nbrs = ((n, self.adj[n]) for n in self.nbunch_iter(nbunch)) if data is True: for n, nbrs in nodes_nbrs: for nbr, keydict in nbrs.items(): for key, ddict in keydict.items(): yield (n, nbr, key, ddict) if keys else (n, nbr, ddict) elif data is not False: for n, nbrs in nodes_nbrs: for nbr, keydict in nbrs.items(): for key, ddict in keydict.items(): d = ddict[data] if data in ddict else default yield (n, nbr, key, d) if keys else (n, nbr, d) else: for n, nbrs in nodes_nbrs: for nbr, keydict in nbrs.items(): for key in keydict: yield (n, nbr, key) if keys else (n, nbr) # alias out_edges to edges out_edges_iter = edges_iter def out_edges(self, nbunch=None, keys=False, data=False): """Return a list of the outgoing edges. Edges are returned as tuples with optional data and keys in the order (node, neighbor, key, data). Parameters ---------- nbunch : iterable container, optional (default= all nodes) A container of nodes. The container will be iterated through once. data : bool, optional (default=False) If True, return edge attribute dict with each edge. keys : bool, optional (default=False) If True, return edge keys with each edge. Returns ------- out_edges : list An listr of (u,v), (u,v,d) or (u,v,key,d) tuples of edges. Notes ----- Nodes in nbunch that are not in the graph will be (quietly) ignored. For directed graphs edges() is the same as out_edges(). See Also -------- in_edges: return a list of incoming edges """ return list(self.out_edges_iter(nbunch, keys=keys, data=data)) def in_edges_iter(self, nbunch=None, data=False, keys=False): """Return an iterator over the incoming edges. Parameters ---------- nbunch : iterable container, optional (default= all nodes) A container of nodes. The container will be iterated through once. data : bool, optional (default=False) If True, return edge attribute dict with each edge. keys : bool, optional (default=False) If True, return edge keys with each edge. Returns ------- in_edge_iter : iterator An iterator of (u,v), (u,v,d) or (u,v,key,d) tuples of edges. See Also -------- edges_iter : return an iterator of edges """ if nbunch is None: nodes_nbrs = self.pred.items() else: nodes_nbrs = ((n, self.pred[n]) for n in self.nbunch_iter(nbunch)) if data: for n, nbrs in nodes_nbrs: for nbr, keydict in nbrs.items(): for key, data in keydict.items(): if keys: yield (nbr, n, key, data) else: yield (nbr, n, data) else: for n, nbrs in nodes_nbrs: for nbr, keydict in nbrs.items(): for key, data in keydict.items(): if keys: yield (nbr, n, key) else: yield (nbr, n) def in_edges(self, nbunch=None, keys=False, data=False): """Return a list of the incoming edges. Parameters ---------- nbunch : iterable container, optional (default= all nodes) A container of nodes. The container will be iterated through once. data : bool, optional (default=False) If True, return edge attribute dict with each edge. keys : bool, optional (default=False) If True, return edge keys with each edge. Returns ------- in_edges : list A list of (u,v), (u,v,d) or (u,v,key,d) tuples of edges. See Also -------- out_edges: return a list of outgoing edges """ return list(self.in_edges_iter(nbunch, keys=keys, data=data)) def degree_iter(self, nbunch=None, weight=None): """Return an iterator for (node, degree). The node degree is the number of edges adjacent to the node. Parameters ---------- nbunch : iterable container, optional (default=all nodes) A container of nodes. The container will be iterated through once. weight : string or None, optional (default=None) The edge attribute that holds the numerical value used as a weight. If None, then each edge has weight 1. The degree is the sum of the edge weights. Returns ------- nd_iter : an iterator The iterator returns two-tuples of (node, degree). See Also -------- degree Examples -------- >>> G = nx.MultiDiGraph() >>> G.add_path([0,1,2,3]) >>> list(G.degree_iter(0)) # node 0 with degree 1 [(0, 1)] >>> list(G.degree_iter([0,1])) [(0, 1), (1, 2)] """ if nbunch is None: nodes_nbrs = ( (n, succ, self.pred[n]) for n,succ in self.succ.items() ) else: nodes_nbrs = ( (n, self.succ[n], self.pred[n]) for n in self.nbunch_iter(nbunch)) if weight is None: for n, succ, pred in nodes_nbrs: indeg = sum([len(data) for data in pred.values()]) outdeg = sum([len(data) for data in succ.values()]) yield (n, indeg + outdeg) else: # edge weighted graph - degree is sum of nbr edge weights for n, succ, pred in nodes_nbrs: deg = sum([d.get(weight, 1) for data in pred.values() for d in data.values()]) deg += sum([d.get(weight, 1) for data in succ.values() for d in data.values()]) yield (n, deg) def in_degree_iter(self, nbunch=None, weight=None): """Return an iterator for (node, in-degree). The node in-degree is the number of edges pointing in to the node. Parameters ---------- nbunch : iterable container, optional (default=all nodes) A container of nodes. The container will be iterated through once. weight : string or None, optional (default=None) The edge attribute that holds the numerical value used as a weight. If None, then each edge has weight 1. The degree is the sum of the edge weights adjacent to the node. Returns ------- nd_iter : an iterator The iterator returns two-tuples of (node, in-degree). See Also -------- degree, in_degree, out_degree, out_degree_iter Examples -------- >>> G = nx.MultiDiGraph() >>> G.add_path([0,1,2,3]) >>> list(G.in_degree_iter(0)) # node 0 with degree 0 [(0, 0)] >>> list(G.in_degree_iter([0,1])) [(0, 0), (1, 1)] """ if nbunch is None: nodes_nbrs = self.pred.items() else: nodes_nbrs = ((n, self.pred[n]) for n in self.nbunch_iter(nbunch)) if weight is None: for n, nbrs in nodes_nbrs: yield (n, sum([len(data) for data in nbrs.values()])) else: # edge weighted graph - degree is sum of nbr edge weights for n, pred in nodes_nbrs: deg = sum([d.get(weight, 1) for data in pred.values() for d in data.values()]) yield (n, deg) def out_degree_iter(self, nbunch=None, weight=None): """Return an iterator for (node, out-degree). The node out-degree is the number of edges pointing out of the node. Parameters ---------- nbunch : iterable container, optional (default=all nodes) A container of nodes. The container will be iterated through once. weight : string or None, optional (default=None) The edge attribute that holds the numerical value used as a weight. If None, then each edge has weight 1. The degree is the sum of the edge weights. Returns ------- nd_iter : an iterator The iterator returns two-tuples of (node, out-degree). See Also -------- degree, in_degree, out_degree, in_degree_iter Examples -------- >>> G = nx.MultiDiGraph() >>> G.add_path([0,1,2,3]) >>> list(G.out_degree_iter(0)) # node 0 with degree 1 [(0, 1)] >>> list(G.out_degree_iter([0,1])) [(0, 1), (1, 1)] """ if nbunch is None: nodes_nbrs = self.succ.items() else: nodes_nbrs = ((n, self.succ[n]) for n in self.nbunch_iter(nbunch)) if weight is None: for n, nbrs in nodes_nbrs: yield (n, sum([len(data) for data in nbrs.values()])) else: for n, succ in nodes_nbrs: deg = sum([d.get(weight, 1) for data in succ.values() for d in data.values()]) yield (n, deg) def is_multigraph(self): """Return True if graph is a multigraph, False otherwise.""" return True def is_directed(self): """Return True if graph is directed, False otherwise.""" return True def to_directed(self): """Return a directed copy of the graph. Returns ------- G : MultiDiGraph A deepcopy of the graph. Notes ----- If edges in both directions (u,v) and (v,u) exist in the graph, attributes for the new undirected edge will be a combination of the attributes of the directed edges. The edge data is updated in the (arbitrary) order that the edges are encountered. For more customized control of the edge attributes use add_edge(). This returns a "deepcopy" of the edge, node, and graph attributes which attempts to completely copy all of the data and references. This is in contrast to the similar G=DiGraph(D) which returns a shallow copy of the data. See the Python copy module for more information on shallow and deep copies, http://docs.python.org/library/copy.html. Examples -------- >>> G = nx.Graph() # or MultiGraph, etc >>> G.add_path([0,1]) >>> H = G.to_directed() >>> H.edges() [(0, 1), (1, 0)] If already directed, return a (deep) copy >>> G = nx.MultiDiGraph() >>> G.add_path([0,1]) >>> H = G.to_directed() >>> H.edges() [(0, 1)] """ return deepcopy(self) def to_undirected(self, reciprocal=False): """Return an undirected representation of the digraph. Parameters ---------- reciprocal : bool (optional) If True only keep edges that appear in both directions in the original digraph. Returns ------- G : MultiGraph An undirected graph with the same name and nodes and with edge (u,v,data) if either (u,v,data) or (v,u,data) is in the digraph. If both edges exist in digraph and their edge data is different, only one edge is created with an arbitrary choice of which edge data to use. You must check and correct for this manually if desired. Notes ----- This returns a "deepcopy" of the edge, node, and graph attributes which attempts to completely copy all of the data and references. This is in contrast to the similar D=DiGraph(G) which returns a shallow copy of the data. See the Python copy module for more information on shallow and deep copies, http://docs.python.org/library/copy.html. Warning: If you have subclassed MultiGraph to use dict-like objects in the data structure, those changes do not transfer to the MultiDiGraph created by this method. """ H = MultiGraph() H.name = self.name H.add_nodes_from(self) if reciprocal is True: H.add_edges_from((u, v, key, deepcopy(data)) for u, nbrs in self.adjacency_iter() for v, keydict in nbrs.items() for key, data in keydict.items() if self.has_edge(v, u, key)) else: H.add_edges_from((u, v, key, deepcopy(data)) for u, nbrs in self.adjacency_iter() for v, keydict in nbrs.items() for key, data in keydict.items()) H.graph = deepcopy(self.graph) H.node = deepcopy(self.node) return H def subgraph(self, nbunch): """Return the subgraph induced on nodes in nbunch. The induced subgraph of the graph contains the nodes in nbunch and the edges between those nodes. Parameters ---------- nbunch : list, iterable A container of nodes which will be iterated through once. Returns ------- G : Graph A subgraph of the graph with the same edge attributes. Notes ----- The graph, edge or node attributes just point to the original graph. So changes to the node or edge structure will not be reflected in the original graph while changes to the attributes will. To create a subgraph with its own copy of the edge/node attributes use: nx.Graph(G.subgraph(nbunch)) If edge attributes are containers, a deep copy can be obtained using: G.subgraph(nbunch).copy() For an inplace reduction of a graph to a subgraph you can remove nodes: G.remove_nodes_from([ n in G if n not in set(nbunch)]) Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> H = G.subgraph([0,1,2]) >>> H.edges() [(0, 1), (1, 2)] """ bunch = self.nbunch_iter(nbunch) # create new graph and copy subgraph into it H = self.__class__() # copy node and attribute dictionaries for n in bunch: H.node[n] = self.node[n] # namespace shortcuts for speed H_succ = H.succ H_pred = H.pred self_succ = self.succ self_pred = self.pred # add nodes for n in H: H_succ[n] = H.adjlist_dict_factory() H_pred[n] = H.adjlist_dict_factory() # add edges for u in H_succ: Hnbrs = H_succ[u] for v, edgedict in self_succ[u].items(): if v in H_succ: # add both representations of edge: u-v and v-u # they share the same edgedict ed = edgedict.copy() Hnbrs[v] = ed H_pred[v][u] = ed H.graph = self.graph return H def reverse(self, copy=True): """Return the reverse of the graph. The reverse is a graph with the same nodes and edges but with the directions of the edges reversed. Parameters ---------- copy : bool optional (default=True) If True, return a new DiGraph holding the reversed edges. If False, reverse the reverse graph is created using the original graph (this changes the original graph). """ if copy: H = self.__class__(name="Reverse of (%s)"%self.name) H.add_nodes_from(self) H.add_edges_from((v, u, k, deepcopy(d)) for u, v, k, d in self.edges(keys=True, data=True)) H.graph = deepcopy(self.graph) H.node = deepcopy(self.node) else: self.pred, self.succ = self.succ, self.pred self.adj = self.succ H = self return H networkx-1.11/networkx/classes/ordered.py0000644000175000017500000000202112637544450020514 0ustar aricaric00000000000000""" OrderedDict variants of the default base classes. """ from collections import OrderedDict from .graph import Graph from .multigraph import MultiGraph from .digraph import DiGraph from .multidigraph import MultiDiGraph __all__ = [] __all__.extend([ 'OrderedGraph', 'OrderedDiGraph', 'OrderedMultiGraph', 'OrderedMultiDiGraph', ]) class OrderedGraph(Graph): node_dict_factory = OrderedDict adjlist_dict_factory = OrderedDict edge_attr_dict_factory = OrderedDict class OrderedDiGraph(DiGraph): node_dict_factory = OrderedDict adjlist_dict_factory = OrderedDict edge_attr_dict_factory = OrderedDict class OrderedMultiGraph(MultiGraph): node_dict_factory = OrderedDict adjlist_dict_factory = OrderedDict edge_key_dict_factory = OrderedDict edge_attr_dict_factory = OrderedDict class OrderedMultiDiGraph(MultiDiGraph): node_dict_factory = OrderedDict adjlist_dict_factory = OrderedDict edge_key_dict_factory = OrderedDict edge_attr_dict_factory = OrderedDict networkx-1.11/networkx/tests/0000755000175000017500000000000012653231454016223 5ustar aricaric00000000000000networkx-1.11/networkx/tests/benchmark.py0000644000175000017500000001647012637544500020540 0ustar aricaric00000000000000from timeit import Timer # This is gratefully modeled after the benchmarks found in # the numpy svn repository. http://svn.scipy.org/svn/numpy/trunk class Benchmark(object): """ Benchmark a method or simple bit of code using different Graph classes. If the test code is the same for each graph class, then you can set it during instantiation through the argument test_string. The argument test_string can also be a tuple of test code and setup code. The code is entered as a string valid for use with the timeit module. Example: >>> b=Benchmark(['Graph','XGraph']) >>> b['Graph']=('G.add_nodes_from(nlist)','nlist=range(100)') >>> b.run() """ def __init__(self,graph_classes,title='',test_string=None,runs=3,reps=1000): self.runs = runs self.reps = reps self.title = title self.class_tests = dict((gc,'') for gc in graph_classes) # set up the test string if it is the same for all classes. if test_string is not None: if isinstance(test_string,tuple): self['all']=test_string else: self['all']=(test_string,'') def __setitem__(self,graph_class,some_strs): """ Set a simple bit of code and setup string for the test. Use this for cases where the code differs from one class to another. """ test_str, setup_str = some_strs if graph_class == 'all': graph_class = self.class_tests.keys() elif not isinstance(graph_class,list): graph_class = [graph_class] for GC in graph_class: # setup_string='import networkx as NX\nG=NX.%s.%s()\n'%(GC.lower(),GC) \ # + setup_str setup_string='import networkx as NX\nG=NX.%s()\n'%(GC,) \ + setup_str self.class_tests[GC] = Timer(test_str, setup_string) def run(self): """Run the benchmark for each class and print results.""" column_len = max(len(G) for G in self.class_tests) print('='*72) if self.title: print("%s: %s runs, %s reps"% (self.title,self.runs,self.reps)) print('='*72) times=[] for GC,timer in self.class_tests.items(): name = GC.ljust(column_len) try: # t=sum(timer.repeat(self.runs,self.reps))/self.runs t=min(timer.repeat(self.runs,self.reps)) # print "%s: %s" % (name, timer.repeat(self.runs,self.reps)) times.append((t,name)) except Exception as e: print("%s: Failed to benchmark (%s)." % (name,e)) times.sort() tmin=times[0][0] for t,name in times: print("%s: %5.2f %s" % (name, t/tmin*100.,t)) print('-'*72) print() if __name__ == "__main__": # set up for all routines: classes=['Graph','MultiGraph','DiGraph','MultiDiGraph'] # classes=['Graph','MultiGraph','DiGraph','MultiDiGraph', # 'SpecialGraph','SpecialDiGraph','SpecialMultiGraph','SpecialMultiDiGraph'] # classes=['Graph','SpecialGraph'] all_tests=['add_nodes','add_edges','remove_nodes','remove_edges',\ 'neighbors','edges','degree','dijkstra','shortest path',\ 'subgraph','edgedata_subgraph','laplacian'] # Choose which tests to run tests=all_tests # tests=['edges','laplacian'] #tests=all_tests[-1:] N=100 if 'add_nodes' in tests: title='Benchmark: Adding nodes' test_string=('G.add_nodes_from(nlist)','nlist=range(%i)'%N) b=Benchmark(classes,title,test_string,runs=3,reps=1000) b.run() if 'add_edges' in tests: title='Benchmark: Adding edges' setup='elist=[(i,i+3) for i in range(%s-3)]\nG.add_nodes_from(range(%i))'%(N,N) test_string=('G.add_edges_from(elist)',setup) b=Benchmark(classes,title,test_string,runs=3,reps=1000) b.run() if 'remove_nodes' in tests: title='Benchmark: Adding and Deleting nodes' setup='nlist=range(%i)'%N test_string=('G.add_nodes_from(nlist)\nG.remove_nodes_from(nlist)',setup) b=Benchmark(classes,title,test_string,runs=3,reps=1000) b.run() if 'remove_edges' in tests: title='Benchmark: Adding and Deleting edges' setup='elist=[(i,i+3) for i in range(%s-3)]'%N test_string=('G.add_edges_from(elist)\nG.remove_edges_from(elist)',setup) b=Benchmark(classes,title,test_string,runs=3,reps=1000) b.run() if 'neighbors' in tests: N=500 p=0.3 title='Benchmark: reporting neighbors' setup='H=NX.binomial_graph(%s,%s)\nfor (u,v) in H.edges_iter():\n G.add_edges_from([(u,v),(v,u)])'%(N,p) test_string=('for n in G:\n for nbr in G.neighbors(n):\n pass',setup) b=Benchmark(classes,title,test_string,runs=3,reps=10) b.run() if 'edges' in tests: N=500 p=0.3 title='Benchmark: reporting edges' setup='H=NX.binomial_graph(%s,%s)\nfor (u,v) in H.edges_iter():\n G.add_edges_from([(u,v),(v,u)])'%(N,p) test_string=('for n in G:\n for e in G.edges(n):\n pass',setup) b=Benchmark(classes,title,test_string,runs=3,reps=10) b.run() if 'degree' in tests: N=500 p=0.3 title='Benchmark: reporting degree' setup='H=NX.binomial_graph(%s,%s)\nfor (u,v) in H.edges_iter():\n G.add_edges_from([(u,v),(v,u)])'%(N,p) test_string=('for d in G.degree():\n pass',setup) b=Benchmark(classes,title,test_string,runs=3,reps=10) b.run() if 'dijkstra' in tests: N=500 p=0.3 title='dijkstra single source shortest path' setup='i=6\nH=NX.binomial_graph(%s,%s)\nfor (u,v) in H.edges_iter():\n G.add_edges_from([(u,v),(v,u)])'%(N,p) test_string=('p=NX.single_source_dijkstra(G,i)',setup) b=Benchmark(classes,title,test_string,runs=3,reps=10) b.run() if 'shortest path' in tests: N=500 p=0.3 title='single source shortest path' setup='i=6\nH=NX.binomial_graph(%s,%s)\nfor (u,v) in H.edges_iter():\n G.add_edges_from([(u,v),(v,u)])'%(N,p) test_string=('p=NX.single_source_shortest_path(G,i)',setup) b=Benchmark(classes,title,test_string,runs=3,reps=10) b.run() if 'subgraph' in tests: N=500 p=0.3 title='subgraph method' setup='nlist=range(100,150)\nH=NX.binomial_graph(%s,%s)\nfor (u,v) in H.edges_iter():\n G.add_edges_from([(u,v),(v,u)])'%(N,p) test_string=('G.subgraph(nlist)',setup) b=Benchmark(classes,title,test_string,runs=3,reps=10) b.run() if 'edgedata_subgraph' in tests: N=500 p=0.3 title='subgraph method with edge data present' setup='nlist=range(100,150)\nH=NX.binomial_graph(%s,%s)\nfor (u,v) in H.edges_iter():\n G.add_edges_from([(u,v,dict(hi=2)),(v,u,dict(hi=2))])'%(N,p) test_string=('G.subgraph(nlist)',setup) b=Benchmark(classes,title,test_string,runs=3,reps=10) b.run() if 'laplacian' in tests: N=500 p=0.3 title='creation of laplacian matrix' setup='H=NX.binomial_graph(%s,%s)\nfor (u,v) in H.edges_iter():\n G.add_edges_from([(u,v),(v,u)])'%(N,p) test_string=('NX.laplacian_matrix(G)',setup) b=Benchmark(classes,title,test_string,runs=3,reps=1) b.run() networkx-1.11/networkx/tests/test_convert.py0000644000175000017500000002271712637544500021326 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * from networkx import * from networkx.convert import * from networkx.algorithms.operators import * from networkx.generators.classic import barbell_graph,cycle_graph class TestConvert(): def edgelists_equal(self,e1,e2): return sorted(sorted(e) for e in e1)==sorted(sorted(e) for e in e2) def test_simple_graphs(self): for dest, source in [(to_dict_of_dicts, from_dict_of_dicts), (to_dict_of_lists, from_dict_of_lists)]: G=barbell_graph(10,3) dod=dest(G) # Dict of [dicts, lists] GG=source(dod) assert_equal(sorted(G.nodes()), sorted(GG.nodes())) assert_equal(sorted(G.edges()), sorted(GG.edges())) GW=to_networkx_graph(dod) assert_equal(sorted(G.nodes()), sorted(GW.nodes())) assert_equal(sorted(G.edges()), sorted(GW.edges())) GI=Graph(dod) assert_equal(sorted(G.nodes()), sorted(GI.nodes())) assert_equal(sorted(G.edges()), sorted(GI.edges())) # With nodelist keyword P4=path_graph(4) P3=path_graph(3) dod=dest(P4,nodelist=[0,1,2]) Gdod=Graph(dod) assert_equal(sorted(Gdod.nodes()), sorted(P3.nodes())) assert_equal(sorted(Gdod.edges()), sorted(P3.edges())) def test_digraphs(self): for dest, source in [(to_dict_of_dicts, from_dict_of_dicts), (to_dict_of_lists, from_dict_of_lists)]: G=cycle_graph(10) # Dict of [dicts, lists] dod=dest(G) GG=source(dod) assert_equal(sorted(G.nodes()), sorted(GG.nodes())) assert_equal(sorted(G.edges()), sorted(GG.edges())) GW=to_networkx_graph(dod) assert_equal(sorted(G.nodes()), sorted(GW.nodes())) assert_equal(sorted(G.edges()), sorted(GW.edges())) GI=Graph(dod) assert_equal(sorted(G.nodes()), sorted(GI.nodes())) assert_equal(sorted(G.edges()), sorted(GI.edges())) G=cycle_graph(10,create_using=DiGraph()) dod=dest(G) GG=source(dod, create_using=DiGraph()) assert_equal(sorted(G.nodes()), sorted(GG.nodes())) assert_equal(sorted(G.edges()), sorted(GG.edges())) GW=to_networkx_graph(dod, create_using=DiGraph()) assert_equal(sorted(G.nodes()), sorted(GW.nodes())) assert_equal(sorted(G.edges()), sorted(GW.edges())) GI=DiGraph(dod) assert_equal(sorted(G.nodes()), sorted(GI.nodes())) assert_equal(sorted(G.edges()), sorted(GI.edges())) def test_graph(self): G=cycle_graph(10) e=G.edges() source=[u for u,v in e] dest=[v for u,v in e] ex=zip(source,dest,source) G=Graph() G.add_weighted_edges_from(ex) # Dict of dicts dod=to_dict_of_dicts(G) GG=from_dict_of_dicts(dod,create_using=Graph()) assert_equal(sorted(G.nodes()), sorted(GG.nodes())) assert_equal(sorted(G.edges()), sorted(GG.edges())) GW=to_networkx_graph(dod,create_using=Graph()) assert_equal(sorted(G.nodes()), sorted(GW.nodes())) assert_equal(sorted(G.edges()), sorted(GW.edges())) GI=Graph(dod) assert_equal(sorted(G.nodes()), sorted(GI.nodes())) assert_equal(sorted(G.edges()), sorted(GI.edges())) # Dict of lists dol=to_dict_of_lists(G) GG=from_dict_of_lists(dol,create_using=Graph()) # dict of lists throws away edge data so set it to none enone=[(u,v,{}) for (u,v,d) in G.edges(data=True)] assert_equal(sorted(G.nodes()), sorted(GG.nodes())) assert_equal(enone, sorted(GG.edges(data=True))) GW=to_networkx_graph(dol,create_using=Graph()) assert_equal(sorted(G.nodes()), sorted(GW.nodes())) assert_equal(enone, sorted(GW.edges(data=True))) GI=Graph(dol) assert_equal(sorted(G.nodes()), sorted(GI.nodes())) assert_equal(enone, sorted(GI.edges(data=True))) def test_with_multiedges_self_loops(self): G=cycle_graph(10) e=G.edges() source,dest = list(zip(*e)) ex=list(zip(source,dest,source)) XG=Graph() XG.add_weighted_edges_from(ex) XGM=MultiGraph() XGM.add_weighted_edges_from(ex) XGM.add_edge(0,1,weight=2) # multiedge XGS=Graph() XGS.add_weighted_edges_from(ex) XGS.add_edge(0,0,weight=100) # self loop # Dict of dicts # with self loops, OK dod=to_dict_of_dicts(XGS) GG=from_dict_of_dicts(dod,create_using=Graph()) assert_equal(sorted(XGS.nodes()), sorted(GG.nodes())) assert_equal(sorted(XGS.edges()), sorted(GG.edges())) GW=to_networkx_graph(dod,create_using=Graph()) assert_equal(sorted(XGS.nodes()), sorted(GW.nodes())) assert_equal(sorted(XGS.edges()), sorted(GW.edges())) GI=Graph(dod) assert_equal(sorted(XGS.nodes()), sorted(GI.nodes())) assert_equal(sorted(XGS.edges()), sorted(GI.edges())) # Dict of lists # with self loops, OK dol=to_dict_of_lists(XGS) GG=from_dict_of_lists(dol,create_using=Graph()) # dict of lists throws away edge data so set it to none enone=[(u,v,{}) for (u,v,d) in XGS.edges(data=True)] assert_equal(sorted(XGS.nodes()), sorted(GG.nodes())) assert_equal(enone, sorted(GG.edges(data=True))) GW=to_networkx_graph(dol,create_using=Graph()) assert_equal(sorted(XGS.nodes()), sorted(GW.nodes())) assert_equal(enone, sorted(GW.edges(data=True))) GI=Graph(dol) assert_equal(sorted(XGS.nodes()), sorted(GI.nodes())) assert_equal(enone, sorted(GI.edges(data=True))) # Dict of dicts # with multiedges, OK dod=to_dict_of_dicts(XGM) GG=from_dict_of_dicts(dod,create_using=MultiGraph(), multigraph_input=True) assert_equal(sorted(XGM.nodes()), sorted(GG.nodes())) assert_equal(sorted(XGM.edges()), sorted(GG.edges())) GW=to_networkx_graph(dod,create_using=MultiGraph(),multigraph_input=True) assert_equal(sorted(XGM.nodes()), sorted(GW.nodes())) assert_equal(sorted(XGM.edges()), sorted(GW.edges())) GI=MultiGraph(dod) # convert can't tell whether to duplicate edges! assert_equal(sorted(XGM.nodes()), sorted(GI.nodes())) #assert_not_equal(sorted(XGM.edges()), sorted(GI.edges())) assert_false(sorted(XGM.edges()) == sorted(GI.edges())) GE=from_dict_of_dicts(dod,create_using=MultiGraph(), multigraph_input=False) assert_equal(sorted(XGM.nodes()), sorted(GE.nodes())) assert_not_equal(sorted(XGM.edges()), sorted(GE.edges())) GI=MultiGraph(XGM) assert_equal(sorted(XGM.nodes()), sorted(GI.nodes())) assert_equal(sorted(XGM.edges()), sorted(GI.edges())) GM=MultiGraph(G) assert_equal(sorted(GM.nodes()), sorted(G.nodes())) assert_equal(sorted(GM.edges()), sorted(G.edges())) # Dict of lists # with multiedges, OK, but better write as DiGraph else you'll # get double edges dol=to_dict_of_lists(G) GG=from_dict_of_lists(dol,create_using=MultiGraph()) assert_equal(sorted(G.nodes()), sorted(GG.nodes())) assert_equal(sorted(G.edges()), sorted(GG.edges())) GW=to_networkx_graph(dol,create_using=MultiGraph()) assert_equal(sorted(G.nodes()), sorted(GW.nodes())) assert_equal(sorted(G.edges()), sorted(GW.edges())) GI=MultiGraph(dol) assert_equal(sorted(G.nodes()), sorted(GI.nodes())) assert_equal(sorted(G.edges()), sorted(GI.edges())) def test_edgelists(self): P=path_graph(4) e=[(0,1),(1,2),(2,3)] G=Graph(e) assert_equal(sorted(G.nodes()), sorted(P.nodes())) assert_equal(sorted(G.edges()), sorted(P.edges())) assert_equal(sorted(G.edges(data=True)), sorted(P.edges(data=True))) e=[(0,1,{}),(1,2,{}),(2,3,{})] G=Graph(e) assert_equal(sorted(G.nodes()), sorted(P.nodes())) assert_equal(sorted(G.edges()), sorted(P.edges())) assert_equal(sorted(G.edges(data=True)), sorted(P.edges(data=True))) e=((n,n+1) for n in range(3)) G=Graph(e) assert_equal(sorted(G.nodes()), sorted(P.nodes())) assert_equal(sorted(G.edges()), sorted(P.edges())) assert_equal(sorted(G.edges(data=True)), sorted(P.edges(data=True))) def test_directed_to_undirected(self): edges1 = [(0, 1), (1, 2), (2, 0)] edges2 = [(0, 1), (1, 2), (0, 2)] assert_true(self.edgelists_equal(nx.Graph(nx.DiGraph(edges1)).edges(),edges1)) assert_true(self.edgelists_equal(nx.Graph(nx.DiGraph(edges2)).edges(),edges1)) assert_true(self.edgelists_equal(nx.MultiGraph(nx.DiGraph(edges1)).edges(),edges1)) assert_true(self.edgelists_equal(nx.MultiGraph(nx.DiGraph(edges2)).edges(),edges1)) assert_true(self.edgelists_equal(nx.MultiGraph(nx.MultiDiGraph(edges1)).edges(), edges1)) assert_true(self.edgelists_equal(nx.MultiGraph(nx.MultiDiGraph(edges2)).edges(), edges1)) assert_true(self.edgelists_equal(nx.Graph(nx.MultiDiGraph(edges1)).edges(),edges1)) assert_true(self.edgelists_equal(nx.Graph(nx.MultiDiGraph(edges2)).edges(),edges1)) networkx-1.11/networkx/tests/test_relabel.py0000644000175000017500000001527212637544500021252 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * from networkx import * from networkx.convert import * from networkx.algorithms.operators import * from networkx.generators.classic import barbell_graph,cycle_graph from networkx.testing import * class TestRelabel(): def test_convert_node_labels_to_integers(self): # test that empty graph converts fine for all options G=empty_graph() H=convert_node_labels_to_integers(G,100) assert_equal(H.name, '(empty_graph(0))_with_int_labels') assert_equal(H.nodes(), []) assert_equal(H.edges(), []) for opt in ["default", "sorted", "increasing degree", "decreasing degree"]: G=empty_graph() H=convert_node_labels_to_integers(G,100, ordering=opt) assert_equal(H.name, '(empty_graph(0))_with_int_labels') assert_equal(H.nodes(), []) assert_equal(H.edges(), []) G=empty_graph() G.add_edges_from([('A','B'),('A','C'),('B','C'),('C','D')]) G.name="paw" H=convert_node_labels_to_integers(G) degH=H.degree().values() degG=G.degree().values() assert_equal(sorted(degH), sorted(degG)) H=convert_node_labels_to_integers(G,1000) degH=H.degree().values() degG=G.degree().values() assert_equal(sorted(degH), sorted(degG)) assert_equal(H.nodes(), [1000, 1001, 1002, 1003]) H=convert_node_labels_to_integers(G,ordering="increasing degree") degH=H.degree().values() degG=G.degree().values() assert_equal(sorted(degH), sorted(degG)) assert_equal(degree(H,0), 1) assert_equal(degree(H,1), 2) assert_equal(degree(H,2), 2) assert_equal(degree(H,3), 3) H=convert_node_labels_to_integers(G,ordering="decreasing degree") degH=H.degree().values() degG=G.degree().values() assert_equal(sorted(degH), sorted(degG)) assert_equal(degree(H,0), 3) assert_equal(degree(H,1), 2) assert_equal(degree(H,2), 2) assert_equal(degree(H,3), 1) H=convert_node_labels_to_integers(G,ordering="increasing degree", label_attribute='label') degH=H.degree().values() degG=G.degree().values() assert_equal(sorted(degH), sorted(degG)) assert_equal(degree(H,0), 1) assert_equal(degree(H,1), 2) assert_equal(degree(H,2), 2) assert_equal(degree(H,3), 3) # check mapping assert_equal(H.node[3]['label'],'C') assert_equal(H.node[0]['label'],'D') assert_true(H.node[1]['label']=='A' or H.node[2]['label']=='A') assert_true(H.node[1]['label']=='B' or H.node[2]['label']=='B') def test_convert_to_integers2(self): G=empty_graph() G.add_edges_from([('C','D'),('A','B'),('A','C'),('B','C')]) G.name="paw" H=convert_node_labels_to_integers(G,ordering="sorted") degH=H.degree().values() degG=G.degree().values() assert_equal(sorted(degH), sorted(degG)) H=convert_node_labels_to_integers(G,ordering="sorted", label_attribute='label') assert_equal(H.node[0]['label'],'A') assert_equal(H.node[1]['label'],'B') assert_equal(H.node[2]['label'],'C') assert_equal(H.node[3]['label'],'D') @raises(nx.NetworkXError) def test_convert_to_integers_raise(self): G = nx.Graph() H=convert_node_labels_to_integers(G,ordering="increasing age") def test_relabel_nodes_copy(self): G=empty_graph() G.add_edges_from([('A','B'),('A','C'),('B','C'),('C','D')]) mapping={'A':'aardvark','B':'bear','C':'cat','D':'dog'} H=relabel_nodes(G,mapping) assert_equal(sorted(H.nodes()), ['aardvark', 'bear', 'cat', 'dog']) def test_relabel_nodes_function(self): G=empty_graph() G.add_edges_from([('A','B'),('A','C'),('B','C'),('C','D')]) # function mapping no longer encouraged but works def mapping(n): return ord(n) H=relabel_nodes(G,mapping) assert_equal(sorted(H.nodes()), [65, 66, 67, 68]) def test_relabel_nodes_graph(self): G=Graph([('A','B'),('A','C'),('B','C'),('C','D')]) mapping={'A':'aardvark','B':'bear','C':'cat','D':'dog'} H=relabel_nodes(G,mapping) assert_equal(sorted(H.nodes()), ['aardvark', 'bear', 'cat', 'dog']) def test_relabel_nodes_digraph(self): G=DiGraph([('A','B'),('A','C'),('B','C'),('C','D')]) mapping={'A':'aardvark','B':'bear','C':'cat','D':'dog'} H=relabel_nodes(G,mapping,copy=False) assert_equal(sorted(H.nodes()), ['aardvark', 'bear', 'cat', 'dog']) def test_relabel_nodes_multigraph(self): G=MultiGraph([('a','b'),('a','b')]) mapping={'a':'aardvark','b':'bear'} G=relabel_nodes(G,mapping,copy=False) assert_equal(sorted(G.nodes()), ['aardvark', 'bear']) assert_edges_equal(sorted(G.edges()), [('aardvark', 'bear'), ('aardvark', 'bear')]) def test_relabel_nodes_multidigraph(self): G=MultiDiGraph([('a','b'),('a','b')]) mapping={'a':'aardvark','b':'bear'} G=relabel_nodes(G,mapping,copy=False) assert_equal(sorted(G.nodes()), ['aardvark', 'bear']) assert_equal(sorted(G.edges()), [('aardvark', 'bear'), ('aardvark', 'bear')]) def test_relabel_isolated_nodes_to_same(self): G=Graph() G.add_nodes_from(range(4)) mapping={1:1} H=relabel_nodes(G, mapping, copy=False) assert_equal(sorted(H.nodes()), list(range(4))) @raises(KeyError) def test_relabel_nodes_missing(self): G=Graph([('A','B'),('A','C'),('B','C'),('C','D')]) mapping={0:'aardvark'} G=relabel_nodes(G,mapping,copy=False) def test_relabel_toposort(self): K5=nx.complete_graph(4) G=nx.complete_graph(4) G=nx.relabel_nodes(G,dict( [(i,i+1) for i in range(4)]),copy=False) nx.is_isomorphic(K5,G) G=nx.complete_graph(4) G=nx.relabel_nodes(G,dict( [(i,i-1) for i in range(4)]),copy=False) nx.is_isomorphic(K5,G) def test_relabel_selfloop(self): G = nx.DiGraph([(1, 1), (1, 2), (2, 3)]) G = nx.relabel_nodes(G, {1: 'One', 2: 'Two', 3: 'Three'}, copy=False) assert_equal(sorted(G.nodes()),['One','Three','Two']) G = nx.MultiDiGraph([(1, 1), (1, 2), (2, 3)]) G = nx.relabel_nodes(G, {1: 'One', 2: 'Two', 3: 'Three'}, copy=False) assert_equal(sorted(G.nodes()),['One','Three','Two']) G = nx.MultiDiGraph([(1, 1)]) G = nx.relabel_nodes(G, {1: 0}, copy=False) assert_equal(G.nodes(), [0]) networkx-1.11/networkx/tests/test.py0000644000175000017500000000233412637544450017563 0ustar aricaric00000000000000#!/usr/bin/env python import sys from os import path,getcwd def run(verbosity=1,doctest=False,numpy=True): """Run NetworkX tests. Parameters ---------- verbosity: integer, optional Level of detail in test reports. Higher numbers provide more detail. doctest: bool, optional True to run doctests in code modules numpy: bool, optional True to test modules dependent on numpy """ try: import nose except ImportError: raise ImportError(\ "The nose package is needed to run the NetworkX tests.") sys.stderr.write("Running NetworkX tests:") nx_install_dir=path.join(path.dirname(__file__), path.pardir) # stop if running from source directory if getcwd() == path.abspath(path.join(nx_install_dir,path.pardir)): raise RuntimeError("Can't run tests from source directory.\n" "Run 'nosetests' from the command line.") argv=[' ','--verbosity=%d'%verbosity, '-w',nx_install_dir, '-exe'] if doctest: argv.extend(['--with-doctest','--doctest-extension=txt']) if not numpy: argv.extend(['-A not numpy']) nose.run(argv=argv) if __name__=="__main__": run() networkx-1.11/networkx/tests/__init__.py0000644000175000017500000000000012637544450020327 0ustar aricaric00000000000000networkx-1.11/networkx/tests/test_convert_pandas.py0000644000175000017500000000420112637544450022644 0ustar aricaric00000000000000from nose import SkipTest from nose.tools import assert_true import networkx as nx class TestConvertPandas(object): numpy=1 # nosetests attribute, use nosetests -a 'not numpy' to skip test @classmethod def setupClass(cls): try: import pandas as pd except ImportError: raise SkipTest('Pandas not available.') def __init__(self, ): global pd import pandas as pd self.r = pd.np.random.RandomState(seed=5) ints = self.r.random_integers(1, 10, size=(3,2)) a = ['A', 'B', 'C'] b = ['D', 'A', 'E'] df = pd.DataFrame(ints, columns=['weight', 'cost']) df[0] = a # Column label 0 (int) df['b'] = b # Column label 'b' (str) self.df = df def assert_equal(self, G1, G2): assert_true( nx.is_isomorphic(G1, G2, edge_match=lambda x, y: x == y )) def test_from_dataframe_all_attr(self, ): Gtrue = nx.Graph([('E', 'C', {'cost': 9, 'weight': 10}), ('B', 'A', {'cost': 1, 'weight': 7}), ('A', 'D', {'cost': 7, 'weight': 4})]) G=nx.from_pandas_dataframe(self.df, 0, 'b', True) self.assert_equal(G, Gtrue) def test_from_dataframe_multi_attr(self, ): Gtrue = nx.Graph([('E', 'C', {'cost': 9, 'weight': 10}), ('B', 'A', {'cost': 1, 'weight': 7}), ('A', 'D', {'cost': 7, 'weight': 4})]) G=nx.from_pandas_dataframe(self.df, 0, 'b', ['weight', 'cost']) self.assert_equal(G, Gtrue) def test_from_dataframe_one_attr(self, ): Gtrue = nx.Graph([('E', 'C', {'weight': 10}), ('B', 'A', {'weight': 7}), ('A', 'D', {'weight': 4})]) G=nx.from_pandas_dataframe(self.df, 0, 'b', 'weight') self.assert_equal(G, Gtrue) def test_from_dataframe_no_attr(self, ): Gtrue = nx.Graph([('E', 'C', {}), ('B', 'A', {}), ('A', 'D', {})]) G=nx.from_pandas_dataframe(self.df, 0, 'b',) self.assert_equal(G, Gtrue) networkx-1.11/networkx/tests/test_convert_scipy.py0000644000175000017500000002222212637544500022524 0ustar aricaric00000000000000from nose import SkipTest from nose.tools import assert_raises, assert_true, assert_equal, raises import networkx as nx from networkx.testing import assert_graphs_equal from networkx.generators.classic import barbell_graph,cycle_graph,path_graph from networkx.testing.utils import assert_graphs_equal class TestConvertNumpy(object): @classmethod def setupClass(cls): global np, sp, sparse, np_assert_equal try: import numpy as np import scipy as sp import scipy.sparse as sparse np_assert_equal=np.testing.assert_equal except ImportError: raise SkipTest('SciPy sparse library not available.') def __init__(self): self.G1 = barbell_graph(10, 3) self.G2 = cycle_graph(10, create_using=nx.DiGraph()) self.G3 = self.create_weighted(nx.Graph()) self.G4 = self.create_weighted(nx.DiGraph()) def create_weighted(self, G): g = cycle_graph(4) e = g.edges() source = [u for u,v in e] dest = [v for u,v in e] weight = [s+10 for s in source] ex = zip(source, dest, weight) G.add_weighted_edges_from(ex) return G def assert_equal(self, G1, G2): assert_true( sorted(G1.nodes())==sorted(G2.nodes()) ) assert_true( sorted(G1.edges())==sorted(G2.edges()) ) def identity_conversion(self, G, A, create_using): GG = nx.from_scipy_sparse_matrix(A, create_using=create_using) self.assert_equal(G, GG) GW = nx.to_networkx_graph(A, create_using=create_using) self.assert_equal(G, GW) GI = create_using.__class__(A) self.assert_equal(G, GI) ACSR = A.tocsr() GI = create_using.__class__(ACSR) self.assert_equal(G, GI) ACOO = A.tocoo() GI = create_using.__class__(ACOO) self.assert_equal(G, GI) ACSC = A.tocsc() GI = create_using.__class__(ACSC) self.assert_equal(G, GI) AD = A.todense() GI = create_using.__class__(AD) self.assert_equal(G, GI) AA = A.toarray() GI = create_using.__class__(AA) self.assert_equal(G, GI) def test_shape(self): "Conversion from non-square sparse array." A = sp.sparse.lil_matrix([[1,2,3],[4,5,6]]) assert_raises(nx.NetworkXError, nx.from_scipy_sparse_matrix, A) def test_identity_graph_matrix(self): "Conversion from graph to sparse matrix to graph." A = nx.to_scipy_sparse_matrix(self.G1) self.identity_conversion(self.G1, A, nx.Graph()) def test_identity_digraph_matrix(self): "Conversion from digraph to sparse matrix to digraph." A = nx.to_scipy_sparse_matrix(self.G2) self.identity_conversion(self.G2, A, nx.DiGraph()) def test_identity_weighted_graph_matrix(self): """Conversion from weighted graph to sparse matrix to weighted graph.""" A = nx.to_scipy_sparse_matrix(self.G3) self.identity_conversion(self.G3, A, nx.Graph()) def test_identity_weighted_digraph_matrix(self): """Conversion from weighted digraph to sparse matrix to weighted digraph.""" A = nx.to_scipy_sparse_matrix(self.G4) self.identity_conversion(self.G4, A, nx.DiGraph()) def test_nodelist(self): """Conversion from graph to sparse matrix to graph with nodelist.""" P4 = path_graph(4) P3 = path_graph(3) nodelist = P3.nodes() A = nx.to_scipy_sparse_matrix(P4, nodelist=nodelist) GA = nx.Graph(A) self.assert_equal(GA, P3) # Make nodelist ambiguous by containing duplicates. nodelist += [nodelist[0]] assert_raises(nx.NetworkXError, nx.to_numpy_matrix, P3, nodelist=nodelist) def test_weight_keyword(self): WP4 = nx.Graph() WP4.add_edges_from( (n,n+1,dict(weight=0.5,other=0.3)) for n in range(3) ) P4 = path_graph(4) A = nx.to_scipy_sparse_matrix(P4) np_assert_equal(A.todense(), nx.to_scipy_sparse_matrix(WP4,weight=None).todense()) np_assert_equal(0.5*A.todense(), nx.to_scipy_sparse_matrix(WP4).todense()) np_assert_equal(0.3*A.todense(), nx.to_scipy_sparse_matrix(WP4,weight='other').todense()) def test_format_keyword(self): WP4 = nx.Graph() WP4.add_edges_from( (n,n+1,dict(weight=0.5,other=0.3)) for n in range(3) ) P4 = path_graph(4) A = nx.to_scipy_sparse_matrix(P4, format='csr') np_assert_equal(A.todense(), nx.to_scipy_sparse_matrix(WP4,weight=None).todense()) A = nx.to_scipy_sparse_matrix(P4, format='csc') np_assert_equal(A.todense(), nx.to_scipy_sparse_matrix(WP4,weight=None).todense()) A = nx.to_scipy_sparse_matrix(P4, format='coo') np_assert_equal(A.todense(), nx.to_scipy_sparse_matrix(WP4,weight=None).todense()) A = nx.to_scipy_sparse_matrix(P4, format='bsr') np_assert_equal(A.todense(), nx.to_scipy_sparse_matrix(WP4,weight=None).todense()) A = nx.to_scipy_sparse_matrix(P4, format='lil') np_assert_equal(A.todense(), nx.to_scipy_sparse_matrix(WP4,weight=None).todense()) A = nx.to_scipy_sparse_matrix(P4, format='dia') np_assert_equal(A.todense(), nx.to_scipy_sparse_matrix(WP4,weight=None).todense()) A = nx.to_scipy_sparse_matrix(P4, format='dok') np_assert_equal(A.todense(), nx.to_scipy_sparse_matrix(WP4,weight=None).todense()) @raises(nx.NetworkXError) def test_format_keyword_fail(self): WP4 = nx.Graph() WP4.add_edges_from( (n,n+1,dict(weight=0.5,other=0.3)) for n in range(3) ) P4 = path_graph(4) nx.to_scipy_sparse_matrix(P4, format='any_other') @raises(nx.NetworkXError) def test_null_fail(self): nx.to_scipy_sparse_matrix(nx.Graph()) def test_empty(self): G = nx.Graph() G.add_node(1) M = nx.to_scipy_sparse_matrix(G) np_assert_equal(M.todense(), np.matrix([[0]])) def test_ordering(self): G = nx.DiGraph() G.add_edge(1,2) G.add_edge(2,3) G.add_edge(3,1) M = nx.to_scipy_sparse_matrix(G,nodelist=[3,2,1]) np_assert_equal(M.todense(), np.matrix([[0,0,1],[1,0,0],[0,1,0]])) def test_selfloop_graph(self): G = nx.Graph([(1,1)]) M = nx.to_scipy_sparse_matrix(G) np_assert_equal(M.todense(), np.matrix([[1]])) def test_selfloop_digraph(self): G = nx.DiGraph([(1,1)]) M = nx.to_scipy_sparse_matrix(G) np_assert_equal(M.todense(), np.matrix([[1]])) def test_from_scipy_sparse_matrix_parallel_edges(self): """Tests that the :func:`networkx.from_scipy_sparse_matrix` function interprets integer weights as the number of parallel edges when creating a multigraph. """ A = sparse.csr_matrix([[1, 1], [1, 2]]) # First, with a simple graph, each integer entry in the adjacency # matrix is interpreted as the weight of a single edge in the graph. expected = nx.DiGraph() edges = [(0, 0), (0, 1), (1, 0)] expected.add_weighted_edges_from([(u, v, 1) for (u, v) in edges]) expected.add_edge(1, 1, weight=2) actual = nx.from_scipy_sparse_matrix(A, parallel_edges=True, create_using=nx.DiGraph()) assert_graphs_equal(actual, expected) actual = nx.from_scipy_sparse_matrix(A, parallel_edges=False, create_using=nx.DiGraph()) assert_graphs_equal(actual, expected) # Now each integer entry in the adjacency matrix is interpreted as the # number of parallel edges in the graph if the appropriate keyword # argument is specified. edges = [(0, 0), (0, 1), (1, 0), (1, 1), (1, 1)] expected = nx.MultiDiGraph() expected.add_weighted_edges_from([(u, v, 1) for (u, v) in edges]) actual = nx.from_scipy_sparse_matrix(A, parallel_edges=True, create_using=nx.MultiDiGraph()) assert_graphs_equal(actual, expected) expected = nx.MultiDiGraph() expected.add_edges_from(set(edges), weight=1) # The sole self-loop (edge 0) on vertex 1 should have weight 2. expected[1][1][0]['weight'] = 2 actual = nx.from_scipy_sparse_matrix(A, parallel_edges=False, create_using=nx.MultiDiGraph()) assert_graphs_equal(actual, expected) def test_symmetric(self): """Tests that a symmetric matrix has edges added only once to an undirected multigraph when using :func:`networkx.from_scipy_sparse_matrix`. """ A = sparse.csr_matrix([[0, 1], [1, 0]]) G = nx.from_scipy_sparse_matrix(A, create_using=nx.MultiGraph()) expected = nx.MultiGraph() expected.add_edge(0, 1, weight=1) assert_graphs_equal(G, expected) networkx-1.11/networkx/tests/test_convert_numpy.py0000644000175000017500000002070212637544500022546 0ustar aricaric00000000000000from nose import SkipTest from nose.tools import assert_raises, assert_true, assert_equal import networkx as nx from networkx.generators.classic import barbell_graph,cycle_graph,path_graph from networkx.testing.utils import assert_graphs_equal class TestConvertNumpy(object): numpy=1 # nosetests attribute, use nosetests -a 'not numpy' to skip test @classmethod def setupClass(cls): global np global np_assert_equal try: import numpy as np np_assert_equal=np.testing.assert_equal except ImportError: raise SkipTest('NumPy not available.') def __init__(self): self.G1 = barbell_graph(10, 3) self.G2 = cycle_graph(10, create_using=nx.DiGraph()) self.G3 = self.create_weighted(nx.Graph()) self.G4 = self.create_weighted(nx.DiGraph()) def create_weighted(self, G): g = cycle_graph(4) e = g.edges() source = [u for u,v in e] dest = [v for u,v in e] weight = [s+10 for s in source] ex = zip(source, dest, weight) G.add_weighted_edges_from(ex) return G def assert_equal(self, G1, G2): assert_true( sorted(G1.nodes())==sorted(G2.nodes()) ) assert_true( sorted(G1.edges())==sorted(G2.edges()) ) def identity_conversion(self, G, A, create_using): GG = nx.from_numpy_matrix(A, create_using=create_using) self.assert_equal(G, GG) GW = nx.to_networkx_graph(A, create_using=create_using) self.assert_equal(G, GW) GI = create_using.__class__(A) self.assert_equal(G, GI) def test_shape(self): "Conversion from non-square array." A=np.array([[1,2,3],[4,5,6]]) assert_raises(nx.NetworkXError, nx.from_numpy_matrix, A) def test_identity_graph_matrix(self): "Conversion from graph to matrix to graph." A = nx.to_numpy_matrix(self.G1) self.identity_conversion(self.G1, A, nx.Graph()) def test_identity_graph_array(self): "Conversion from graph to array to graph." A = nx.to_numpy_matrix(self.G1) A = np.asarray(A) self.identity_conversion(self.G1, A, nx.Graph()) def test_identity_digraph_matrix(self): """Conversion from digraph to matrix to digraph.""" A = nx.to_numpy_matrix(self.G2) self.identity_conversion(self.G2, A, nx.DiGraph()) def test_identity_digraph_array(self): """Conversion from digraph to array to digraph.""" A = nx.to_numpy_matrix(self.G2) A = np.asarray(A) self.identity_conversion(self.G2, A, nx.DiGraph()) def test_identity_weighted_graph_matrix(self): """Conversion from weighted graph to matrix to weighted graph.""" A = nx.to_numpy_matrix(self.G3) self.identity_conversion(self.G3, A, nx.Graph()) def test_identity_weighted_graph_array(self): """Conversion from weighted graph to array to weighted graph.""" A = nx.to_numpy_matrix(self.G3) A = np.asarray(A) self.identity_conversion(self.G3, A, nx.Graph()) def test_identity_weighted_digraph_matrix(self): """Conversion from weighted digraph to matrix to weighted digraph.""" A = nx.to_numpy_matrix(self.G4) self.identity_conversion(self.G4, A, nx.DiGraph()) def test_identity_weighted_digraph_array(self): """Conversion from weighted digraph to array to weighted digraph.""" A = nx.to_numpy_matrix(self.G4) A = np.asarray(A) self.identity_conversion(self.G4, A, nx.DiGraph()) def test_nodelist(self): """Conversion from graph to matrix to graph with nodelist.""" P4 = path_graph(4) P3 = path_graph(3) nodelist = P3.nodes() A = nx.to_numpy_matrix(P4, nodelist=nodelist) GA = nx.Graph(A) self.assert_equal(GA, P3) # Make nodelist ambiguous by containing duplicates. nodelist += [nodelist[0]] assert_raises(nx.NetworkXError, nx.to_numpy_matrix, P3, nodelist=nodelist) def test_weight_keyword(self): WP4 = nx.Graph() WP4.add_edges_from( (n,n+1,dict(weight=0.5,other=0.3)) for n in range(3) ) P4 = path_graph(4) A = nx.to_numpy_matrix(P4) np_assert_equal(A, nx.to_numpy_matrix(WP4,weight=None)) np_assert_equal(0.5*A, nx.to_numpy_matrix(WP4)) np_assert_equal(0.3*A, nx.to_numpy_matrix(WP4,weight='other')) def test_from_numpy_matrix_type(self): A=np.matrix([[1]]) G=nx.from_numpy_matrix(A) assert_equal(type(G[0][0]['weight']),int) A=np.matrix([[1]]).astype(np.float) G=nx.from_numpy_matrix(A) assert_equal(type(G[0][0]['weight']),float) A=np.matrix([[1]]).astype(np.str) G=nx.from_numpy_matrix(A) assert_equal(type(G[0][0]['weight']),str) A=np.matrix([[1]]).astype(np.bool) G=nx.from_numpy_matrix(A) assert_equal(type(G[0][0]['weight']),bool) A=np.matrix([[1]]).astype(np.complex) G=nx.from_numpy_matrix(A) assert_equal(type(G[0][0]['weight']),complex) A=np.matrix([[1]]).astype(np.object) assert_raises(TypeError,nx.from_numpy_matrix,A) def test_from_numpy_matrix_dtype(self): dt=[('weight',float),('cost',int)] A=np.matrix([[(1.0,2)]],dtype=dt) G=nx.from_numpy_matrix(A) assert_equal(type(G[0][0]['weight']),float) assert_equal(type(G[0][0]['cost']),int) assert_equal(G[0][0]['cost'],2) assert_equal(G[0][0]['weight'],1.0) def test_to_numpy_recarray(self): G=nx.Graph() G.add_edge(1,2,weight=7.0,cost=5) A=nx.to_numpy_recarray(G,dtype=[('weight',float),('cost',int)]) assert_equal(sorted(A.dtype.names),['cost','weight']) assert_equal(A.weight[0,1],7.0) assert_equal(A.weight[0,0],0.0) assert_equal(A.cost[0,1],5) assert_equal(A.cost[0,0],0) def test_numpy_multigraph(self): G=nx.MultiGraph() G.add_edge(1,2,weight=7) G.add_edge(1,2,weight=70) A=nx.to_numpy_matrix(G) assert_equal(A[1,0],77) A=nx.to_numpy_matrix(G,multigraph_weight=min) assert_equal(A[1,0],7) A=nx.to_numpy_matrix(G,multigraph_weight=max) assert_equal(A[1,0],70) def test_from_numpy_matrix_parallel_edges(self): """Tests that the :func:`networkx.from_numpy_matrix` function interprets integer weights as the number of parallel edges when creating a multigraph. """ A = np.matrix([[1, 1], [1, 2]]) # First, with a simple graph, each integer entry in the adjacency # matrix is interpreted as the weight of a single edge in the graph. expected = nx.DiGraph() edges = [(0, 0), (0, 1), (1, 0)] expected.add_weighted_edges_from([(u, v, 1) for (u, v) in edges]) expected.add_edge(1, 1, weight=2) actual = nx.from_numpy_matrix(A, parallel_edges=True, create_using=nx.DiGraph()) assert_graphs_equal(actual, expected) actual = nx.from_numpy_matrix(A, parallel_edges=False, create_using=nx.DiGraph()) assert_graphs_equal(actual, expected) # Now each integer entry in the adjacency matrix is interpreted as the # number of parallel edges in the graph if the appropriate keyword # argument is specified. edges = [(0, 0), (0, 1), (1, 0), (1, 1), (1, 1)] expected = nx.MultiDiGraph() expected.add_weighted_edges_from([(u, v, 1) for (u, v) in edges]) actual = nx.from_numpy_matrix(A, parallel_edges=True, create_using=nx.MultiDiGraph()) assert_graphs_equal(actual, expected) expected = nx.MultiDiGraph() expected.add_edges_from(set(edges), weight=1) # The sole self-loop (edge 0) on vertex 1 should have weight 2. expected[1][1][0]['weight'] = 2 actual = nx.from_numpy_matrix(A, parallel_edges=False, create_using=nx.MultiDiGraph()) assert_graphs_equal(actual, expected) def test_symmetric(self): """Tests that a symmetric matrix has edges added only once to an undirected multigraph when using :func:`networkx.from_numpy_matrix`. """ A = np.matrix([[0, 1], [1, 0]]) G = nx.from_numpy_matrix(A, create_using=nx.MultiGraph()) expected = nx.MultiGraph() expected.add_edge(0, 1, weight=1) assert_graphs_equal(G, expected) networkx-1.11/networkx/tests/test_exceptions.py0000644000175000017500000000144412637544450022025 0ustar aricaric00000000000000from nose.tools import raises import networkx as nx # smoke tests for exceptions @raises(nx.NetworkXException) def test_raises_networkx_exception(): raise nx.NetworkXException @raises(nx.NetworkXError) def test_raises_networkx_error(): raise nx.NetworkXError @raises(nx.NetworkXPointlessConcept) def test_raises_networkx_pointless_concept(): raise nx.NetworkXPointlessConcept @raises(nx.NetworkXAlgorithmError) def test_raises_networkx_algorithm_error(): raise nx.NetworkXAlgorithmError @raises(nx.NetworkXUnfeasible) def test_raises_networkx_unfeasible(): raise nx.NetworkXUnfeasible @raises(nx.NetworkXNoPath) def test_raises_networkx_no_path(): raise nx.NetworkXNoPath @raises(nx.NetworkXUnbounded) def test_raises_networkx_unbounded(): raise nx.NetworkXUnbounded networkx-1.11/networkx/relabel.py0000644000175000017500000001576312637544500017056 0ustar aricaric00000000000000# Copyright (C) 2006-2013 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import networkx as nx __author__ = """\n""".join(['Aric Hagberg ', 'Pieter Swart (swart@lanl.gov)', 'Dan Schult (dschult@colgate.edu)']) __all__ = ['convert_node_labels_to_integers', 'relabel_nodes'] def relabel_nodes(G, mapping, copy=True): """Relabel the nodes of the graph G. Parameters ---------- G : graph A NetworkX graph mapping : dictionary A dictionary with the old labels as keys and new labels as values. A partial mapping is allowed. copy : bool (optional, default=True) If True return a copy, or if False relabel the nodes in place. Examples -------- >>> G=nx.path_graph(3) # nodes 0-1-2 >>> mapping={0:'a',1:'b',2:'c'} >>> H=nx.relabel_nodes(G,mapping) >>> print(sorted(H.nodes())) ['a', 'b', 'c'] >>> G=nx.path_graph(26) # nodes 0..25 >>> mapping=dict(zip(G.nodes(),"abcdefghijklmnopqrstuvwxyz")) >>> H=nx.relabel_nodes(G,mapping) # nodes a..z >>> mapping=dict(zip(G.nodes(),range(1,27))) >>> G1=nx.relabel_nodes(G,mapping) # nodes 1..26 Partial in-place mapping: >>> G=nx.path_graph(3) # nodes 0-1-2 >>> mapping={0:'a',1:'b'} # 0->'a' and 1->'b' >>> G=nx.relabel_nodes(G,mapping, copy=False) print(G.nodes()) [2, 'b', 'a'] Mapping as function: >>> G=nx.path_graph(3) >>> def mapping(x): ... return x**2 >>> H=nx.relabel_nodes(G,mapping) >>> print(H.nodes()) [0, 1, 4] Notes ----- Only the nodes specified in the mapping will be relabeled. The keyword setting copy=False modifies the graph in place. This is not always possible if the mapping is circular. In that case use copy=True. See Also -------- convert_node_labels_to_integers """ # you can pass a function f(old_label)->new_label # but we'll just make a dictionary here regardless if not hasattr(mapping,"__getitem__"): m = dict((n, mapping(n)) for n in G) else: m = mapping if copy: return _relabel_copy(G, m) else: return _relabel_inplace(G, m) def _relabel_inplace(G, mapping): old_labels = set(mapping.keys()) new_labels = set(mapping.values()) if len(old_labels & new_labels) > 0: # labels sets overlap # can we topological sort and still do the relabeling? D = nx.DiGraph(list(mapping.items())) D.remove_edges_from(D.selfloop_edges()) try: nodes = nx.topological_sort(D, reverse=True) except nx.NetworkXUnfeasible: raise nx.NetworkXUnfeasible('The node label sets are overlapping ' 'and no ordering can resolve the ' 'mapping. Use copy=True.') else: # non-overlapping label sets nodes = old_labels multigraph = G.is_multigraph() directed = G.is_directed() for old in nodes: try: new = mapping[old] except KeyError: continue if new == old: continue try: G.add_node(new, attr_dict=G.node[old]) except KeyError: raise KeyError("Node %s is not in the graph"%old) if multigraph: new_edges = [(new, new if old == target else target, key, data) for (_,target,key,data) in G.edges(old, data=True, keys=True)] if directed: new_edges += [(new if old == source else source, new, key, data) for (source, _, key,data) in G.in_edges(old, data=True, keys=True)] else: new_edges = [(new, new if old == target else target, data) for (_,target,data) in G.edges(old, data=True)] if directed: new_edges += [(new if old == source else source,new,data) for (source,_,data) in G.in_edges(old, data=True)] G.remove_node(old) G.add_edges_from(new_edges) return G def _relabel_copy(G, mapping): H = G.__class__() H.name = "(%s)" % G.name if G.is_multigraph(): H.add_edges_from( (mapping.get(n1, n1),mapping.get(n2, n2),k,d.copy()) for (n1,n2,k,d) in G.edges_iter(keys=True, data=True)) else: H.add_edges_from( (mapping.get(n1, n1),mapping.get(n2, n2),d.copy()) for (n1, n2, d) in G.edges_iter(data=True)) H.add_nodes_from(mapping.get(n, n) for n in G) H.node.update(dict((mapping.get(n, n), d.copy()) for n,d in G.node.items())) H.graph.update(G.graph.copy()) return H def convert_node_labels_to_integers(G, first_label=0, ordering="default", label_attribute=None): """Return a copy of the graph G with the nodes relabeled using consecutive integers. Parameters ---------- G : graph A NetworkX graph first_label : int, optional (default=0) An integer specifying the starting offset in numbering nodes. The new integer labels are numbered first_label, ..., n-1+first_label. ordering : string "default" : inherit node ordering from G.nodes() "sorted" : inherit node ordering from sorted(G.nodes()) "increasing degree" : nodes are sorted by increasing degree "decreasing degree" : nodes are sorted by decreasing degree label_attribute : string, optional (default=None) Name of node attribute to store old label. If None no attribute is created. Notes ----- Node and edge attribute data are copied to the new (relabeled) graph. See Also -------- relabel_nodes """ N = G.number_of_nodes()+first_label if ordering == "default": mapping = dict(zip(G.nodes(), range(first_label, N))) elif ordering == "sorted": nlist = G.nodes() nlist.sort() mapping = dict(zip(nlist, range(first_label, N))) elif ordering == "increasing degree": dv_pairs = [(d,n) for (n,d) in G.degree_iter()] dv_pairs.sort() # in-place sort from lowest to highest degree mapping = dict(zip([n for d,n in dv_pairs], range(first_label, N))) elif ordering == "decreasing degree": dv_pairs = [(d,n) for (n,d) in G.degree_iter()] dv_pairs.sort() # in-place sort from lowest to highest degree dv_pairs.reverse() mapping = dict(zip([n for d,n in dv_pairs], range(first_label, N))) else: raise nx.NetworkXError('Unknown node ordering: %s'%ordering) H = relabel_nodes(G, mapping) H.name = "("+G.name+")_with_int_labels" # create node attribute with the old label if label_attribute is not None: nx.set_node_attributes(H, label_attribute, dict((v,k) for k,v in mapping.items())) return H networkx-1.11/networkx/testing/0000755000175000017500000000000012653231454016536 5ustar aricaric00000000000000networkx-1.11/networkx/testing/utils.py0000644000175000017500000000361612637544450020263 0ustar aricaric00000000000000import operator from nose.tools import * __all__ = ['assert_nodes_equal', 'assert_edges_equal','assert_graphs_equal'] def assert_nodes_equal(nlist1, nlist2): # Assumes lists are either nodes, or (node,datadict) tuples, # and also that nodes are orderable/sortable. try: n1 = sorted(nlist1,key=operator.itemgetter(0)) n2 = sorted(nlist2,key=operator.itemgetter(0)) assert_equal(len(n1),len(n2)) for a,b in zip(n1,n2): assert_equal(a,b) except TypeError: assert_equal(set(nlist1),set(nlist2)) return def assert_edges_equal(elist1, elist2): # Assumes lists with u,v nodes either as # edge tuples (u,v) # edge tuples with data dicts (u,v,d) # edge tuples with keys and data dicts (u,v,k, d) # and also that nodes are orderable/sortable. e1 = sorted(elist1,key=lambda x: sorted(x[0:2])) e2 = sorted(elist2,key=lambda x: sorted(x[0:2])) assert_equal(len(e1),len(e2)) if len(e1) == 0: return True if len(e1[0]) == 2: for a,b in zip(e1,e2): assert_equal(set(a[0:2]),set(b[0:2])) elif len(e1[0]) == 3: for a,b in zip(e1,e2): assert_equal(set(a[0:2]),set(b[0:2])) assert_equal(a[2],b[2]) elif len(e1[0]) == 4: for a,b in zip(e1,e2): assert_equal(set(a[0:2]),set(b[0:2])) assert_equal(a[2],b[2]) assert_equal(a[3],b[3]) def assert_graphs_equal(graph1, graph2): if graph1.is_multigraph(): edges1 = graph1.edges(data=True,keys=True) else: edges1 = graph1.edges(data=True) if graph2.is_multigraph(): edges2 = graph2.edges(data=True,keys=True) else: edges2 = graph2.edges(data=True) assert_nodes_equal(graph1.nodes(data=True), graph2.nodes(data=True)) assert_edges_equal(edges1, edges2) assert_equal(graph1.graph,graph2.graph) return networkx-1.11/networkx/testing/__init__.py0000644000175000017500000000004512637544450020653 0ustar aricaric00000000000000from networkx.testing.utils import * networkx-1.11/networkx/testing/tests/0000755000175000017500000000000012653231454017700 5ustar aricaric00000000000000networkx-1.11/networkx/testing/tests/test_utils.py0000644000175000017500000000716512637544450022467 0ustar aricaric00000000000000from nose.tools import * import networkx as nx from networkx.testing import * # thanks to numpy for this GenericTest class (numpy/testing/test_utils.py) class _GenericTest(object): def _test_equal(self, a, b): self._assert_func(a, b) def _test_not_equal(self, a, b): try: self._assert_func(a, b) passed = True except AssertionError: pass else: raise AssertionError("a and b are found equal but are not") class TestNodesEqual(_GenericTest): def setUp(self): self._assert_func = assert_nodes_equal def test_nodes_equal(self): a = [1,2,5,4] b = [4,5,1,2] self._test_equal(a,b) def test_nodes_not_equal(self): a = [1,2,5,4] b = [4,5,1,3] self._test_not_equal(a,b) def test_nodes_with_data_equal(self): G = nx.Graph() G.add_nodes_from([1,2,3],color='red') H = nx.Graph() H.add_nodes_from([1,2,3],color='red') self._test_equal(G.nodes(data=True), H.nodes(data=True)) def test_edges_with_data_not_equal(self): G = nx.Graph() G.add_nodes_from([1,2,3],color='red') H = nx.Graph() H.add_nodes_from([1,2,3],color='blue') self._test_not_equal(G.nodes(data=True), H.nodes(data=True)) class TestEdgesEqual(_GenericTest): def setUp(self): self._assert_func = assert_edges_equal def test_edges_equal(self): a = [(1,2),(5,4)] b = [(4,5),(1,2)] self._test_equal(a,b) def test_edges_not_equal(self): a = [(1,2),(5,4)] b = [(4,5),(1,3)] self._test_not_equal(a,b) def test_edges_with_data_equal(self): G = nx.MultiGraph() G.add_path([0,1,2],weight=1) H = nx.MultiGraph() H.add_path([0,1,2],weight=1) self._test_equal(G.edges(data=True, keys=True), H.edges(data=True, keys=True)) def test_edges_with_data_not_equal(self): G = nx.MultiGraph() G.add_path([0,1,2],weight=1) H = nx.MultiGraph() H.add_path([0,1,2],weight=2) self._test_not_equal(G.edges(data=True, keys=True), H.edges(data=True, keys=True)) class TestGraphsEqual(_GenericTest): def setUp(self): self._assert_func = assert_graphs_equal def test_graphs_equal(self): G = nx.path_graph(4) H = nx.Graph() H.add_path(range(4)) H.name='path_graph(4)' self._test_equal(G,H) def test_digraphs_equal(self): G = nx.path_graph(4, create_using=nx.DiGraph()) H = nx.DiGraph() H.add_path(range(4)) H.name='path_graph(4)' self._test_equal(G,H) def test_multigraphs_equal(self): G = nx.path_graph(4, create_using=nx.MultiGraph()) H = nx.MultiGraph() H.add_path(range(4)) H.name='path_graph(4)' self._test_equal(G,H) def test_multigraphs_equal(self): G = nx.path_graph(4, create_using=nx.MultiDiGraph()) H = nx.MultiDiGraph() H.add_path(range(4)) H.name='path_graph(4)' self._test_equal(G,H) def test_graphs_not_equal(self): G = nx.path_graph(4) H = nx.Graph() H.add_cycle(range(4)) self._test_not_equal(G,H) def test_graphs_not_equal2(self): G = nx.path_graph(4) H = nx.Graph() H.add_path(range(3)) H.name='path_graph(4)' self._test_not_equal(G,H) def test_graphs_not_equal3(self): G = nx.path_graph(4) H = nx.Graph() H.add_path(range(4)) H.name='path_graph(foo)' self._test_not_equal(G,H) networkx-1.11/networkx/generators/0000755000175000017500000000000012653231454017232 5ustar aricaric00000000000000networkx-1.11/networkx/generators/line.py0000644000175000017500000002047112637544500020540 0ustar aricaric00000000000000"""Functions for generating line graphs. """ # Copyright (C) 2013 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. __author__ = "\n".join(["Aric Hagberg (hagberg@lanl.gov)", "Pieter Swart (swart@lanl.gov)", "Dan Schult (dschult@colgate.edu)", "chebee7i (chebee7i@gmail.com)"]) __all__ = ['line_graph'] def line_graph(G, create_using=None): """Returns the line graph of the graph or digraph ``G``. The line graph of a graph ``G`` has a node for each edge in ``G`` and an edge joining those nodes if the two edges in ``G`` share a common node. For directed graphs, nodes are adjacent exactly when the edges they represent form a directed path of length two. The nodes of the line graph are 2-tuples of nodes in the original graph (or 3-tuples for multigraphs, with the key of the edge as the third element). For information about self-loops and more discussion, see the **Notes** section below. Parameters ---------- G : graph A NetworkX Graph, DiGraph, MultiGraph, or MultiDigraph. Returns ------- L : graph The line graph of G. Examples -------- >>> import networkx as nx >>> G = nx.star_graph(3) >>> L = nx.line_graph(G) >>> print(sorted(map(sorted, L.edges()))) # makes a 3-clique, K3 [[(0, 1), (0, 2)], [(0, 1), (0, 3)], [(0, 2), (0, 3)]] Notes ----- Graph, node, and edge data are not propagated to the new graph. For undirected graphs, the nodes in G must be sortable, otherwise the constructed line graph may not be correct. *Self-loops in undirected graphs* For an undirected graph `G` without multiple edges, each edge can be written as a set `\{u, v\}`. Its line graph `L` has the edges of `G` as its nodes. If `x` and `y` are two nodes in `L`, then `\{x, y\}` is an edge in `L` if and only if the intersection of `x` and `y` is nonempty. Thus, the set of all edges is determined by the set of all pairwise intersections of edges in `G`. Trivially, every edge in G would have a nonzero intersection with itself, and so every node in `L` should have a self-loop. This is not so interesting, and the original context of line graphs was with simple graphs, which had no self-loops or multiple edges. The line graph was also meant to be a simple graph and thus, self-loops in `L` are not part of the standard definition of a line graph. In a pairwise intersection matrix, this is analogous to excluding the diagonal entries from the line graph definition. Self-loops and multiple edges in `G` add nodes to `L` in a natural way, and do not require any fundamental changes to the definition. It might be argued that the self-loops we excluded before should now be included. However, the self-loops are still "trivial" in some sense and thus, are usually excluded. *Self-loops in directed graphs* For a directed graph `G` without multiple edges, each edge can be written as a tuple `(u, v)`. Its line graph `L` has the edges of `G` as its nodes. If `x` and `y` are two nodes in `L`, then `(x, y)` is an edge in `L` if and only if the tail of `x` matches the head of `y`, for example, if `x = (a, b)` and `y = (b, c)` for some vertices `a`, `b`, and `c` in `G`. Due to the directed nature of the edges, it is no longer the case that every edge in `G` should have a self-loop in `L`. Now, the only time self-loops arise is if a node in `G` itself has a self-loop. So such self-loops are no longer "trivial" but instead, represent essential features of the topology of `G`. For this reason, the historical development of line digraphs is such that self-loops are included. When the graph `G` has multiple edges, once again only superficial changes are required to the definition. References ---------- * Harary, Frank, and Norman, Robert Z., "Some properties of line digraphs", Rend. Circ. Mat. Palermo, II. Ser. 9 (1960), 161--168. * Hemminger, R. L.; Beineke, L. W. (1978), "Line graphs and line digraphs", in Beineke, L. W.; Wilson, R. J., Selected Topics in Graph Theory, Academic Press Inc., pp. 271--305. """ if G.is_directed(): L = _lg_directed(G, create_using=create_using) else: L = _lg_undirected(G, selfloops=False, create_using=create_using) return L def _node_func(G): """Returns a function which returns a sorted node for line graphs. When constructing a line graph for undirected graphs, we must normalize the ordering of nodes as they appear in the edge. """ if G.is_multigraph(): def sorted_node(u, v, key): return (u, v, key) if u <= v else (v, u, key) else: def sorted_node(u, v): return (u, v) if u <= v else (v, u) return sorted_node def _edge_func(G): """Returns the edges from G, handling keys for multigraphs as necessary. """ if G.is_multigraph(): def get_edges(nbunch=None): return G.edges_iter(nbunch, keys=True) else: def get_edges(nbunch=None): return G.edges_iter(nbunch) return get_edges def _sorted_edge(u, v): """Returns a sorted edge. During the construction of a line graph for undirected graphs, the data structure can be a multigraph even though the line graph will never have multiple edges between its nodes. For this reason, we must make sure not to add any edge more than once. This requires that we build up a list of edges to add and then remove all duplicates. And so, we must normalize the representation of the edges. """ return (u, v) if u <= v else (v, u) def _lg_directed(G, create_using=None): """Return the line graph L of the (multi)digraph G. Edges in G appear as nodes in L, represented as tuples of the form (u,v) or (u,v,key) if G is a multidigraph. A node in L corresponding to the edge (u,v) is connected to every node corresponding to an edge (v,w). Parameters ---------- G : digraph A directed graph or directed multigraph. create_using : None A digraph instance used to populate the line graph. """ if create_using is None: L = G.__class__() else: L = create_using # Create a graph specific edge function. get_edges = _edge_func(G) for from_node in get_edges(): # from_node is: (u,v) or (u,v,key) L.add_node(from_node) for to_node in get_edges(from_node[1]): L.add_edge(from_node, to_node) return L def _lg_undirected(G, selfloops=False, create_using=None): """Return the line graph L of the (multi)graph G. Edges in G appear as nodes in L, represented as sorted tuples of the form (u,v), or (u,v,key) if G is a multigraph. A node in L corresponding to the edge {u,v} is connected to every node corresponding to an edge that involves u or v. Parameters ---------- G : graph An undirected graph or multigraph. selfloops : bool If `True`, then self-loops are included in the line graph. If `False`, they are excluded. create_using : None A graph instance used to populate the line graph. Notes ----- The standard algorithm for line graphs of undirected graphs does not produce self-loops. """ if create_using is None: L = G.__class__() else: L = create_using # Graph specific functions for edges and sorted nodes. get_edges = _edge_func(G) sorted_node = _node_func(G) # Determine if we include self-loops or not. shift = 0 if selfloops else 1 edges = set([]) for u in G: # Label nodes as a sorted tuple of nodes in original graph. nodes = [ sorted_node(*x) for x in get_edges(u) ] if len(nodes) == 1: # Then the edge will be an isolated node in L. L.add_node(nodes[0]) # Add a clique of `nodes` to graph. To prevent double adding edges, # especially important for multigraphs, we store the edges in # canonical form in a set. for i, a in enumerate(nodes): edges.update([ _sorted_edge(a,b) for b in nodes[i+shift:] ]) L.add_edges_from(edges) return L networkx-1.11/networkx/generators/classic.py0000644000175000017500000004634712637544500021244 0ustar aricaric00000000000000""" Generators for some classic graphs. The typical graph generator is called as follows: >>> G=nx.complete_graph(100) returning the complete graph on n nodes labeled 0,..,99 as a simple graph. Except for empty_graph, all the generators in this module return a Graph class (i.e. a simple, undirected graph). """ # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import itertools from networkx.algorithms.bipartite.generators import complete_bipartite_graph from networkx.utils import accumulate __author__ ="""Aric Hagberg (hagberg@lanl.gov)\nPieter Swart (swart@lanl.gov)""" __all__ = [ 'balanced_tree', 'barbell_graph', 'complete_graph', 'complete_multipartite_graph', 'circular_ladder_graph', 'circulant_graph', 'cycle_graph', 'dorogovtsev_goltsev_mendes_graph', 'empty_graph', 'full_rary_tree', 'grid_graph', 'grid_2d_graph', 'hypercube_graph', 'ladder_graph', 'lollipop_graph', 'null_graph', 'path_graph', 'star_graph', 'trivial_graph', 'wheel_graph'] #------------------------------------------------------------------- # Some Classic Graphs #------------------------------------------------------------------- import networkx as nx from networkx.utils import is_list_of_ints, flatten def _tree_edges(n,r): # helper function for trees # yields edges in rooted tree at 0 with n nodes and branching ratio r nodes=iter(range(n)) parents=[next(nodes)] # stack of max length r while parents: source=parents.pop(0) for i in range(r): try: target=next(nodes) parents.append(target) yield source,target except StopIteration: break def full_rary_tree(r, n, create_using=None): """Creates a full r-ary tree of n vertices. Sometimes called a k-ary, n-ary, or m-ary tree. "... all non-leaf vertices have exactly r children and all levels are full except for some rightmost position of the bottom level (if a leaf at the bottom level is missing, then so are all of the leaves to its right." [1]_ Parameters ---------- r : int branching factor of the tree n : int Number of nodes in the tree create_using : NetworkX graph type, optional Use specified type to construct graph (default = networkx.Graph) Returns ------- G : networkx Graph An r-ary tree with n nodes References ---------- .. [1] An introduction to data structures and algorithms, James Andrew Storer, Birkhauser Boston 2001, (page 225). """ G=nx.empty_graph(n,create_using) G.add_edges_from(_tree_edges(n,r)) return G def balanced_tree(r, h, create_using=None): """Return the perfectly balanced r-tree of height h. Parameters ---------- r : int Branching factor of the tree h : int Height of the tree create_using : NetworkX graph type, optional Use specified type to construct graph (default = networkx.Graph) Returns ------- G : networkx Graph A tree with n nodes Notes ----- This is the rooted tree where all leaves are at distance h from the root. The root has degree r and all other internal nodes have degree r+1. Node labels are the integers 0 (the root) up to number_of_nodes - 1. Also refered to as a complete r-ary tree. """ # number of nodes is n=1+r+..+r^h if r==1: n=2 else: n = int((1-r**(h+1))/(1-r)) # sum of geometric series r!=1 G=nx.empty_graph(n,create_using) G.add_edges_from(_tree_edges(n,r)) return G return nx.full_rary_tree(r,n,create_using) def barbell_graph(m1,m2,create_using=None): """Return the Barbell Graph: two complete graphs connected by a path. For m1 > 1 and m2 >= 0. Two identical complete graphs K_{m1} form the left and right bells, and are connected by a path P_{m2}. The 2*m1+m2 nodes are numbered 0,...,m1-1 for the left barbell, m1,...,m1+m2-1 for the path, and m1+m2,...,2*m1+m2-1 for the right barbell. The 3 subgraphs are joined via the edges (m1-1,m1) and (m1+m2-1,m1+m2). If m2=0, this is merely two complete graphs joined together. This graph is an extremal example in David Aldous and Jim Fill's etext on Random Walks on Graphs. """ if create_using is not None and create_using.is_directed(): raise nx.NetworkXError("Directed Graph not supported") if m1<2: raise nx.NetworkXError(\ "Invalid graph description, m1 should be >=2") if m2<0: raise nx.NetworkXError(\ "Invalid graph description, m2 should be >=0") # left barbell G=complete_graph(m1,create_using) G.name="barbell_graph(%d,%d)"%(m1,m2) # connecting path G.add_nodes_from([v for v in range(m1,m1+m2-1)]) if m2>1: G.add_edges_from([(v,v+1) for v in range(m1,m1+m2-1)]) # right barbell G.add_edges_from( (u,v) for u in range(m1+m2,2*m1+m2) for v in range(u+1,2*m1+m2)) # connect it up G.add_edge(m1-1,m1) if m2>0: G.add_edge(m1+m2-1,m1+m2) return G def complete_graph(n,create_using=None): """ Return the complete graph K_n with n nodes. Node labels are the integers 0 to n-1. """ G=empty_graph(n,create_using) G.name="complete_graph(%d)"%(n) if n>1: if G.is_directed(): edges=itertools.permutations(range(n),2) else: edges=itertools.combinations(range(n),2) G.add_edges_from(edges) return G def circular_ladder_graph(n,create_using=None): """Return the circular ladder graph CL_n of length n. CL_n consists of two concentric n-cycles in which each of the n pairs of concentric nodes are joined by an edge. Node labels are the integers 0 to n-1 """ G=ladder_graph(n,create_using) G.name="circular_ladder_graph(%d)"%n G.add_edge(0,n-1) G.add_edge(n,2*n-1) return G def circulant_graph(n, offsets, create_using=None): """Generates the circulant graph Ci_n(x_1, x_2, ..., x_m) with n vertices. Returns ------- The graph Ci_n(x_1, ..., x_m) consisting of n vertices 0, ..., n-1 such that the vertex with label i is connected to the vertices labelled (i + x) and (i - x), for all x in x_1 up to x_m, with the indices taken modulo n. Parameters ---------- n : integer The number of vertices the generated graph is to contain. offsets : list of integers A list of vertex offsets, x_1 up to x_m, as described above. create_using : NetworkX graph type, optional Use specified type to construct graph (default = networkx.Graph) Examples -------- Many well-known graph families are subfamilies of the circulant graphs; for example, to generate the cycle graph on n points, we connect every vertex to every other at offset plus or minus one. For n = 10, >>> import networkx >>> G = networkx.generators.classic.circulant_graph(10, [1]) >>> edges = [ ... (0, 9), (0, 1), (1, 2), (2, 3), (3, 4), ... (4, 5), (5, 6), (6, 7), (7, 8), (8, 9)] ... >>> sorted(edges) == sorted(G.edges()) True Similarly, we can generate the complete graph on 5 points with the set of offsets [1, 2]: >>> G = networkx.generators.classic.circulant_graph(5, [1, 2]) >>> edges = [ ... (0, 1), (0, 2), (0, 3), (0, 4), (1, 2), ... (1, 3), (1, 4), (2, 3), (2, 4), (3, 4)] ... >>> sorted(edges) == sorted(G.edges()) True """ G = empty_graph(n, create_using) template = 'circulant_graph(%d, [%s])' G.name = template % (n, ', '.join(str(j) for j in offsets)) for i in range(n): for j in offsets: G.add_edge(i, (i - j) % n) G.add_edge(i, (i + j) % n) return G def cycle_graph(n,create_using=None): """Return the cycle graph C_n over n nodes. C_n is the n-path with two end-nodes connected. Node labels are the integers 0 to n-1 If create_using is a DiGraph, the direction is in increasing order. """ G=path_graph(n,create_using) G.name="cycle_graph(%d)"%n if n>1: G.add_edge(n-1,0) return G def dorogovtsev_goltsev_mendes_graph(n,create_using=None): """Return the hierarchically constructed Dorogovtsev-Goltsev-Mendes graph. n is the generation. See: arXiv:/cond-mat/0112143 by Dorogovtsev, Goltsev and Mendes. """ if create_using is not None: if create_using.is_directed(): raise nx.NetworkXError("Directed Graph not supported") if create_using.is_multigraph(): raise nx.NetworkXError("Multigraph not supported") G=empty_graph(0,create_using) G.name="Dorogovtsev-Goltsev-Mendes Graph" G.add_edge(0,1) if n==0: return G new_node = 2 # next node to be added for i in range(1,n+1): #iterate over number of generations. last_generation_edges = G.edges() number_of_edges_in_last_generation = len(last_generation_edges) for j in range(0,number_of_edges_in_last_generation): G.add_edge(new_node,last_generation_edges[j][0]) G.add_edge(new_node,last_generation_edges[j][1]) new_node += 1 return G def empty_graph(n=0,create_using=None): """Return the empty graph with n nodes and zero edges. Node labels are the integers 0 to n-1 For example: >>> G=nx.empty_graph(10) >>> G.number_of_nodes() 10 >>> G.number_of_edges() 0 The variable create_using should point to a "graph"-like object that will be cleaned (nodes and edges will be removed) and refitted as an empty "graph" with n nodes with integer labels. This capability is useful for specifying the class-nature of the resulting empty "graph" (i.e. Graph, DiGraph, MyWeirdGraphClass, etc.). The variable create_using has two main uses: Firstly, the variable create_using can be used to create an empty digraph, network,etc. For example, >>> n=10 >>> G=nx.empty_graph(n,create_using=nx.DiGraph()) will create an empty digraph on n nodes. Secondly, one can pass an existing graph (digraph, pseudograph, etc.) via create_using. For example, if G is an existing graph (resp. digraph, pseudograph, etc.), then empty_graph(n,create_using=G) will empty G (i.e. delete all nodes and edges using G.clear() in base) and then add n nodes and zero edges, and return the modified graph (resp. digraph, pseudograph, etc.). See also create_empty_copy(G). """ if create_using is None: # default empty graph is a simple graph G=nx.Graph() else: G=create_using G.clear() G.add_nodes_from(range(n)) G.name="empty_graph(%d)"%n return G def grid_2d_graph(m,n,periodic=False,create_using=None): """ Return the 2d grid graph of mxn nodes, each connected to its nearest neighbors. Optional argument periodic=True will connect boundary nodes via periodic boundary conditions. """ G=empty_graph(0,create_using) G.name="grid_2d_graph" rows=range(m) columns=range(n) G.add_nodes_from( (i,j) for i in rows for j in columns ) G.add_edges_from( ((i,j),(i-1,j)) for i in rows for j in columns if i>0 ) G.add_edges_from( ((i,j),(i,j-1)) for i in rows for j in columns if j>0 ) if G.is_directed(): G.add_edges_from( ((i,j),(i+1,j)) for i in rows for j in columns if i2: G.add_edges_from( ((i,0),(i,n-1)) for i in rows ) if G.is_directed(): G.add_edges_from( ((i,n-1),(i,0)) for i in rows ) if m>2: G.add_edges_from( ((0,j),(m-1,j)) for j in columns ) if G.is_directed(): G.add_edges_from( ((m-1,j),(0,j)) for j in columns ) G.name="periodic_grid_2d_graph(%d,%d)"%(m,n) return G def grid_graph(dim,periodic=False): """ Return the n-dimensional grid graph. The dimension is the length of the list 'dim' and the size in each dimension is the value of the list element. E.g. G=grid_graph(dim=[2,3]) produces a 2x3 grid graph. If periodic=True then join grid edges with periodic boundary conditions. """ dlabel="%s"%dim if dim==[]: G=empty_graph(0) G.name="grid_graph(%s)"%dim return G if not is_list_of_ints(dim): raise nx.NetworkXError("dim is not a list of integers") if min(dim)<=0: raise nx.NetworkXError(\ "dim is not a list of strictly positive integers") if periodic: func=cycle_graph else: func=path_graph dim=list(dim) current_dim=dim.pop() G=func(current_dim) while len(dim)>0: current_dim=dim.pop() # order matters: copy before it is cleared during the creation of Gnew Gold=G.copy() Gnew=func(current_dim) # explicit: create_using=None # This is so that we get a new graph of Gnew's class. G=nx.cartesian_product(Gnew,Gold) # graph G is done but has labels of the form (1,(2,(3,1))) # so relabel H=nx.relabel_nodes(G, flatten) H.name="grid_graph(%s)"%dlabel return H def hypercube_graph(n): """Return the n-dimensional hypercube. Node labels are the integers 0 to 2**n - 1. """ dim=n*[2] G=grid_graph(dim) G.name="hypercube_graph_(%d)"%n return G def ladder_graph(n,create_using=None): """Return the Ladder graph of length n. This is two rows of n nodes, with each pair connected by a single edge. Node labels are the integers 0 to 2*n - 1. """ if create_using is not None and create_using.is_directed(): raise nx.NetworkXError("Directed Graph not supported") G=empty_graph(2*n,create_using) G.name="ladder_graph_(%d)"%n G.add_edges_from([(v,v+1) for v in range(n-1)]) G.add_edges_from([(v,v+1) for v in range(n,2*n-1)]) G.add_edges_from([(v,v+n) for v in range(n)]) return G def lollipop_graph(m,n,create_using=None): """Return the Lollipop Graph; `K_m` connected to `P_n`. This is the Barbell Graph without the right barbell. For m>1 and n>=0, the complete graph K_m is connected to the path P_n. The resulting m+n nodes are labelled 0,...,m-1 for the complete graph and m,...,m+n-1 for the path. The 2 subgraphs are joined via the edge (m-1,m). If n=0, this is merely a complete graph. Node labels are the integers 0 to number_of_nodes - 1. (This graph is an extremal example in David Aldous and Jim Fill's etext on Random Walks on Graphs.) """ if create_using is not None and create_using.is_directed(): raise nx.NetworkXError("Directed Graph not supported") if m<2: raise nx.NetworkXError(\ "Invalid graph description, m should be >=2") if n<0: raise nx.NetworkXError(\ "Invalid graph description, n should be >=0") # the ball G=complete_graph(m,create_using) # the stick G.add_nodes_from([v for v in range(m,m+n)]) if n>1: G.add_edges_from([(v,v+1) for v in range(m,m+n-1)]) # connect ball to stick if m>0: G.add_edge(m-1,m) G.name="lollipop_graph(%d,%d)"%(m,n) return G def null_graph(create_using=None): """Return the Null graph with no nodes or edges. See empty_graph for the use of create_using. """ G=empty_graph(0,create_using) G.name="null_graph()" return G def path_graph(n,create_using=None): """Return the Path graph P_n of n nodes linearly connected by n-1 edges. Node labels are the integers 0 to n - 1. If create_using is a DiGraph then the edges are directed in increasing order. """ G=empty_graph(n,create_using) G.name="path_graph(%d)"%n G.add_edges_from([(v,v+1) for v in range(n-1)]) return G def star_graph(n,create_using=None): """ Return the Star graph with n+1 nodes: one center node, connected to n outer nodes. Node labels are the integers 0 to n. """ G=complete_bipartite_graph(1,n,create_using) G.name="star_graph(%d)"%n return G def trivial_graph(create_using=None): """ Return the Trivial graph with one node (with integer label 0) and no edges. """ G=empty_graph(1,create_using) G.name="trivial_graph()" return G def wheel_graph(n,create_using=None): """ Return the wheel graph: a single hub node connected to each node of the (n-1)-node cycle graph. Node labels are the integers 0 to n - 1. """ if n == 0: return nx.empty_graph(n, create_using=create_using) G=star_graph(n-1,create_using) G.name="wheel_graph(%d)"%n G.add_edges_from([(v,v+1) for v in range(1,n-1)]) if n>2: G.add_edge(1,n-1) return G def complete_multipartite_graph(*block_sizes): """Returns the complete multipartite graph with the specified block sizes. Parameters ---------- block_sizes : tuple of integers The number of vertices in each block of the multipartite graph. The length of this tuple is the number of blocks. Returns ------- G : NetworkX Graph Returns the complete multipartite graph with the specified block sizes. For each node, the node attribute ``'block'`` is an integer indicating which block contains the node. Examples -------- Creating a complete tripartite graph, with blocks of one, two, and three vertices, respectively. >>> import networkx as nx >>> G = nx.complete_multipartite_graph(1, 2, 3) >>> [G.node[u]['block'] for u in G] [0, 1, 1, 2, 2, 2] >>> G.edges(0) [(0, 1), (0, 2), (0, 3), (0, 4), (0, 5)] >>> G.edges(2) [(2, 0), (2, 3), (2, 4), (2, 5)] >>> G.edges(4) [(4, 0), (4, 1), (4, 2)] Notes ----- This function generalizes several other graph generator functions. - If no block sizes are given, this returns the null graph. - If a single block size ``n`` is given, this returns the empty graph on ``n`` nodes. - If two block sizes ``m`` and ``n`` are given, this returns the complete bipartite graph on ``m + n`` nodes. - If block sizes ``1`` and ``n`` are given, this returns the star graph on ``n + 1`` nodes. See also -------- complete_bipartite_graph """ G = nx.empty_graph(sum(block_sizes)) # If block_sizes is (n1, n2, n3, ...), create pairs of the form (0, n1), # (n1, n1 + n2), (n1 + n2, n1 + n2 + n3), etc. extents = zip([0] + list(accumulate(block_sizes)), accumulate(block_sizes)) blocks = [range(start, end) for start, end in extents] for (i, block) in enumerate(blocks): G.add_nodes_from(block, block=i) # Across blocks, all vertices should be adjacent. We can use # itertools.combinations() because the complete multipartite graph is an # undirected graph. for block1, block2 in itertools.combinations(blocks, 2): G.add_edges_from(itertools.product(block1, block2)) G.name = 'complete_multiparite_graph{0}'.format(block_sizes) return G networkx-1.11/networkx/generators/ego.py0000644000175000017500000000433112637544500020360 0ustar aricaric00000000000000""" Ego graph. """ # Copyright (C) 2010 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. __author__ = """\n""".join(['Drew Conway ', 'Aric Hagberg ']) __all__ = ['ego_graph'] import networkx as nx def ego_graph(G,n,radius=1,center=True,undirected=False,distance=None): """Returns induced subgraph of neighbors centered at node n within a given radius. Parameters ---------- G : graph A NetworkX Graph or DiGraph n : node A single node radius : number, optional Include all neighbors of distance<=radius from n. center : bool, optional If False, do not include center node in graph undirected : bool, optional If True use both in- and out-neighbors of directed graphs. distance : key, optional Use specified edge data key as distance. For example, setting distance='weight' will use the edge weight to measure the distance from the node n. Notes ----- For directed graphs D this produces the "out" neighborhood or successors. If you want the neighborhood of predecessors first reverse the graph with D.reverse(). If you want both directions use the keyword argument undirected=True. Node, edge, and graph attributes are copied to the returned subgraph. """ if undirected: if distance is not None: sp,_=nx.single_source_dijkstra(G.to_undirected(), n,cutoff=radius, weight=distance) else: sp=nx.single_source_shortest_path_length(G.to_undirected(), n,cutoff=radius) else: if distance is not None: sp,_=nx.single_source_dijkstra(G, n,cutoff=radius, weight=distance) else: sp=nx.single_source_shortest_path_length(G,n,cutoff=radius) H=G.subgraph(sp).copy() if not center: H.remove_node(n) return H networkx-1.11/networkx/generators/directed.py0000644000175000017500000002261112637544500021372 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ Generators for some directed graphs, including growing network (GN) graphs and scale-free graphs. """ # Copyright (C) 2006-2009 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. __author__ ="""Aric Hagberg (hagberg@lanl.gov)\nWillem Ligtenberg (W.P.A.Ligtenberg@tue.nl)""" __all__ = ['gn_graph', 'gnc_graph', 'gnr_graph','scale_free_graph'] import random import networkx as nx from networkx.generators.classic import empty_graph from networkx.utils import discrete_sequence def gn_graph(n, kernel=None, create_using=None, seed=None): """Return the growing network (GN) digraph with ``n`` nodes. The GN graph is built by adding nodes one at a time with a link to one previously added node. The target node for the link is chosen with probability based on degree. The default attachment kernel is a linear function of the degree of a node. The graph is always a (directed) tree. Parameters ---------- n : int The number of nodes for the generated graph. kernel : function The attachment kernel. create_using : graph, optional (default DiGraph) Return graph of this type. The instance will be cleared. seed : hashable object, optional The seed for the random number generator. Examples -------- To create the undirected GN graph, use the :meth:`~DiGraph.to_directed` method:: >>> D = nx.gn_graph(10) # the GN graph >>> G = D.to_undirected() # the undirected version To specify an attachment kernel, use the ``kernel`` keyword argument:: >>> D = nx.gn_graph(10, kernel=lambda x: x ** 1.5) # A_k = k^1.5 References ---------- .. [1] P. L. Krapivsky and S. Redner, Organization of Growing Random Networks, Phys. Rev. E, 63, 066123, 2001. """ if create_using is None: create_using = nx.DiGraph() elif not create_using.is_directed(): raise nx.NetworkXError("Directed Graph required in create_using") if kernel is None: kernel = lambda x: x if seed is not None: random.seed(seed) G=empty_graph(1,create_using) G.name="gn_graph(%s)"%(n) if n==1: return G G.add_edge(1,0) # get started ds=[1,1] # degree sequence for source in range(2,n): # compute distribution from kernel and degree dist=[kernel(d) for d in ds] # choose target from discrete distribution target=discrete_sequence(1,distribution=dist)[0] G.add_edge(source,target) ds.append(1) # the source has only one link (degree one) ds[target]+=1 # add one to the target link degree return G def gnr_graph(n, p, create_using=None, seed=None): """Return the growing network with redirection (GNR) digraph with ``n`` nodes and redirection probability ``p``. The GNR graph is built by adding nodes one at a time with a link to one previously added node. The previous target node is chosen uniformly at random. With probabiliy ``p`` the link is instead "redirected" to the successor node of the target. The graph is always a (directed) tree. Parameters ---------- n : int The number of nodes for the generated graph. p : float The redirection probability. create_using : graph, optional (default DiGraph) Return graph of this type. The instance will be cleared. seed : hashable object, optional The seed for the random number generator. Examples -------- To create the undirected GNR graph, use the :meth:`~DiGraph.to_directed` method:: >>> D = nx.gnr_graph(10, 0.5) # the GNR graph >>> G = D.to_undirected() # the undirected version References ---------- .. [1] P. L. Krapivsky and S. Redner, Organization of Growing Random Networks, Phys. Rev. E, 63, 066123, 2001. """ if create_using is None: create_using = nx.DiGraph() elif not create_using.is_directed(): raise nx.NetworkXError("Directed Graph required in create_using") if not seed is None: random.seed(seed) G=empty_graph(1,create_using) G.name="gnr_graph(%s,%s)"%(n,p) if n==1: return G for source in range(1,n): target=random.randrange(0,source) if random.random() < p and target !=0: target=G.successors(target)[0] G.add_edge(source,target) return G def gnc_graph(n, create_using=None, seed=None): """Return the growing network with copying (GNC) digraph with ``n`` nodes. The GNC graph is built by adding nodes one at a time with a link to one previously added node (chosen uniformly at random) and to all of that node's successors. Parameters ---------- n : int The number of nodes for the generated graph. create_using : graph, optional (default DiGraph) Return graph of this type. The instance will be cleared. seed : hashable object, optional The seed for the random number generator. References ---------- .. [1] P. L. Krapivsky and S. Redner, Network Growth by Copying, Phys. Rev. E, 71, 036118, 2005k.}, """ if create_using is None: create_using = nx.DiGraph() elif not create_using.is_directed(): raise nx.NetworkXError("Directed Graph required in create_using") if not seed is None: random.seed(seed) G=empty_graph(1,create_using) G.name="gnc_graph(%s)"%(n) if n==1: return G for source in range(1,n): target=random.randrange(0,source) for succ in G.successors(target): G.add_edge(source,succ) G.add_edge(source,target) return G def scale_free_graph(n, alpha=0.41, beta=0.54, gamma=0.05, delta_in=0.2, delta_out=0, create_using=None, seed=None): """Returns a scale-free directed graph. Parameters ---------- n : integer Number of nodes in graph alpha : float Probability for adding a new node connected to an existing node chosen randomly according to the in-degree distribution. beta : float Probability for adding an edge between two existing nodes. One existing node is chosen randomly according the in-degree distribution and the other chosen randomly according to the out-degree distribution. gamma : float Probability for adding a new node conecgted to an existing node chosen randomly according to the out-degree distribution. delta_in : float Bias for choosing ndoes from in-degree distribution. delta_out : float Bias for choosing ndoes from out-degree distribution. create_using : graph, optional (default MultiDiGraph) Use this graph instance to start the process (default=3-cycle). seed : integer, optional Seed for random number generator Examples -------- Create a scale-free graph on one hundred nodes:: >>> G = nx.scale_free_graph(100) Notes ----- The sum of ``alpha``, ``beta``, and ``gamma`` must be 1. References ---------- .. [1] B. Bollobás, C. Borgs, J. Chayes, and O. Riordan, Directed scale-free graphs, Proceedings of the fourteenth annual ACM-SIAM Symposium on Discrete Algorithms, 132--139, 2003. """ def _choose_node(G,distribution,delta): cumsum=0.0 # normalization psum=float(sum(distribution.values()))+float(delta)*len(distribution) r=random.random() for i in range(0,len(distribution)): cumsum+=(distribution[i]+delta)/psum if r < cumsum: break return i if create_using is None: # start with 3-cycle G = nx.MultiDiGraph() G.add_edges_from([(0,1),(1,2),(2,0)]) else: # keep existing graph structure? G = create_using if not (G.is_directed() and G.is_multigraph()): raise nx.NetworkXError(\ "MultiDiGraph required in create_using") if alpha <= 0: raise ValueError('alpha must be >= 0.') if beta <= 0: raise ValueError('beta must be >= 0.') if gamma <= 0: raise ValueError('beta must be >= 0.') if alpha+beta+gamma !=1.0: raise ValueError('alpha+beta+gamma must equal 1.') G.name="directed_scale_free_graph(%s,alpha=%s,beta=%s,gamma=%s,delta_in=%s,delta_out=%s)"%(n,alpha,beta,gamma,delta_in,delta_out) # seed random number generated (uses None as default) random.seed(seed) while len(G) # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import heapq from itertools import combinations, permutations import math from operator import itemgetter import random import networkx as nx from networkx.utils import random_weighted_sample __author__ = "\n".join(['Aric Hagberg ', 'Pieter Swart ', 'Dan Schult ' 'Joel Miller ', 'Nathan Lemons ' 'Brian Cloteaux ']) __all__ = ['configuration_model', 'directed_configuration_model', 'expected_degree_graph', 'havel_hakimi_graph', 'directed_havel_hakimi_graph', 'degree_sequence_tree', 'random_degree_sequence_graph'] def configuration_model(deg_sequence,create_using=None,seed=None): """Return a random graph with the given degree sequence. The configuration model generates a random pseudograph (graph with parallel edges and self loops) by randomly assigning edges to match the given degree sequence. Parameters ---------- deg_sequence : list of integers Each list entry corresponds to the degree of a node. create_using : graph, optional (default MultiGraph) Return graph of this type. The instance will be cleared. seed : hashable object, optional Seed for random number generator. Returns ------- G : MultiGraph A graph with the specified degree sequence. Nodes are labeled starting at 0 with an index corresponding to the position in deg_sequence. Raises ------ NetworkXError If the degree sequence does not have an even sum. See Also -------- is_valid_degree_sequence Notes ----- As described by Newman [1]_. A non-graphical degree sequence (not realizable by some simple graph) is allowed since this function returns graphs with self loops and parallel edges. An exception is raised if the degree sequence does not have an even sum. This configuration model construction process can lead to duplicate edges and loops. You can remove the self-loops and parallel edges (see below) which will likely result in a graph that doesn't have the exact degree sequence specified. The density of self-loops and parallel edges tends to decrease as the number of nodes increases. However, typically the number of self-loops will approach a Poisson distribution with a nonzero mean, and similarly for the number of parallel edges. Consider a node with k stubs. The probability of being joined to another stub of the same node is basically (k-1)/N where k is the degree and N is the number of nodes. So the probability of a self-loop scales like c/N for some constant c. As N grows, this means we expect c self-loops. Similarly for parallel edges. References ---------- .. [1] M.E.J. Newman, "The structure and function of complex networks", SIAM REVIEW 45-2, pp 167-256, 2003. Examples -------- >>> from networkx.utils import powerlaw_sequence >>> z=nx.utils.create_degree_sequence(100,powerlaw_sequence) >>> G=nx.configuration_model(z) To remove parallel edges: >>> G=nx.Graph(G) To remove self loops: >>> G.remove_edges_from(G.selfloop_edges()) """ if not sum(deg_sequence)%2 ==0: raise nx.NetworkXError('Invalid degree sequence') if create_using is None: create_using = nx.MultiGraph() elif create_using.is_directed(): raise nx.NetworkXError("Directed Graph not supported") if not seed is None: random.seed(seed) # start with empty N-node graph N=len(deg_sequence) # allow multiedges and selfloops G=nx.empty_graph(N,create_using) if N==0 or max(deg_sequence)==0: # done if no edges return G # build stublist, a list of available degree-repeated stubs # e.g. for deg_sequence=[3,2,1,1,1] # initially, stublist=[1,1,1,2,2,3,4,5] # i.e., node 1 has degree=3 and is repeated 3 times, etc. stublist=[] for n in G: for i in range(deg_sequence[n]): stublist.append(n) # shuffle stublist and assign pairs by removing 2 elements at a time random.shuffle(stublist) while stublist: n1 = stublist.pop() n2 = stublist.pop() G.add_edge(n1,n2) G.name="configuration_model %d nodes %d edges"%(G.order(),G.size()) return G def directed_configuration_model(in_degree_sequence, out_degree_sequence, create_using=None,seed=None): """Return a directed_random graph with the given degree sequences. The configuration model generates a random directed pseudograph (graph with parallel edges and self loops) by randomly assigning edges to match the given degree sequences. Parameters ---------- in_degree_sequence : list of integers Each list entry corresponds to the in-degree of a node. out_degree_sequence : list of integers Each list entry corresponds to the out-degree of a node. create_using : graph, optional (default MultiDiGraph) Return graph of this type. The instance will be cleared. seed : hashable object, optional Seed for random number generator. Returns ------- G : MultiDiGraph A graph with the specified degree sequences. Nodes are labeled starting at 0 with an index corresponding to the position in deg_sequence. Raises ------ NetworkXError If the degree sequences do not have the same sum. See Also -------- configuration_model Notes ----- Algorithm as described by Newman [1]_. A non-graphical degree sequence (not realizable by some simple graph) is allowed since this function returns graphs with self loops and parallel edges. An exception is raised if the degree sequences does not have the same sum. This configuration model construction process can lead to duplicate edges and loops. You can remove the self-loops and parallel edges (see below) which will likely result in a graph that doesn't have the exact degree sequence specified. This "finite-size effect" decreases as the size of the graph increases. References ---------- .. [1] Newman, M. E. J. and Strogatz, S. H. and Watts, D. J. Random graphs with arbitrary degree distributions and their applications Phys. Rev. E, 64, 026118 (2001) Examples -------- >>> D=nx.DiGraph([(0,1),(1,2),(2,3)]) # directed path graph >>> din=list(D.in_degree().values()) >>> dout=list(D.out_degree().values()) >>> din.append(1) >>> dout[0]=2 >>> D=nx.directed_configuration_model(din,dout) To remove parallel edges: >>> D=nx.DiGraph(D) To remove self loops: >>> D.remove_edges_from(D.selfloop_edges()) """ if not sum(in_degree_sequence) == sum(out_degree_sequence): raise nx.NetworkXError('Invalid degree sequences. ' 'Sequences must have equal sums.') if create_using is None: create_using = nx.MultiDiGraph() if not seed is None: random.seed(seed) nin=len(in_degree_sequence) nout=len(out_degree_sequence) # pad in- or out-degree sequence with zeros to match lengths if nin>nout: out_degree_sequence.extend((nin-nout)*[0]) else: in_degree_sequence.extend((nout-nin)*[0]) # start with empty N-node graph N=len(in_degree_sequence) # allow multiedges and selfloops G=nx.empty_graph(N,create_using) if N==0 or max(in_degree_sequence)==0: # done if no edges return G # build stublists of available degree-repeated stubs # e.g. for degree_sequence=[3,2,1,1,1] # initially, stublist=[1,1,1,2,2,3,4,5] # i.e., node 1 has degree=3 and is repeated 3 times, etc. in_stublist=[] for n in G: for i in range(in_degree_sequence[n]): in_stublist.append(n) out_stublist=[] for n in G: for i in range(out_degree_sequence[n]): out_stublist.append(n) # shuffle stublists and assign pairs by removing 2 elements at a time random.shuffle(in_stublist) random.shuffle(out_stublist) while in_stublist and out_stublist: source = out_stublist.pop() target = in_stublist.pop() G.add_edge(source,target) G.name="directed configuration_model %d nodes %d edges"%(G.order(),G.size()) return G def expected_degree_graph(w, seed=None, selfloops=True): r"""Return a random graph with given expected degrees. Given a sequence of expected degrees `W=(w_0,w_1,\ldots,w_{n-1}`) of length `n` this algorithm assigns an edge between node `u` and node `v` with probability .. math:: p_{uv} = \frac{w_u w_v}{\sum_k w_k} . Parameters ---------- w : list The list of expected degrees. selfloops: bool (default=True) Set to False to remove the possibility of self-loop edges. seed : hashable object, optional The seed for the random number generator. Returns ------- Graph Examples -------- >>> z=[10 for i in range(100)] >>> G=nx.expected_degree_graph(z) Notes ----- The nodes have integer labels corresponding to index of expected degrees input sequence. The complexity of this algorithm is `\mathcal{O}(n+m)` where `n` is the number of nodes and `m` is the expected number of edges. The model in [1]_ includes the possibility of self-loop edges. Set selfloops=False to produce a graph without self loops. For finite graphs this model doesn't produce exactly the given expected degree sequence. Instead the expected degrees are as follows. For the case without self loops (selfloops=False), .. math:: E[deg(u)] = \sum_{v \ne u} p_{uv} = w_u \left( 1 - \frac{w_u}{\sum_k w_k} \right) . NetworkX uses the standard convention that a self-loop edge counts 2 in the degree of a node, so with self loops (selfloops=True), .. math:: E[deg(u)] = \sum_{v \ne u} p_{uv} + 2 p_{uu} = w_u \left( 1 + \frac{w_u}{\sum_k w_k} \right) . References ---------- .. [1] Fan Chung and L. Lu, Connected components in random graphs with given expected degree sequences, Ann. Combinatorics, 6, pp. 125-145, 2002. .. [2] Joel Miller and Aric Hagberg, Efficient generation of networks with given expected degrees, in Algorithms and Models for the Web-Graph (WAW 2011), Alan Frieze, Paul Horn, and Paweł Prałat (Eds), LNCS 6732, pp. 115-126, 2011. """ n = len(w) G=nx.empty_graph(n) if n==0 or max(w)==0: # done if no edges return G if seed is not None: random.seed(seed) rho = 1/float(sum(w)) # sort weights, largest first # preserve order of weights for integer node label mapping order = sorted(enumerate(w),key=itemgetter(1),reverse=True) mapping = dict((c,uv[0]) for c,uv in enumerate(order)) seq = [v for u,v in order] last=n if not selfloops: last-=1 for u in range(last): v = u if not selfloops: v += 1 factor = seq[u] * rho p = seq[v]*factor if p>1: p = 1 while v0: if p != 1: r = random.random() v += int(math.floor(math.log(r)/math.log(1-p))) if v < n: q = seq[v]*factor if q>1: q = 1 if random.random() < q/p: G.add_edge(mapping[u],mapping[v]) v += 1 p = q return G def havel_hakimi_graph(deg_sequence,create_using=None): """Return a simple graph with given degree sequence constructed using the Havel-Hakimi algorithm. Parameters ---------- deg_sequence: list of integers Each integer corresponds to the degree of a node (need not be sorted). create_using : graph, optional (default Graph) Return graph of this type. The instance will be cleared. Directed graphs are not allowed. Raises ------ NetworkXException For a non-graphical degree sequence (i.e. one not realizable by some simple graph). Notes ----- The Havel-Hakimi algorithm constructs a simple graph by successively connecting the node of highest degree to other nodes of highest degree, resorting remaining nodes by degree, and repeating the process. The resulting graph has a high degree-associativity. Nodes are labeled 1,.., len(deg_sequence), corresponding to their position in deg_sequence. The basic algorithm is from Hakimi [1]_ and was generalized by Kleitman and Wang [2]_. References ---------- .. [1] Hakimi S., On Realizability of a Set of Integers as Degrees of the Vertices of a Linear Graph. I, Journal of SIAM, 10(3), pp. 496-506 (1962) .. [2] Kleitman D.J. and Wang D.L. Algorithms for Constructing Graphs and Digraphs with Given Valences and Factors Discrete Mathematics, 6(1), pp. 79-88 (1973) """ if not nx.is_valid_degree_sequence(deg_sequence): raise nx.NetworkXError('Invalid degree sequence') if create_using is not None: if create_using.is_directed(): raise nx.NetworkXError("Directed graphs are not supported") p = len(deg_sequence) G=nx.empty_graph(p,create_using) num_degs = [] for i in range(p): num_degs.append([]) dmax, dsum, n = 0, 0, 0 for d in deg_sequence: # Process only the non-zero integers if d>0: num_degs[d].append(n) dmax, dsum, n = max(dmax,d), dsum+d, n+1 # Return graph if no edges if n==0: return G modstubs = [(0,0)]*(dmax+1) # Successively reduce degree sequence by removing the maximum degree while n > 0: # Retrieve the maximum degree in the sequence while len(num_degs[dmax]) == 0: dmax -= 1; # If there are not enough stubs to connect to, then the sequence is # not graphical if dmax > n-1: raise nx.NetworkXError('Non-graphical integer sequence') # Remove largest stub in list source = num_degs[dmax].pop() n -= 1 # Reduce the next dmax largest stubs mslen = 0 k = dmax for i in range(dmax): while len(num_degs[k]) == 0: k -= 1 target = num_degs[k].pop() G.add_edge(source, target) n -= 1 if k > 1: modstubs[mslen] = (k-1,target) mslen += 1 # Add back to the list any nonzero stubs that were removed for i in range(mslen): (stubval, stubtarget) = modstubs[i] num_degs[stubval].append(stubtarget) n += 1 G.name="havel_hakimi_graph %d nodes %d edges"%(G.order(),G.size()) return G def directed_havel_hakimi_graph(in_deg_sequence, out_deg_sequence, create_using=None): """Return a directed graph with the given degree sequences. Parameters ---------- in_deg_sequence : list of integers Each list entry corresponds to the in-degree of a node. out_deg_sequence : list of integers Each list entry corresponds to the out-degree of a node. create_using : graph, optional (default DiGraph) Return graph of this type. The instance will be cleared. Returns ------- G : DiGraph A graph with the specified degree sequences. Nodes are labeled starting at 0 with an index corresponding to the position in deg_sequence Raises ------ NetworkXError If the degree sequences are not digraphical. See Also -------- configuration_model Notes ----- Algorithm as described by Kleitman and Wang [1]_. References ---------- .. [1] D.J. Kleitman and D.L. Wang Algorithms for Constructing Graphs and Digraphs with Given Valences and Factors Discrete Mathematics, 6(1), pp. 79-88 (1973) """ assert(nx.utils.is_list_of_ints(in_deg_sequence)) assert(nx.utils.is_list_of_ints(out_deg_sequence)) if create_using is None: create_using = nx.DiGraph() # Process the sequences and form two heaps to store degree pairs with # either zero or nonzero out degrees sumin, sumout, nin, nout = 0, 0, len(in_deg_sequence), len(out_deg_sequence) maxn = max(nin, nout) G = nx.empty_graph(maxn,create_using) if maxn==0: return G maxin = 0 stubheap, zeroheap = [ ], [ ] for n in range(maxn): in_deg, out_deg = 0, 0 if n 0: stubheap.append((-1*out_deg, -1*in_deg,n)) elif out_deg > 0: zeroheap.append((-1*out_deg,n)) if sumin != sumout: raise nx.NetworkXError( 'Invalid degree sequences. Sequences must have equal sums.') heapq.heapify(stubheap) heapq.heapify(zeroheap) modstubs = [(0,0,0)]*(maxin+1) # Successively reduce degree sequence by removing the maximum while stubheap: # Remove first value in the sequence with a non-zero in degree (freeout, freein, target) = heapq.heappop(stubheap) freein *= -1 if freein > len(stubheap)+len(zeroheap): raise nx.NetworkXError('Non-digraphical integer sequence') # Attach arcs from the nodes with the most stubs mslen = 0 for i in range(freein): if zeroheap and (not stubheap or stubheap[0][0] > zeroheap[0][0]): (stubout, stubsource) = heapq.heappop(zeroheap) stubin = 0 else: (stubout, stubin, stubsource) = heapq.heappop(stubheap) if stubout == 0: raise nx.NetworkXError('Non-digraphical integer sequence') G.add_edge(stubsource, target) # Check if source is now totally connected if stubout+1<0 or stubin<0: modstubs[mslen] = (stubout+1, stubin, stubsource) mslen += 1 # Add the nodes back to the heaps that still have available stubs for i in range(mslen): stub = modstubs[i] if stub[1] < 0: heapq.heappush(stubheap, stub) else: heapq.heappush(zeroheap, (stub[0], stub[2])) if freeout<0: heapq.heappush(zeroheap, (freeout, target)) G.name="directed_havel_hakimi_graph %d nodes %d edges"%(G.order(),G.size()) return G def degree_sequence_tree(deg_sequence,create_using=None): """Make a tree for the given degree sequence. A tree has #nodes-#edges=1 so the degree sequence must have len(deg_sequence)-sum(deg_sequence)/2=1 """ if not len(deg_sequence)-sum(deg_sequence)/2.0 == 1.0: raise nx.NetworkXError("Degree sequence invalid") if create_using is not None and create_using.is_directed(): raise nx.NetworkXError("Directed Graph not supported") # single node tree if len(deg_sequence)==1: G=nx.empty_graph(0,create_using) return G # all degrees greater than 1 deg=[s for s in deg_sequence if s>1] deg.sort(reverse=True) # make path graph as backbone n=len(deg)+2 G=nx.path_graph(n,create_using) last=n # add the leaves for source in range(1,n-1): nedges=deg.pop()-2 for target in range(last,last+nedges): G.add_edge(source, target) last+=nedges # in case we added one too many if len(G.degree())>len(deg_sequence): G.remove_node(0) return G def random_degree_sequence_graph(sequence, seed=None, tries=10): r"""Return a simple random graph with the given degree sequence. If the maximum degree `d_m` in the sequence is `O(m^{1/4})` then the algorithm produces almost uniform random graphs in `O(m d_m)` time where `m` is the number of edges. Parameters ---------- sequence : list of integers Sequence of degrees seed : hashable object, optional Seed for random number generator tries : int, optional Maximum number of tries to create a graph Returns ------- G : Graph A graph with the specified degree sequence. Nodes are labeled starting at 0 with an index corresponding to the position in the sequence. Raises ------ NetworkXUnfeasible If the degree sequence is not graphical. NetworkXError If a graph is not produced in specified number of tries See Also -------- is_valid_degree_sequence, configuration_model Notes ----- The generator algorithm [1]_ is not guaranteed to produce a graph. References ---------- .. [1] Moshen Bayati, Jeong Han Kim, and Amin Saberi, A sequential algorithm for generating random graphs. Algorithmica, Volume 58, Number 4, 860-910, DOI: 10.1007/s00453-009-9340-1 Examples -------- >>> sequence = [1, 2, 2, 3] >>> G = nx.random_degree_sequence_graph(sequence) >>> sorted(G.degree().values()) [1, 2, 2, 3] """ DSRG = DegreeSequenceRandomGraph(sequence, seed=seed) for try_n in range(tries): try: return DSRG.generate() except nx.NetworkXUnfeasible: pass raise nx.NetworkXError('failed to generate graph in %d tries'%tries) class DegreeSequenceRandomGraph(object): # class to generate random graphs with a given degree sequence # use random_degree_sequence_graph() def __init__(self, degree, seed=None): if not nx.is_valid_degree_sequence(degree): raise nx.NetworkXUnfeasible('degree sequence is not graphical') if seed is not None: random.seed(seed) self.degree = list(degree) # node labels are integers 0,...,n-1 self.m = sum(self.degree)/2.0 # number of edges try: self.dmax = max(self.degree) # maximum degree except ValueError: self.dmax = 0 def generate(self): # remaining_degree is mapping from int->remaining degree self.remaining_degree = dict(enumerate(self.degree)) # add all nodes to make sure we get isolated nodes self.graph = nx.Graph() self.graph.add_nodes_from(self.remaining_degree) # remove zero degree nodes for n,d in list(self.remaining_degree.items()): if d == 0: del self.remaining_degree[n] if len(self.remaining_degree) > 0: # build graph in three phases according to how many unmatched edges self.phase1() self.phase2() self.phase3() return self.graph def update_remaining(self, u, v, aux_graph=None): # decrement remaining nodes, modify auxilliary graph if in phase3 if aux_graph is not None: # remove edges from auxilliary graph aux_graph.remove_edge(u,v) if self.remaining_degree[u] == 1: del self.remaining_degree[u] if aux_graph is not None: aux_graph.remove_node(u) else: self.remaining_degree[u] -= 1 if self.remaining_degree[v] == 1: del self.remaining_degree[v] if aux_graph is not None: aux_graph.remove_node(v) else: self.remaining_degree[v] -= 1 def p(self,u,v): # degree probability return 1 - self.degree[u]*self.degree[v]/(4.0*self.m) def q(self,u,v): # remaining degree probability norm = float(max(self.remaining_degree.values()))**2 return self.remaining_degree[u]*self.remaining_degree[v]/norm def suitable_edge(self): # Check if there is a suitable edge that is not in the graph # True if an (arbitrary) remaining node has at least one possible # connection to another remaining node nodes = iter(self.remaining_degree) u = next(nodes) # one arbitrary node for v in nodes: # loop over all other remaining nodes if not self.graph.has_edge(u, v): return True return False def phase1(self): # choose node pairs from (degree) weighted distribution while sum(self.remaining_degree.values()) >= 2 * self.dmax**2: u,v = sorted(random_weighted_sample(self.remaining_degree, 2)) if self.graph.has_edge(u,v): continue if random.random() < self.p(u,v): # accept edge self.graph.add_edge(u,v) self.update_remaining(u,v) def phase2(self): # choose remaining nodes uniformly at random and use rejection sampling while len(self.remaining_degree) >= 2 * self.dmax: norm = float(max(self.remaining_degree.values()))**2 while True: u,v = sorted(random.sample(self.remaining_degree.keys(), 2)) if self.graph.has_edge(u,v): continue if random.random() < self.q(u,v): break if random.random() < self.p(u,v): # accept edge self.graph.add_edge(u,v) self.update_remaining(u,v) def phase3(self): # build potential remaining edges and choose with rejection sampling potential_edges = combinations(self.remaining_degree, 2) # build auxilliary graph of potential edges not already in graph H = nx.Graph([(u,v) for (u,v) in potential_edges if not self.graph.has_edge(u,v)]) while self.remaining_degree: if not self.suitable_edge(): raise nx.NetworkXUnfeasible('no suitable edges left') while True: u,v = sorted(random.choice(H.edges())) if random.random() < self.q(u,v): break if random.random() < self.p(u,v): # accept edge self.graph.add_edge(u,v) self.update_remaining(u,v, aux_graph=H) networkx-1.11/networkx/generators/threshold.py0000644000175000017500000007033212637544500021606 0ustar aricaric00000000000000""" Threshold Graphs - Creation, manipulation and identification. """ __author__ = """Aric Hagberg (hagberg@lanl.gov)\nPieter Swart (swart@lanl.gov)\nDan Schult (dschult@colgate.edu)""" # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. # __all__=[] import random # for swap_d from math import sqrt import networkx def is_threshold_graph(G): """ Returns True if G is a threshold graph. """ return is_threshold_sequence(list(G.degree().values())) def is_threshold_sequence(degree_sequence): """ Returns True if the sequence is a threshold degree seqeunce. Uses the property that a threshold graph must be constructed by adding either dominating or isolated nodes. Thus, it can be deconstructed iteratively by removing a node of degree zero or a node that connects to the remaining nodes. If this deconstruction failes then the sequence is not a threshold sequence. """ ds=degree_sequence[:] # get a copy so we don't destroy original ds.sort() while ds: if ds[0]==0: # if isolated node ds.pop(0) # remove it continue if ds[-1]!=len(ds)-1: # is the largest degree node dominating? return False # no, not a threshold degree sequence ds.pop() # yes, largest is the dominating node ds=[ d-1 for d in ds ] # remove it and decrement all degrees return True def creation_sequence(degree_sequence,with_labels=False,compact=False): """ Determines the creation sequence for the given threshold degree sequence. The creation sequence is a list of single characters 'd' or 'i': 'd' for dominating or 'i' for isolated vertices. Dominating vertices are connected to all vertices present when it is added. The first node added is by convention 'd'. This list can be converted to a string if desired using "".join(cs) If with_labels==True: Returns a list of 2-tuples containing the vertex number and a character 'd' or 'i' which describes the type of vertex. If compact==True: Returns the creation sequence in a compact form that is the number of 'i's and 'd's alternating. Examples: [1,2,2,3] represents d,i,i,d,d,i,i,i [3,1,2] represents d,d,d,i,d,d Notice that the first number is the first vertex to be used for construction and so is always 'd'. with_labels and compact cannot both be True. Returns None if the sequence is not a threshold sequence """ if with_labels and compact: raise ValueError("compact sequences cannot be labeled") # make an indexed copy if isinstance(degree_sequence,dict): # labeled degree seqeunce ds = [ [degree,label] for (label,degree) in degree_sequence.items() ] else: ds=[ [d,i] for i,d in enumerate(degree_sequence) ] ds.sort() cs=[] # creation sequence while ds: if ds[0][0]==0: # isolated node (d,v)=ds.pop(0) if len(ds)>0: # make sure we start with a d cs.insert(0,(v,'i')) else: cs.insert(0,(v,'d')) continue if ds[-1][0]!=len(ds)-1: # Not dominating node return None # not a threshold degree sequence (d,v)=ds.pop() cs.insert(0,(v,'d')) ds=[ [d[0]-1,d[1]] for d in ds ] # decrement due to removing node if with_labels: return cs if compact: return make_compact(cs) return [ v[1] for v in cs ] # not labeled def make_compact(creation_sequence): """ Returns the creation sequence in a compact form that is the number of 'i's and 'd's alternating. Examples: [1,2,2,3] represents d,i,i,d,d,i,i,i. [3,1,2] represents d,d,d,i,d,d. Notice that the first number is the first vertex to be used for construction and so is always 'd'. Labeled creation sequences lose their labels in the compact representation. """ first=creation_sequence[0] if isinstance(first,str): # creation sequence cs = creation_sequence[:] elif isinstance(first,tuple): # labeled creation sequence cs = [ s[1] for s in creation_sequence ] elif isinstance(first,int): # compact creation sequence return creation_sequence else: raise TypeError("Not a valid creation sequence type") ccs=[] count=1 # count the run lengths of d's or i's. for i in range(1,len(cs)): if cs[i]==cs[i-1]: count+=1 else: ccs.append(count) count=1 ccs.append(count) # don't forget the last one return ccs def uncompact(creation_sequence): """ Converts a compact creation sequence for a threshold graph to a standard creation sequence (unlabeled). If the creation_sequence is already standard, return it. See creation_sequence. """ first=creation_sequence[0] if isinstance(first,str): # creation sequence return creation_sequence elif isinstance(first,tuple): # labeled creation sequence return creation_sequence elif isinstance(first,int): # compact creation sequence ccscopy=creation_sequence[:] else: raise TypeError("Not a valid creation sequence type") cs = [] while ccscopy: cs.extend(ccscopy.pop(0)*['d']) if ccscopy: cs.extend(ccscopy.pop(0)*['i']) return cs def creation_sequence_to_weights(creation_sequence): """ Returns a list of node weights which create the threshold graph designated by the creation sequence. The weights are scaled so that the threshold is 1.0. The order of the nodes is the same as that in the creation sequence. """ # Turn input sequence into a labeled creation sequence first=creation_sequence[0] if isinstance(first,str): # creation sequence if isinstance(creation_sequence,list): wseq = creation_sequence[:] else: wseq = list(creation_sequence) # string like 'ddidid' elif isinstance(first,tuple): # labeled creation sequence wseq = [ v[1] for v in creation_sequence] elif isinstance(first,int): # compact creation sequence wseq = uncompact(creation_sequence) else: raise TypeError("Not a valid creation sequence type") # pass through twice--first backwards wseq.reverse() w=0 prev='i' for j,s in enumerate(wseq): if s=='i': wseq[j]=w prev=s elif prev=='i': prev=s w+=1 wseq.reverse() # now pass through forwards for j,s in enumerate(wseq): if s=='d': wseq[j]=w prev=s elif prev=='d': prev=s w+=1 # Now scale weights if prev=='d': w+=1 wscale=1./float(w) return [ ww*wscale for ww in wseq] #return wseq def weights_to_creation_sequence(weights,threshold=1,with_labels=False,compact=False): """ Returns a creation sequence for a threshold graph determined by the weights and threshold given as input. If the sum of two node weights is greater than the threshold value, an edge is created between these nodes. The creation sequence is a list of single characters 'd' or 'i': 'd' for dominating or 'i' for isolated vertices. Dominating vertices are connected to all vertices present when it is added. The first node added is by convention 'd'. If with_labels==True: Returns a list of 2-tuples containing the vertex number and a character 'd' or 'i' which describes the type of vertex. If compact==True: Returns the creation sequence in a compact form that is the number of 'i's and 'd's alternating. Examples: [1,2,2,3] represents d,i,i,d,d,i,i,i [3,1,2] represents d,d,d,i,d,d Notice that the first number is the first vertex to be used for construction and so is always 'd'. with_labels and compact cannot both be True. """ if with_labels and compact: raise ValueError("compact sequences cannot be labeled") # make an indexed copy if isinstance(weights,dict): # labeled weights wseq = [ [w,label] for (label,w) in weights.items() ] else: wseq = [ [w,i] for i,w in enumerate(weights) ] wseq.sort() cs=[] # creation sequence cutoff=threshold-wseq[-1][0] while wseq: if wseq[0][0]0: # get new degree sequence on subgraph dsdict=H.degree() ds=[ [d,v] for v,d in dsdict.items() ] ds.sort() # Update threshold graph nodes if ds[-1][0]==0: # all are isolated cs.extend( zip( dsdict, ['i']*(len(ds)-1)+['d']) ) break # Done! # pull off isolated nodes while ds[0][0]==0: (d,iso)=ds.pop(0) cs.append((iso,'i')) # find new biggest node (d,bigv)=ds.pop() # add edges of star to t_g cs.append((bigv,'d')) # form subgraph of neighbors of big node H=H.subgraph(H.neighbors(bigv)) cs.reverse() return cs ### Properties of Threshold Graphs def triangles(creation_sequence): """ Compute number of triangles in the threshold graph with the given creation sequence. """ # shortcut algoritm that doesn't require computing number # of triangles at each node. cs=creation_sequence # alias dr=cs.count("d") # number of d's in sequence ntri=dr*(dr-1)*(dr-2)/6 # number of triangles in clique of nd d's # now add dr choose 2 triangles for every 'i' in sequence where # dr is the number of d's to the right of the current i for i,typ in enumerate(cs): if typ=="i": ntri+=dr*(dr-1)/2 else: dr-=1 return ntri def triangle_sequence(creation_sequence): """ Return triangle sequence for the given threshold graph creation sequence. """ cs=creation_sequence seq=[] dr=cs.count("d") # number of d's to the right of the current pos dcur=(dr-1)*(dr-2) // 2 # number of triangles through a node of clique dr irun=0 # number of i's in the last run drun=0 # number of d's in the last run for i,sym in enumerate(cs): if sym=="d": drun+=1 tri=dcur+(dr-1)*irun # new triangles at this d else: # cs[i]="i": if prevsym=="d": # new string of i's dcur+=(dr-1)*irun # accumulate shared shortest paths irun=0 # reset i run counter dr-=drun # reduce number of d's to right drun=0 # reset d run counter irun+=1 tri=dr*(dr-1) // 2 # new triangles at this i seq.append(tri) prevsym=sym return seq def cluster_sequence(creation_sequence): """ Return cluster sequence for the given threshold graph creation sequence. """ triseq=triangle_sequence(creation_sequence) degseq=degree_sequence(creation_sequence) cseq=[] for i,deg in enumerate(degseq): tri=triseq[i] if deg <= 1: # isolated vertex or single pair gets cc 0 cseq.append(0) continue max_size=(deg*(deg-1)) // 2 cseq.append(float(tri)/float(max_size)) return cseq def degree_sequence(creation_sequence): """ Return degree sequence for the threshold graph with the given creation sequence """ cs=creation_sequence # alias seq=[] rd=cs.count("d") # number of d to the right for i,sym in enumerate(cs): if sym=="d": rd-=1 seq.append(rd+i) else: seq.append(rd) return seq def density(creation_sequence): """ Return the density of the graph with this creation_sequence. The density is the fraction of possible edges present. """ N=len(creation_sequence) two_size=sum(degree_sequence(creation_sequence)) two_possible=N*(N-1) den=two_size/float(two_possible) return den def degree_correlation(creation_sequence): """ Return the degree-degree correlation over all edges. """ cs=creation_sequence s1=0 # deg_i*deg_j s2=0 # deg_i^2+deg_j^2 s3=0 # deg_i+deg_j m=0 # number of edges rd=cs.count("d") # number of d nodes to the right rdi=[ i for i,sym in enumerate(cs) if sym=="d"] # index of "d"s ds=degree_sequence(cs) for i,sym in enumerate(cs): if sym=="d": if i!=rdi[0]: print("Logic error in degree_correlation",i,rdi) raise ValueError rdi.pop(0) degi=ds[i] for dj in rdi: degj=ds[dj] s1+=degj*degi s2+=degi**2+degj**2 s3+=degi+degj m+=1 denom=(2*m*s2-s3*s3) numer=(4*m*s1-s3*s3) if denom==0: if numer==0: return 1 raise ValueError("Zero Denominator but Numerator is %s"%numer) return numer/float(denom) def shortest_path(creation_sequence,u,v): """ Find the shortest path between u and v in a threshold graph G with the given creation_sequence. For an unlabeled creation_sequence, the vertices u and v must be integers in (0,len(sequence)) refering to the position of the desired vertices in the sequence. For a labeled creation_sequence, u and v are labels of veritices. Use cs=creation_sequence(degree_sequence,with_labels=True) to convert a degree sequence to a creation sequence. Returns a list of vertices from u to v. Example: if they are neighbors, it returns [u,v] """ # Turn input sequence into a labeled creation sequence first=creation_sequence[0] if isinstance(first,str): # creation sequence cs = [(i,creation_sequence[i]) for i in range(len(creation_sequence))] elif isinstance(first,tuple): # labeled creation sequence cs = creation_sequence[:] elif isinstance(first,int): # compact creation sequence ci = uncompact(creation_sequence) cs = [(i,ci[i]) for i in range(len(ci))] else: raise TypeError("Not a valid creation sequence type") verts=[ s[0] for s in cs ] if v not in verts: raise ValueError("Vertex %s not in graph from creation_sequence"%v) if u not in verts: raise ValueError("Vertex %s not in graph from creation_sequence"%u) # Done checking if u==v: return [u] uindex=verts.index(u) vindex=verts.index(v) bigind=max(uindex,vindex) if cs[bigind][1]=='d': return [u,v] # must be that cs[bigind][1]=='i' cs=cs[bigind:] while cs: vert=cs.pop() if vert[1]=='d': return [u,vert[0],v] # All after u are type 'i' so no connection return -1 def shortest_path_length(creation_sequence,i): """ Return the shortest path length from indicated node to every other node for the threshold graph with the given creation sequence. Node is indicated by index i in creation_sequence unless creation_sequence is labeled in which case, i is taken to be the label of the node. Paths lengths in threshold graphs are at most 2. Length to unreachable nodes is set to -1. """ # Turn input sequence into a labeled creation sequence first=creation_sequence[0] if isinstance(first,str): # creation sequence if isinstance(creation_sequence,list): cs = creation_sequence[:] else: cs = list(creation_sequence) elif isinstance(first,tuple): # labeled creation sequence cs = [ v[1] for v in creation_sequence] i = [v[0] for v in creation_sequence].index(i) elif isinstance(first,int): # compact creation sequence cs = uncompact(creation_sequence) else: raise TypeError("Not a valid creation sequence type") # Compute N=len(cs) spl=[2]*N # length 2 to every node spl[i]=0 # except self which is 0 # 1 for all d's to the right for j in range(i+1,N): if cs[j]=="d": spl[j]=1 if cs[i]=='d': # 1 for all nodes to the left for j in range(i): spl[j]=1 # and -1 for any trailing i to indicate unreachable for j in range(N-1,0,-1): if cs[j]=="d": break spl[j]=-1 return spl def betweenness_sequence(creation_sequence,normalized=True): """ Return betweenness for the threshold graph with the given creation sequence. The result is unscaled. To scale the values to the iterval [0,1] divide by (n-1)*(n-2). """ cs=creation_sequence seq=[] # betweenness lastchar='d' # first node is always a 'd' dr=float(cs.count("d")) # number of d's to the right of curren pos irun=0 # number of i's in the last run drun=0 # number of d's in the last run dlast=0.0 # betweenness of last d for i,c in enumerate(cs): if c=='d': #cs[i]=="d": # betweennees = amt shared with eariler d's and i's # + new isolated nodes covered # + new paths to all previous nodes b=dlast + (irun-1)*irun/dr + 2*irun*(i-drun-irun)/dr drun+=1 # update counter else: # cs[i]="i": if lastchar=='d': # if this is a new run of i's dlast=b # accumulate betweenness dr-=drun # update number of d's to the right drun=0 # reset d counter irun=0 # reset i counter b=0 # isolated nodes have zero betweenness irun+=1 # add another i to the run seq.append(float(b)) lastchar=c # normalize by the number of possible shortest paths if normalized: order=len(cs) scale=1.0/((order-1)*(order-2)) seq=[ s*scale for s in seq ] return seq def eigenvectors(creation_sequence): """ Return a 2-tuple of Laplacian eigenvalues and eigenvectors for the threshold network with creation_sequence. The first value is a list of eigenvalues. The second value is a list of eigenvectors. The lists are in the same order so corresponding eigenvectors and eigenvalues are in the same position in the two lists. Notice that the order of the eigenvalues returned by eigenvalues(cs) may not correspond to the order of these eigenvectors. """ ccs=make_compact(creation_sequence) N=sum(ccs) vec=[0]*N val=vec[:] # get number of type d nodes to the right (all for first node) dr=sum(ccs[::2]) nn=ccs[0] vec[0]=[1./sqrt(N)]*N val[0]=0 e=dr dr-=nn type_d=True i=1 dd=1 while dd=0): raise ValueError("p must be in [0,1]") cs=['d'] # threshold sequences always start with a d for i in range(1,n): if random.random() < p: cs.append('d') else: cs.append('i') return cs # maybe *_d_threshold_sequence routines should # be (or be called from) a single routine with a more descriptive name # and a keyword parameter? def right_d_threshold_sequence(n,m): """ Create a skewed threshold graph with a given number of vertices (n) and a given number of edges (m). The routine returns an unlabeled creation sequence for the threshold graph. FIXME: describe algorithm """ cs=['d']+['i']*(n-1) # create sequence with n insolated nodes # m n*(n-1)/2: raise ValueError("Too many edges for this many nodes.") # connected case m >n-1 ind=n-1 sum=n-1 while sum n*(n-1)/2: raise ValueError("Too many edges for this many nodes.") # Connected case when M>N-1 cs[n-1]='d' sum=n-1 ind=1 while summ: # be sure not to change the first vertex cs[sum-m]='i' return cs def swap_d(cs,p_split=1.0,p_combine=1.0,seed=None): """ Perform a "swap" operation on a threshold sequence. The swap preserves the number of nodes and edges in the graph for the given sequence. The resulting sequence is still a threshold sequence. Perform one split and one combine operation on the 'd's of a creation sequence for a threshold graph. This operation maintains the number of nodes and edges in the graph, but shifts the edges from node to node maintaining the threshold quality of the graph. """ if not seed is None: random.seed(seed) # preprocess the creation sequence dlist= [ i for (i,node_type) in enumerate(cs[1:-1]) if node_type=='d' ] # split if random.random()>sys.stderr,"split at %s to %s and %s"%(choice,split_to,flip_side) # combine if random.random()= len(cs) or cs[target]=='d' or first_choice==second_choice: return cs # OK to combine cs[first_choice]='i' cs[second_choice]='i' cs[target]='d' # print >>sys.stderr,"combine %s and %s to make %s."%(first_choice,second_choice,target) return cs networkx-1.11/networkx/generators/geometric.py0000644000175000017500000002672612637544500021600 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ Generators for geometric graphs. """ # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. __author__ = "\n".join(['Aric Hagberg (hagberg@lanl.gov)', 'Dan Schult (dschult@colgate.edu)', 'Ben Edwards (BJEdwards@gmail.com)']) __all__ = ['random_geometric_graph', 'waxman_graph', 'geographical_threshold_graph', 'navigable_small_world_graph'] from bisect import bisect_left from functools import reduce from itertools import product import math, random, sys import networkx as nx #--------------------------------------------------------------------------- # Random Geometric Graphs #--------------------------------------------------------------------------- def random_geometric_graph(n, radius, dim=2, pos=None): """Returns a random geometric graph in the unit cube. The random geometric graph model places ``n`` nodes uniformly at random in the unit cube. Two nodes are joined by an edge if the Euclidean distance between the nodes is at most ``radius``. Parameters ---------- n : int Number of nodes radius: float Distance threshold value dim : int, optional Dimension of graph pos : dict, optional A dictionary keyed by node with node positions as values. Returns ------- Graph Examples -------- Create a random geometric graph on twenty nodes where nodes are joined by an edge if their distance is at most 0.1:: >>> G = nx.random_geometric_graph(20, 0.1) Notes ----- This algorithm currently only supports Euclidean distance. This uses an `O(n^2)` algorithm to build the graph. A faster algorithm is possible using k-d trees. The ``pos`` keyword argument can be used to specify node positions so you can create an arbitrary distribution and domain for positions. For example, to use a 2D Gaussian distribution of node positions with mean (0, 0) and standard deviation 2:: >>> import random >>> n = 20 >>> p = {i: (random.gauss(0, 2), random.gauss(0, 2)) for i in range(n)} >>> G = nx.random_geometric_graph(n, 0.2, pos=p) References ---------- .. [1] Penrose, Mathew, Random Geometric Graphs, Oxford Studies in Probability, 5, 2003. """ G=nx.Graph() G.name="Random Geometric Graph" G.add_nodes_from(range(n)) if pos is None: # random positions for n in G: G.node[n]['pos']=[random.random() for i in range(0,dim)] else: nx.set_node_attributes(G,'pos',pos) # connect nodes within "radius" of each other # n^2 algorithm, could use a k-d tree implementation nodes = G.nodes(data=True) while nodes: u,du = nodes.pop() pu = du['pos'] for v,dv in nodes: pv = dv['pos'] d = sum(((a-b)**2 for a,b in zip(pu,pv))) if d <= radius**2: G.add_edge(u,v) return G def geographical_threshold_graph(n, theta, alpha=2, dim=2, pos=None, weight=None): r"""Returns a geographical threshold graph. The geographical threshold graph model places ``n`` nodes uniformly at random in a rectangular domain. Each node `u` is assigned a weight `w_u`. Two nodes `u` and `v` are joined by an edge if .. math:: w_u + w_v \ge \theta r^{\alpha} where `r` is the Euclidean distance between `u` and `v`, and `\theta`, `\alpha` are parameters. Parameters ---------- n : int Number of nodes theta: float Threshold value alpha: float, optional Exponent of distance function dim : int, optional Dimension of graph pos : dict Node positions as a dictionary of tuples keyed by node. weight : dict Node weights as a dictionary of numbers keyed by node. Returns ------- Graph Examples -------- >>> G = nx.geographical_threshold_graph(20, 50) Notes ----- If weights are not specified they are assigned to nodes by drawing randomly from the exponential distribution with rate parameter `\lambda=1`. To specify weights from a different distribution, use the ``weight`` keyword argument:: >>> import random >>> n = 20 >>> w = {i: random.expovariate(5.0) for i in range(n)} >>> G = nx.geographical_threshold_graph(20, 50, weight=w) If node positions are not specified they are randomly assigned from the uniform distribution. References ---------- .. [1] Masuda, N., Miwa, H., Konno, N.: Geographical threshold graphs with small-world and scale-free properties. Physical Review E 71, 036108 (2005) .. [2] Milan Bradonjić, Aric Hagberg and Allon G. Percus, Giant component and connectivity in geographical threshold graphs, in Algorithms and Models for the Web-Graph (WAW 2007), Antony Bonato and Fan Chung (Eds), pp. 209--216, 2007 """ G=nx.Graph() # add n nodes G.add_nodes_from([v for v in range(n)]) if weight is None: # choose weights from exponential distribution for n in G: G.node[n]['weight'] = random.expovariate(1.0) else: nx.set_node_attributes(G,'weight',weight) if pos is None: # random positions for n in G: G.node[n]['pos']=[random.random() for i in range(0,dim)] else: nx.set_node_attributes(G,'pos',pos) G.add_edges_from(geographical_threshold_edges(G, theta, alpha)) return G def geographical_threshold_edges(G, theta, alpha=2): """Generates edges for a geographical threshold graph given a graph with positions and weights assigned as node attributes ``'pos'`` and ``'weight'``. """ nodes = G.nodes(data=True) while nodes: u,du = nodes.pop() wu = du['weight'] pu = du['pos'] for v,dv in nodes: wv = dv['weight'] pv = dv['pos'] r = math.sqrt(sum(((a-b)**2 for a,b in zip(pu,pv)))) if wu+wv >= theta*r**alpha: yield(u,v) def waxman_graph(n, alpha=0.4, beta=0.1, L=None, domain=(0, 0, 1, 1)): r"""Return a Waxman random graph. The Waxman random graph model places ``n`` nodes uniformly at random in a rectangular domain. Each pair of nodes at Euclidean distance `d` is joined by an edge with probability .. math:: p = \alpha \exp(-d / \beta L). This function implements both Waxman models, using the ``L`` keyword argument. * Waxman-1: if ``L`` is not specified, it is set to be the maximum distance between any pair of nodes. * Waxman-2: if ``L`` is specified, the distance between a pair of nodes is chosen uniformly at random from the interval `[0, L]`. Parameters ---------- n : int Number of nodes alpha: float Model parameter beta: float Model parameter L : float, optional Maximum distance between nodes. If not specified, the actual distance is calculated. domain : four-tuple of numbers, optional Domain size, given as a tuple of the form `(x_min, y_min, x_max, y_max)`. Returns ------- G: Graph References ---------- .. [1] B. M. Waxman, Routing of multipoint connections. IEEE J. Select. Areas Commun. 6(9),(1988) 1617-1622. """ # build graph of n nodes with random positions in the unit square G = nx.Graph() G.add_nodes_from(range(n)) (xmin,ymin,xmax,ymax)=domain for n in G: G.node[n]['pos']=(xmin + ((xmax-xmin)*random.random()), ymin + ((ymax-ymin)*random.random())) if L is None: # find maximum distance L between two nodes l = 0 pos = list(nx.get_node_attributes(G,'pos').values()) while pos: x1,y1 = pos.pop() for x2,y2 in pos: r2 = (x1-x2)**2 + (y1-y2)**2 if r2 > l: l = r2 l=math.sqrt(l) else: # user specified maximum distance l = L nodes=G.nodes() if L is None: # Waxman-1 model # try all pairs, connect randomly based on euclidean distance while nodes: u = nodes.pop() x1,y1 = G.node[u]['pos'] for v in nodes: x2,y2 = G.node[v]['pos'] r = math.sqrt((x1-x2)**2 + (y1-y2)**2) if random.random() < alpha*math.exp(-r/(beta*l)): G.add_edge(u,v) else: # Waxman-2 model # try all pairs, connect randomly based on randomly chosen l while nodes: u = nodes.pop() for v in nodes: r = random.random()*l if random.random() < alpha*math.exp(-r/(beta*l)): G.add_edge(u,v) return G def navigable_small_world_graph(n, p=1, q=1, r=2, dim=2, seed=None): """Return a navigable small-world graph. A navigable small-world graph is a directed grid with additional long-range connections that are chosen randomly. [...] we begin with a set of nodes [...] that are identified with the set of lattice points in an `n \times n` square, `\{(i, j): i \in \{1, 2, \ldots, n\}, j \in \{1, 2, \ldots, n\}\}`, and we define the *lattice distance* between two nodes `(i, j)` and `(k, l)` to be the number of "lattice steps" separating them: `d((i, j), (k, l)) = |k - i| + |l - j|`. For a universal constant `p \geq 1`, the node `u` has a directed edge to every other node within lattice distance `p` --- these are its *local contacts*. For universal constants `q \ge 0` and `r \ge 0` we also construct directed edges from `u` to `q` other nodes (the *long-range contacts*) using independent random trials; the `i`th directed edge from `u` has endpoint `v` with probability proportional to `[d(u,v)]^{-r}`. -- [1]_ Parameters ---------- n : int The number of nodes. p : int The diameter of short range connections. Each node is joined with every other node within this lattice distance. q : int The number of long-range connections for each node. r : float Exponent for decaying probability of connections. The probability of connecting to a node at lattice distance `d` is `1/d^r`. dim : int Dimension of grid seed : int, optional Seed for random number generator (default=None). References ---------- .. [1] J. Kleinberg. The small-world phenomenon: An algorithmic perspective. Proc. 32nd ACM Symposium on Theory of Computing, 2000. """ if (p < 1): raise nx.NetworkXException("p must be >= 1") if (q < 0): raise nx.NetworkXException("q must be >= 0") if (r < 0): raise nx.NetworkXException("r must be >= 1") if not seed is None: random.seed(seed) G = nx.DiGraph() nodes = list(product(range(n),repeat=dim)) for p1 in nodes: probs = [0] for p2 in nodes: if p1==p2: continue d = sum((abs(b-a) for a,b in zip(p1,p2))) if d <= p: G.add_edge(p1,p2) probs.append(d**-r) cdf = list(nx.utils.accumulate(probs)) for _ in range(q): target = nodes[bisect_left(cdf,random.uniform(0, cdf[-1]))] G.add_edge(p1,target) return G networkx-1.11/networkx/generators/__init__.py0000644000175000017500000000140412637544500021343 0ustar aricaric00000000000000""" A package for generating various graphs in networkx. """ from networkx.generators.classic import * from networkx.generators.degree_seq import * from networkx.generators.directed import * from networkx.generators.ego import * from networkx.generators.expanders import * from networkx.generators.geometric import * from networkx.generators.line import * from networkx.generators.random_graphs import * from networkx.generators.small import * from networkx.generators.stochastic import * from networkx.generators.social import * from networkx.generators.threshold import * from networkx.generators.intersection import * from networkx.generators.random_clustered import * from networkx.generators.community import * from networkx.generators.nonisomorphic_trees import * networkx-1.11/networkx/generators/atlas.py0000644000175000017500000064235312637544450020732 0ustar aricaric00000000000000""" Generators for the small graph atlas. See "An Atlas of Graphs" by Ronald C. Read and Robin J. Wilson, Oxford University Press, 1998. Because of its size, this module is not imported by default. """ # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. __author__ = """Pieter Swart (swart@lanl.gov)""" __all__ = ['graph_atlas_g'] from networkx.generators.small import make_small_graph def graph_atlas_g(): """ Return the list [G0,G1,...,G1252] of graphs as named in the Graph Atlas. G0,G1,...,G1252 are all graphs with up to 7 nodes. The graphs are listed: 1. in increasing order of number of nodes; 2. for a fixed number of nodes, in increasing order of the number of edges; 3. for fixed numbers of nodes and edges, in increasing order of the degree sequence, for example 111223 < 112222; 4. for fixed degree sequence, in increasing number of automorphisms. Note that indexing is set up so that for GAG=graph_atlas_g(), then G123=GAG[123] and G[0]=empty_graph(0) """ descr_list=[ ['edgelist', 'G0', 0, []], ['edgelist', 'G1', 1, []], ['edgelist', 'G2', 2, []], ['edgelist', 'G3', 2, [[1, 2]]], ['edgelist', 'G4', 3, []], ['edgelist', 'G5', 3, [[2, 3]]], ['edgelist', 'G6', 3, [[1, 2], [1, 3]]], ['edgelist', 'G7', 3, [[1, 2], [1, 3], [2, 3]]], ['edgelist', 'G8', 4, []], ['edgelist', 'G9', 4, [[4, 3]]], ['edgelist', 'G10', 4, [[4, 3], [4, 2]]], ['edgelist', 'G11', 4, [[1, 2], [4, 3]]], ['edgelist', 'G12', 4, [[4, 3], [2, 3], [4, 2]]], ['edgelist', 'G13', 4, [[4, 1], [4, 2], [4, 3]]], ['edgelist', 'G14', 4, [[1, 2], [2, 3], [1, 4]]], ['edgelist', 'G15', 4, [[4, 3], [2, 3], [4, 2], [4, 1]]], ['edgelist', 'G16', 4, [[1, 2], [2, 3], [3, 4], [1, 4]]], ['edgelist', 'G17', 4, [[1, 2], [1, 3], [1, 4], [2, 3], [3, 4]]], ['edgelist', 'G18', 4, [[1, 2], [2, 3], [1, 3], [4, 1], [4, 2], [4, 3]]], ['edgelist', 'G19', 5, []], ['edgelist', 'G20', 5, [[5, 4]]], ['edgelist', 'G21', 5, [[2, 3], [1, 2]]], ['edgelist', 'G22', 5, [[1, 3], [5, 4]]], ['edgelist', 'G23', 5, [[2, 3], [1, 2], [3, 1]]], ['edgelist', 'G24', 5, [[5, 4], [4, 3], [4, 2]]], ['edgelist', 'G25', 5, [[4, 3], [5, 4], [1, 5]]], ['edgelist', 'G26', 5, [[2, 3], [1, 2], [5, 4]]], ['edgelist', 'G27', 5, [[5, 4], [2, 3], [4, 2], [4, 3]]], ['edgelist', 'G28', 5, [[1, 4], [2, 1], [3, 2], [4, 3]]], ['edgelist', 'G29', 5, [[5, 4], [5, 1], [5, 2], [5, 3]]], ['edgelist', 'G30', 5, [[5, 1], [4, 2], [5, 4], [4, 3]]], ['edgelist', 'G31', 5, [[3, 4], [2, 3], [1, 2], [5, 1]]], ['edgelist', 'G32', 5, [[2, 3], [1, 2], [3, 1], [5, 4]]], ['edgelist', 'G33', 5, [[1, 4], [3, 1], [4, 3], [2, 1], [3, 2]]], ['edgelist', 'G34', 5, [[5, 3], [5, 4], [3, 4], [5, 2], [5, 1]]], ['edgelist', 'G35', 5, [[1, 2], [2, 3], [3, 4], [1, 5], [1, 3]]], ['edgelist', 'G36', 5, [[5, 1], [2, 3], [5, 4], [4, 3], [4, 2]]], ['edgelist', 'G37', 5, [[2, 1], [5, 2], [3, 5], [4, 3], [2, 4]]], ['edgelist', 'G38', 5, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5]]], ['edgelist', 'G39', 5, [[2, 1], [5, 2], [5, 1], [1, 4], [2, 4], [4, 5]]], ['edgelist', 'G40', 5, [[2, 1], [5, 2], [3, 5], [4, 3], [2, 4], [3, 2]]], ['edgelist', 'G41', 5, [[2, 1], [5, 2], [3, 5], [4, 3], [2, 4], [4, 5]]], ['edgelist', 'G42', 5, [[1, 2], [5, 4], [3, 4], [5, 3], [5, 1], [5, 2]]], ['edgelist', 'G43', 5, [[1, 5], [4, 1], [5, 4], [3, 4], [2, 3], [1, 2]]], ['edgelist', 'G44', 5, [[3, 2], [1, 3], [4, 1], [2, 4], [5, 2], [1, 5]]], ['edgelist', 'G45', 5, [[5, 1], [2, 3], [5, 4], [4, 3], [4, 2], [5, 2], [3, 5]]], ['edgelist', 'G46', 5, [[5, 2], [3, 5], [4, 3], [2, 4], [4, 5], [1, 4], [5, 1]]], ['edgelist', 'G47', 5, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2]]], ['edgelist', 'G48', 5, [[3, 2], [1, 3], [4, 1], [2, 4], [5, 2], [1, 5], [3, 5]]], ['edgelist', 'G49', 5, [[2, 1], [5, 2], [3, 5], [4, 3], [2, 4], [5, 1], [4, 5], [1, 4]]], ['edgelist', 'G50', 5, [[1, 2], [2, 3], [3, 4], [1, 4], [5, 1], [5, 2], [5, 3], [5, 4]]], ['edgelist', 'G51', 5, [[1, 2], [4, 5], [1, 4], [1, 5], [2, 3], [2, 4], [2, 5], [3, 4], [3, 5]]], ['edgelist', 'G52', 5, [[1, 2], [1, 3], [1, 4], [1, 5], [2, 3], [2, 4], [2, 5], [3, 4], [3, 5], [4, 5]]], ['edgelist', 'G53', 6, []], ['edgelist', 'G54', 6, [[6, 5]]], ['edgelist', 'G55', 6, [[1, 4], [6, 5]]], ['edgelist', 'G56', 6, [[2, 4], [2, 3]]], ['edgelist', 'G57', 6, [[2, 4], [3, 2], [4, 3]]], ['edgelist', 'G58', 6, [[1, 4], [6, 1], [5, 1]]], ['edgelist', 'G59', 6, [[5, 4], [6, 5], [1, 6]]], ['edgelist', 'G60', 6, [[5, 4], [6, 2], [6, 3]]], ['edgelist', 'G61', 6, [[2, 3], [4, 1], [6, 5]]], ['edgelist', 'G62', 6, [[1, 4], [5, 1], [6, 5], [1, 6]]], ['edgelist', 'G63', 6, [[4, 1], [6, 4], [5, 6], [1, 5]]], ['edgelist', 'G64', 6, [[6, 2], [6, 4], [6, 3], [1, 6]]], ['edgelist', 'G65', 6, [[5, 4], [4, 2], [5, 1], [4, 3]]], ['edgelist', 'G66', 6, [[1, 3], [2, 4], [3, 2], [6, 4]]], ['edgelist', 'G67', 6, [[2, 4], [3, 2], [4, 3], [1, 6]]], ['edgelist', 'G68', 6, [[2, 3], [1, 4], [6, 1], [5, 1]]], ['edgelist', 'G69', 6, [[5, 6], [2, 3], [1, 6], [4, 5]]], ['edgelist', 'G70', 6, [[1, 3], [5, 1], [4, 2], [6, 4]]], ['edgelist', 'G71', 6, [[4, 1], [6, 4], [5, 6], [1, 5], [6, 1]]], ['edgelist', 'G72', 6, [[6, 4], [4, 2], [4, 3], [5, 4], [5, 6]]], ['edgelist', 'G73', 6, [[6, 4], [6, 5], [3, 4], [4, 5], [1, 5]]], ['edgelist', 'G74', 6, [[5, 4], [2, 3], [5, 1], [4, 3], [4, 2]]], ['edgelist', 'G75', 6, [[2, 5], [4, 5], [5, 1], [3, 2], [4, 3]]], ['edgelist', 'G76', 6, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5]]], ['edgelist', 'G77', 6, [[6, 4], [6, 5], [6, 1], [6, 2], [6, 3]]], ['edgelist', 'G78', 6, [[2, 5], [6, 2], [2, 1], [3, 2], [3, 4]]], ['edgelist', 'G79', 6, [[1, 2], [4, 5], [1, 3], [4, 1], [6, 4]]], ['edgelist', 'G80', 6, [[2, 1], [3, 2], [3, 5], [2, 4], [6, 4]]], ['edgelist', 'G81', 6, [[5, 4], [1, 6], [5, 1], [4, 3], [4, 2]]], ['edgelist', 'G82', 6, [[2, 3], [1, 2], [5, 6], [2, 4], [3, 4]]], ['edgelist', 'G83', 6, [[1, 2], [1, 6], [3, 4], [4, 5], [5, 6]]], ['edgelist', 'G84', 6, [[5, 4], [6, 2], [6, 3], [1, 4], [5, 1]]], ['edgelist', 'G85', 6, [[2, 3], [4, 1], [6, 4], [5, 6], [1, 5]]], ['edgelist', 'G86', 6, [[1, 4], [6, 1], [5, 6], [4, 5], [6, 4], [5, 1]]], ['edgelist', 'G87', 6, [[2, 5], [3, 5], [5, 1], [3, 4], [4, 2], [4, 5]]], ['edgelist', 'G88', 6, [[2, 5], [3, 5], [5, 1], [3, 2], [4, 2], [3, 4]]], ['edgelist', 'G89', 6, [[3, 1], [6, 5], [5, 4], [6, 4], [5, 1], [3, 5]]], ['edgelist', 'G90', 6, [[4, 3], [5, 4], [1, 5], [2, 1], [3, 2], [1, 4]]], ['edgelist', 'G91', 6, [[5, 2], [4, 2], [5, 3], [4, 3], [3, 1], [2, 1]]], ['edgelist', 'G92', 6, [[6, 3], [6, 4], [6, 5], [4, 5], [6, 2], [6, 1]]], ['edgelist', 'G93', 6, [[5, 4], [5, 3], [5, 1], [2, 5], [4, 1], [6, 4]]], ['edgelist', 'G94', 6, [[5, 4], [4, 6], [6, 5], [6, 2], [4, 3], [5, 1]]], ['edgelist', 'G95', 6, [[5, 3], [2, 3], [5, 4], [5, 2], [5, 1], [1, 6]]], ['edgelist', 'G96', 6, [[2, 3], [4, 2], [1, 4], [3, 1], [5, 1], [6, 1]]], ['edgelist', 'G97', 6, [[3, 1], [5, 3], [2, 5], [3, 2], [4, 2], [6, 4]]], ['edgelist', 'G98', 6, [[2, 3], [4, 2], [1, 4], [3, 1], [5, 1], [6, 4]]], ['edgelist', 'G99', 6, [[6, 4], [3, 6], [3, 1], [5, 3], [5, 4], [4, 2]]], ['edgelist', 'G100', 6, [[1, 3], [4, 5], [2, 1], [6, 4], [5, 6], [4, 1]]], ['edgelist', 'G101', 6, [[2, 3], [4, 1], [6, 4], [5, 6], [1, 5], [6, 1]]], ['edgelist', 'G102', 6, [[5, 4], [2, 3], [5, 1], [4, 3], [4, 2], [6, 1]]], ['edgelist', 'G103', 6, [[2, 5], [3, 5], [5, 1], [1, 6], [4, 2], [3, 4]]], ['edgelist', 'G104', 6, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 6]]], ['edgelist', 'G105', 6, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6]]], ['edgelist', 'G106', 6, [[2, 4], [3, 2], [4, 3], [1, 5], [6, 1], [5, 6]]], ['edgelist', 'G107', 6, [[1, 2], [2, 3], [1, 3], [4, 1], [4, 2], [4, 3], [1, 6]]], ['edgelist', 'G108', 6, [[2, 5], [3, 5], [3, 2], [4, 2], [3, 4], [3, 1], [1, 2]]], ['edgelist', 'G109', 6, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2]]], ['edgelist', 'G110', 6, [[1, 2], [4, 3], [1, 3], [4, 1], [4, 2], [6, 2], [6, 3]]], ['edgelist', 'G111', 6, [[2, 5], [3, 5], [3, 4], [1, 5], [4, 2], [5, 6], [4, 5]]], ['edgelist', 'G112', 6, [[2, 1], [5, 2], [3, 5], [4, 3], [6, 2], [3, 6], [2, 3]]], ['edgelist', 'G113', 6, [[1, 5], [3, 1], [2, 3], [4, 2], [6, 4], [4, 1], [3, 4]]], ['edgelist', 'G114', 6, [[2, 5], [3, 5], [3, 4], [3, 2], [4, 2], [5, 6], [1, 5]]], ['edgelist', 'G115', 6, [[2, 1], [5, 2], [3, 5], [4, 3], [6, 2], [3, 6], [5, 6]]], ['edgelist', 'G116', 6, [[1, 2], [2, 3], [1, 3], [4, 1], [4, 2], [4, 3], [6, 5]]], ['edgelist', 'G117', 6, [[1, 6], [5, 1], [6, 5], [1, 3], [4, 1], [4, 3], [1, 2]]], ['edgelist', 'G118', 6, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 6], [5, 2]]], ['edgelist', 'G119', 6, [[1, 2], [5, 1], [2, 5], [1, 3], [4, 1], [4, 3], [4, 6]]], ['edgelist', 'G120', 6, [[2, 5], [3, 5], [5, 1], [1, 6], [4, 2], [3, 4], [4, 5]]], ['edgelist', 'G121', 6, [[3, 1], [4, 3], [5, 4], [6, 5], [3, 6], [2, 3], [5, 2]]], ['edgelist', 'G122', 6, [[2, 6], [1, 2], [5, 1], [4, 5], [3, 4], [2, 3], [1, 4]]], ['edgelist', 'G123', 6, [[2, 5], [3, 5], [5, 1], [1, 6], [4, 2], [3, 4], [3, 2]]], ['edgelist', 'G124', 6, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [1, 3], [6, 2]]], ['edgelist', 'G125', 6, [[3, 1], [5, 2], [2, 3], [6, 5], [3, 6], [4, 2], [6, 4]]], ['edgelist', 'G126', 6, [[6, 1], [4, 6], [3, 4], [1, 3], [2, 4], [5, 2], [4, 5]]], ['edgelist', 'G127', 6, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [3, 4]]], ['edgelist', 'G128', 6, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [1, 4]]], ['edgelist', 'G129', 6, [[5, 4], [1, 5], [2, 1], [3, 2], [4, 3], [1, 6], [6, 4]]], ['edgelist', 'G130', 6, [[2, 3], [1, 2], [3, 1], [4, 1], [5, 4], [6, 5], [4, 6]]], ['edgelist', 'G131', 6, [[1, 2], [3, 6], [1, 3], [4, 1], [4, 2], [4, 3], [3, 2], [6, 2]]], ['edgelist', 'G132', 6, [[1, 2], [2, 3], [3, 4], [1, 4], [5, 1], [5, 2], [5, 3], [5, 4]]], ['edgelist', 'G133', 6, [[1, 2], [2, 3], [1, 3], [4, 1], [4, 2], [4, 3], [6, 1], [1, 5]]], ['edgelist', 'G134', 6, [[2, 3], [4, 2], [1, 4], [2, 1], [3, 1], [4, 3], [6, 4], [5, 1]]], ['edgelist', 'G135', 6, [[1, 2], [3, 5], [1, 3], [6, 3], [4, 2], [4, 3], [3, 2], [5, 2]]], ['edgelist', 'G136', 6, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [2, 6]]], ['edgelist', 'G137', 6, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [6, 5]]], ['edgelist', 'G138', 6, [[1, 2], [3, 6], [1, 3], [5, 1], [4, 2], [4, 3], [3, 2], [6, 2]]], ['edgelist', 'G139', 6, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [6, 1]]], ['edgelist', 'G140', 6, [[1, 2], [3, 6], [1, 3], [5, 1], [4, 2], [4, 3], [4, 1], [6, 2]]], ['edgelist', 'G141', 6, [[3, 1], [4, 3], [5, 4], [6, 5], [3, 6], [2, 3], [5, 2], [6, 4]]], ['edgelist', 'G142', 6, [[1, 2], [2, 3], [1, 3], [4, 1], [4, 2], [4, 3], [1, 6], [6, 5]]], ['edgelist', 'G143', 6, [[1, 2], [3, 6], [1, 3], [5, 1], [4, 2], [4, 3], [6, 2], [6, 4]]], ['edgelist', 'G144', 6, [[2, 5], [3, 5], [3, 4], [1, 5], [4, 2], [5, 6], [1, 6], [4, 5]]], ['edgelist', 'G145', 6, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [6, 1], [6, 3], [1, 3]]], ['edgelist', 'G146', 6, [[2, 6], [5, 2], [1, 5], [6, 1], [3, 6], [5, 3], [4, 5], [6, 4]]], ['edgelist', 'G147', 6, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [3, 5], [6, 3]]], ['edgelist', 'G148', 6, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [2, 5], [1, 2]]], ['edgelist', 'G149', 6, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [4, 1], [6, 3], [6, 1]]], ['edgelist', 'G150', 6, [[2, 5], [3, 5], [3, 4], [1, 5], [4, 2], [5, 6], [1, 6], [3, 2]]], ['edgelist', 'G151', 6, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [5, 6], [6, 4], [2, 6]]], ['edgelist', 'G152', 6, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [3, 5], [6, 2]]], ['edgelist', 'G153', 6, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 6], [6, 3], [6, 1]]], ['edgelist', 'G154', 6, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [5, 2], [6, 3]]], ['edgelist', 'G155', 6, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [3, 5], [1, 4]]], ['edgelist', 'G156', 6, [[1, 2], [3, 6], [1, 3], [4, 1], [4, 2], [4, 3], [3, 2], [6, 2], [5, 3]]], ['edgelist', 'G157', 6, [[1, 2], [3, 6], [1, 3], [4, 1], [4, 2], [4, 3], [3, 2], [6, 2], [1, 5]]], ['edgelist', 'G158', 6, [[1, 2], [2, 3], [3, 4], [1, 4], [5, 1], [5, 2], [5, 3], [5, 4], [5, 6]]], ['edgelist', 'G159', 6, [[3, 1], [5, 2], [2, 3], [6, 5], [3, 6], [4, 2], [6, 4], [4, 3], [5, 4]]], ['edgelist', 'G160', 6, [[1, 2], [3, 6], [1, 3], [4, 1], [4, 2], [4, 3], [3, 2], [6, 2], [5, 6]]], ['edgelist', 'G161', 6, [[2, 6], [5, 2], [1, 5], [6, 1], [3, 6], [5, 3], [4, 5], [6, 4], [5, 6]]], ['edgelist', 'G162', 6, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [3, 6], [1, 6], [3, 1], [4, 1]]], ['edgelist', 'G163', 6, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [1, 5], [2, 1], [5, 2]]], ['edgelist', 'G164', 6, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [5, 2], [2, 1], [6, 2]]], ['edgelist', 'G165', 6, [[1, 2], [2, 3], [1, 3], [4, 1], [4, 2], [4, 3], [6, 5], [5, 1], [6, 1]]], ['edgelist', 'G166', 6, [[5, 4], [1, 5], [2, 1], [3, 2], [4, 3], [1, 6], [6, 4], [1, 4], [2, 6]]], ['edgelist', 'G167', 6, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [4, 3], [1, 4], [5, 1]]], ['edgelist', 'G168', 6, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [4, 3], [1, 4], [3, 5]]], ['edgelist', 'G169', 6, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [3, 6], [1, 6], [3, 1], [6, 2]]], ['edgelist', 'G170', 6, [[2, 6], [5, 2], [1, 5], [6, 1], [3, 6], [5, 3], [4, 5], [6, 4], [3, 1]]], ['edgelist', 'G171', 6, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [6, 1], [6, 5], [6, 3], [6, 4]]], ['edgelist', 'G172', 6, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [4, 1], [6, 3], [6, 1], [6, 2]]], ['edgelist', 'G173', 6, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [1, 4], [5, 3], [6, 3]]], ['edgelist', 'G174', 6, [[3, 4], [1, 3], [4, 1], [5, 4], [2, 5], [6, 2], [5, 6], [2, 1], [6, 3]]], ['edgelist', 'G175', 6, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [1, 4], [6, 3], [5, 2]]], ['edgelist', 'G176', 6, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [3, 5], [1, 4], [1, 3]]], ['edgelist', 'G177', 6, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [3, 5], [1, 4], [5, 6]]], ['edgelist', 'G178', 6, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [3, 5], [1, 4], [1, 6]]], ['edgelist', 'G179', 6, [[2, 6], [5, 2], [1, 5], [6, 1], [3, 6], [5, 3], [4, 5], [6, 4], [5, 6], [2, 1]]], ['edgelist', 'G180', 6, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [6, 5], [4, 6], [2, 6]]], ['edgelist', 'G181', 6, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [4, 3], [1, 4], [5, 1], [3, 5]]], ['edgelist', 'G182', 6, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [4, 3], [1, 4], [3, 5], [6, 3]]], ['edgelist', 'G183', 6, [[2, 1], [5, 2], [1, 5], [6, 1], [5, 6], [4, 5], [2, 4], [6, 2], [3, 4], [2, 3]]], ['edgelist', 'G184', 6, [[5, 4], [1, 5], [2, 1], [3, 2], [4, 3], [1, 6], [6, 4], [1, 4], [2, 6], [6, 3]]], ['edgelist', 'G185', 6, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [3, 6], [1, 6], [3, 1], [6, 2], [5, 2]]], ['edgelist', 'G186', 6, [[1, 2], [3, 5], [1, 3], [5, 6], [4, 2], [4, 3], [5, 2], [6, 2], [6, 3], [6, 4]]], ['edgelist', 'G187', 6, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [6, 1], [6, 2], [6, 3], [6, 4], [6, 5]]], ['edgelist', 'G188', 6, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [3, 5], [1, 3], [2, 4], [6, 2]]], ['edgelist', 'G189', 6, [[4, 5], [2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [3, 5], [6, 2], [4, 3], [1, 4]]], ['edgelist', 'G190', 6, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [5, 2], [6, 4], [3, 6], [2, 1]]], ['edgelist', 'G191', 6, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [3, 5], [1, 4], [1, 3], [2, 6]]], ['edgelist', 'G192', 6, [[1, 2], [3, 5], [1, 3], [3, 2], [4, 2], [4, 3], [5, 2], [6, 2], [6, 3], [6, 4], [1, 4]]], ['edgelist', 'G193', 6, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [3, 5], [1, 4], [6, 1], [5, 6]]], ['edgelist', 'G194', 6, [[1, 2], [2, 3], [3, 4], [5, 6], [1, 5], [2, 4], [5, 2], [3, 5], [1, 4], [6, 4], [1, 3]]], ['edgelist', 'G195', 6, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [5, 2], [6, 4], [3, 6], [2, 1], [6, 2]]], ['edgelist', 'G196', 6, [[2, 4], [5, 2], [4, 5], [3, 4], [1, 3], [5, 1], [6, 5], [3, 6], [5, 3], [1, 6], [2, 6]]], ['edgelist', 'G197', 6, [[4, 5], [2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [3, 5], [6, 2], [1, 4], [2, 5], [1, 2]]], ['edgelist', 'G198', 6, [[2, 6], [5, 2], [1, 5], [6, 1], [3, 6], [5, 3], [4, 5], [6, 4], [1, 2], [3, 1], [4, 3]]], ['edgelist', 'G199', 6, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [3, 5], [6, 3], [2, 6], [2, 5], [1, 4]]], ['edgelist', 'G200', 6, [[1, 2], [2, 3], [1, 3], [3, 4], [5, 6], [1, 5], [2, 4], [5, 2], [3, 5], [1, 4], [6, 4], [5, 4]]], ['edgelist', 'G201', 6, [[4, 3], [2, 4], [3, 2], [1, 3], [6, 1], [3, 6], [3, 5], [6, 2], [1, 4], [2, 5], [1, 2], [1, 5]]], ['edgelist', 'G202', 6, [[2, 6], [5, 2], [1, 5], [6, 1], [3, 6], [5, 3], [4, 5], [6, 4], [1, 2], [3, 1], [4, 3], [5, 6]]], ['edgelist', 'G203', 6, [[4, 5], [2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [3, 5], [6, 2], [1, 4], [2, 5], [1, 2], [3, 4]]], ['edgelist', 'G204', 6, [[1, 2], [2, 3], [1, 3], [4, 3], [4, 2], [5, 1], [3, 5], [6, 2], [1, 6], [5, 6], [4, 5], [6, 4]]], ['edgelist', 'G205', 6, [[4, 5], [2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [3, 5], [6, 2], [1, 4], [2, 5], [1, 2], [3, 4], [1, 5]]], ['edgelist', 'G206', 6, [[1, 2], [2, 3], [1, 3], [4, 3], [4, 2], [5, 1], [3, 5], [6, 2], [1, 6], [5, 6], [4, 5], [6, 4], [4, 1]]], ['edgelist', 'G207', 6, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [3, 5], [6, 3], [2, 6], [2, 5], [2, 4], [3, 1], [5, 1], [6, 4]]], ['edgelist', 'G208', 6, [[1, 2], [1, 3], [1, 4], [1, 5], [1, 6], [2, 3], [2, 4], [2, 5], [2, 6], [3, 4], [3, 5], [3, 6], [4, 5], [4, 6], [5, 6]]], ['edgelist', 'G209', 7, []], ['edgelist', 'G210', 7, [[7, 6]]], ['edgelist', 'G211', 7, [[3, 4], [2, 3]]], ['edgelist', 'G212', 7, [[6, 5], [7, 1]]], ['edgelist', 'G213', 7, [[1, 5], [5, 3], [3, 1]]], ['edgelist', 'G214', 7, [[1, 2], [1, 7], [1, 6]]], ['edgelist', 'G215', 7, [[6, 5], [7, 1], [6, 7]]], ['edgelist', 'G216', 7, [[4, 3], [2, 3], [6, 7]]], ['edgelist', 'G217', 7, [[4, 2], [6, 7], [1, 5]]], ['edgelist', 'G218', 7, [[3, 6], [7, 3], [6, 7], [2, 3]]], ['edgelist', 'G219', 7, [[2, 3], [5, 2], [6, 5], [3, 6]]], ['edgelist', 'G220', 7, [[2, 1], [6, 2], [2, 3], [5, 2]]], ['edgelist', 'G221', 7, [[2, 1], [3, 2], [6, 3], [7, 3]]], ['edgelist', 'G222', 7, [[4, 5], [3, 4], [2, 3], [1, 2]]], ['edgelist', 'G223', 7, [[5, 3], [1, 5], [3, 1], [6, 7]]], ['edgelist', 'G224', 7, [[1, 2], [7, 1], [1, 6], [5, 3]]], ['edgelist', 'G225', 7, [[4, 2], [6, 5], [7, 6], [1, 7]]], ['edgelist', 'G226', 7, [[1, 5], [4, 1], [3, 6], [7, 3]]], ['edgelist', 'G227', 7, [[3, 4], [2, 3], [7, 1], [6, 5]]], ['edgelist', 'G228', 7, [[1, 5], [4, 1], [2, 4], [5, 2], [2, 1]]], ['edgelist', 'G229', 7, [[3, 6], [7, 3], [6, 7], [5, 3], [4, 3]]], ['edgelist', 'G230', 7, [[5, 3], [5, 1], [3, 1], [6, 5], [7, 1]]], ['edgelist', 'G231', 7, [[3, 6], [7, 3], [6, 7], [2, 3], [1, 2]]], ['edgelist', 'G232', 7, [[5, 2], [1, 5], [4, 1], [2, 4], [3, 2]]], ['edgelist', 'G233', 7, [[2, 3], [1, 2], [5, 1], [4, 5], [3, 4]]], ['edgelist', 'G234', 7, [[6, 2], [6, 1], [3, 6], [4, 6], [5, 6]]], ['edgelist', 'G235', 7, [[2, 6], [7, 2], [2, 1], [3, 2], [4, 3]]], ['edgelist', 'G236', 7, [[2, 6], [5, 2], [3, 4], [7, 3], [3, 2]]], ['edgelist', 'G237', 7, [[2, 6], [7, 2], [2, 3], [3, 4], [5, 4]]], ['edgelist', 'G238', 7, [[3, 2], [4, 3], [5, 4], [6, 5], [4, 7]]], ['edgelist', 'G239', 7, [[7, 6], [3, 7], [2, 3], [6, 3], [4, 5]]], ['edgelist', 'G240', 7, [[5, 4], [6, 5], [7, 6], [1, 7], [2, 1]]], ['edgelist', 'G241', 7, [[1, 5], [4, 1], [3, 6], [7, 3], [6, 7]]], ['edgelist', 'G242', 7, [[5, 2], [6, 3], [7, 6], [4, 7], [3, 4]]], ['edgelist', 'G243', 7, [[2, 5], [4, 2], [2, 1], [3, 2], [7, 6]]], ['edgelist', 'G244', 7, [[1, 5], [4, 1], [2, 1], [3, 2], [7, 6]]], ['edgelist', 'G245', 7, [[1, 5], [4, 1], [3, 2], [6, 3], [7, 3]]], ['edgelist', 'G246', 7, [[7, 6], [4, 5], [3, 4], [2, 3], [1, 2]]], ['edgelist', 'G247', 7, [[3, 4], [2, 3], [7, 1], [6, 7], [6, 5]]], ['edgelist', 'G248', 7, [[1, 2], [5, 7], [6, 5], [4, 3], [7, 6]]], ['edgelist', 'G249', 7, [[2, 6], [7, 2], [6, 7], [3, 6], [2, 3], [7, 3]]], ['edgelist', 'G250', 7, [[2, 5], [4, 2], [3, 4], [5, 3], [2, 1], [3, 2]]], ['edgelist', 'G251', 7, [[1, 5], [4, 1], [2, 4], [3, 2], [2, 5], [4, 5]]], ['edgelist', 'G252', 7, [[6, 3], [5, 6], [3, 5], [4, 3], [7, 4], [3, 7]]], ['edgelist', 'G253', 7, [[2, 3], [5, 2], [6, 5], [3, 6], [1, 2], [5, 1]]], ['edgelist', 'G254', 7, [[2, 3], [6, 2], [5, 6], [3, 5], [1, 3], [6, 1]]], ['edgelist', 'G255', 7, [[3, 6], [7, 3], [6, 7], [3, 5], [2, 3], [4, 3]]], ['edgelist', 'G256', 7, [[2, 5], [4, 2], [3, 4], [2, 3], [3, 6], [7, 3]]], ['edgelist', 'G257', 7, [[6, 5], [7, 6], [2, 7], [6, 2], [4, 7], [1, 2]]], ['edgelist', 'G258', 7, [[7, 6], [2, 7], [6, 2], [4, 2], [1, 4], [2, 5]]], ['edgelist', 'G259', 7, [[1, 5], [4, 1], [3, 4], [5, 3], [3, 6], [7, 3]]], ['edgelist', 'G260', 7, [[2, 5], [4, 2], [3, 4], [2, 3], [3, 6], [7, 6]]], ['edgelist', 'G261', 7, [[3, 4], [2, 3], [4, 7], [6, 5], [7, 6], [6, 3]]], ['edgelist', 'G262', 7, [[3, 6], [7, 3], [6, 7], [2, 5], [4, 2], [3, 2]]], ['edgelist', 'G263', 7, [[5, 6], [1, 5], [4, 1], [3, 4], [5, 3], [7, 4]]], ['edgelist', 'G264', 7, [[1, 5], [4, 1], [2, 4], [7, 6], [2, 5], [2, 1]]], ['edgelist', 'G265', 7, [[2, 5], [4, 2], [3, 4], [6, 3], [7, 6], [3, 7]]], ['edgelist', 'G266', 7, [[7, 4], [6, 7], [5, 6], [2, 5], [3, 2], [6, 3]]], ['edgelist', 'G267', 7, [[2, 1], [4, 2], [7, 4], [6, 7], [5, 6], [2, 5]]], ['edgelist', 'G268', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6]]], ['edgelist', 'G269', 7, [[1, 5], [4, 1], [5, 4], [3, 6], [7, 3], [6, 7]]], ['edgelist', 'G270', 7, [[7, 4], [1, 7], [7, 3], [6, 7], [7, 2], [5, 7]]], ['edgelist', 'G271', 7, [[3, 5], [6, 3], [3, 4], [7, 3], [2, 3], [2, 1]]], ['edgelist', 'G272', 7, [[2, 1], [3, 2], [6, 3], [2, 5], [4, 2], [7, 3]]], ['edgelist', 'G273', 7, [[2, 1], [3, 2], [4, 7], [2, 4], [5, 2], [6, 5]]], ['edgelist', 'G274', 7, [[2, 1], [3, 2], [6, 3], [7, 6], [2, 5], [4, 2]]], ['edgelist', 'G275', 7, [[2, 1], [3, 5], [6, 3], [7, 6], [3, 7], [4, 3]]], ['edgelist', 'G276', 7, [[5, 1], [2, 5], [4, 2], [3, 2], [6, 3], [7, 3]]], ['edgelist', 'G277', 7, [[7, 6], [2, 3], [1, 2], [3, 1], [4, 3], [1, 5]]], ['edgelist', 'G278', 7, [[1, 5], [4, 1], [2, 1], [3, 2], [6, 3], [7, 3]]], ['edgelist', 'G279', 7, [[2, 1], [4, 2], [7, 4], [3, 7], [5, 2], [6, 5]]], ['edgelist', 'G280', 7, [[3, 6], [7, 3], [5, 3], [2, 5], [4, 2], [1, 4]]], ['edgelist', 'G281', 7, [[1, 5], [4, 1], [3, 4], [5, 3], [2, 3], [7, 6]]], ['edgelist', 'G282', 7, [[1, 5], [4, 1], [3, 2], [6, 3], [7, 6], [3, 7]]], ['edgelist', 'G283', 7, [[4, 5], [2, 1], [3, 2], [6, 3], [7, 6], [3, 7]]], ['edgelist', 'G284', 7, [[5, 6], [1, 5], [4, 1], [7, 4], [2, 1], [3, 2]]], ['edgelist', 'G285', 7, [[3, 6], [7, 3], [6, 7], [2, 5], [4, 2], [2, 1]]], ['edgelist', 'G286', 7, [[5, 6], [4, 5], [3, 4], [2, 3], [1, 2], [7, 1]]], ['edgelist', 'G287', 7, [[7, 5], [6, 7], [5, 6], [3, 4], [2, 3], [1, 2]]], ['edgelist', 'G288', 7, [[1, 2], [5, 1], [3, 4], [6, 3], [7, 6], [4, 7]]], ['edgelist', 'G289', 7, [[2, 3], [1, 2], [5, 1], [4, 5], [3, 4], [7, 6]]], ['edgelist', 'G290', 7, [[2, 5], [4, 2], [3, 4], [5, 3], [2, 1], [3, 2], [4, 5]]], ['edgelist', 'G291', 7, [[2, 3], [6, 2], [5, 6], [3, 5], [1, 3], [6, 1], [6, 3]]], ['edgelist', 'G292', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2]]], ['edgelist', 'G293', 7, [[2, 3], [6, 2], [5, 6], [3, 5], [1, 3], [6, 1], [2, 1]]], ['edgelist', 'G294', 7, [[1, 5], [4, 1], [3, 4], [5, 3], [3, 6], [7, 3], [3, 1]]], ['edgelist', 'G295', 7, [[2, 5], [4, 2], [3, 4], [5, 3], [2, 1], [3, 2], [3, 7]]], ['edgelist', 'G296', 7, [[2, 5], [4, 2], [3, 4], [5, 3], [2, 1], [4, 5], [7, 4]]], ['edgelist', 'G297', 7, [[1, 5], [4, 1], [3, 4], [5, 3], [3, 6], [7, 3], [4, 5]]], ['edgelist', 'G298', 7, [[1, 5], [4, 1], [2, 4], [4, 7], [2, 5], [2, 1], [6, 5]]], ['edgelist', 'G299', 7, [[1, 5], [4, 1], [2, 4], [7, 6], [2, 5], [2, 1], [4, 5]]], ['edgelist', 'G300', 7, [[6, 3], [5, 6], [3, 5], [4, 3], [7, 4], [3, 7], [3, 2]]], ['edgelist', 'G301', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [1, 3], [3, 6]]], ['edgelist', 'G302', 7, [[6, 3], [5, 6], [3, 5], [4, 3], [7, 4], [3, 7], [4, 2]]], ['edgelist', 'G303', 7, [[2, 5], [4, 2], [3, 4], [5, 3], [3, 1], [3, 2], [7, 1]]], ['edgelist', 'G304', 7, [[2, 3], [6, 2], [5, 6], [3, 5], [1, 3], [6, 1], [4, 6]]], ['edgelist', 'G305', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [1, 3], [4, 6]]], ['edgelist', 'G306', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [1, 3], [2, 6]]], ['edgelist', 'G307', 7, [[4, 3], [5, 4], [4, 6], [3, 5], [6, 3], [7, 2], [7, 5]]], ['edgelist', 'G308', 7, [[2, 3], [6, 2], [5, 6], [3, 5], [1, 3], [6, 1], [1, 4]]], ['edgelist', 'G309', 7, [[4, 5], [2, 4], [3, 2], [7, 3], [6, 7], [2, 6], [5, 2]]], ['edgelist', 'G310', 7, [[1, 2], [5, 1], [2, 5], [3, 2], [4, 3], [6, 4], [5, 6]]], ['edgelist', 'G311', 7, [[7, 4], [6, 7], [2, 6], [3, 2], [4, 3], [5, 3], [6, 5]]], ['edgelist', 'G312', 7, [[2, 3], [5, 2], [6, 5], [7, 6], [4, 7], [3, 4], [6, 3]]], ['edgelist', 'G313', 7, [[5, 2], [4, 5], [2, 4], [3, 2], [7, 3], [6, 7], [3, 6]]], ['edgelist', 'G314', 7, [[4, 1], [7, 4], [1, 7], [2, 1], [1, 3], [6, 1], [1, 5]]], ['edgelist', 'G315', 7, [[2, 6], [7, 2], [2, 3], [4, 2], [5, 4], [2, 5], [5, 1]]], ['edgelist', 'G316', 7, [[6, 1], [7, 6], [1, 7], [6, 3], [2, 6], [7, 4], [5, 7]]], ['edgelist', 'G317', 7, [[5, 2], [1, 5], [2, 1], [3, 2], [1, 4], [7, 1], [5, 6]]], ['edgelist', 'G318', 7, [[6, 3], [7, 6], [3, 7], [3, 5], [4, 3], [2, 1], [3, 2]]], ['edgelist', 'G319', 7, [[5, 2], [1, 5], [4, 1], [2, 4], [3, 2], [2, 6], [7, 2]]], ['edgelist', 'G320', 7, [[2, 1], [5, 2], [1, 5], [6, 5], [3, 2], [4, 3], [7, 2]]], ['edgelist', 'G321', 7, [[1, 2], [5, 1], [2, 5], [3, 2], [4, 3], [6, 5], [7, 5]]], ['edgelist', 'G322', 7, [[3, 4], [6, 3], [7, 6], [4, 7], [2, 3], [5, 6], [1, 6]]], ['edgelist', 'G323', 7, [[1, 5], [4, 1], [2, 4], [5, 2], [2, 1], [3, 2], [7, 6]]], ['edgelist', 'G324', 7, [[3, 6], [7, 3], [6, 7], [5, 3], [2, 3], [1, 2], [4, 2]]], ['edgelist', 'G325', 7, [[3, 6], [7, 3], [5, 3], [2, 5], [4, 2], [3, 4], [1, 2]]], ['edgelist', 'G326', 7, [[7, 3], [6, 7], [3, 6], [2, 3], [1, 2], [5, 2], [4, 2]]], ['edgelist', 'G327', 7, [[1, 5], [4, 1], [2, 4], [5, 2], [6, 5], [3, 2], [7, 4]]], ['edgelist', 'G328', 7, [[3, 6], [7, 3], [6, 7], [5, 6], [4, 7], [2, 3], [1, 2]]], ['edgelist', 'G329', 7, [[3, 6], [7, 3], [2, 5], [2, 3], [1, 2], [5, 1], [1, 4]]], ['edgelist', 'G330', 7, [[7, 6], [2, 3], [5, 2], [1, 5], [4, 1], [2, 4], [4, 5]]], ['edgelist', 'G331', 7, [[5, 2], [1, 5], [2, 1], [4, 7], [3, 4], [1, 3], [6, 1]]], ['edgelist', 'G332', 7, [[5, 2], [1, 5], [4, 1], [2, 4], [3, 2], [6, 3], [7, 2]]], ['edgelist', 'G333', 7, [[5, 2], [1, 5], [2, 1], [3, 4], [1, 3], [6, 1], [7, 6]]], ['edgelist', 'G334', 7, [[1, 2], [6, 1], [7, 6], [4, 7], [3, 4], [1, 3], [5, 1]]], ['edgelist', 'G335', 7, [[2, 1], [5, 2], [3, 5], [4, 3], [5, 4], [1, 5], [7, 6]]], ['edgelist', 'G336', 7, [[4, 7], [3, 4], [2, 3], [1, 2], [5, 1], [2, 5], [6, 5]]], ['edgelist', 'G337', 7, [[2, 1], [6, 2], [7, 6], [3, 7], [2, 3], [4, 3], [5, 4]]], ['edgelist', 'G338', 7, [[3, 4], [2, 3], [1, 2], [5, 1], [6, 5], [7, 6], [5, 2]]], ['edgelist', 'G339', 7, [[6, 3], [7, 6], [3, 7], [2, 3], [5, 2], [1, 5], [4, 2]]], ['edgelist', 'G340', 7, [[3, 4], [2, 3], [1, 2], [5, 1], [6, 5], [7, 6], [6, 3]]], ['edgelist', 'G341', 7, [[2, 5], [1, 2], [3, 1], [4, 3], [6, 4], [1, 6], [7, 4]]], ['edgelist', 'G342', 7, [[3, 2], [4, 3], [7, 4], [6, 7], [1, 6], [3, 1], [6, 5]]], ['edgelist', 'G343', 7, [[6, 3], [7, 6], [3, 7], [2, 3], [1, 2], [5, 1], [4, 1]]], ['edgelist', 'G344', 7, [[5, 2], [1, 5], [4, 1], [2, 4], [3, 2], [6, 3], [7, 3]]], ['edgelist', 'G345', 7, [[2, 1], [3, 2], [6, 3], [5, 6], [1, 5], [5, 2], [7, 4]]], ['edgelist', 'G346', 7, [[3, 6], [7, 3], [1, 5], [4, 1], [2, 4], [5, 2], [2, 1]]], ['edgelist', 'G347', 7, [[7, 6], [1, 5], [4, 1], [2, 4], [5, 2], [3, 5], [4, 3]]], ['edgelist', 'G348', 7, [[3, 2], [6, 3], [5, 6], [1, 5], [4, 1], [7, 4], [3, 7]]], ['edgelist', 'G349', 7, [[5, 1], [4, 5], [2, 4], [3, 2], [6, 3], [7, 6], [3, 7]]], ['edgelist', 'G350', 7, [[7, 6], [3, 7], [2, 3], [5, 2], [1, 5], [4, 1], [2, 4]]], ['edgelist', 'G351', 7, [[5, 2], [1, 5], [3, 1], [4, 3], [7, 4], [6, 7], [1, 6]]], ['edgelist', 'G352', 7, [[1, 5], [4, 1], [5, 4], [3, 2], [6, 3], [7, 6], [3, 7]]], ['edgelist', 'G353', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [6, 7], [1, 7]]], ['edgelist', 'G354', 7, [[2, 1], [5, 2], [1, 5], [6, 3], [7, 6], [4, 7], [3, 4]]], ['edgelist', 'G355', 7, [[1, 2], [5, 1], [6, 5], [3, 6], [2, 3], [6, 2], [5, 2], [3, 5]]], ['edgelist', 'G356', 7, [[5, 2], [6, 5], [3, 6], [2, 3], [1, 2], [6, 1], [1, 5], [3, 1]]], ['edgelist', 'G357', 7, [[1, 5], [4, 1], [2, 4], [5, 2], [2, 1], [4, 5], [6, 2], [7, 2]]], ['edgelist', 'G358', 7, [[5, 2], [6, 5], [3, 6], [2, 3], [6, 2], [7, 6], [3, 5], [4, 3]]], ['edgelist', 'G359', 7, [[2, 4], [1, 2], [5, 1], [3, 5], [2, 3], [5, 2], [6, 5], [2, 6]]], ['edgelist', 'G360', 7, [[3, 1], [4, 3], [7, 4], [6, 7], [1, 6], [4, 1], [1, 7], [5, 1]]], ['edgelist', 'G361', 7, [[2, 1], [3, 2], [6, 3], [5, 6], [1, 5], [3, 1], [6, 1], [7, 6]]], ['edgelist', 'G362', 7, [[2, 1], [3, 2], [4, 3], [2, 4], [5, 4], [3, 5], [6, 3], [4, 6]]], ['edgelist', 'G363', 7, [[3, 1], [4, 3], [7, 4], [6, 7], [1, 6], [4, 1], [7, 1], [5, 6]]], ['edgelist', 'G364', 7, [[2, 1], [3, 2], [5, 4], [2, 6], [5, 2], [3, 5], [6, 3], [4, 6]]], ['edgelist', 'G365', 7, [[4, 6], [3, 2], [5, 4], [2, 6], [5, 2], [3, 5], [6, 3], [5, 7]]], ['edgelist', 'G366', 7, [[1, 5], [4, 1], [2, 4], [5, 2], [2, 1], [4, 5], [3, 2], [6, 3]]], ['edgelist', 'G367', 7, [[4, 6], [3, 2], [5, 4], [2, 6], [5, 2], [3, 5], [6, 3], [1, 4]]], ['edgelist', 'G368', 7, [[5, 1], [3, 5], [1, 3], [4, 1], [3, 4], [6, 3], [7, 6], [3, 7]]], ['edgelist', 'G369', 7, [[4, 3], [7, 4], [6, 7], [3, 6], [1, 3], [6, 1], [5, 6], [3, 5]]], ['edgelist', 'G370', 7, [[1, 6], [5, 1], [3, 5], [6, 3], [2, 6], [5, 2], [4, 5], [6, 4]]], ['edgelist', 'G371', 7, [[3, 4], [2, 3], [5, 2], [6, 5], [2, 6], [6, 3], [7, 6], [4, 7]]], ['edgelist', 'G372', 7, [[6, 3], [5, 6], [1, 5], [4, 1], [7, 4], [3, 7], [5, 3], [4, 3]]], ['edgelist', 'G373', 7, [[1, 5], [4, 1], [2, 4], [5, 2], [3, 5], [4, 3], [6, 5], [3, 6]]], ['edgelist', 'G374', 7, [[6, 7], [3, 6], [7, 3], [4, 3], [5, 4], [1, 5], [4, 1], [3, 5]]], ['edgelist', 'G375', 7, [[2, 1], [6, 1], [4, 3], [2, 4], [6, 3], [7, 2], [7, 3], [7, 6]]], ['edgelist', 'G376', 7, [[6, 5], [7, 6], [4, 7], [1, 4], [5, 1], [3, 5], [4, 3], [1, 3]]], ['edgelist', 'G377', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [5, 3], [2, 6]]], ['edgelist', 'G378', 7, [[6, 1], [7, 3], [1, 7], [2, 1], [3, 2], [6, 3], [5, 6], [5, 7]]], ['edgelist', 'G379', 7, [[1, 5], [4, 1], [2, 4], [5, 2], [2, 1], [3, 2], [2, 6], [7, 2]]], ['edgelist', 'G380', 7, [[1, 3], [5, 1], [2, 5], [1, 2], [4, 1], [2, 4], [6, 2], [7, 2]]], ['edgelist', 'G381', 7, [[5, 3], [1, 5], [4, 1], [2, 4], [5, 2], [2, 1], [2, 6], [7, 2]]], ['edgelist', 'G382', 7, [[1, 5], [4, 1], [5, 4], [2, 5], [4, 2], [2, 6], [3, 2], [7, 2]]], ['edgelist', 'G383', 7, [[3, 2], [1, 3], [4, 1], [6, 4], [3, 6], [4, 3], [5, 4], [7, 6]]], ['edgelist', 'G384', 7, [[5, 3], [1, 5], [4, 1], [2, 4], [5, 2], [4, 5], [2, 6], [7, 2]]], ['edgelist', 'G385', 7, [[3, 2], [1, 3], [4, 1], [6, 4], [3, 6], [7, 6], [5, 4], [6, 1]]], ['edgelist', 'G386', 7, [[2, 1], [3, 2], [4, 3], [2, 4], [5, 3], [4, 5], [5, 6], [7, 5]]], ['edgelist', 'G387', 7, [[7, 6], [2, 3], [5, 2], [1, 5], [4, 1], [2, 4], [1, 2], [4, 5]]], ['edgelist', 'G388', 7, [[1, 2], [7, 6], [3, 4], [7, 5], [7, 4], [7, 3], [7, 1], [7, 2]]], ['edgelist', 'G389', 7, [[7, 5], [2, 3], [3, 4], [7, 6], [5, 6], [7, 3], [7, 1], [7, 2]]], ['edgelist', 'G390', 7, [[1, 2], [2, 3], [3, 4], [7, 6], [7, 5], [7, 4], [7, 1], [7, 3]]], ['edgelist', 'G391', 7, [[1, 5], [4, 1], [2, 4], [5, 2], [2, 1], [7, 2], [6, 2], [3, 6]]], ['edgelist', 'G392', 7, [[4, 1], [3, 4], [5, 3], [1, 5], [2, 1], [3, 2], [6, 3], [7, 3]]], ['edgelist', 'G393', 7, [[3, 2], [4, 3], [7, 4], [6, 7], [1, 6], [3, 1], [6, 3], [5, 6]]], ['edgelist', 'G394', 7, [[2, 1], [3, 2], [4, 3], [5, 4], [6, 3], [2, 6], [7, 2], [3, 7]]], ['edgelist', 'G395', 7, [[3, 6], [5, 3], [2, 5], [4, 2], [1, 4], [2, 1], [3, 2], [7, 3]]], ['edgelist', 'G396', 7, [[5, 6], [1, 5], [4, 1], [3, 4], [5, 3], [2, 5], [4, 2], [7, 4]]], ['edgelist', 'G397', 7, [[1, 2], [5, 1], [2, 5], [3, 2], [5, 3], [6, 5], [2, 6], [7, 4]]], ['edgelist', 'G398', 7, [[1, 2], [3, 1], [4, 3], [5, 4], [2, 5], [3, 2], [2, 7], [6, 1]]], ['edgelist', 'G399', 7, [[5, 6], [1, 5], [2, 1], [5, 2], [4, 1], [2, 4], [7, 2], [3, 7]]], ['edgelist', 'G400', 7, [[3, 6], [5, 3], [1, 5], [2, 1], [5, 2], [4, 1], [2, 4], [7, 2]]], ['edgelist', 'G401', 7, [[2, 7], [3, 2], [1, 3], [2, 1], [5, 2], [4, 5], [3, 4], [5, 6]]], ['edgelist', 'G402', 7, [[1, 2], [3, 1], [4, 3], [5, 4], [2, 5], [3, 2], [2, 7], [6, 4]]], ['edgelist', 'G403', 7, [[1, 5], [4, 1], [5, 4], [2, 5], [4, 2], [6, 2], [7, 3], [2, 7]]], ['edgelist', 'G404', 7, [[3, 4], [2, 3], [1, 2], [6, 1], [5, 6], [1, 5], [3, 1], [7, 6]]], ['edgelist', 'G405', 7, [[5, 6], [1, 5], [4, 1], [2, 4], [5, 2], [3, 5], [4, 3], [7, 3]]], ['edgelist', 'G406', 7, [[3, 4], [2, 3], [1, 2], [5, 1], [6, 5], [5, 2], [3, 7], [6, 3]]], ['edgelist', 'G407', 7, [[1, 2], [2, 3], [3, 4], [7, 4], [5, 6], [7, 3], [7, 1], [7, 2]]], ['edgelist', 'G408', 7, [[5, 2], [1, 5], [4, 1], [2, 4], [1, 2], [3, 2], [6, 3], [7, 3]]], ['edgelist', 'G409', 7, [[1, 2], [2, 3], [3, 4], [7, 6], [5, 6], [7, 3], [7, 5], [7, 2]]], ['edgelist', 'G410', 7, [[1, 2], [5, 1], [1, 3], [6, 1], [7, 6], [4, 7], [3, 4], [6, 3]]], ['edgelist', 'G411', 7, [[1, 5], [4, 1], [3, 4], [5, 3], [2, 5], [4, 2], [3, 6], [7, 3]]], ['edgelist', 'G412', 7, [[5, 6], [4, 5], [2, 4], [3, 2], [7, 3], [5, 7], [4, 3], [1, 2]]], ['edgelist', 'G413', 7, [[2, 1], [3, 7], [4, 3], [5, 4], [6, 3], [2, 6], [7, 2], [7, 6]]], ['edgelist', 'G414', 7, [[3, 4], [2, 3], [1, 2], [5, 1], [6, 5], [7, 6], [6, 3], [5, 2]]], ['edgelist', 'G415', 7, [[5, 2], [1, 5], [4, 1], [2, 4], [4, 5], [3, 2], [3, 6], [7, 3]]], ['edgelist', 'G416', 7, [[1, 7], [5, 1], [2, 5], [4, 2], [1, 4], [3, 5], [4, 3], [6, 3]]], ['edgelist', 'G417', 7, [[1, 5], [4, 1], [2, 4], [5, 2], [3, 5], [4, 3], [2, 1], [7, 6]]], ['edgelist', 'G418', 7, [[1, 2], [5, 1], [4, 3], [7, 4], [6, 7], [3, 6], [7, 3], [4, 6]]], ['edgelist', 'G419', 7, [[6, 3], [7, 6], [3, 7], [5, 3], [1, 5], [4, 1], [3, 4], [2, 3]]], ['edgelist', 'G420', 7, [[3, 1], [2, 3], [1, 2], [6, 1], [5, 6], [1, 5], [7, 1], [4, 7]]], ['edgelist', 'G421', 7, [[1, 2], [3, 1], [4, 3], [3, 2], [2, 5], [6, 5], [6, 4], [2, 7]]], ['edgelist', 'G422', 7, [[2, 7], [3, 2], [1, 3], [2, 1], [5, 2], [4, 5], [3, 4], [6, 7]]], ['edgelist', 'G423', 7, [[7, 2], [1, 7], [2, 1], [6, 2], [1, 6], [3, 2], [4, 3], [5, 4]]], ['edgelist', 'G424', 7, [[7, 6], [3, 7], [2, 3], [5, 2], [4, 5], [1, 4], [5, 1], [3, 5]]], ['edgelist', 'G425', 7, [[2, 7], [1, 2], [6, 1], [2, 6], [4, 1], [5, 4], [3, 5], [1, 3]]], ['edgelist', 'G426', 7, [[3, 7], [5, 3], [1, 5], [2, 1], [5, 2], [4, 5], [6, 4], [3, 6]]], ['edgelist', 'G427', 7, [[2, 1], [3, 2], [7, 3], [6, 7], [2, 6], [5, 2], [4, 5], [3, 4]]], ['edgelist', 'G428', 7, [[7, 2], [5, 4], [2, 1], [6, 2], [4, 3], [3, 2], [5, 7], [6, 5]]], ['edgelist', 'G429', 7, [[5, 3], [1, 5], [2, 1], [5, 2], [4, 5], [7, 4], [6, 7], [4, 6]]], ['edgelist', 'G430', 7, [[5, 2], [3, 5], [1, 3], [7, 1], [4, 7], [1, 4], [6, 1], [5, 6]]], ['edgelist', 'G431', 7, [[6, 7], [5, 6], [1, 5], [4, 1], [3, 4], [5, 3], [2, 5], [4, 2]]], ['edgelist', 'G432', 7, [[7, 4], [6, 7], [5, 6], [1, 5], [2, 1], [3, 2], [6, 3], [5, 2]]], ['edgelist', 'G433', 7, [[1, 2], [3, 1], [4, 3], [3, 2], [2, 5], [6, 5], [6, 4], [5, 7]]], ['edgelist', 'G434', 7, [[5, 1], [4, 5], [3, 4], [7, 3], [6, 7], [2, 6], [5, 2], [3, 2]]], ['edgelist', 'G435', 7, [[7, 2], [1, 7], [5, 4], [6, 2], [1, 6], [3, 2], [4, 3], [6, 7]]], ['edgelist', 'G436', 7, [[7, 3], [6, 7], [4, 6], [7, 4], [5, 4], [1, 5], [2, 1], [5, 2]]], ['edgelist', 'G437', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [1, 7], [6, 2]]], ['edgelist', 'G438', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [1, 7], [5, 3]]], ['edgelist', 'G439', 7, [[1, 2], [3, 1], [4, 3], [5, 4], [2, 5], [3, 2], [6, 7], [1, 6]]], ['edgelist', 'G440', 7, [[5, 1], [3, 5], [4, 3], [7, 4], [6, 7], [5, 6], [2, 3], [6, 2]]], ['edgelist', 'G441', 7, [[6, 2], [3, 5], [4, 3], [1, 4], [6, 1], [5, 6], [2, 3], [1, 7]]], ['edgelist', 'G442', 7, [[6, 7], [3, 6], [5, 3], [1, 5], [4, 1], [3, 4], [2, 5], [4, 2]]], ['edgelist', 'G443', 7, [[1, 5], [2, 1], [5, 2], [4, 5], [6, 4], [7, 6], [3, 7], [5, 3]]], ['edgelist', 'G444', 7, [[1, 2], [7, 6], [3, 4], [4, 5], [7, 5], [1, 6], [7, 3], [7, 2]]], ['edgelist', 'G445', 7, [[2, 3], [1, 2], [5, 1], [6, 5], [3, 6], [4, 3], [7, 4], [6, 7]]], ['edgelist', 'G446', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [3, 7], [2, 7]]], ['edgelist', 'G447', 7, [[7, 3], [6, 7], [3, 6], [2, 3], [5, 2], [1, 5], [4, 1], [2, 4]]], ['edgelist', 'G448', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [7, 6], [7, 2]]], ['edgelist', 'G449', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [7, 1], [7, 4]]], ['edgelist', 'G450', 7, [[1, 5], [2, 1], [4, 3], [2, 5], [3, 6], [6, 4], [7, 5], [7, 4]]], ['edgelist', 'G451', 7, [[1, 5], [4, 1], [2, 4], [5, 2], [2, 1], [7, 3], [6, 7], [3, 6]]], ['edgelist', 'G452', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [3, 5], [1, 4]]], ['edgelist', 'G453', 7, [[1, 2], [3, 6], [1, 3], [4, 1], [4, 2], [4, 3], [3, 2], [6, 2], [5, 3]]], ['edgelist', 'G454', 7, [[1, 2], [3, 6], [1, 3], [4, 1], [4, 2], [4, 3], [3, 2], [6, 2], [1, 5]]], ['edgelist', 'G455', 7, [[1, 2], [2, 3], [3, 4], [1, 4], [5, 1], [5, 2], [5, 3], [5, 4], [5, 6]]], ['edgelist', 'G456', 7, [[3, 1], [5, 2], [2, 3], [6, 5], [3, 6], [4, 2], [6, 4], [4, 3], [5, 4]]], ['edgelist', 'G457', 7, [[1, 2], [3, 6], [1, 3], [4, 1], [4, 2], [4, 3], [3, 2], [6, 2], [5, 6]]], ['edgelist', 'G458', 7, [[2, 6], [5, 2], [1, 5], [6, 1], [3, 6], [5, 3], [4, 5], [6, 4], [5, 6]]], ['edgelist', 'G459', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [3, 6], [1, 6], [3, 1], [4, 1]]], ['edgelist', 'G460', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [1, 5], [2, 1], [5, 2]]], ['edgelist', 'G461', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [5, 2], [2, 1], [6, 2]]], ['edgelist', 'G462', 7, [[1, 2], [2, 3], [1, 3], [4, 1], [4, 2], [4, 3], [6, 5], [5, 1], [6, 1]]], ['edgelist', 'G463', 7, [[5, 4], [1, 5], [2, 1], [3, 2], [4, 3], [1, 6], [6, 4], [1, 4], [2, 6]]], ['edgelist', 'G464', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [4, 3], [1, 4], [5, 1]]], ['edgelist', 'G465', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [4, 3], [1, 4], [3, 5]]], ['edgelist', 'G466', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [3, 6], [1, 6], [3, 1], [6, 2]]], ['edgelist', 'G467', 7, [[2, 6], [5, 2], [1, 5], [6, 1], [3, 6], [5, 3], [4, 5], [6, 4], [3, 1]]], ['edgelist', 'G468', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [6, 1], [6, 5], [6, 3], [6, 4]]], ['edgelist', 'G469', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [4, 1], [6, 3], [6, 1], [6, 2]]], ['edgelist', 'G470', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [1, 4], [5, 3], [6, 3]]], ['edgelist', 'G471', 7, [[3, 4], [1, 3], [4, 1], [5, 4], [2, 5], [6, 2], [5, 6], [2, 1], [6, 3]]], ['edgelist', 'G472', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [1, 4], [6, 3], [5, 2]]], ['edgelist', 'G473', 7, [[1, 2], [2, 3], [1, 3], [4, 1], [4, 2], [4, 3], [6, 1], [1, 5], [1, 7]]], ['edgelist', 'G474', 7, [[1, 2], [2, 3], [1, 3], [4, 1], [4, 2], [4, 3], [6, 1], [1, 5], [3, 7]]], ['edgelist', 'G475', 7, [[2, 3], [4, 2], [1, 4], [2, 1], [3, 1], [4, 3], [6, 4], [5, 1], [2, 7]]], ['edgelist', 'G476', 7, [[1, 2], [3, 5], [1, 3], [4, 2], [4, 3], [3, 2], [5, 2], [6, 3], [3, 7]]], ['edgelist', 'G477', 7, [[1, 2], [3, 5], [1, 3], [6, 3], [4, 2], [4, 3], [3, 2], [5, 2], [2, 7]]], ['edgelist', 'G478', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [2, 6], [2, 7]]], ['edgelist', 'G479', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [2, 6], [5, 7]]], ['edgelist', 'G480', 7, [[1, 2], [3, 6], [1, 3], [5, 1], [4, 2], [4, 3], [3, 2], [6, 2], [2, 7]]], ['edgelist', 'G481', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [6, 5], [5, 7]]], ['edgelist', 'G482', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [6, 5], [4, 7]]], ['edgelist', 'G483', 7, [[1, 2], [3, 6], [1, 3], [5, 1], [4, 2], [4, 3], [3, 2], [6, 2], [1, 7]]], ['edgelist', 'G484', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [6, 1], [2, 7]]], ['edgelist', 'G485', 7, [[3, 1], [4, 3], [5, 4], [6, 5], [3, 6], [2, 3], [5, 2], [6, 4], [3, 7]]], ['edgelist', 'G486', 7, [[1, 2], [3, 6], [1, 3], [5, 1], [4, 2], [4, 3], [4, 1], [6, 2], [1, 7]]], ['edgelist', 'G487', 7, [[1, 2], [2, 3], [1, 3], [4, 1], [4, 2], [4, 3], [6, 1], [1, 5], [6, 7]]], ['edgelist', 'G488', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [6, 1], [5, 7]]], ['edgelist', 'G489', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [6, 5], [3, 7]]], ['edgelist', 'G490', 7, [[3, 1], [4, 3], [5, 4], [6, 5], [3, 6], [2, 3], [5, 2], [6, 4], [6, 7]]], ['edgelist', 'G491', 7, [[2, 3], [4, 2], [1, 4], [2, 1], [3, 1], [4, 3], [5, 1], [7, 6], [7, 4]]], ['edgelist', 'G492', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [6, 1], [1, 7]]], ['edgelist', 'G493', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [4, 7], [6, 5], [1, 4], [3, 5]]], ['edgelist', 'G494', 7, [[1, 2], [3, 6], [1, 3], [5, 1], [4, 2], [4, 3], [3, 2], [6, 2], [6, 7]]], ['edgelist', 'G495', 7, [[3, 1], [4, 3], [5, 4], [6, 5], [3, 6], [2, 3], [5, 2], [6, 4], [5, 7]]], ['edgelist', 'G496', 7, [[1, 2], [3, 6], [1, 3], [4, 1], [4, 2], [4, 3], [3, 2], [6, 2], [5, 7]]], ['edgelist', 'G497', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [6, 1], [3, 7]]], ['edgelist', 'G498', 7, [[1, 2], [3, 6], [1, 3], [5, 1], [4, 2], [4, 3], [4, 1], [6, 2], [6, 7]]], ['edgelist', 'G499', 7, [[1, 2], [3, 6], [1, 3], [6, 5], [4, 2], [4, 3], [4, 1], [6, 2], [3, 7]]], ['edgelist', 'G500', 7, [[1, 2], [3, 6], [1, 3], [5, 1], [4, 2], [4, 3], [6, 2], [6, 4], [1, 7]]], ['edgelist', 'G501', 7, [[1, 2], [2, 3], [1, 3], [4, 1], [4, 2], [4, 3], [1, 6], [6, 5], [6, 7]]], ['edgelist', 'G502', 7, [[1, 2], [2, 3], [3, 4], [1, 4], [5, 1], [5, 2], [5, 3], [5, 4], [6, 7]]], ['edgelist', 'G503', 7, [[2, 5], [3, 5], [3, 4], [1, 5], [4, 2], [5, 6], [1, 6], [4, 5], [5, 7]]], ['edgelist', 'G504', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [6, 1], [6, 3], [1, 3], [1, 7]]], ['edgelist', 'G505', 7, [[2, 5], [3, 5], [3, 4], [1, 5], [4, 2], [5, 6], [1, 6], [4, 5], [4, 7]]], ['edgelist', 'G506', 7, [[1, 2], [3, 5], [1, 3], [6, 3], [4, 2], [4, 3], [3, 2], [5, 2], [6, 7]]], ['edgelist', 'G507', 7, [[2, 6], [5, 2], [1, 5], [6, 1], [3, 6], [5, 3], [4, 5], [6, 4], [5, 7]]], ['edgelist', 'G508', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [3, 5], [6, 3], [3, 7]]], ['edgelist', 'G509', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [7, 6], [7, 2]]], ['edgelist', 'G510', 7, [[2, 5], [3, 5], [3, 4], [1, 5], [4, 2], [5, 6], [1, 6], [4, 5], [3, 7]]], ['edgelist', 'G511', 7, [[2, 5], [3, 5], [3, 4], [1, 5], [4, 2], [5, 6], [1, 6], [4, 5], [1, 7]]], ['edgelist', 'G512', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [7, 6], [4, 7], [2, 7], [1, 2], [2, 5]]], ['edgelist', 'G513', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [4, 1], [6, 3], [6, 1], [1, 7]]], ['edgelist', 'G514', 7, [[2, 5], [3, 5], [3, 4], [1, 5], [4, 2], [5, 6], [1, 6], [3, 2], [5, 7]]], ['edgelist', 'G515', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [6, 5], [6, 7]]], ['edgelist', 'G516', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [3, 5], [6, 3], [5, 7]]], ['edgelist', 'G517', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [3, 5], [6, 3], [6, 7]]], ['edgelist', 'G518', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [7, 6], [4, 7], [2, 7], [1, 2], [1, 5]]], ['edgelist', 'G519', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [6, 1], [6, 3], [1, 3], [2, 7]]], ['edgelist', 'G520', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [6, 1], [6, 3], [1, 3], [5, 7]]], ['edgelist', 'G521', 7, [[2, 5], [3, 5], [3, 4], [1, 5], [4, 2], [5, 6], [1, 6], [3, 2], [3, 7]]], ['edgelist', 'G522', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [4, 1], [6, 3], [6, 1], [4, 7]]], ['edgelist', 'G523', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [4, 1], [6, 3], [6, 1], [3, 7]]], ['edgelist', 'G524', 7, [[1, 2], [3, 6], [1, 3], [6, 2], [4, 2], [4, 3], [3, 2], [7, 1], [7, 5]]], ['edgelist', 'G525', 7, [[2, 6], [5, 2], [1, 5], [6, 1], [3, 6], [5, 3], [4, 5], [6, 4], [2, 7]]], ['edgelist', 'G526', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [6, 1], [6, 7]]], ['edgelist', 'G527', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [3, 5], [6, 3], [2, 7]]], ['edgelist', 'G528', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [3, 5], [6, 3], [1, 7]]], ['edgelist', 'G529', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [3, 5], [6, 3], [4, 7]]], ['edgelist', 'G530', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [7, 6], [4, 7], [2, 7], [1, 2], [3, 5]]], ['edgelist', 'G531', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [5, 6], [6, 4], [2, 6], [4, 7]]], ['edgelist', 'G532', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [4, 1], [6, 3], [6, 1], [2, 7]]], ['edgelist', 'G533', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [3, 5], [6, 2], [5, 7]]], ['edgelist', 'G534', 7, [[1, 2], [3, 6], [1, 3], [6, 2], [4, 2], [4, 3], [4, 1], [7, 5], [7, 1]]], ['edgelist', 'G535', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 6], [6, 3], [6, 1], [2, 7]]], ['edgelist', 'G536', 7, [[2, 5], [3, 5], [3, 4], [1, 5], [4, 2], [5, 6], [1, 6], [3, 2], [7, 1]]], ['edgelist', 'G537', 7, [[6, 4], [4, 3], [5, 4], [6, 5], [3, 6], [2, 3], [5, 2], [7, 1], [7, 3]]], ['edgelist', 'G538', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 6], [6, 3], [6, 1], [1, 7]]], ['edgelist', 'G539', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [5, 6], [6, 4], [2, 6], [6, 7]]], ['edgelist', 'G540', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [4, 1], [6, 3], [6, 1], [5, 7]]], ['edgelist', 'G541', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [7, 6], [4, 7], [2, 7], [1, 2], [6, 5]]], ['edgelist', 'G542', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [5, 2], [6, 3], [6, 7]]], ['edgelist', 'G543', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [5, 6], [6, 4], [2, 6], [2, 7]]], ['edgelist', 'G544', 7, [[2, 5], [3, 5], [3, 4], [1, 5], [4, 2], [5, 6], [1, 6], [3, 2], [4, 7]]], ['edgelist', 'G545', 7, [[1, 2], [2, 3], [1, 3], [4, 1], [4, 2], [4, 3], [1, 6], [6, 5], [5, 7]]], ['edgelist', 'G546', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [5, 6], [6, 4], [2, 6], [1, 7]]], ['edgelist', 'G547', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 6], [6, 3], [6, 1], [5, 7]]], ['edgelist', 'G548', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [3, 5], [6, 2], [1, 7]]], ['edgelist', 'G549', 7, [[1, 2], [3, 6], [1, 3], [6, 4], [4, 2], [4, 3], [6, 2], [7, 5], [7, 1]]], ['edgelist', 'G550', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [5, 2], [6, 3], [1, 7]]], ['edgelist', 'G551', 7, [[7, 4], [2, 3], [7, 6], [4, 5], [7, 5], [1, 6], [7, 1], [7, 2], [7, 3]]], ['edgelist', 'G552', 7, [[1, 2], [3, 1], [4, 3], [5, 4], [2, 5], [3, 2], [7, 3], [6, 7], [3, 6]]], ['edgelist', 'G553', 7, [[2, 5], [3, 5], [3, 4], [1, 5], [4, 2], [5, 6], [4, 5], [7, 6], [7, 1]]], ['edgelist', 'G554', 7, [[2, 5], [3, 5], [3, 4], [1, 5], [4, 2], [5, 6], [1, 6], [7, 5], [7, 4]]], ['edgelist', 'G555', 7, [[5, 2], [6, 5], [7, 6], [4, 7], [3, 4], [2, 3], [6, 3], [1, 6], [3, 1]]], ['edgelist', 'G556', 7, [[5, 2], [4, 2], [3, 4], [5, 1], [6, 1], [6, 3], [6, 5], [7, 5], [6, 7]]], ['edgelist', 'G557', 7, [[2, 1], [3, 2], [7, 3], [4, 7], [6, 4], [5, 6], [4, 5], [3, 4], [1, 3]]], ['edgelist', 'G558', 7, [[1, 3], [6, 1], [2, 6], [3, 2], [5, 3], [6, 5], [7, 6], [4, 7], [3, 4]]], ['edgelist', 'G559', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [6, 7], [1, 7], [2, 4], [5, 2]]], ['edgelist', 'G560', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [6, 5], [6, 2], [7, 2], [1, 7]]], ['edgelist', 'G561', 7, [[1, 5], [2, 1], [5, 2], [4, 5], [3, 4], [7, 3], [6, 7], [2, 6], [3, 2]]], ['edgelist', 'G562', 7, [[1, 2], [3, 1], [4, 3], [5, 4], [2, 5], [3, 2], [6, 4], [7, 6], [4, 7]]], ['edgelist', 'G563', 7, [[7, 6], [4, 7], [3, 4], [1, 5], [1, 6], [2, 1], [3, 1], [2, 3], [6, 5]]], ['edgelist', 'G564', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [6, 4], [6, 2], [7, 2], [1, 7]]], ['edgelist', 'G565', 7, [[6, 3], [7, 6], [4, 7], [3, 4], [1, 3], [5, 1], [6, 5], [2, 6], [1, 2]]], ['edgelist', 'G566', 7, [[3, 5], [2, 3], [5, 2], [6, 5], [1, 6], [2, 1], [7, 5], [4, 7], [3, 4]]], ['edgelist', 'G567', 7, [[7, 3], [6, 7], [3, 6], [2, 3], [1, 2], [5, 1], [2, 5], [4, 2], [1, 4]]], ['edgelist', 'G568', 7, [[1, 6], [7, 1], [2, 7], [5, 2], [3, 5], [4, 3], [2, 4], [6, 2], [7, 6]]], ['edgelist', 'G569', 7, [[7, 6], [4, 7], [3, 4], [6, 3], [1, 6], [2, 1], [5, 2], [1, 5], [3, 1]]], ['edgelist', 'G570', 7, [[1, 5], [4, 1], [2, 4], [5, 2], [3, 5], [7, 3], [6, 7], [3, 6], [4, 3]]], ['edgelist', 'G571', 7, [[2, 1], [5, 2], [6, 5], [1, 6], [7, 1], [4, 7], [3, 4], [1, 3], [4, 5]]], ['edgelist', 'G572', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [7, 1], [7, 2], [7, 4]]], ['edgelist', 'G573', 7, [[1, 2], [2, 3], [3, 4], [1, 4], [5, 2], [6, 5], [6, 4], [7, 1], [7, 5]]], ['edgelist', 'G574', 7, [[1, 2], [5, 1], [2, 5], [3, 2], [6, 3], [5, 6], [7, 6], [4, 7], [3, 4]]], ['edgelist', 'G575', 7, [[2, 1], [7, 4], [1, 5], [6, 1], [4, 6], [6, 7], [2, 3], [2, 5], [7, 3]]], ['edgelist', 'G576', 7, [[7, 3], [6, 7], [3, 6], [2, 3], [1, 4], [5, 1], [2, 5], [4, 2], [4, 5]]], ['edgelist', 'G577', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [7, 2], [6, 7], [7, 1]]], ['edgelist', 'G578', 7, [[1, 5], [2, 1], [3, 2], [4, 3], [1, 4], [3, 5], [6, 5], [7, 6], [4, 7]]], ['edgelist', 'G579', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [5, 3], [7, 2], [6, 7]]], ['edgelist', 'G580', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [6, 4], [6, 5], [7, 2], [7, 6]]], ['edgelist', 'G581', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [7, 1], [7, 5], [7, 3]]], ['edgelist', 'G582', 7, [[1, 5], [4, 1], [5, 4], [7, 2], [6, 7], [2, 6], [3, 2], [6, 3], [7, 3]]], ['edgelist', 'G583', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [3, 5], [1, 4], [1, 3]]], ['edgelist', 'G584', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [3, 5], [1, 4], [6, 5]]], ['edgelist', 'G585', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [3, 5], [1, 4], [6, 1]]], ['edgelist', 'G586', 7, [[2, 6], [5, 2], [1, 5], [6, 1], [3, 6], [5, 3], [4, 5], [6, 4], [5, 6], [2, 1]]], ['edgelist', 'G587', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [6, 5], [4, 6], [2, 6]]], ['edgelist', 'G588', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [4, 3], [1, 4], [5, 1], [3, 5]]], ['edgelist', 'G589', 7, [[2, 1], [5, 2], [1, 5], [6, 1], [5, 6], [4, 5], [2, 4], [6, 2], [3, 4], [2, 3]]], ['edgelist', 'G590', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [4, 3], [1, 4], [3, 5], [6, 3]]], ['edgelist', 'G591', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [3, 6], [1, 6], [3, 1], [6, 2], [5, 2]]], ['edgelist', 'G592', 7, [[5, 4], [1, 5], [2, 1], [3, 2], [4, 3], [1, 6], [6, 4], [1, 4], [2, 6], [6, 3]]], ['edgelist', 'G593', 7, [[1, 2], [3, 5], [1, 3], [5, 6], [4, 2], [4, 3], [5, 2], [6, 2], [6, 3], [6, 4]]], ['edgelist', 'G594', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [6, 1], [6, 2], [6, 3], [6, 4], [6, 5]]], ['edgelist', 'G595', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [3, 5], [1, 3], [2, 4], [6, 2]]], ['edgelist', 'G596', 7, [[1, 2], [2, 3], [4, 5], [1, 3], [4, 1], [3, 5], [6, 3], [2, 6], [5, 2], [4, 6]]], ['edgelist', 'G597', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [5, 2], [6, 4], [3, 6], [2, 1]]], ['edgelist', 'G598', 7, [[1, 2], [3, 6], [1, 3], [4, 1], [4, 2], [4, 3], [3, 2], [6, 2], [5, 3], [3, 7]]], ['edgelist', 'G599', 7, [[1, 2], [3, 6], [1, 3], [4, 1], [4, 2], [4, 3], [3, 2], [6, 2], [5, 3], [2, 7]]], ['edgelist', 'G600', 7, [[1, 2], [3, 6], [1, 3], [4, 1], [4, 2], [4, 3], [3, 2], [6, 2], [1, 5], [2, 7]]], ['edgelist', 'G601', 7, [[1, 2], [3, 6], [1, 3], [4, 1], [4, 2], [4, 3], [3, 2], [6, 2], [1, 5], [1, 7]]], ['edgelist', 'G602', 7, [[1, 2], [3, 6], [1, 3], [4, 1], [4, 2], [4, 3], [3, 2], [6, 2], [1, 5], [4, 7]]], ['edgelist', 'G603', 7, [[1, 2], [2, 3], [3, 4], [1, 4], [5, 1], [5, 2], [5, 3], [5, 4], [5, 6], [5, 7]]], ['edgelist', 'G604', 7, [[3, 1], [5, 2], [2, 3], [6, 5], [3, 6], [4, 2], [6, 4], [4, 3], [5, 4], [4, 7]]], ['edgelist', 'G605', 7, [[1, 2], [3, 6], [1, 3], [4, 1], [4, 2], [4, 3], [3, 2], [6, 2], [5, 6], [3, 7]]], ['edgelist', 'G606', 7, [[3, 1], [5, 2], [2, 3], [6, 5], [3, 6], [4, 2], [6, 4], [4, 3], [5, 4], [3, 7]]], ['edgelist', 'G607', 7, [[3, 4], [2, 3], [5, 2], [6, 5], [3, 6], [1, 3], [5, 1], [1, 2], [6, 1], [7, 6]]], ['edgelist', 'G608', 7, [[1, 2], [3, 6], [1, 3], [4, 1], [4, 2], [4, 3], [3, 2], [6, 2], [1, 5], [6, 7]]], ['edgelist', 'G609', 7, [[3, 1], [5, 2], [2, 3], [6, 5], [3, 6], [4, 2], [6, 4], [4, 3], [5, 4], [5, 7]]], ['edgelist', 'G610', 7, [[1, 2], [3, 6], [1, 3], [4, 1], [4, 2], [4, 3], [3, 2], [6, 2], [5, 6], [7, 6]]], ['edgelist', 'G611', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [3, 5], [1, 4], [6, 7]]], ['edgelist', 'G612', 7, [[2, 6], [5, 2], [1, 5], [6, 1], [3, 6], [5, 3], [4, 5], [6, 4], [5, 6], [7, 6]]], ['edgelist', 'G613', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [3, 6], [1, 6], [3, 1], [4, 1], [1, 7]]], ['edgelist', 'G614', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [3, 6], [1, 6], [3, 1], [4, 1], [3, 7]]], ['edgelist', 'G615', 7, [[2, 6], [5, 2], [1, 5], [6, 1], [3, 6], [5, 3], [4, 5], [6, 4], [5, 6], [2, 7]]], ['edgelist', 'G616', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [3, 6], [1, 6], [3, 1], [4, 1], [4, 7]]], ['edgelist', 'G617', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [1, 5], [2, 1], [5, 2], [1, 7]]], ['edgelist', 'G618', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [5, 2], [2, 1], [6, 2], [2, 7]]], ['edgelist', 'G619', 7, [[1, 2], [2, 3], [1, 3], [4, 1], [4, 2], [4, 3], [6, 5], [5, 1], [6, 1], [1, 7]]], ['edgelist', 'G620', 7, [[5, 4], [1, 5], [2, 1], [3, 2], [4, 3], [1, 6], [6, 4], [1, 4], [2, 6], [4, 7]]], ['edgelist', 'G621', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [3, 6], [1, 6], [3, 1], [4, 1], [6, 7]]], ['edgelist', 'G622', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [4, 3], [1, 4], [5, 1], [4, 7]]], ['edgelist', 'G623', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [5, 2], [2, 1], [6, 2], [1, 7]]], ['edgelist', 'G624', 7, [[2, 6], [5, 2], [1, 5], [6, 1], [3, 6], [5, 3], [3, 1], [6, 4], [3, 4], [7, 3]]], ['edgelist', 'G625', 7, [[1, 2], [3, 6], [1, 3], [4, 1], [4, 2], [4, 3], [3, 2], [6, 2], [7, 5], [7, 3]]], ['edgelist', 'G626', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [3, 6], [1, 6], [3, 1], [6, 2], [3, 7]]], ['edgelist', 'G627', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [5, 2], [2, 1], [6, 2], [6, 7]]], ['edgelist', 'G628', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [3, 6], [1, 6], [3, 1], [4, 1], [5, 7]]], ['edgelist', 'G629', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [4, 3], [1, 4], [3, 5], [4, 7]]], ['edgelist', 'G630', 7, [[1, 2], [2, 3], [1, 3], [4, 1], [4, 2], [4, 3], [6, 5], [5, 1], [6, 1], [3, 7]]], ['edgelist', 'G631', 7, [[2, 6], [5, 2], [1, 5], [6, 1], [3, 6], [5, 3], [4, 5], [6, 4], [3, 1], [6, 7]]], ['edgelist', 'G632', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [4, 3], [1, 4], [5, 1], [3, 7]]], ['edgelist', 'G633', 7, [[2, 6], [5, 2], [1, 5], [6, 1], [3, 6], [5, 3], [3, 1], [6, 4], [3, 4], [1, 7]]], ['edgelist', 'G634', 7, [[5, 4], [1, 5], [2, 1], [3, 2], [4, 3], [1, 6], [6, 4], [1, 4], [2, 6], [2, 7]]], ['edgelist', 'G635', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [4, 3], [1, 4], [3, 5], [5, 7]]], ['edgelist', 'G636', 7, [[1, 2], [3, 6], [1, 3], [4, 1], [4, 2], [4, 3], [3, 2], [6, 2], [7, 5], [7, 1]]], ['edgelist', 'G637', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [3, 6], [1, 6], [3, 1], [6, 2], [6, 7]]], ['edgelist', 'G638', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [1, 5], [2, 1], [5, 2], [6, 7]]], ['edgelist', 'G639', 7, [[2, 6], [5, 2], [1, 5], [6, 1], [3, 6], [5, 3], [4, 5], [6, 4], [3, 1], [1, 7]]], ['edgelist', 'G640', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [5, 2], [2, 1], [6, 2], [3, 7]]], ['edgelist', 'G641', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [1, 4], [5, 3], [6, 3], [3, 7]]], ['edgelist', 'G642', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [6, 1], [6, 5], [6, 3], [6, 4], [6, 7]]], ['edgelist', 'G643', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [4, 1], [6, 3], [6, 1], [6, 2], [1, 7]]], ['edgelist', 'G644', 7, [[1, 2], [2, 3], [1, 3], [4, 1], [4, 2], [4, 3], [6, 5], [5, 1], [6, 1], [6, 7]]], ['edgelist', 'G645', 7, [[1, 2], [2, 3], [3, 4], [1, 4], [5, 1], [5, 2], [5, 3], [5, 4], [7, 6], [7, 5]]], ['edgelist', 'G646', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [4, 3], [1, 4], [5, 1], [2, 7]]], ['edgelist', 'G647', 7, [[5, 4], [1, 5], [2, 1], [3, 2], [4, 3], [1, 6], [6, 4], [1, 4], [2, 6], [5, 7]]], ['edgelist', 'G648', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [6, 1], [6, 5], [6, 3], [6, 4], [5, 7]]], ['edgelist', 'G649', 7, [[5, 4], [1, 5], [2, 1], [3, 2], [4, 3], [1, 6], [6, 4], [1, 4], [2, 6], [3, 7]]], ['edgelist', 'G650', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [6, 1], [6, 5], [6, 3], [6, 4], [1, 7]]], ['edgelist', 'G651', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [1, 4], [5, 3], [6, 3], [6, 7]]], ['edgelist', 'G652', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [4, 1], [6, 3], [6, 1], [6, 2], [2, 7]]], ['edgelist', 'G653', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [3, 6], [1, 6], [3, 1], [6, 2], [4, 7]]], ['edgelist', 'G654', 7, [[5, 4], [5, 2], [2, 3], [6, 5], [3, 6], [4, 2], [6, 4], [4, 3], [7, 1], [7, 3]]], ['edgelist', 'G655', 7, [[2, 1], [3, 2], [4, 3], [5, 4], [6, 5], [2, 6], [7, 2], [5, 7], [3, 7], [6, 3]]], ['edgelist', 'G656', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [1, 4], [5, 3], [6, 3], [1, 7]]], ['edgelist', 'G657', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [4, 1], [6, 3], [6, 1], [6, 2], [4, 7]]], ['edgelist', 'G658', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [4, 1], [6, 3], [6, 1], [6, 2], [3, 7]]], ['edgelist', 'G659', 7, [[1, 2], [3, 6], [1, 3], [4, 1], [4, 2], [4, 3], [3, 2], [6, 2], [7, 6], [7, 5]]], ['edgelist', 'G660', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [4, 3], [1, 4], [3, 5], [2, 7]]], ['edgelist', 'G661', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [4, 3], [1, 4], [3, 5], [6, 7]]], ['edgelist', 'G662', 7, [[2, 6], [5, 2], [1, 5], [6, 1], [3, 6], [5, 3], [4, 5], [6, 4], [3, 1], [2, 7]]], ['edgelist', 'G663', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [6, 1], [6, 5], [6, 3], [6, 4], [2, 7]]], ['edgelist', 'G664', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [1, 4], [5, 3], [6, 3], [2, 7]]], ['edgelist', 'G665', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [4, 1], [6, 3], [6, 1], [6, 2], [5, 7]]], ['edgelist', 'G666', 7, [[3, 4], [1, 3], [4, 1], [5, 4], [2, 5], [6, 2], [5, 6], [2, 1], [6, 3], [5, 7]]], ['edgelist', 'G667', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [1, 4], [6, 3], [5, 2], [1, 7]]], ['edgelist', 'G668', 7, [[5, 1], [2, 5], [4, 2], [3, 4], [2, 3], [7, 2], [1, 7], [6, 1], [2, 6], [1, 2]]], ['edgelist', 'G669', 7, [[4, 3], [7, 4], [6, 7], [1, 6], [3, 1], [6, 3], [2, 6], [3, 2], [5, 3], [6, 5]]], ['edgelist', 'G670', 7, [[3, 1], [2, 3], [4, 2], [1, 4], [7, 1], [2, 7], [6, 2], [1, 6], [5, 1], [2, 5]]], ['edgelist', 'G671', 7, [[7, 5], [2, 3], [7, 6], [4, 5], [5, 6], [1, 6], [7, 1], [7, 2], [7, 3], [7, 4]]], ['edgelist', 'G672', 7, [[1, 2], [7, 6], [3, 4], [4, 5], [7, 5], [1, 6], [7, 1], [7, 2], [7, 3], [7, 4]]], ['edgelist', 'G673', 7, [[1, 2], [3, 1], [4, 3], [5, 4], [2, 5], [3, 2], [1, 6], [6, 3], [7, 2], [3, 7]]], ['edgelist', 'G674', 7, [[1, 2], [3, 1], [4, 3], [5, 4], [2, 5], [3, 2], [6, 1], [6, 3], [7, 3], [1, 7]]], ['edgelist', 'G675', 7, [[7, 5], [2, 3], [7, 6], [4, 5], [5, 6], [1, 6], [7, 4], [7, 2], [7, 3], [1, 5]]], ['edgelist', 'G676', 7, [[2, 1], [3, 2], [1, 3], [4, 3], [5, 4], [3, 5], [6, 3], [5, 6], [7, 5], [2, 7]]], ['edgelist', 'G677', 7, [[1, 2], [1, 4], [1, 5], [1, 6], [1, 7], [2, 3], [2, 4], [2, 5], [3, 6], [3, 7]]], ['edgelist', 'G678', 7, [[1, 3], [6, 1], [5, 6], [3, 5], [2, 3], [6, 2], [7, 6], [4, 7], [3, 4], [3, 7]]], ['edgelist', 'G679', 7, [[1, 2], [3, 1], [4, 3], [5, 4], [2, 5], [3, 2], [6, 2], [1, 6], [7, 1], [3, 7]]], ['edgelist', 'G680', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [7, 3], [7, 5], [1, 3], [5, 1]]], ['edgelist', 'G681', 7, [[1, 5], [4, 1], [3, 4], [6, 3], [7, 6], [3, 7], [5, 3], [2, 5], [4, 2], [5, 4]]], ['edgelist', 'G682', 7, [[2, 7], [3, 2], [1, 3], [2, 1], [5, 2], [4, 5], [3, 4], [6, 7], [5, 6], [4, 2]]], ['edgelist', 'G683', 7, [[7, 6], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [7, 1], [7, 2], [7, 3], [7, 5]]], ['edgelist', 'G684', 7, [[1, 2], [7, 6], [3, 4], [4, 5], [7, 5], [1, 6], [7, 1], [7, 2], [7, 3], [6, 4]]], ['edgelist', 'G685', 7, [[1, 2], [2, 3], [3, 4], [6, 5], [1, 5], [6, 1], [6, 4], [6, 3], [7, 6], [7, 2]]], ['edgelist', 'G686', 7, [[1, 4], [3, 1], [2, 3], [4, 2], [5, 4], [3, 5], [1, 5], [7, 1], [6, 7], [1, 6]]], ['edgelist', 'G687', 7, [[1, 4], [3, 1], [2, 3], [4, 2], [5, 4], [1, 6], [1, 5], [7, 1], [6, 7], [2, 5]]], ['edgelist', 'G688', 7, [[1, 2], [7, 6], [3, 4], [4, 5], [7, 5], [1, 6], [7, 1], [7, 2], [7, 3], [5, 3]]], ['edgelist', 'G689', 7, [[2, 3], [6, 2], [7, 6], [3, 7], [2, 7], [6, 3], [5, 2], [1, 5], [4, 1], [2, 4]]], ['edgelist', 'G690', 7, [[5, 3], [7, 3], [6, 4], [5, 2], [3, 1], [7, 4], [6, 3], [1, 2], [1, 5], [7, 1]]], ['edgelist', 'G691', 7, [[5, 3], [4, 7], [6, 4], [6, 2], [3, 1], [7, 1], [6, 3], [2, 5], [1, 5], [6, 5]]], ['edgelist', 'G692', 7, [[5, 1], [6, 5], [5, 2], [3, 2], [4, 3], [1, 4], [4, 5], [6, 4], [7, 2], [7, 6]]], ['edgelist', 'G693', 7, [[1, 5], [2, 1], [3, 2], [5, 3], [4, 5], [6, 4], [5, 6], [6, 3], [7, 4], [3, 7]]], ['edgelist', 'G694', 7, [[2, 7], [3, 2], [1, 3], [2, 1], [5, 2], [4, 5], [3, 4], [6, 7], [5, 6], [5, 7]]], ['edgelist', 'G695', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [7, 4], [7, 2], [7, 6], [6, 2]]], ['edgelist', 'G696', 7, [[2, 1], [5, 2], [1, 5], [3, 1], [4, 3], [7, 4], [6, 7], [1, 6], [6, 3], [7, 3]]], ['edgelist', 'G697', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [6, 4], [6, 2], [6, 5], [7, 2], [6, 7]]], ['edgelist', 'G698', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [3, 5], [5, 2], [7, 2], [7, 6]]], ['edgelist', 'G699', 7, [[1, 2], [3, 1], [4, 3], [5, 4], [2, 5], [3, 2], [6, 4], [3, 6], [7, 2], [5, 7]]], ['edgelist', 'G700', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [6, 3], [6, 5], [7, 6], [7, 1], [1, 3]]], ['edgelist', 'G701', 7, [[3, 1], [6, 3], [2, 6], [1, 2], [4, 1], [6, 4], [7, 6], [5, 7], [1, 5], [5, 4]]], ['edgelist', 'G702', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [5, 3], [2, 6], [7, 3], [7, 6]]], ['edgelist', 'G703', 7, [[6, 1], [7, 6], [3, 7], [4, 3], [1, 4], [5, 1], [3, 5], [5, 4], [2, 5], [4, 2]]], ['edgelist', 'G704', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [7, 4], [6, 7], [4, 6], [5, 6], [5, 7]]], ['edgelist', 'G705', 7, [[6, 3], [3, 2], [4, 3], [5, 4], [2, 5], [6, 1], [7, 2], [7, 1], [2, 6], [3, 7]]], ['edgelist', 'G706', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [7, 3], [7, 5], [5, 3], [6, 2]]], ['edgelist', 'G707', 7, [[5, 3], [3, 4], [5, 2], [1, 2], [4, 1], [7, 5], [1, 7], [6, 1], [5, 6], [2, 6]]], ['edgelist', 'G708', 7, [[3, 2], [6, 3], [4, 6], [1, 4], [5, 1], [7, 5], [4, 7], [2, 4], [5, 2], [6, 5]]], ['edgelist', 'G709', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [6, 1], [6, 2], [6, 3], [7, 6], [7, 4]]], ['edgelist', 'G710', 7, [[1, 2], [5, 1], [2, 5], [3, 2], [6, 3], [5, 6], [7, 6], [4, 7], [3, 4], [6, 4]]], ['edgelist', 'G711', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [7, 6], [7, 2], [7, 3], [5, 3]]], ['edgelist', 'G712', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [6, 1], [6, 4], [6, 3], [7, 6], [7, 5]]], ['edgelist', 'G713', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [7, 1], [7, 4], [7, 3], [5, 1]]], ['edgelist', 'G714', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [7, 1], [7, 5], [7, 6], [7, 4]]], ['edgelist', 'G715', 7, [[1, 6], [7, 1], [2, 7], [1, 2], [2, 6], [3, 2], [4, 3], [5, 4], [7, 5], [5, 6]]], ['edgelist', 'G716', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [7, 1], [7, 6], [7, 5], [3, 1]]], ['edgelist', 'G717', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [7, 1], [7, 2], [7, 6], [7, 4]]], ['edgelist', 'G718', 7, [[3, 2], [3, 1], [4, 3], [5, 4], [2, 5], [6, 2], [6, 1], [7, 1], [2, 7], [7, 6]]], ['edgelist', 'G719', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [6, 4], [6, 2], [7, 2], [7, 5], [7, 6]]], ['edgelist', 'G720', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [6, 3], [5, 2], [7, 1], [6, 7]]], ['edgelist', 'G721', 7, [[4, 2], [1, 4], [6, 1], [2, 6], [3, 2], [7, 3], [1, 7], [1, 5], [5, 3], [5, 7]]], ['edgelist', 'G722', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [7, 5], [7, 2], [7, 3], [7, 6]]], ['edgelist', 'G723', 7, [[1, 4], [3, 1], [2, 3], [4, 2], [5, 4], [3, 5], [6, 5], [6, 1], [7, 5], [7, 2]]], ['edgelist', 'G724', 7, [[1, 2], [7, 6], [3, 4], [4, 5], [7, 5], [1, 6], [7, 3], [7, 2], [5, 3], [6, 2]]], ['edgelist', 'G725', 7, [[6, 3], [7, 6], [3, 7], [5, 3], [1, 5], [4, 1], [3, 4], [2, 1], [2, 4], [5, 2]]], ['edgelist', 'G726', 7, [[4, 5], [2, 4], [5, 2], [1, 5], [4, 1], [2, 1], [3, 2], [6, 3], [7, 6], [3, 7]]], ['edgelist', 'G727', 7, [[6, 7], [3, 6], [7, 3], [4, 7], [1, 4], [5, 1], [6, 5], [2, 5], [4, 2], [3, 2]]], ['edgelist', 'G728', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [7, 1], [7, 2], [7, 6], [5, 3]]], ['edgelist', 'G729', 7, [[2, 1], [3, 2], [4, 3], [1, 4], [6, 1], [2, 6], [5, 6], [7, 5], [4, 7], [3, 7]]], ['edgelist', 'G730', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [5, 2], [3, 6], [7, 1], [4, 7]]], ['edgelist', 'G731', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [3, 5], [1, 4], [1, 3], [2, 6]]], ['edgelist', 'G732', 7, [[1, 2], [3, 5], [1, 3], [3, 2], [4, 2], [4, 3], [5, 2], [6, 2], [6, 3], [6, 4], [1, 4]]], ['edgelist', 'G733', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [3, 5], [1, 4], [6, 1], [5, 6]]], ['edgelist', 'G734', 7, [[1, 2], [2, 3], [3, 4], [5, 6], [1, 5], [2, 4], [5, 2], [3, 5], [1, 4], [6, 4], [1, 3]]], ['edgelist', 'G735', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [6, 4], [3, 6], [2, 1], [5, 2], [6, 2]]], ['edgelist', 'G736', 7, [[2, 5], [2, 6], [2, 7], [3, 4], [3, 6], [3, 7], [4, 5], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G737', 7, [[4, 7], [2, 4], [3, 2], [1, 3], [6, 1], [7, 6], [3, 7], [6, 2], [1, 4], [2, 7], [1, 2]]], ['edgelist', 'G738', 7, [[2, 6], [5, 2], [1, 5], [6, 1], [3, 6], [5, 3], [4, 5], [6, 4], [1, 2], [3, 1], [4, 3]]], ['edgelist', 'G739', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [3, 5], [6, 3], [2, 6], [2, 5], [1, 4]]], ['edgelist', 'G740', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [3, 5], [1, 4], [6, 5], [7, 5]]], ['edgelist', 'G741', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [3, 5], [1, 4], [6, 4], [7, 5]]], ['edgelist', 'G742', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [3, 5], [1, 4], [6, 3], [5, 7]]], ['edgelist', 'G743', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [3, 5], [1, 4], [3, 6], [7, 3]]], ['edgelist', 'G744', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [3, 5], [1, 4], [6, 3], [1, 7]]], ['edgelist', 'G745', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [2, 3], [2, 4], [2, 5], [3, 4], [3, 5], [4, 5], [7, 6]]], ['edgelist', 'G746', 7, [[2, 6], [5, 2], [1, 5], [6, 1], [3, 6], [5, 3], [4, 5], [6, 4], [5, 6], [2, 1], [5, 7]]], ['edgelist', 'G747', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [6, 5], [4, 6], [2, 6], [2, 7]]], ['edgelist', 'G748', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [6, 5], [4, 6], [2, 6], [7, 5]]], ['edgelist', 'G749', 7, [[2, 6], [5, 2], [1, 5], [6, 1], [3, 6], [5, 3], [4, 5], [6, 4], [5, 6], [2, 1], [2, 7]]], ['edgelist', 'G750', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [4, 3], [1, 4], [5, 1], [3, 5], [3, 7]]], ['edgelist', 'G751', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [6, 5], [4, 6], [2, 6], [6, 7]]], ['edgelist', 'G752', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [4, 3], [1, 4], [3, 5], [6, 3], [3, 7]]], ['edgelist', 'G753', 7, [[2, 1], [5, 2], [1, 5], [6, 1], [5, 6], [4, 5], [2, 4], [6, 2], [3, 4], [2, 3], [7, 2]]], ['edgelist', 'G754', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [4, 3], [1, 4], [3, 5], [6, 3], [4, 7]]], ['edgelist', 'G755', 7, [[2, 1], [5, 2], [1, 5], [6, 1], [5, 6], [4, 5], [2, 4], [6, 2], [3, 4], [2, 3], [7, 5]]], ['edgelist', 'G756', 7, [[1, 5], [2, 6], [2, 7], [3, 4], [3, 6], [3, 7], [4, 6], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G757', 7, [[5, 4], [1, 5], [2, 1], [3, 2], [4, 3], [1, 6], [6, 4], [1, 4], [2, 6], [6, 3], [1, 7]]], ['edgelist', 'G758', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [4, 3], [1, 4], [3, 5], [6, 3], [1, 7]]], ['edgelist', 'G759', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [3, 6], [1, 6], [3, 1], [6, 2], [5, 2], [2, 7]]], ['edgelist', 'G760', 7, [[2, 1], [5, 2], [1, 5], [6, 1], [5, 6], [4, 5], [2, 4], [6, 2], [3, 4], [2, 3], [6, 7]]], ['edgelist', 'G761', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [6, 5], [4, 6], [2, 6], [1, 7]]], ['edgelist', 'G762', 7, [[1, 2], [3, 5], [1, 3], [5, 6], [4, 2], [4, 3], [5, 2], [6, 2], [6, 3], [6, 4], [3, 7]]], ['edgelist', 'G763', 7, [[2, 1], [5, 2], [1, 5], [6, 1], [5, 6], [4, 5], [2, 4], [6, 2], [3, 4], [2, 3], [4, 7]]], ['edgelist', 'G764', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [3, 6], [1, 6], [3, 1], [6, 2], [5, 2], [3, 7]]], ['edgelist', 'G765', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [4, 3], [1, 4], [3, 5], [6, 3], [6, 7]]], ['edgelist', 'G766', 7, [[5, 4], [1, 5], [2, 1], [3, 2], [4, 3], [1, 6], [6, 4], [1, 4], [2, 6], [6, 3], [6, 7]]], ['edgelist', 'G767', 7, [[1, 2], [3, 5], [1, 3], [5, 6], [4, 2], [4, 3], [5, 2], [6, 2], [6, 3], [6, 4], [6, 7]]], ['edgelist', 'G768', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [3, 5], [1, 4], [6, 5], [6, 7]]], ['edgelist', 'G769', 7, [[5, 4], [1, 5], [2, 1], [3, 2], [4, 3], [1, 6], [6, 4], [1, 4], [2, 6], [6, 3], [2, 7]]], ['edgelist', 'G770', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [3, 6], [1, 6], [3, 1], [6, 2], [5, 2], [5, 7]]], ['edgelist', 'G771', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [3, 6], [1, 6], [3, 1], [6, 2], [5, 2], [6, 7]]], ['edgelist', 'G772', 7, [[1, 2], [3, 5], [1, 3], [5, 6], [4, 2], [4, 3], [5, 2], [6, 2], [6, 3], [6, 4], [5, 7]]], ['edgelist', 'G773', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [4, 3], [1, 4], [5, 1], [3, 5], [2, 7]]], ['edgelist', 'G774', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [3, 5], [1, 4], [7, 6], [7, 3]]], ['edgelist', 'G775', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [6, 1], [6, 2], [6, 3], [6, 4], [6, 5], [6, 7]]], ['edgelist', 'G776', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [3, 5], [1, 3], [2, 4], [6, 2], [2, 7]]], ['edgelist', 'G777', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [6, 1], [6, 2], [6, 3], [6, 4], [6, 5], [2, 7]]], ['edgelist', 'G778', 7, [[2, 1], [5, 2], [1, 5], [6, 1], [5, 6], [4, 5], [2, 4], [6, 2], [3, 4], [2, 3], [3, 7]]], ['edgelist', 'G779', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [4, 3], [1, 4], [3, 5], [6, 3], [2, 7]]], ['edgelist', 'G780', 7, [[1, 7], [2, 5], [2, 6], [2, 7], [3, 5], [3, 6], [3, 7], [4, 5], [4, 6], [4, 7], [6, 7]]], ['edgelist', 'G781', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [5, 2], [6, 4], [3, 6], [2, 1], [2, 7]]], ['edgelist', 'G782', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [3, 5], [1, 3], [2, 4], [6, 2], [6, 7]]], ['edgelist', 'G783', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [3, 5], [1, 3], [2, 4], [6, 2], [7, 4]]], ['edgelist', 'G784', 7, [[5, 4], [1, 5], [2, 1], [3, 2], [4, 3], [1, 6], [6, 4], [1, 4], [2, 6], [6, 3], [5, 7]]], ['edgelist', 'G785', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [3, 6], [1, 6], [3, 1], [6, 2], [5, 2], [7, 4]]], ['edgelist', 'G786', 7, [[4, 5], [2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [3, 5], [6, 2], [4, 3], [1, 4], [2, 7]]], ['edgelist', 'G787', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [5, 2], [6, 4], [3, 6], [2, 1], [7, 3]]], ['edgelist', 'G788', 7, [[1, 2], [3, 5], [1, 3], [5, 6], [4, 2], [4, 3], [5, 2], [6, 2], [6, 3], [6, 4], [1, 7]]], ['edgelist', 'G789', 7, [[1, 7], [2, 5], [2, 6], [2, 7], [3, 5], [3, 6], [3, 7], [4, 5], [4, 6], [4, 7], [5, 6]]], ['edgelist', 'G790', 7, [[7, 6], [1, 7], [6, 1], [2, 6], [7, 2], [3, 7], [6, 3], [4, 6], [7, 4], [5, 7], [6, 5]]], ['edgelist', 'G791', 7, [[1, 2], [3, 1], [4, 3], [5, 4], [2, 5], [3, 2], [6, 2], [3, 6], [7, 3], [2, 7], [4, 2]]], ['edgelist', 'G792', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [6, 2], [4, 6], [7, 2], [5, 7], [2, 5], [4, 2]]], ['edgelist', 'G793', 7, [[2, 5], [3, 4], [5, 3], [1, 7], [5, 6], [7, 6], [4, 2], [7, 5], [4, 1], [4, 7], [5, 4]]], ['edgelist', 'G794', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [5, 3], [7, 5], [3, 7], [6, 3], [1, 3]]], ['edgelist', 'G795', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [5, 1], [4, 1], [3, 1], [7, 1], [4, 7]]], ['edgelist', 'G796', 7, [[1, 2], [3, 1], [6, 3], [7, 6], [3, 7], [2, 3], [5, 2], [3, 5], [4, 3], [5, 4], [4, 2]]], ['edgelist', 'G797', 7, [[5, 6], [2, 5], [3, 2], [4, 3], [7, 4], [6, 7], [3, 6], [5, 3], [4, 6], [1, 6], [3, 1]]], ['edgelist', 'G798', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [5, 3], [6, 3], [6, 1], [7, 3], [5, 7], [6, 5]]], ['edgelist', 'G799', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [7, 2], [3, 7], [1, 3], [7, 1], [6, 3], [1, 6]]], ['edgelist', 'G800', 7, [[1, 6], [7, 1], [2, 7], [6, 2], [3, 6], [7, 3], [5, 4], [4, 3], [5, 6], [7, 5], [7, 6]]], ['edgelist', 'G801', 7, [[1, 6], [7, 1], [2, 7], [6, 2], [3, 6], [7, 3], [4, 7], [6, 4], [5, 6], [7, 5], [5, 4]]], ['edgelist', 'G802', 7, [[1, 6], [1, 7], [2, 3], [2, 7], [3, 5], [4, 5], [4, 6], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G803', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [1, 3], [3, 5], [6, 3], [5, 6], [7, 6], [7, 1]]], ['edgelist', 'G804', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [6, 7], [1, 7], [5, 3], [1, 5], [3, 1], [7, 5]]], ['edgelist', 'G805', 7, [[1, 2], [3, 1], [4, 3], [5, 4], [2, 5], [6, 2], [6, 3], [7, 2], [3, 7], [5, 3], [6, 5]]], ['edgelist', 'G806', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [6, 2], [3, 6], [5, 3], [7, 3], [5, 7]]], ['edgelist', 'G807', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [7, 1], [7, 5], [7, 3], [1, 3], [5, 1]]], ['edgelist', 'G808', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [4, 2], [6, 4], [5, 6], [2, 5], [7, 6], [7, 2]]], ['edgelist', 'G809', 7, [[1, 5], [4, 1], [5, 4], [3, 5], [4, 3], [2, 4], [3, 2], [5, 2], [6, 3], [7, 6], [3, 7]]], ['edgelist', 'G810', 7, [[1, 6], [1, 7], [2, 5], [2, 7], [3, 4], [3, 6], [4, 5], [4, 6], [4, 7], [5, 6], [5, 7]]], ['edgelist', 'G811', 7, [[1, 2], [5, 1], [6, 5], [7, 6], [4, 7], [3, 4], [2, 3], [5, 2], [3, 5], [6, 3], [2, 6]]], ['edgelist', 'G812', 7, [[1, 5], [4, 1], [5, 4], [3, 5], [7, 3], [2, 7], [6, 2], [3, 6], [4, 3], [2, 4], [5, 2]]], ['edgelist', 'G813', 7, [[1, 2], [7, 6], [3, 4], [4, 5], [5, 6], [1, 6], [7, 1], [7, 2], [7, 3], [7, 4], [7, 5]]], ['edgelist', 'G814', 7, [[5, 2], [1, 5], [2, 1], [4, 2], [1, 4], [6, 2], [7, 6], [2, 7], [3, 2], [6, 3], [7, 3]]], ['edgelist', 'G815', 7, [[1, 2], [2, 3], [3, 4], [1, 4], [5, 1], [5, 2], [5, 3], [5, 4], [6, 5], [7, 6], [5, 7]]], ['edgelist', 'G816', 7, [[2, 1], [3, 2], [4, 3], [5, 4], [1, 5], [3, 1], [6, 3], [7, 6], [4, 7], [7, 1], [1, 6]]], ['edgelist', 'G817', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [1, 3], [5, 1], [7, 5], [1, 7], [4, 7]]], ['edgelist', 'G818', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [3, 1], [6, 3], [7, 6], [5, 7], [1, 6], [7, 1]]], ['edgelist', 'G819', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [7, 1], [3, 7], [4, 7], [1, 4], [5, 1]]], ['edgelist', 'G820', 7, [[5, 7], [6, 5], [7, 6], [4, 7], [6, 4], [3, 6], [4, 3], [6, 1], [7, 1], [2, 1], [3, 2]]], ['edgelist', 'G821', 7, [[3, 1], [5, 3], [6, 5], [4, 6], [2, 4], [1, 2], [3, 2], [4, 3], [7, 4], [6, 7], [5, 4]]], ['edgelist', 'G822', 7, [[5, 4], [5, 3], [2, 5], [4, 2], [1, 4], [2, 1], [4, 3], [4, 6], [3, 6], [7, 1], [5, 7]]], ['edgelist', 'G823', 7, [[1, 2], [1, 3], [3, 4], [6, 2], [2, 4], [6, 3], [7, 4], [7, 1], [6, 4], [5, 6], [4, 5]]], ['edgelist', 'G824', 7, [[5, 1], [2, 5], [7, 2], [1, 7], [4, 1], [2, 4], [6, 2], [1, 6], [7, 6], [3, 4], [1, 3]]], ['edgelist', 'G825', 7, [[1, 2], [6, 1], [5, 6], [2, 5], [3, 2], [4, 3], [5, 4], [5, 3], [7, 5], [3, 7], [4, 7]]], ['edgelist', 'G826', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [6, 2], [4, 6], [7, 4], [5, 7], [6, 7]]], ['edgelist', 'G827', 7, [[7, 4], [6, 7], [3, 6], [4, 3], [6, 4], [5, 6], [3, 5], [2, 3], [6, 2], [1, 2], [5, 1]]], ['edgelist', 'G828', 7, [[1, 2], [2, 3], [3, 4], [1, 4], [5, 1], [5, 2], [5, 3], [5, 4], [6, 2], [7, 6], [5, 7]]], ['edgelist', 'G829', 7, [[1, 5], [4, 1], [3, 4], [6, 3], [7, 6], [3, 7], [5, 3], [2, 5], [4, 2], [2, 1], [3, 2]]], ['edgelist', 'G830', 7, [[6, 1], [1, 2], [4, 1], [6, 4], [3, 6], [7, 3], [5, 7], [6, 5], [2, 6], [7, 2], [4, 7]]], ['edgelist', 'G831', 7, [[1, 2], [3, 1], [4, 3], [5, 4], [2, 5], [3, 2], [6, 2], [3, 6], [7, 5], [7, 3], [4, 7]]], ['edgelist', 'G832', 7, [[4, 3], [7, 4], [6, 7], [1, 6], [3, 1], [2, 3], [1, 2], [6, 2], [3, 6], [5, 3], [7, 5]]], ['edgelist', 'G833', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [5, 2], [6, 4], [6, 2], [7, 5], [7, 6], [4, 7]]], ['edgelist', 'G834', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [6, 1], [6, 4], [7, 1], [7, 3], [7, 4], [6, 7]]], ['edgelist', 'G835', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [6, 2], [7, 6], [5, 7], [4, 7], [2, 7]]], ['edgelist', 'G836', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [6, 2], [7, 6], [2, 7], [3, 7], [5, 3]]], ['edgelist', 'G837', 7, [[1, 2], [2, 3], [3, 4], [1, 4], [5, 1], [5, 4], [5, 3], [7, 2], [7, 5], [6, 3], [4, 6]]], ['edgelist', 'G838', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [6, 2], [7, 2], [5, 7], [7, 6], [3, 7]]], ['edgelist', 'G839', 7, [[1, 4], [1, 7], [2, 3], [2, 6], [3, 5], [3, 7], [4, 5], [4, 6], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G840', 7, [[6, 2], [7, 6], [5, 7], [4, 5], [3, 4], [1, 3], [2, 1], [6, 1], [7, 1], [3, 7], [5, 3]]], ['edgelist', 'G841', 7, [[2, 1], [3, 2], [4, 3], [5, 4], [1, 5], [6, 3], [4, 6], [7, 1], [7, 6], [7, 3], [4, 7]]], ['edgelist', 'G842', 7, [[1, 4], [5, 1], [3, 5], [4, 3], [2, 4], [5, 2], [6, 2], [7, 1], [7, 2], [6, 4], [5, 6]]], ['edgelist', 'G843', 7, [[1, 6], [1, 7], [2, 4], [2, 5], [3, 4], [3, 6], [3, 7], [4, 5], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G844', 7, [[1, 3], [2, 1], [3, 2], [1, 4], [4, 2], [6, 5], [6, 4], [7, 5], [7, 3], [7, 1], [2, 7]]], ['edgelist', 'G845', 7, [[5, 2], [6, 5], [3, 6], [2, 3], [1, 2], [6, 1], [7, 6], [4, 7], [3, 4], [1, 3], [5, 1]]], ['edgelist', 'G846', 7, [[1, 2], [3, 1], [4, 3], [5, 4], [2, 5], [7, 2], [7, 3], [6, 2], [4, 6], [6, 3], [5, 6]]], ['edgelist', 'G847', 7, [[1, 6], [1, 7], [2, 5], [2, 7], [3, 5], [3, 6], [3, 7], [4, 5], [4, 6], [4, 7], [5, 6]]], ['edgelist', 'G848', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [7, 4], [7, 5], [7, 3], [3, 1], [5, 1]]], ['edgelist', 'G849', 7, [[1, 3], [2, 1], [3, 2], [1, 4], [4, 2], [6, 5], [6, 4], [7, 5], [7, 3], [5, 1], [2, 5]]], ['edgelist', 'G850', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [6, 1], [6, 2], [6, 3], [7, 3], [1, 7], [2, 7]]], ['edgelist', 'G851', 7, [[1, 4], [5, 1], [2, 5], [4, 2], [5, 4], [1, 2], [3, 5], [7, 3], [6, 7], [3, 6], [2, 3]]], ['edgelist', 'G852', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [6, 1], [6, 5], [6, 3], [6, 4], [7, 2], [7, 6]]], ['edgelist', 'G853', 7, [[5, 2], [6, 5], [3, 6], [2, 3], [1, 2], [5, 1], [6, 1], [7, 6], [4, 7], [3, 4], [6, 4]]], ['edgelist', 'G854', 7, [[1, 2], [2, 3], [3, 4], [1, 4], [5, 1], [5, 4], [5, 3], [6, 2], [6, 5], [7, 6], [5, 7]]], ['edgelist', 'G855', 7, [[1, 2], [2, 3], [3, 4], [6, 5], [1, 5], [6, 1], [6, 2], [6, 3], [6, 4], [7, 4], [7, 5]]], ['edgelist', 'G856', 7, [[1, 5], [4, 1], [2, 4], [5, 2], [4, 5], [6, 2], [7, 6], [2, 7], [3, 2], [6, 3], [7, 3]]], ['edgelist', 'G857', 7, [[5, 2], [1, 5], [4, 1], [3, 6], [6, 5], [7, 6], [3, 7], [2, 3], [4, 2], [7, 4], [3, 4]]], ['edgelist', 'G858', 7, [[1, 2], [2, 3], [3, 4], [1, 4], [5, 1], [5, 3], [6, 5], [6, 4], [7, 1], [7, 6], [4, 7]]], ['edgelist', 'G859', 7, [[6, 3], [3, 5], [6, 4], [5, 2], [6, 5], [1, 2], [4, 1], [1, 3], [7, 3], [7, 4], [1, 7]]], ['edgelist', 'G860', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [6, 1], [6, 5], [6, 3], [6, 4], [7, 2], [1, 7]]], ['edgelist', 'G861', 7, [[1, 4], [1, 5], [2, 3], [2, 6], [2, 7], [3, 5], [3, 7], [4, 6], [4, 7], [5, 6], [6, 7]]], ['edgelist', 'G862', 7, [[5, 1], [4, 5], [6, 4], [1, 6], [2, 1], [3, 2], [4, 3], [5, 2], [6, 3], [7, 5], [6, 7]]], ['edgelist', 'G863', 7, [[3, 4], [5, 3], [1, 5], [6, 1], [2, 6], [5, 2], [4, 5], [6, 4], [2, 1], [7, 6], [7, 3]]], ['edgelist', 'G864', 7, [[5, 2], [1, 5], [4, 1], [5, 4], [6, 5], [7, 6], [3, 7], [2, 3], [4, 2], [7, 4], [3, 6]]], ['edgelist', 'G865', 7, [[1, 4], [5, 1], [3, 5], [4, 3], [2, 4], [1, 2], [7, 1], [6, 7], [3, 6], [2, 6], [5, 2]]], ['edgelist', 'G866', 7, [[1, 4], [1, 5], [2, 5], [2, 6], [2, 7], [3, 4], [3, 6], [3, 7], [4, 7], [5, 6], [6, 7]]], ['edgelist', 'G867', 7, [[1, 2], [2, 3], [3, 4], [1, 4], [5, 1], [5, 4], [5, 3], [6, 2], [6, 5], [7, 2], [6, 7]]], ['edgelist', 'G868', 7, [[5, 2], [6, 5], [7, 6], [4, 7], [3, 4], [2, 3], [1, 2], [6, 1], [5, 1], [6, 3], [7, 3]]], ['edgelist', 'G869', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [5, 3], [4, 6], [5, 6], [4, 1], [7, 6], [7, 2]]], ['edgelist', 'G870', 7, [[1, 5], [2, 1], [5, 2], [4, 5], [3, 4], [2, 3], [7, 2], [6, 7], [4, 6], [6, 5], [3, 7]]], ['edgelist', 'G871', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [6, 2], [5, 3], [7, 3], [4, 7], [5, 7]]], ['edgelist', 'G872', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [7, 3], [2, 7], [6, 3], [5, 2], [1, 4]]], ['edgelist', 'G873', 7, [[1, 4], [1, 5], [2, 3], [2, 6], [2, 7], [3, 6], [3, 7], [4, 6], [4, 7], [5, 6], [5, 7]]], ['edgelist', 'G874', 7, [[1, 2], [2, 3], [3, 4], [6, 5], [1, 5], [6, 4], [6, 2], [7, 4], [7, 5], [5, 3], [1, 4]]], ['edgelist', 'G875', 7, [[1, 5], [1, 6], [1, 7], [2, 4], [2, 6], [2, 7], [3, 4], [3, 5], [3, 7], [4, 7], [5, 6]]], ['edgelist', 'G876', 7, [[5, 4], [3, 5], [4, 3], [1, 4], [3, 2], [6, 5], [6, 1], [7, 5], [7, 2], [2, 6], [1, 7]]], ['edgelist', 'G877', 7, [[7, 5], [4, 7], [2, 4], [5, 2], [1, 5], [3, 1], [4, 3], [1, 2], [6, 1], [7, 6], [6, 3]]], ['edgelist', 'G878', 7, [[7, 2], [3, 7], [2, 3], [1, 2], [4, 1], [5, 4], [6, 5], [4, 6], [3, 1], [5, 1], [6, 7]]], ['edgelist', 'G879', 7, [[1, 2], [2, 3], [3, 4], [5, 6], [1, 5], [2, 4], [5, 2], [3, 5], [1, 4], [6, 4], [5, 4], [1, 3]]], ['edgelist', 'G880', 7, [[4, 7], [2, 4], [3, 2], [1, 3], [6, 1], [7, 6], [3, 7], [6, 2], [1, 4], [2, 7], [1, 2], [1, 7]]], ['edgelist', 'G881', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [6, 4], [3, 6], [2, 1], [5, 2], [6, 2], [3, 5]]], ['edgelist', 'G882', 7, [[4, 5], [2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [3, 5], [6, 2], [1, 4], [2, 5], [1, 2], [3, 4]]], ['edgelist', 'G883', 7, [[1, 2], [2, 3], [1, 3], [4, 3], [4, 2], [5, 1], [3, 5], [6, 2], [1, 6], [5, 6], [4, 5], [6, 4]]], ['edgelist', 'G884', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [3, 5], [1, 4], [1, 3], [2, 6], [7, 2]]], ['edgelist', 'G885', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [3, 5], [1, 4], [1, 3], [5, 7], [6, 4]]], ['edgelist', 'G886', 7, [[1, 2], [3, 5], [1, 3], [3, 2], [4, 2], [4, 3], [5, 2], [6, 2], [6, 3], [6, 4], [1, 4], [2, 7]]], ['edgelist', 'G887', 7, [[1, 2], [3, 5], [1, 3], [3, 2], [4, 2], [4, 3], [5, 2], [6, 2], [6, 3], [6, 4], [1, 4], [4, 7]]], ['edgelist', 'G888', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [3, 5], [1, 4], [6, 1], [5, 6], [5, 7]]], ['edgelist', 'G889', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [3, 5], [1, 4], [6, 1], [5, 6], [7, 2]]], ['edgelist', 'G890', 7, [[1, 2], [3, 5], [1, 3], [3, 2], [4, 2], [4, 3], [5, 2], [6, 2], [6, 3], [6, 4], [1, 4], [1, 7]]], ['edgelist', 'G891', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [3, 5], [1, 4], [6, 1], [5, 6], [1, 7]]], ['edgelist', 'G892', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [3, 5], [1, 4], [6, 1], [5, 6], [3, 7]]], ['edgelist', 'G893', 7, [[1, 2], [2, 3], [3, 4], [5, 6], [1, 5], [2, 4], [5, 2], [3, 5], [1, 4], [6, 4], [1, 3], [2, 7]]], ['edgelist', 'G894', 7, [[1, 2], [2, 3], [3, 4], [5, 6], [1, 5], [2, 4], [5, 2], [3, 5], [1, 4], [6, 4], [1, 3], [5, 7]]], ['edgelist', 'G895', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [3, 5], [1, 4], [1, 3], [7, 2], [7, 6]]], ['edgelist', 'G896', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [6, 2], [6, 4], [3, 6], [2, 1], [5, 2], [2, 7]]], ['edgelist', 'G897', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [3, 5], [6, 3], [2, 6], [4, 6], [1, 4], [6, 7]]], ['edgelist', 'G898', 7, [[4, 7], [2, 4], [3, 2], [1, 3], [6, 1], [7, 6], [3, 7], [6, 2], [1, 4], [2, 7], [1, 2], [2, 5]]], ['edgelist', 'G899', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [3, 5], [6, 3], [2, 6], [4, 6], [1, 4], [4, 7]]], ['edgelist', 'G900', 7, [[1, 2], [3, 5], [1, 3], [3, 2], [4, 2], [4, 3], [5, 2], [6, 2], [6, 3], [6, 4], [1, 4], [5, 7]]], ['edgelist', 'G901', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [6, 2], [6, 4], [3, 6], [2, 1], [5, 2], [3, 7]]], ['edgelist', 'G902', 7, [[4, 7], [2, 4], [3, 2], [1, 3], [6, 1], [7, 6], [3, 7], [6, 2], [1, 4], [2, 7], [1, 2], [1, 5]]], ['edgelist', 'G903', 7, [[2, 4], [5, 2], [4, 5], [3, 4], [1, 3], [5, 1], [6, 5], [3, 6], [5, 3], [1, 6], [2, 6], [4, 7]]], ['edgelist', 'G904', 7, [[2, 4], [5, 2], [4, 5], [3, 4], [1, 3], [5, 1], [6, 5], [3, 6], [5, 3], [1, 6], [2, 6], [1, 7]]], ['edgelist', 'G905', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [2, 4], [5, 2], [3, 5], [1, 4], [6, 1], [5, 6], [6, 7]]], ['edgelist', 'G906', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [3, 5], [6, 3], [2, 6], [2, 5], [1, 4], [6, 7]]], ['edgelist', 'G907', 7, [[2, 6], [5, 2], [1, 5], [6, 1], [3, 6], [5, 3], [4, 5], [6, 4], [1, 2], [3, 1], [4, 3], [5, 7]]], ['edgelist', 'G908', 7, [[2, 6], [5, 2], [1, 5], [6, 1], [3, 6], [5, 3], [4, 5], [6, 4], [1, 2], [3, 1], [4, 3], [1, 7]]], ['edgelist', 'G909', 7, [[4, 7], [2, 4], [3, 2], [1, 3], [6, 1], [7, 6], [3, 7], [6, 2], [1, 4], [2, 7], [1, 2], [5, 6]]], ['edgelist', 'G910', 7, [[2, 6], [5, 2], [1, 5], [6, 1], [3, 6], [5, 3], [4, 5], [6, 4], [1, 2], [3, 1], [4, 3], [4, 7]]], ['edgelist', 'G911', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [3, 5], [6, 3], [2, 6], [2, 5], [1, 4], [1, 7]]], ['edgelist', 'G912', 7, [[1, 2], [2, 3], [3, 4], [5, 6], [1, 5], [2, 4], [5, 2], [3, 5], [1, 4], [6, 4], [1, 3], [6, 7]]], ['edgelist', 'G913', 7, [[1, 4], [7, 1], [6, 7], [4, 6], [2, 4], [7, 2], [5, 7], [4, 5], [3, 4], [7, 3], [4, 7], [6, 5]]], ['edgelist', 'G914', 7, [[1, 2], [5, 1], [6, 5], [2, 6], [5, 2], [3, 5], [2, 3], [7, 2], [5, 7], [3, 7], [4, 3], [5, 4]]], ['edgelist', 'G915', 7, [[5, 2], [4, 3], [4, 1], [5, 3], [6, 2], [6, 1], [4, 6], [5, 4], [6, 5], [7, 6], [4, 7], [5, 7]]], ['edgelist', 'G916', 7, [[1, 2], [2, 3], [1, 3], [4, 1], [4, 2], [4, 3], [5, 3], [4, 5], [6, 4], [1, 6], [7, 4], [2, 7]]], ['edgelist', 'G917', 7, [[1, 2], [2, 3], [1, 3], [4, 1], [4, 2], [4, 3], [7, 4], [1, 7], [6, 1], [2, 6], [5, 2], [3, 5]]], ['edgelist', 'G918', 7, [[7, 3], [6, 7], [4, 6], [3, 4], [2, 3], [5, 2], [6, 5], [3, 6], [5, 3], [1, 5], [2, 1], [6, 2]]], ['edgelist', 'G919', 7, [[6, 5], [7, 6], [4, 7], [5, 4], [1, 5], [4, 1], [2, 4], [1, 2], [5, 2], [4, 6], [3, 4], [5, 3]]], ['edgelist', 'G920', 7, [[1, 5], [4, 1], [3, 4], [5, 3], [2, 5], [4, 2], [2, 1], [3, 2], [6, 1], [2, 6], [7, 2], [1, 7]]], ['edgelist', 'G921', 7, [[2, 3], [1, 2], [3, 1], [4, 3], [1, 4], [2, 4], [5, 3], [1, 5], [6, 5], [3, 6], [7, 3], [2, 7]]], ['edgelist', 'G922', 7, [[1, 2], [2, 3], [3, 4], [1, 4], [5, 1], [5, 2], [5, 3], [5, 4], [6, 3], [5, 6], [7, 5], [4, 7]]], ['edgelist', 'G923', 7, [[1, 5], [4, 1], [3, 4], [5, 3], [2, 5], [4, 2], [2, 1], [3, 2], [6, 1], [2, 6], [7, 2], [3, 7]]], ['edgelist', 'G924', 7, [[2, 3], [1, 2], [3, 1], [4, 3], [1, 4], [2, 4], [5, 3], [1, 5], [7, 5], [3, 7], [6, 3], [5, 6]]], ['edgelist', 'G925', 7, [[2, 1], [3, 2], [1, 3], [4, 1], [5, 4], [1, 5], [6, 1], [4, 6], [5, 6], [7, 5], [4, 7], [7, 1]]], ['edgelist', 'G926', 7, [[1, 2], [2, 3], [1, 3], [4, 1], [4, 2], [4, 3], [5, 2], [1, 5], [6, 1], [3, 6], [7, 6], [3, 7]]], ['edgelist', 'G927', 7, [[1, 2], [2, 3], [3, 4], [1, 4], [5, 1], [5, 2], [5, 3], [5, 4], [7, 5], [1, 7], [6, 1], [4, 6]]], ['edgelist', 'G928', 7, [[1, 2], [2, 3], [3, 4], [1, 4], [5, 1], [5, 2], [5, 3], [2, 6], [7, 2], [1, 7], [6, 1], [5, 6]]], ['edgelist', 'G929', 7, [[1, 5], [4, 1], [3, 4], [5, 3], [2, 5], [4, 2], [2, 1], [3, 2], [7, 2], [1, 7], [6, 1], [3, 6]]], ['edgelist', 'G930', 7, [[6, 5], [4, 6], [5, 4], [7, 5], [4, 7], [3, 4], [5, 3], [1, 5], [4, 1], [2, 1], [3, 2], [7, 3]]], ['edgelist', 'G931', 7, [[5, 2], [4, 3], [4, 1], [5, 3], [6, 2], [6, 1], [4, 6], [5, 4], [6, 5], [7, 6], [1, 7], [4, 7]]], ['edgelist', 'G932', 7, [[1, 2], [2, 3], [3, 4], [1, 4], [5, 1], [5, 2], [5, 3], [5, 4], [7, 2], [1, 7], [6, 1], [2, 6]]], ['edgelist', 'G933', 7, [[1, 2], [2, 3], [3, 4], [1, 4], [5, 1], [5, 2], [5, 3], [5, 4], [4, 2], [6, 2], [7, 6], [5, 7]]], ['edgelist', 'G934', 7, [[1, 5], [4, 1], [3, 4], [5, 3], [2, 5], [4, 2], [2, 1], [3, 2], [6, 5], [4, 6], [7, 4], [5, 7]]], ['edgelist', 'G935', 7, [[2, 1], [3, 2], [4, 3], [1, 4], [5, 4], [2, 5], [5, 1], [6, 5], [1, 6], [4, 6], [7, 1], [2, 7]]], ['edgelist', 'G936', 7, [[1, 5], [4, 1], [3, 4], [5, 3], [2, 5], [4, 2], [2, 1], [3, 2], [7, 3], [5, 7], [6, 1], [2, 6]]], ['edgelist', 'G937', 7, [[1, 2], [2, 3], [3, 4], [1, 4], [5, 1], [5, 2], [5, 3], [6, 5], [3, 6], [2, 6], [7, 2], [6, 7]]], ['edgelist', 'G938', 7, [[1, 3], [2, 1], [3, 2], [1, 4], [4, 2], [5, 3], [6, 4], [7, 2], [7, 5], [5, 1], [4, 5], [2, 6]]], ['edgelist', 'G939', 7, [[1, 2], [2, 3], [1, 3], [4, 1], [4, 3], [5, 4], [5, 2], [1, 5], [6, 3], [6, 5], [7, 1], [4, 7]]], ['edgelist', 'G940', 7, [[6, 1], [3, 6], [7, 3], [4, 7], [3, 4], [2, 3], [1, 2], [5, 1], [2, 5], [6, 2], [7, 6], [1, 3]]], ['edgelist', 'G941', 7, [[1, 5], [4, 1], [3, 4], [5, 3], [2, 5], [4, 2], [2, 1], [3, 2], [7, 5], [3, 7], [6, 3], [4, 6]]], ['edgelist', 'G942', 7, [[1, 3], [2, 1], [6, 2], [4, 6], [7, 4], [3, 7], [5, 3], [4, 5], [6, 5], [3, 6], [2, 3], [5, 2]]], ['edgelist', 'G943', 7, [[1, 3], [2, 1], [3, 2], [1, 4], [4, 2], [5, 1], [2, 5], [5, 3], [4, 5], [6, 5], [7, 6], [4, 7]]], ['edgelist', 'G944', 7, [[1, 2], [2, 3], [1, 3], [4, 1], [4, 2], [4, 3], [7, 2], [3, 7], [5, 7], [4, 5], [6, 4], [7, 6]]], ['edgelist', 'G945', 7, [[1, 5], [4, 1], [3, 4], [5, 3], [2, 5], [4, 2], [2, 1], [3, 2], [6, 1], [7, 6], [1, 7], [4, 5]]], ['edgelist', 'G946', 7, [[1, 2], [2, 3], [3, 4], [1, 4], [5, 1], [5, 2], [5, 3], [5, 4], [6, 2], [3, 6], [7, 1], [4, 7]]], ['edgelist', 'G947', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [6, 4], [6, 2], [6, 5], [7, 4], [5, 7], [2, 7], [7, 6]]], ['edgelist', 'G948', 7, [[1, 6], [1, 7], [2, 4], [2, 5], [3, 4], [3, 5], [3, 6], [3, 7], [4, 6], [4, 7], [5, 6], [5, 7]]], ['edgelist', 'G949', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [6, 3], [7, 6], [1, 7], [7, 3], [1, 6], [2, 6], [7, 2]]], ['edgelist', 'G950', 7, [[1, 2], [7, 6], [3, 4], [4, 5], [5, 6], [1, 6], [7, 1], [7, 2], [7, 3], [7, 4], [7, 5], [6, 2]]], ['edgelist', 'G951', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [6, 1], [6, 2], [6, 3], [6, 4], [6, 5], [7, 2], [6, 7]]], ['edgelist', 'G952', 7, [[1, 2], [2, 3], [3, 4], [1, 4], [5, 1], [5, 2], [5, 3], [5, 4], [6, 2], [5, 6], [7, 5], [6, 7]]], ['edgelist', 'G953', 7, [[3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [6, 2], [6, 4], [3, 6], [2, 1], [5, 2], [7, 4], [7, 2]]], ['edgelist', 'G954', 7, [[1, 5], [1, 7], [2, 4], [2, 6], [2, 7], [3, 4], [3, 6], [3, 7], [4, 6], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G955', 7, [[1, 6], [1, 7], [2, 5], [2, 6], [2, 7], [3, 4], [3, 6], [3, 7], [4, 5], [4, 7], [5, 6], [6, 7]]], ['edgelist', 'G956', 7, [[1, 2], [3, 5], [1, 3], [3, 2], [5, 7], [5, 2], [6, 2], [6, 3], [6, 4], [1, 4], [7, 2], [3, 7]]], ['edgelist', 'G957', 7, [[1, 2], [2, 3], [3, 4], [6, 5], [1, 5], [6, 4], [6, 2], [7, 4], [7, 5], [5, 3], [1, 4], [5, 4]]], ['edgelist', 'G958', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [6, 4], [3, 6], [2, 1], [5, 2], [7, 2], [7, 6]]], ['edgelist', 'G959', 7, [[1, 2], [2, 3], [3, 4], [1, 4], [5, 1], [5, 2], [5, 3], [5, 4], [6, 2], [7, 6], [1, 7], [2, 7]]], ['edgelist', 'G960', 7, [[1, 4], [5, 1], [3, 5], [4, 3], [2, 4], [5, 2], [2, 1], [6, 2], [6, 3], [7, 2], [3, 7], [5, 7]]], ['edgelist', 'G961', 7, [[1, 2], [2, 3], [3, 4], [1, 4], [5, 3], [6, 1], [6, 5], [5, 2], [2, 7], [6, 4], [2, 6], [7, 3]]], ['edgelist', 'G962', 7, [[1, 2], [2, 3], [3, 4], [1, 4], [5, 3], [6, 1], [6, 5], [5, 2], [2, 6], [6, 4], [7, 2], [5, 7]]], ['edgelist', 'G963', 7, [[1, 2], [2, 3], [3, 4], [1, 4], [5, 3], [6, 1], [6, 5], [5, 2], [2, 6], [6, 4], [7, 2], [1, 7]]], ['edgelist', 'G964', 7, [[5, 4], [2, 3], [1, 2], [1, 4], [5, 1], [7, 5], [5, 3], [6, 5], [7, 3], [7, 4], [4, 3], [6, 2]]], ['edgelist', 'G965', 7, [[3, 4], [5, 3], [1, 5], [7, 1], [7, 6], [5, 6], [2, 4], [6, 2], [1, 6], [7, 2], [4, 7], [6, 4]]], ['edgelist', 'G966', 7, [[1, 4], [1, 6], [2, 3], [2, 6], [2, 7], [3, 5], [3, 7], [4, 5], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G967', 7, [[1, 4], [1, 7], [2, 5], [2, 6], [2, 7], [3, 4], [3, 5], [3, 6], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G968', 7, [[1, 2], [2, 3], [3, 4], [5, 6], [1, 5], [5, 4], [6, 4], [7, 2], [4, 7], [7, 3], [1, 7], [5, 7]]], ['edgelist', 'G969', 7, [[1, 2], [3, 5], [1, 3], [7, 2], [4, 2], [4, 3], [5, 2], [6, 2], [6, 3], [6, 4], [1, 4], [5, 7]]], ['edgelist', 'G970', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [6, 4], [3, 6], [2, 1], [5, 2], [7, 4], [2, 7]]], ['edgelist', 'G971', 7, [[5, 4], [2, 3], [6, 1], [1, 4], [5, 1], [5, 2], [5, 3], [6, 2], [7, 3], [7, 4], [4, 3], [7, 5]]], ['edgelist', 'G972', 7, [[3, 4], [5, 3], [6, 5], [1, 6], [7, 1], [2, 7], [4, 2], [7, 4], [6, 4], [2, 6], [5, 1], [4, 1]]], ['edgelist', 'G973', 7, [[1, 4], [1, 6], [2, 5], [2, 6], [2, 7], [3, 5], [3, 6], [3, 7], [4, 5], [4, 7], [5, 7], [6, 7]]], ['edgelist', 'G974', 7, [[4, 3], [2, 3], [6, 1], [1, 4], [5, 1], [5, 2], [7, 5], [6, 2], [7, 3], [7, 4], [7, 2], [1, 7]]], ['edgelist', 'G975', 7, [[1, 6], [1, 7], [2, 4], [2, 5], [2, 7], [3, 4], [3, 5], [3, 7], [4, 6], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G976', 7, [[1, 4], [1, 6], [2, 3], [2, 5], [2, 7], [3, 5], [3, 7], [4, 6], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G977', 7, [[1, 4], [1, 7], [2, 5], [2, 6], [2, 7], [3, 5], [3, 6], [3, 7], [4, 5], [4, 6], [5, 7], [6, 7]]], ['edgelist', 'G978', 7, [[1, 6], [1, 7], [2, 5], [2, 6], [2, 7], [3, 5], [3, 6], [3, 7], [4, 5], [4, 6], [4, 7], [5, 7]]], ['edgelist', 'G979', 7, [[1, 2], [2, 3], [3, 4], [1, 4], [5, 3], [6, 1], [6, 5], [5, 2], [4, 5], [6, 4], [3, 7], [7, 2]]], ['edgelist', 'G980', 7, [[1, 2], [2, 3], [3, 4], [1, 4], [5, 3], [6, 1], [6, 5], [5, 2], [4, 5], [6, 4], [7, 2], [7, 6]]], ['edgelist', 'G981', 7, [[1, 3], [1, 7], [2, 5], [2, 6], [2, 7], [3, 4], [3, 6], [4, 5], [4, 6], [4, 7], [5, 6], [5, 7]]], ['edgelist', 'G982', 7, [[1, 6], [1, 7], [2, 4], [2, 5], [2, 7], [3, 4], [3, 5], [3, 6], [4, 5], [4, 7], [5, 6], [6, 7]]], ['edgelist', 'G983', 7, [[1, 2], [2, 3], [3, 4], [1, 4], [5, 2], [3, 5], [6, 3], [1, 6], [5, 4], [7, 6], [7, 5], [4, 6]]], ['edgelist', 'G984', 7, [[1, 3], [1, 7], [2, 3], [2, 5], [2, 6], [3, 4], [4, 5], [4, 6], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G985', 7, [[1, 3], [1, 7], [2, 4], [2, 5], [2, 6], [3, 5], [3, 6], [4, 5], [4, 6], [4, 7], [5, 7], [6, 7]]], ['edgelist', 'G986', 7, [[1, 3], [1, 7], [2, 4], [2, 5], [2, 6], [3, 6], [3, 7], [4, 5], [4, 6], [4, 7], [5, 6], [5, 7]]], ['edgelist', 'G987', 7, [[1, 6], [1, 7], [2, 4], [2, 5], [2, 7], [3, 4], [3, 5], [3, 7], [4, 5], [4, 6], [5, 6], [6, 7]]], ['edgelist', 'G988', 7, [[1, 6], [1, 7], [2, 3], [2, 6], [2, 7], [3, 4], [3, 5], [4, 5], [4, 6], [4, 7], [5, 6], [5, 7]]], ['edgelist', 'G989', 7, [[4, 1], [3, 4], [5, 3], [1, 5], [6, 2], [6, 3], [7, 2], [7, 1], [4, 7], [6, 4], [5, 6], [7, 5]]], ['edgelist', 'G990', 7, [[1, 6], [1, 7], [2, 3], [2, 4], [2, 5], [3, 4], [3, 5], [4, 6], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G991', 7, [[1, 2], [1, 3], [2, 6], [2, 7], [3, 4], [3, 5], [4, 5], [4, 6], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G992', 7, [[4, 1], [3, 4], [5, 3], [1, 5], [6, 2], [6, 3], [7, 2], [7, 1], [4, 7], [6, 4], [7, 5], [2, 4]]], ['edgelist', 'G993', 7, [[1, 5], [1, 6], [1, 7], [2, 4], [2, 6], [2, 7], [3, 4], [3, 5], [3, 6], [4, 7], [5, 7], [6, 7]]], ['edgelist', 'G994', 7, [[3, 4], [5, 3], [6, 3], [5, 2], [7, 1], [4, 1], [4, 2], [7, 4], [6, 7], [2, 6], [5, 1], [4, 5]]], ['edgelist', 'G995', 7, [[3, 4], [5, 3], [5, 2], [3, 6], [7, 1], [7, 5], [4, 2], [7, 4], [1, 4], [2, 6], [5, 1], [6, 4]]], ['edgelist', 'G996', 7, [[1, 5], [1, 6], [1, 7], [2, 3], [2, 4], [2, 7], [3, 4], [3, 7], [4, 6], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G997', 7, [[4, 1], [3, 4], [5, 3], [1, 5], [6, 2], [6, 3], [7, 2], [7, 1], [4, 7], [2, 4], [7, 5], [6, 5]]], ['edgelist', 'G998', 7, [[7, 4], [2, 3], [3, 4], [1, 4], [5, 3], [6, 1], [1, 7], [5, 2], [4, 5], [7, 6], [6, 2], [1, 5]]], ['edgelist', 'G999', 7, [[1, 4], [1, 6], [1, 7], [2, 3], [2, 6], [2, 7], [3, 5], [3, 7], [4, 5], [4, 6], [5, 6], [5, 7]]], ['edgelist', 'G1000', 7, [[1, 4], [1, 6], [1, 7], [2, 3], [2, 5], [2, 7], [3, 4], [3, 6], [4, 5], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1001', 7, [[1, 4], [1, 6], [1, 7], [2, 3], [2, 6], [2, 7], [3, 4], [3, 5], [4, 5], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1002', 7, [[1, 5], [4, 1], [2, 4], [5, 2], [2, 1], [5, 6], [3, 5], [7, 3], [6, 7], [3, 6], [4, 3], [7, 4]]], ['edgelist', 'G1003', 7, [[1, 2], [2, 3], [3, 4], [1, 4], [5, 1], [5, 2], [6, 3], [6, 5], [7, 5], [7, 4], [7, 3], [6, 4]]], ['edgelist', 'G1004', 7, [[1, 5], [1, 6], [1, 7], [2, 4], [2, 6], [2, 7], [3, 4], [3, 6], [3, 7], [4, 5], [5, 6], [5, 7]]], ['edgelist', 'G1005', 7, [[4, 1], [5, 3], [4, 2], [5, 1], [6, 3], [6, 2], [5, 4], [6, 5], [4, 6], [7, 2], [1, 7], [3, 7]]], ['edgelist', 'G1006', 7, [[2, 1], [5, 2], [1, 5], [6, 1], [7, 6], [2, 7], [4, 5], [6, 4], [3, 4], [6, 3], [7, 4], [3, 7]]], ['edgelist', 'G1007', 7, [[1, 2], [3, 1], [3, 4], [4, 5], [1, 5], [1, 6], [7, 2], [5, 7], [7, 6], [3, 7], [4, 2], [6, 4]]], ['edgelist', 'G1008', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [7, 1], [7, 2], [7, 3], [7, 4], [7, 5], [7, 6]]], ['edgelist', 'G1009', 7, [[4, 7], [2, 3], [1, 7], [1, 4], [5, 1], [5, 2], [5, 3], [5, 4], [6, 2], [3, 6], [5, 6], [7, 5]]], ['edgelist', 'G1010', 7, [[2, 5], [2, 6], [2, 7], [3, 4], [3, 5], [3, 6], [3, 7], [4, 5], [4, 6], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1011', 7, [[2, 4], [2, 5], [2, 6], [2, 7], [3, 4], [3, 5], [3, 6], [3, 7], [4, 6], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1012', 7, [[1, 7], [2, 5], [2, 6], [3, 4], [3, 5], [3, 6], [3, 7], [4, 5], [4, 6], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1013', 7, [[1, 2], [2, 3], [3, 4], [5, 6], [1, 5], [2, 4], [5, 2], [3, 5], [1, 4], [6, 4], [5, 4], [1, 3], [7, 5]]], ['edgelist', 'G1014', 7, [[4, 5], [2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [3, 5], [6, 2], [1, 4], [2, 5], [1, 2], [1, 5], [2, 7]]], ['edgelist', 'G1015', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [1, 6], [1, 7], [3, 4], [3, 5], [3, 6], [3, 7], [4, 7], [5, 6], [6, 7]]], ['edgelist', 'G1016', 7, [[1, 7], [2, 5], [2, 6], [2, 7], [3, 4], [3, 5], [3, 6], [4, 5], [4, 6], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1017', 7, [[1, 4], [2, 5], [2, 6], [2, 7], [3, 5], [3, 6], [3, 7], [4, 5], [4, 6], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1018', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [1, 6], [1, 7], [3, 6], [3, 7], [4, 5], [4, 6], [4, 7], [5, 6], [5, 7]]], ['edgelist', 'G1019', 7, [[1, 7], [2, 5], [2, 6], [2, 7], [3, 4], [3, 5], [3, 6], [3, 7], [4, 5], [4, 6], [4, 7], [5, 6], [6, 7]]], ['edgelist', 'G1020', 7, [[1, 7], [2, 4], [2, 5], [2, 6], [3, 4], [3, 5], [3, 6], [3, 7], [4, 6], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1021', 7, [[2, 6], [5, 2], [1, 5], [6, 1], [3, 6], [5, 3], [4, 5], [6, 4], [1, 2], [3, 1], [4, 3], [5, 6], [2, 7]]], ['edgelist', 'G1022', 7, [[1, 2], [2, 3], [3, 4], [5, 6], [1, 5], [2, 4], [5, 2], [3, 5], [1, 4], [6, 4], [5, 4], [1, 3], [6, 7]]], ['edgelist', 'G1023', 7, [[1, 6], [2, 4], [2, 5], [2, 6], [2, 7], [3, 4], [3, 5], [3, 6], [3, 7], [4, 5], [4, 7], [5, 7], [6, 7]]], ['edgelist', 'G1024', 7, [[1, 7], [2, 4], [2, 5], [2, 6], [2, 7], [3, 4], [3, 5], [3, 6], [3, 7], [4, 6], [4, 7], [5, 6], [5, 7]]], ['edgelist', 'G1025', 7, [[6, 7], [1, 6], [7, 1], [5, 7], [6, 5], [2, 6], [7, 2], [4, 7], [6, 4], [3, 6], [7, 3], [2, 1], [3, 2]]], ['edgelist', 'G1026', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [1, 6], [1, 7], [2, 3], [2, 4], [2, 5], [2, 6], [3, 4], [3, 5], [3, 7]]], ['edgelist', 'G1027', 7, [[4, 5], [1, 4], [5, 1], [2, 5], [4, 2], [3, 4], [5, 3], [2, 1], [3, 2], [7, 1], [4, 7], [6, 4], [5, 6]]], ['edgelist', 'G1028', 7, [[4, 5], [1, 4], [5, 1], [2, 5], [4, 2], [3, 4], [5, 3], [2, 1], [3, 2], [7, 1], [4, 7], [6, 4], [1, 6]]], ['edgelist', 'G1029', 7, [[4, 5], [1, 4], [5, 1], [2, 5], [4, 2], [3, 4], [5, 3], [2, 1], [3, 2], [7, 5], [1, 7], [6, 1], [4, 6]]], ['edgelist', 'G1030', 7, [[1, 6], [1, 7], [2, 4], [2, 5], [3, 5], [3, 6], [3, 7], [4, 5], [4, 6], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1031', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [1, 6], [1, 7], [2, 3], [2, 4], [2, 5], [3, 4], [3, 5], [4, 6], [5, 7]]], ['edgelist', 'G1032', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [2, 3], [2, 4], [2, 5], [3, 4], [3, 5], [4, 5], [6, 2], [7, 6], [2, 7]]], ['edgelist', 'G1033', 7, [[1, 5], [1, 7], [2, 4], [2, 6], [3, 4], [3, 5], [3, 6], [3, 7], [4, 6], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1034', 7, [[1, 6], [1, 7], [2, 5], [2, 7], [3, 4], [3, 5], [3, 6], [3, 7], [4, 5], [4, 6], [4, 7], [5, 6], [6, 7]]], ['edgelist', 'G1035', 7, [[1, 6], [1, 7], [2, 4], [2, 5], [3, 4], [3, 5], [3, 6], [3, 7], [4, 6], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1036', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [2, 3], [2, 4], [2, 5], [3, 4], [3, 5], [4, 5], [6, 4], [7, 6], [5, 7]]], ['edgelist', 'G1037', 7, [[1, 6], [1, 7], [2, 6], [2, 7], [3, 4], [3, 5], [3, 6], [3, 7], [4, 5], [4, 6], [4, 7], [5, 6], [5, 7]]], ['edgelist', 'G1038', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [6, 4], [3, 6], [2, 1], [5, 2], [7, 2], [7, 6], [6, 2]]], ['edgelist', 'G1039', 7, [[1, 2], [2, 3], [3, 4], [1, 4], [5, 1], [5, 2], [5, 3], [5, 4], [6, 5], [1, 6], [7, 1], [4, 7], [7, 5]]], ['edgelist', 'G1040', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [6, 4], [3, 6], [2, 1], [5, 2], [7, 2], [6, 2], [3, 7]]], ['edgelist', 'G1041', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [1, 6], [1, 7], [2, 3], [2, 4], [2, 5], [2, 6], [3, 4], [3, 5], [6, 7]]], ['edgelist', 'G1042', 7, [[2, 1], [3, 2], [5, 3], [2, 5], [4, 2], [1, 4], [3, 4], [6, 3], [2, 6], [1, 6], [7, 1], [2, 7], [3, 7]]], ['edgelist', 'G1043', 7, [[3, 6], [7, 3], [6, 7], [5, 6], [4, 5], [1, 4], [5, 1], [2, 5], [4, 2], [7, 4], [3, 2], [5, 3], [4, 3]]], ['edgelist', 'G1044', 7, [[1, 4], [1, 7], [2, 5], [2, 6], [2, 7], [3, 5], [3, 6], [3, 7], [4, 5], [4, 6], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1045', 7, [[3, 5], [4, 3], [2, 4], [5, 2], [1, 5], [4, 1], [7, 4], [2, 7], [6, 2], [5, 6], [7, 5], [4, 6], [2, 3]]], ['edgelist', 'G1046', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [1, 6], [1, 7], [2, 3], [2, 4], [2, 5], [3, 5], [3, 6], [4, 6], [4, 7]]], ['edgelist', 'G1047', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [1, 6], [1, 7], [2, 3], [2, 4], [2, 5], [3, 4], [3, 5], [4, 6], [6, 7]]], ['edgelist', 'G1048', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [1, 6], [1, 7], [2, 3], [2, 4], [2, 5], [3, 4], [3, 6], [4, 7], [5, 6]]], ['edgelist', 'G1049', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [1, 6], [1, 7], [2, 3], [2, 4], [2, 7], [3, 5], [3, 6], [4, 5], [4, 6]]], ['edgelist', 'G1050', 7, [[1, 3], [1, 7], [2, 5], [2, 6], [2, 7], [3, 4], [3, 6], [4, 5], [4, 6], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1051', 7, [[3, 6], [2, 3], [6, 2], [5, 6], [4, 5], [1, 4], [5, 1], [4, 3], [5, 3], [2, 4], [7, 4], [3, 7], [2, 7]]], ['edgelist', 'G1052', 7, [[1, 5], [1, 7], [2, 5], [2, 6], [2, 7], [3, 4], [3, 6], [3, 7], [4, 5], [4, 6], [4, 7], [5, 6], [6, 7]]], ['edgelist', 'G1053', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [6, 4], [3, 6], [2, 1], [5, 2], [1, 5], [7, 2], [5, 7]]], ['edgelist', 'G1054', 7, [[3, 4], [1, 3], [4, 1], [5, 4], [2, 5], [6, 2], [5, 6], [2, 1], [6, 3], [6, 1], [7, 6], [2, 7], [5, 1]]], ['edgelist', 'G1055', 7, [[1, 2], [2, 3], [1, 3], [4, 1], [2, 4], [4, 3], [5, 1], [3, 6], [4, 5], [6, 4], [1, 6], [7, 5], [7, 2]]], ['edgelist', 'G1056', 7, [[3, 4], [1, 3], [4, 1], [5, 4], [2, 5], [6, 2], [5, 6], [2, 1], [6, 3], [7, 3], [6, 7], [1, 6], [2, 3]]], ['edgelist', 'G1057', 7, [[6, 5], [7, 3], [7, 5], [5, 4], [6, 1], [4, 2], [4, 3], [7, 4], [6, 7], [5, 1], [2, 5], [6, 2], [1, 4]]], ['edgelist', 'G1058', 7, [[1, 3], [1, 7], [2, 4], [2, 5], [2, 6], [3, 6], [3, 7], [4, 5], [4, 6], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1059', 7, [[2, 6], [5, 2], [1, 5], [6, 1], [3, 6], [5, 3], [4, 5], [6, 4], [1, 2], [3, 1], [4, 3], [7, 6], [7, 5]]], ['edgelist', 'G1060', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [6, 4], [3, 6], [2, 1], [5, 2], [1, 5], [1, 7], [5, 7]]], ['edgelist', 'G1061', 7, [[1, 5], [4, 1], [3, 4], [5, 3], [2, 5], [4, 2], [2, 1], [6, 1], [5, 6], [2, 6], [7, 2], [1, 7], [4, 7]]], ['edgelist', 'G1062', 7, [[1, 6], [1, 7], [2, 3], [2, 6], [2, 7], [3, 4], [3, 5], [4, 5], [4, 6], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1063', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [5, 2], [6, 4], [3, 6], [2, 1], [7, 4], [5, 7], [6, 2]]], ['edgelist', 'G1064', 7, [[6, 3], [1, 3], [4, 1], [5, 4], [2, 5], [6, 2], [5, 6], [2, 1], [7, 3], [7, 4], [2, 3], [4, 2], [5, 1]]], ['edgelist', 'G1065', 7, [[2, 1], [3, 2], [1, 3], [1, 4], [4, 3], [7, 3], [2, 7], [6, 2], [7, 6], [5, 7], [6, 5], [1, 6], [5, 1]]], ['edgelist', 'G1066', 7, [[1, 6], [1, 7], [2, 5], [2, 6], [2, 7], [3, 4], [3, 5], [3, 6], [3, 7], [4, 5], [4, 6], [4, 7], [5, 7]]], ['edgelist', 'G1067', 7, [[1, 6], [1, 7], [2, 4], [2, 5], [2, 7], [3, 4], [3, 5], [3, 6], [3, 7], [4, 6], [4, 7], [5, 6], [5, 7]]], ['edgelist', 'G1068', 7, [[1, 2], [2, 3], [5, 2], [4, 2], [1, 5], [3, 4], [1, 4], [3, 1], [6, 1], [7, 6], [5, 7], [4, 6], [5, 3]]], ['edgelist', 'G1069', 7, [[1, 6], [1, 7], [2, 3], [2, 4], [2, 5], [3, 5], [3, 6], [3, 7], [4, 5], [4, 6], [4, 7], [5, 7], [6, 7]]], ['edgelist', 'G1070', 7, [[4, 5], [2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [3, 5], [6, 2], [1, 4], [1, 5], [3, 4], [7, 6], [1, 7]]], ['edgelist', 'G1071', 7, [[6, 3], [1, 3], [4, 1], [5, 4], [2, 5], [6, 2], [5, 6], [2, 1], [7, 3], [7, 4], [3, 4], [6, 1], [5, 1]]], ['edgelist', 'G1072', 7, [[1, 2], [2, 3], [3, 4], [6, 5], [1, 5], [6, 1], [6, 2], [6, 3], [6, 4], [7, 4], [7, 5], [5, 3], [1, 4]]], ['edgelist', 'G1073', 7, [[1, 2], [1, 7], [2, 5], [2, 6], [3, 4], [3, 5], [3, 6], [3, 7], [4, 5], [4, 6], [4, 7], [5, 7], [6, 7]]], ['edgelist', 'G1074', 7, [[1, 2], [1, 7], [2, 6], [2, 7], [3, 4], [3, 5], [3, 6], [3, 7], [4, 5], [4, 6], [4, 7], [5, 6], [5, 7]]], ['edgelist', 'G1075', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [7, 3], [1, 7], [6, 1], [3, 6], [6, 4], [5, 6], [7, 5], [4, 7]]], ['edgelist', 'G1076', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [7, 6], [1, 7], [1, 3], [3, 6], [6, 4], [5, 6], [7, 5], [4, 7]]], ['edgelist', 'G1077', 7, [[4, 5], [1, 4], [5, 1], [4, 7], [4, 2], [3, 4], [5, 3], [2, 1], [3, 2], [6, 3], [4, 6], [7, 3], [6, 7]]], ['edgelist', 'G1078', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [6, 1], [6, 2], [6, 3], [6, 4], [6, 5], [7, 6], [4, 7], [5, 7]]], ['edgelist', 'G1079', 7, [[2, 1], [3, 2], [7, 1], [2, 5], [4, 2], [1, 4], [3, 4], [2, 7], [2, 6], [3, 7], [5, 4], [6, 5], [7, 6]]], ['edgelist', 'G1080', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [1, 6], [1, 7], [2, 3], [2, 4], [2, 5], [3, 6], [3, 7], [4, 6], [5, 7]]], ['edgelist', 'G1081', 7, [[1, 7], [2, 3], [3, 4], [1, 4], [5, 3], [6, 1], [7, 4], [5, 2], [4, 5], [7, 6], [2, 6], [4, 6], [2, 4]]], ['edgelist', 'G1082', 7, [[1, 2], [2, 3], [3, 4], [1, 4], [5, 1], [5, 2], [7, 5], [5, 4], [6, 5], [6, 3], [7, 1], [4, 7], [4, 6]]], ['edgelist', 'G1083', 7, [[1, 2], [2, 3], [3, 4], [4, 7], [1, 5], [7, 6], [1, 7], [7, 5], [3, 6], [6, 4], [5, 6], [2, 6], [7, 2]]], ['edgelist', 'G1084', 7, [[1, 5], [1, 6], [1, 7], [2, 5], [2, 6], [2, 7], [3, 4], [3, 6], [3, 7], [4, 5], [4, 7], [5, 6], [6, 7]]], ['edgelist', 'G1085', 7, [[1, 5], [1, 6], [1, 7], [2, 4], [2, 6], [2, 7], [3, 4], [3, 6], [3, 7], [4, 5], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1086', 7, [[3, 4], [6, 3], [7, 6], [4, 7], [5, 4], [6, 5], [1, 6], [3, 1], [2, 3], [1, 2], [6, 2], [5, 3], [7, 5]]], ['edgelist', 'G1087', 7, [[3, 2], [1, 6], [7, 1], [5, 7], [6, 5], [2, 6], [7, 2], [4, 7], [6, 4], [3, 6], [7, 3], [2, 1], [4, 5]]], ['edgelist', 'G1088', 7, [[1, 2], [3, 1], [3, 4], [4, 5], [1, 5], [1, 6], [7, 2], [5, 7], [7, 6], [3, 7], [4, 2], [6, 4], [7, 1]]], ['edgelist', 'G1089', 7, [[1, 5], [1, 6], [1, 7], [2, 3], [2, 6], [2, 7], [3, 4], [3, 7], [4, 5], [4, 6], [4, 7], [5, 6], [5, 7]]], ['edgelist', 'G1090', 7, [[3, 4], [1, 3], [4, 1], [5, 4], [5, 7], [6, 2], [5, 6], [4, 2], [6, 3], [7, 1], [7, 2], [3, 2], [5, 2]]], ['edgelist', 'G1091', 7, [[1, 2], [2, 3], [3, 4], [1, 4], [5, 1], [5, 2], [5, 4], [6, 5], [6, 3], [6, 2], [7, 6], [2, 7], [3, 7]]], ['edgelist', 'G1092', 7, [[1, 5], [1, 6], [1, 7], [2, 5], [2, 6], [2, 7], [3, 4], [3, 6], [3, 7], [4, 5], [4, 6], [4, 7], [5, 7]]], ['edgelist', 'G1093', 7, [[4, 1], [3, 4], [5, 3], [1, 5], [6, 2], [6, 3], [7, 2], [7, 1], [4, 7], [2, 4], [7, 5], [6, 5], [6, 4]]], ['edgelist', 'G1094', 7, [[1, 5], [1, 6], [1, 7], [2, 3], [2, 5], [2, 6], [3, 4], [3, 7], [4, 5], [4, 6], [4, 7], [5, 7], [6, 7]]], ['edgelist', 'G1095', 7, [[1, 5], [1, 6], [1, 7], [2, 4], [2, 6], [2, 7], [3, 4], [3, 5], [3, 6], [4, 5], [4, 7], [5, 7], [6, 7]]], ['edgelist', 'G1096', 7, [[1, 3], [1, 6], [1, 7], [2, 3], [2, 5], [2, 7], [3, 4], [4, 5], [4, 6], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1097', 7, [[4, 5], [6, 1], [4, 6], [1, 7], [7, 5], [3, 4], [5, 3], [2, 1], [3, 2], [2, 7], [6, 2], [3, 6], [7, 3]]], ['edgelist', 'G1098', 7, [[1, 3], [1, 6], [1, 7], [2, 3], [2, 4], [2, 5], [3, 7], [4, 5], [4, 6], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1099', 7, [[4, 1], [3, 4], [5, 3], [1, 5], [6, 4], [6, 3], [6, 5], [2, 4], [2, 1], [5, 2], [7, 1], [4, 7], [2, 7]]], ['edgelist', 'G1100', 7, [[3, 4], [1, 3], [4, 1], [5, 4], [2, 5], [6, 2], [5, 6], [7, 1], [2, 7], [7, 4], [5, 7], [2, 3], [6, 1]]], ['edgelist', 'G1101', 7, [[1, 5], [1, 6], [1, 7], [2, 3], [2, 4], [2, 7], [3, 4], [3, 6], [3, 7], [4, 5], [4, 6], [5, 6], [5, 7]]], ['edgelist', 'G1102', 7, [[1, 5], [1, 6], [1, 7], [2, 3], [2, 4], [2, 7], [3, 5], [3, 6], [3, 7], [4, 5], [4, 6], [4, 7], [5, 6]]], ['edgelist', 'G1103', 7, [[1, 2], [2, 3], [3, 4], [1, 4], [5, 1], [5, 2], [6, 4], [6, 5], [7, 5], [7, 3], [7, 6], [6, 3], [4, 7]]], ['edgelist', 'G1104', 7, [[1, 2], [1, 6], [1, 7], [2, 4], [2, 5], [3, 4], [3, 5], [3, 6], [3, 7], [4, 5], [4, 7], [5, 6], [6, 7]]], ['edgelist', 'G1105', 7, [[1, 2], [1, 6], [1, 7], [2, 4], [2, 5], [3, 4], [3, 5], [3, 6], [3, 7], [4, 6], [4, 7], [5, 6], [5, 7]]], ['edgelist', 'G1106', 7, [[1, 2], [3, 1], [3, 4], [4, 5], [1, 5], [1, 6], [7, 2], [5, 7], [7, 6], [3, 7], [4, 2], [6, 4], [3, 2]]], ['edgelist', 'G1107', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [3, 5], [6, 3], [2, 6], [2, 5], [2, 4], [3, 1], [5, 1], [6, 4]]], ['edgelist', 'G1108', 7, [[4, 5], [2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [3, 5], [6, 2], [1, 4], [2, 5], [1, 2], [3, 4], [1, 5], [2, 7]]], ['edgelist', 'G1109', 7, [[1, 7], [2, 4], [2, 5], [2, 6], [3, 4], [3, 5], [3, 6], [3, 7], [4, 5], [4, 6], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1110', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [1, 6], [1, 7], [3, 5], [3, 6], [3, 7], [4, 5], [4, 6], [4, 7], [5, 7], [6, 7]]], ['edgelist', 'G1111', 7, [[1, 7], [2, 4], [2, 5], [2, 6], [2, 7], [3, 4], [3, 5], [3, 6], [3, 7], [4, 5], [4, 6], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1112', 7, [[1, 4], [2, 3], [2, 5], [2, 6], [2, 7], [3, 5], [3, 6], [3, 7], [4, 5], [4, 6], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1113', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [2, 3], [2, 4], [2, 5], [3, 4], [3, 5], [4, 5], [4, 7], [6, 4], [5, 6], [7, 5]]], ['edgelist', 'G1114', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [2, 3], [2, 4], [2, 5], [3, 4], [3, 5], [4, 5], [7, 3], [2, 7], [6, 2], [1, 6]]], ['edgelist', 'G1115', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [2, 3], [2, 4], [2, 5], [3, 4], [3, 5], [4, 5], [6, 3], [4, 6], [7, 5], [1, 7]]], ['edgelist', 'G1116', 7, [[4, 5], [2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [3, 5], [6, 2], [1, 4], [2, 5], [1, 2], [1, 5], [7, 5], [1, 7]]], ['edgelist', 'G1117', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [1, 6], [1, 7], [2, 3], [2, 4], [2, 5], [2, 6], [2, 7], [4, 5], [4, 6], [5, 7]]], ['edgelist', 'G1118', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [1, 6], [1, 7], [2, 7], [3, 6], [3, 7], [4, 5], [4, 6], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1119', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [1, 6], [1, 7], [2, 5], [3, 6], [3, 7], [4, 6], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1120', 7, [[4, 5], [2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [1, 2], [6, 2], [1, 5], [2, 5], [6, 4], [3, 6], [7, 5], [1, 7]]], ['edgelist', 'G1121', 7, [[2, 4], [3, 2], [1, 3], [6, 1], [5, 6], [4, 5], [6, 4], [3, 6], [2, 1], [5, 2], [7, 2], [6, 2], [3, 7], [1, 5]]], ['edgelist', 'G1122', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [1, 6], [1, 7], [2, 7], [3, 6], [3, 7], [4, 5], [4, 6], [4, 7], [5, 6], [5, 7]]], ['edgelist', 'G1123', 7, [[3, 4], [5, 3], [7, 4], [5, 1], [7, 1], [4, 5], [4, 2], [6, 5], [6, 1], [1, 4], [2, 6], [6, 4], [7, 5], [7, 2]]], ['edgelist', 'G1124', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [2, 3], [2, 4], [2, 5], [3, 4], [3, 5], [4, 5], [6, 4], [7, 6], [5, 7], [6, 5]]], ['edgelist', 'G1125', 7, [[4, 2], [2, 5], [3, 4], [4, 5], [1, 5], [2, 6], [1, 2], [1, 3], [3, 6], [6, 4], [5, 6], [2, 3], [7, 3], [6, 7]]], ['edgelist', 'G1126', 7, [[1, 4], [1, 7], [2, 5], [2, 6], [2, 7], [3, 4], [3, 5], [3, 6], [3, 7], [4, 5], [4, 6], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1127', 7, [[1, 4], [1, 7], [2, 3], [2, 5], [2, 6], [3, 5], [3, 6], [3, 7], [4, 5], [4, 6], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1128', 7, [[1, 6], [1, 7], [2, 5], [2, 6], [2, 7], [3, 4], [3, 5], [3, 6], [3, 7], [4, 5], [4, 6], [4, 7], [5, 6], [5, 7]]], ['edgelist', 'G1129', 7, [[1, 2], [1, 7], [2, 5], [2, 6], [3, 4], [3, 5], [3, 6], [3, 7], [4, 5], [4, 6], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1130', 7, [[1, 6], [1, 7], [2, 3], [2, 4], [2, 5], [3, 5], [3, 6], [3, 7], [4, 5], [4, 6], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1131', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [2, 3], [2, 4], [2, 5], [3, 4], [3, 5], [5, 7], [6, 3], [2, 6], [1, 6], [7, 4]]], ['edgelist', 'G1132', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [6, 1], [6, 2], [6, 3], [6, 4], [6, 5], [5, 3], [4, 1], [7, 2], [6, 7]]], ['edgelist', 'G1133', 7, [[1, 5], [1, 7], [2, 3], [2, 4], [2, 6], [2, 7], [3, 4], [3, 6], [3, 7], [4, 5], [4, 6], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1134', 7, [[1, 5], [1, 7], [2, 4], [2, 5], [2, 6], [2, 7], [3, 4], [3, 5], [3, 6], [3, 7], [4, 6], [4, 7], [5, 6], [6, 7]]], ['edgelist', 'G1135', 7, [[1, 6], [1, 7], [2, 4], [2, 5], [2, 6], [2, 7], [3, 4], [3, 5], [3, 6], [3, 7], [4, 5], [4, 7], [5, 6], [6, 7]]], ['edgelist', 'G1136', 7, [[3, 4], [1, 3], [4, 1], [5, 4], [2, 5], [6, 2], [2, 3], [6, 3], [5, 1], [4, 2], [6, 1], [7, 6], [7, 5], [1, 2]]], ['edgelist', 'G1137', 7, [[3, 4], [1, 3], [4, 1], [5, 4], [2, 5], [6, 2], [5, 6], [6, 3], [7, 1], [7, 2], [6, 1], [2, 3], [4, 2], [5, 1]]], ['edgelist', 'G1138', 7, [[6, 7], [1, 6], [7, 1], [5, 7], [6, 5], [2, 6], [7, 2], [4, 7], [6, 4], [3, 6], [7, 3], [2, 1], [3, 2], [4, 5]]], ['edgelist', 'G1139', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [1, 6], [1, 7], [2, 6], [2, 7], [3, 6], [3, 7], [4, 5], [4, 7], [5, 6], [6, 7]]], ['edgelist', 'G1140', 7, [[1, 2], [3, 1], [3, 4], [4, 5], [1, 5], [1, 6], [7, 2], [5, 7], [7, 6], [3, 7], [4, 2], [6, 4], [7, 1], [4, 7]]], ['edgelist', 'G1141', 7, [[4, 2], [5, 3], [5, 6], [5, 1], [2, 5], [1, 4], [6, 1], [6, 3], [7, 2], [4, 7], [7, 1], [6, 7], [7, 3], [5, 7]]], ['edgelist', 'G1142', 7, [[1, 5], [4, 1], [3, 4], [5, 3], [2, 5], [4, 2], [2, 1], [3, 2], [6, 5], [2, 6], [7, 2], [4, 7], [1, 6], [7, 1]]], ['edgelist', 'G1143', 7, [[4, 5], [5, 3], [2, 6], [5, 1], [2, 5], [6, 4], [4, 1], [6, 3], [7, 5], [1, 7], [4, 7], [3, 7], [6, 7], [2, 7]]], ['edgelist', 'G1144', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [1, 6], [1, 7], [2, 5], [2, 6], [3, 4], [3, 7], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1145', 7, [[3, 4], [5, 3], [7, 4], [5, 1], [5, 6], [4, 5], [4, 2], [6, 3], [2, 7], [6, 7], [7, 1], [6, 4], [7, 5], [1, 2]]], ['edgelist', 'G1146', 7, [[1, 5], [1, 6], [1, 7], [2, 4], [2, 6], [2, 7], [3, 4], [3, 5], [3, 7], [4, 5], [4, 6], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1147', 7, [[1, 5], [1, 6], [1, 7], [2, 3], [2, 6], [2, 7], [3, 4], [3, 5], [4, 5], [4, 6], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1148', 7, [[3, 4], [5, 3], [7, 4], [5, 1], [2, 5], [7, 1], [4, 2], [6, 3], [5, 6], [6, 7], [2, 6], [6, 4], [7, 5], [4, 1]]], ['edgelist', 'G1149', 7, [[4, 2], [5, 3], [1, 4], [5, 1], [2, 5], [6, 4], [6, 1], [6, 3], [7, 5], [2, 7], [7, 4], [1, 7], [7, 6], [3, 7]]], ['edgelist', 'G1150', 7, [[1, 2], [5, 3], [4, 1], [5, 1], [5, 6], [6, 4], [2, 4], [6, 3], [7, 5], [3, 7], [7, 6], [4, 7], [7, 2], [1, 7]]], ['edgelist', 'G1151', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [1, 6], [1, 7], [2, 3], [2, 7], [3, 6], [4, 5], [4, 6], [4, 7], [5, 6], [5, 7]]], ['edgelist', 'G1152', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [1, 6], [1, 7], [2, 6], [2, 7], [3, 4], [3, 5], [4, 6], [4, 7], [5, 6], [5, 7]]], ['edgelist', 'G1153', 7, [[3, 4], [5, 3], [7, 4], [5, 1], [2, 5], [7, 2], [4, 2], [6, 3], [6, 1], [6, 7], [5, 6], [6, 4], [7, 1], [4, 1]]], ['edgelist', 'G1154', 7, [[3, 4], [5, 3], [4, 1], [5, 1], [5, 6], [4, 5], [4, 2], [6, 3], [1, 2], [6, 7], [7, 1], [6, 4], [7, 5], [2, 7]]], ['edgelist', 'G1155', 7, [[1, 5], [1, 6], [1, 7], [2, 4], [2, 6], [2, 7], [3, 4], [3, 5], [3, 6], [3, 7], [4, 5], [4, 7], [5, 6], [6, 7]]], ['edgelist', 'G1156', 7, [[1, 4], [1, 5], [1, 7], [2, 3], [2, 5], [2, 6], [3, 4], [3, 6], [3, 7], [4, 6], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1157', 7, [[1, 5], [1, 6], [1, 7], [2, 3], [2, 4], [2, 7], [3, 5], [3, 6], [3, 7], [4, 5], [4, 6], [4, 7], [5, 6], [6, 7]]], ['edgelist', 'G1158', 7, [[1, 2], [1, 6], [1, 7], [2, 5], [2, 7], [3, 4], [3, 5], [3, 6], [3, 7], [4, 5], [4, 6], [4, 7], [5, 6], [6, 7]]], ['edgelist', 'G1159', 7, [[1, 2], [1, 5], [1, 7], [2, 4], [2, 6], [3, 4], [3, 5], [3, 6], [3, 7], [4, 6], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1160', 7, [[3, 4], [5, 3], [7, 4], [5, 1], [2, 5], [5, 6], [4, 2], [6, 3], [6, 1], [7, 2], [1, 7], [6, 4], [7, 5], [4, 1]]], ['edgelist', 'G1161', 7, [[1, 2], [1, 6], [1, 7], [2, 4], [2, 5], [3, 4], [3, 5], [3, 6], [3, 7], [4, 6], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1162', 7, [[1, 5], [1, 6], [1, 7], [2, 3], [2, 4], [2, 5], [3, 4], [3, 6], [3, 7], [4, 6], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1163', 7, [[1, 5], [1, 6], [1, 7], [2, 5], [2, 6], [2, 7], [3, 4], [3, 5], [3, 6], [3, 7], [4, 5], [4, 6], [4, 7], [6, 7]]], ['edgelist', 'G1164', 7, [[3, 4], [5, 3], [7, 4], [5, 1], [5, 6], [4, 6], [4, 2], [6, 3], [4, 1], [2, 5], [7, 1], [2, 7], [7, 5], [1, 2]]], ['edgelist', 'G1165', 7, [[1, 5], [1, 6], [1, 7], [2, 4], [2, 5], [2, 6], [2, 7], [3, 4], [3, 5], [3, 6], [3, 7], [4, 6], [4, 7], [5, 7]]], ['edgelist', 'G1166', 7, [[1, 5], [1, 6], [1, 7], [2, 3], [2, 4], [2, 6], [2, 7], [3, 4], [3, 6], [3, 7], [4, 5], [4, 7], [5, 6], [5, 7]]], ['edgelist', 'G1167', 7, [[1, 5], [1, 6], [1, 7], [2, 3], [2, 4], [2, 6], [2, 7], [3, 4], [3, 5], [3, 7], [4, 5], [4, 6], [5, 7], [6, 7]]], ['edgelist', 'G1168', 7, [[1, 4], [1, 5], [1, 6], [2, 3], [2, 5], [2, 6], [2, 7], [3, 4], [3, 6], [3, 7], [4, 5], [4, 7], [5, 7], [6, 7]]], ['edgelist', 'G1169', 7, [[1, 4], [1, 5], [1, 6], [2, 4], [2, 5], [2, 6], [2, 7], [3, 4], [3, 5], [3, 6], [3, 7], [4, 7], [5, 7], [6, 7]]], ['edgelist', 'G1170', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [6, 7], [1, 7], [1, 3], [6, 1], [4, 6], [2, 4], [7, 2], [5, 7], [3, 5]]], ['edgelist', 'G1171', 7, [[1, 4], [1, 5], [1, 6], [1, 7], [2, 4], [2, 5], [2, 6], [2, 7], [3, 4], [3, 5], [3, 6], [3, 7], [4, 7], [5, 6]]], ['edgelist', 'G1172', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [1, 6], [2, 3], [2, 4], [2, 5], [2, 6], [3, 4], [3, 5], [3, 6], [4, 5], [4, 6], [5, 6]]], ['edgelist', 'G1173', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [3, 5], [6, 3], [2, 6], [2, 5], [2, 4], [3, 1], [5, 1], [6, 4], [2, 7]]], ['edgelist', 'G1174', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [3, 5], [6, 3], [2, 6], [2, 5], [2, 4], [3, 1], [5, 1], [6, 4], [1, 7]]], ['edgelist', 'G1175', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [1, 6], [1, 7], [2, 3], [2, 4], [2, 5], [2, 6], [2, 7], [4, 5], [4, 6], [4, 7], [6, 7]]], ['edgelist', 'G1176', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [1, 6], [1, 7], [2, 6], [2, 7], [3, 5], [4, 5], [4, 6], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1177', 7, [[4, 5], [5, 6], [1, 4], [1, 5], [1, 6], [1, 7], [4, 7], [5, 7], [6, 7], [2, 6], [4, 6], [3, 4], [3, 5], [2, 7], [1, 2]]], ['edgelist', 'G1178', 7, [[4, 5], [5, 6], [1, 4], [1, 5], [1, 6], [1, 7], [4, 7], [2, 4], [2, 5], [2, 6], [4, 6], [3, 4], [3, 5], [7, 2], [5, 7]]], ['edgelist', 'G1179', 7, [[1, 2], [2, 3], [3, 4], [1, 4], [5, 1], [5, 2], [5, 3], [5, 4], [6, 2], [1, 6], [6, 3], [4, 6], [5, 6], [7, 2], [6, 7]]], ['edgelist', 'G1180', 7, [[5, 4], [5, 6], [6, 4], [1, 2], [1, 6], [1, 4], [3, 5], [2, 6], [2, 4], [7, 6], [4, 7], [7, 1], [2, 7], [7, 3], [5, 7]]], ['edgelist', 'G1181', 7, [[4, 5], [5, 6], [6, 7], [1, 5], [1, 6], [1, 7], [4, 7], [2, 4], [5, 7], [2, 6], [4, 6], [3, 4], [3, 5], [2, 7], [1, 2]]], ['edgelist', 'G1182', 7, [[1, 3], [1, 7], [2, 4], [2, 5], [2, 6], [2, 7], [3, 4], [3, 5], [3, 6], [4, 5], [4, 6], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1183', 7, [[7, 2], [5, 6], [1, 4], [1, 5], [1, 6], [1, 7], [4, 7], [2, 4], [2, 5], [2, 6], [4, 6], [3, 4], [3, 5], [6, 7], [5, 7]]], ['edgelist', 'G1184', 7, [[4, 5], [5, 6], [1, 4], [1, 5], [1, 6], [5, 7], [4, 7], [2, 4], [2, 5], [2, 6], [4, 6], [3, 4], [3, 5], [6, 7], [6, 3]]], ['edgelist', 'G1185', 7, [[4, 5], [5, 6], [1, 4], [1, 5], [7, 1], [5, 7], [4, 7], [2, 4], [2, 5], [2, 6], [4, 6], [3, 4], [3, 5], [6, 7], [6, 3]]], ['edgelist', 'G1186', 7, [[1, 2], [2, 3], [1, 3], [4, 1], [2, 4], [3, 4], [6, 2], [4, 6], [5, 4], [3, 5], [7, 3], [4, 7], [7, 2], [1, 6], [5, 1]]], ['edgelist', 'G1187', 7, [[1, 2], [3, 1], [4, 3], [5, 4], [2, 5], [4, 2], [5, 3], [1, 5], [4, 1], [7, 4], [5, 7], [6, 5], [4, 6], [7, 3], [6, 2]]], ['edgelist', 'G1188', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [2, 3], [2, 4], [2, 5], [3, 4], [3, 5], [4, 5], [6, 4], [5, 6], [7, 5], [6, 7], [7, 4]]], ['edgelist', 'G1189', 7, [[1, 2], [2, 3], [3, 4], [1, 4], [5, 1], [5, 4], [5, 3], [7, 2], [3, 7], [6, 3], [5, 6], [7, 5], [1, 7], [7, 6], [4, 7]]], ['edgelist', 'G1190', 7, [[1, 2], [6, 4], [2, 4], [1, 5], [4, 1], [5, 4], [3, 5], [6, 3], [5, 6], [7, 5], [3, 7], [7, 6], [4, 7], [7, 2], [1, 7]]], ['edgelist', 'G1191', 7, [[6, 3], [5, 6], [4, 2], [1, 5], [1, 6], [1, 4], [3, 5], [2, 6], [2, 5], [7, 4], [2, 7], [7, 1], [6, 7], [7, 3], [5, 7]]], ['edgelist', 'G1192', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [5, 6], [1, 6], [1, 4], [3, 1], [4, 2], [7, 5], [6, 7], [7, 4], [1, 7], [7, 3], [2, 7]]], ['edgelist', 'G1193', 7, [[6, 3], [4, 1], [6, 4], [1, 5], [1, 6], [5, 4], [3, 5], [2, 6], [2, 5], [7, 5], [3, 7], [7, 1], [6, 7], [7, 4], [2, 7]]], ['edgelist', 'G1194', 7, [[1, 5], [1, 6], [1, 7], [2, 3], [2, 4], [2, 7], [3, 4], [3, 5], [3, 6], [4, 5], [4, 6], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1195', 7, [[7, 2], [5, 6], [1, 4], [1, 5], [1, 6], [5, 7], [4, 7], [2, 4], [2, 5], [1, 7], [4, 6], [3, 4], [3, 5], [6, 7], [6, 3]]], ['edgelist', 'G1196', 7, [[4, 5], [1, 2], [1, 4], [1, 5], [1, 6], [5, 7], [4, 7], [2, 4], [2, 5], [7, 1], [2, 7], [3, 4], [3, 5], [6, 7], [6, 3]]], ['edgelist', 'G1197', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [1, 6], [1, 7], [2, 6], [2, 7], [3, 5], [3, 6], [3, 7], [4, 5], [4, 6], [4, 7], [5, 7]]], ['edgelist', 'G1198', 7, [[6, 3], [5, 6], [6, 4], [1, 5], [1, 2], [2, 4], [3, 5], [4, 1], [2, 5], [7, 5], [2, 7], [7, 1], [4, 7], [7, 3], [6, 7]]], ['edgelist', 'G1199', 7, [[6, 1], [5, 4], [6, 4], [6, 3], [1, 2], [2, 4], [3, 5], [4, 1], [2, 5], [7, 3], [6, 7], [7, 5], [4, 7], [7, 1], [2, 7]]], ['edgelist', 'G1200', 7, [[4, 5], [5, 6], [1, 4], [5, 7], [1, 2], [2, 7], [4, 7], [2, 4], [2, 5], [7, 1], [1, 6], [3, 4], [3, 5], [6, 7], [6, 3]]], ['edgelist', 'G1201', 7, [[1, 3], [1, 4], [1, 7], [2, 4], [2, 5], [2, 6], [2, 7], [3, 5], [3, 6], [3, 7], [4, 5], [4, 6], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1202', 7, [[1, 5], [1, 6], [1, 7], [2, 4], [2, 5], [2, 6], [2, 7], [3, 4], [3, 5], [3, 6], [3, 7], [4, 6], [4, 7], [5, 6], [5, 7]]], ['edgelist', 'G1203', 7, [[4, 5], [6, 1], [1, 4], [1, 5], [5, 7], [2, 7], [4, 7], [2, 4], [2, 5], [7, 1], [2, 6], [3, 4], [3, 5], [6, 7], [6, 3]]], ['edgelist', 'G1204', 7, [[7, 5], [6, 3], [1, 4], [1, 5], [3, 5], [2, 7], [4, 7], [2, 4], [2, 5], [7, 1], [4, 6], [3, 4], [1, 2], [6, 7], [5, 6]]], ['edgelist', 'G1205', 7, [[1, 5], [1, 6], [1, 7], [2, 3], [2, 4], [2, 6], [2, 7], [3, 4], [3, 5], [3, 7], [4, 5], [4, 6], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1206', 7, [[1, 2], [1, 3], [1, 4], [2, 5], [2, 6], [2, 7], [3, 5], [3, 6], [3, 7], [4, 5], [4, 6], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1207', 7, [[3, 4], [1, 3], [4, 1], [5, 4], [2, 5], [6, 2], [5, 6], [2, 1], [6, 3], [7, 3], [6, 7], [7, 2], [1, 7], [7, 5], [4, 7]]], ['edgelist', 'G1208', 7, [[4, 1], [4, 6], [4, 5], [3, 1], [3, 6], [3, 5], [2, 5], [2, 6], [2, 1], [7, 1], [2, 7], [7, 6], [4, 7], [7, 5], [3, 7]]], ['edgelist', 'G1209', 7, [[1, 4], [1, 5], [1, 6], [1, 7], [2, 3], [2, 5], [2, 6], [2, 7], [3, 4], [3, 6], [3, 7], [4, 5], [4, 7], [5, 6], [6, 7]]], ['edgelist', 'G1210', 7, [[4, 5], [7, 3], [1, 4], [1, 5], [6, 1], [2, 7], [4, 7], [2, 4], [2, 5], [7, 1], [5, 6], [3, 4], [3, 5], [6, 2], [6, 3]]], ['edgelist', 'G1211', 7, [[4, 5], [7, 3], [1, 4], [1, 5], [6, 1], [6, 7], [4, 7], [2, 4], [2, 5], [1, 2], [5, 7], [3, 4], [3, 5], [6, 2], [6, 3]]], ['edgelist', 'G1212', 7, [[1, 2], [2, 3], [3, 4], [4, 5], [1, 5], [7, 3], [2, 7], [7, 1], [5, 7], [6, 5], [1, 6], [4, 6], [7, 4], [6, 3], [2, 6]]], ['edgelist', 'G1213', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [1, 6], [2, 3], [2, 4], [2, 5], [2, 6], [3, 4], [3, 5], [3, 6], [4, 5], [4, 6], [5, 6], [3, 7]]], ['edgelist', 'G1214', 7, [[4, 1], [5, 2], [5, 4], [2, 4], [5, 1], [3, 6], [7, 3], [6, 7], [2, 6], [5, 6], [4, 6], [1, 6], [1, 7], [4, 7], [5, 7], [7, 2]]], ['edgelist', 'G1215', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [1, 6], [1, 7], [2, 7], [3, 4], [3, 5], [3, 6], [4, 5], [4, 6], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1216', 7, [[1, 6], [1, 7], [2, 3], [2, 4], [2, 5], [2, 6], [2, 7], [3, 4], [3, 5], [3, 6], [3, 7], [4, 5], [4, 6], [4, 7], [5, 6], [5, 7]]], ['edgelist', 'G1217', 7, [[4, 5], [6, 2], [1, 4], [1, 5], [6, 1], [5, 7], [4, 7], [2, 4], [2, 5], [7, 1], [4, 6], [3, 4], [3, 5], [6, 7], [5, 6], [3, 6]]], ['edgelist', 'G1218', 7, [[3, 5], [4, 2], [4, 1], [5, 4], [5, 1], [6, 3], [5, 6], [6, 1], [4, 6], [6, 2], [7, 6], [2, 7], [4, 7], [7, 1], [5, 7], [7, 3]]], ['edgelist', 'G1219', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [1, 6], [1, 7], [2, 6], [2, 7], [3, 4], [3, 5], [4, 5], [4, 6], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1220', 7, [[3, 5], [5, 2], [4, 1], [4, 2], [5, 1], [6, 3], [5, 6], [6, 1], [4, 6], [7, 6], [2, 6], [7, 2], [4, 7], [5, 7], [7, 3], [7, 1]]], ['edgelist', 'G1221', 7, [[1, 2], [1, 4], [1, 6], [1, 7], [2, 4], [2, 6], [2, 7], [3, 5], [3, 6], [3, 7], [4, 5], [4, 6], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1222', 7, [[3, 6], [1, 2], [5, 6], [2, 4], [6, 1], [5, 4], [6, 4], [3, 5], [2, 5], [4, 1], [7, 4], [3, 7], [7, 5], [6, 7], [7, 2], [1, 7]]], ['edgelist', 'G1223', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [1, 6], [1, 7], [2, 4], [2, 7], [3, 5], [3, 6], [3, 7], [4, 5], [4, 6], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1224', 7, [[3, 6], [6, 2], [4, 2], [1, 5], [6, 1], [5, 4], [6, 4], [3, 5], [2, 5], [4, 1], [7, 3], [5, 7], [6, 7], [7, 2], [1, 7], [4, 7]]], ['edgelist', 'G1225', 7, [[2, 7], [1, 2], [1, 4], [1, 5], [6, 1], [5, 7], [4, 7], [2, 4], [2, 5], [7, 1], [4, 6], [3, 4], [3, 5], [6, 7], [5, 6], [3, 6]]], ['edgelist', 'G1226', 7, [[4, 5], [6, 2], [1, 4], [1, 5], [6, 1], [5, 7], [4, 7], [2, 4], [2, 5], [7, 1], [2, 7], [3, 4], [3, 5], [6, 7], [1, 2], [3, 6]]], ['edgelist', 'G1227', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [1, 6], [1, 7], [2, 3], [2, 4], [2, 5], [2, 6], [2, 7], [3, 6], [3, 7], [4, 5], [4, 7], [5, 6]]], ['edgelist', 'G1228', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [1, 6], [1, 7], [2, 5], [2, 6], [2, 7], [3, 4], [3, 6], [3, 7], [4, 5], [4, 7], [5, 6], [6, 7]]], ['edgelist', 'G1229', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [1, 6], [1, 7], [2, 5], [2, 6], [2, 7], [3, 5], [3, 6], [3, 7], [4, 5], [4, 6], [4, 7], [6, 7]]], ['edgelist', 'G1230', 7, [[3, 6], [6, 2], [4, 6], [1, 5], [1, 2], [5, 4], [4, 3], [3, 5], [2, 5], [1, 6], [7, 5], [3, 7], [7, 4], [6, 7], [7, 2], [1, 7]]], ['edgelist', 'G1231', 7, [[6, 7], [6, 2], [1, 4], [1, 5], [1, 2], [5, 7], [4, 7], [2, 4], [2, 5], [7, 1], [4, 6], [3, 4], [3, 5], [7, 3], [5, 6], [3, 6]]], ['edgelist', 'G1232', 7, [[4, 5], [6, 2], [1, 4], [1, 5], [1, 2], [5, 7], [4, 7], [2, 4], [2, 5], [7, 1], [6, 1], [3, 4], [3, 5], [7, 3], [7, 6], [3, 6]]], ['edgelist', 'G1233', 7, [[6, 1], [6, 2], [1, 4], [1, 5], [7, 2], [5, 7], [4, 7], [2, 4], [2, 5], [7, 1], [4, 6], [3, 4], [3, 5], [7, 3], [5, 6], [3, 6]]], ['edgelist', 'G1234', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [1, 6], [2, 3], [2, 4], [2, 5], [2, 6], [3, 4], [3, 5], [3, 6], [4, 5], [4, 6], [5, 6], [7, 3], [2, 7]]], ['edgelist', 'G1235', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [1, 6], [1, 7], [2, 3], [2, 4], [2, 5], [2, 6], [2, 7], [3, 4], [3, 5], [3, 6], [3, 7], [5, 6], [5, 7]]], ['edgelist', 'G1236', 7, [[5, 1], [5, 4], [1, 2], [4, 1], [3, 5], [4, 2], [6, 4], [5, 6], [6, 3], [7, 6], [6, 1], [2, 6], [7, 2], [1, 7], [7, 5], [3, 7], [4, 7]]], ['edgelist', 'G1237', 7, [[1, 2], [6, 2], [6, 4], [1, 5], [6, 1], [5, 4], [4, 2], [3, 6], [2, 5], [4, 1], [3, 5], [7, 3], [6, 7], [7, 4], [2, 7], [7, 1], [5, 7]]], ['edgelist', 'G1238', 7, [[4, 5], [6, 2], [1, 4], [1, 5], [5, 6], [5, 7], [4, 7], [2, 4], [2, 5], [1, 2], [4, 6], [3, 4], [3, 5], [3, 6], [6, 1], [6, 7], [7, 3]]], ['edgelist', 'G1239', 7, [[4, 3], [5, 2], [1, 2], [4, 1], [3, 5], [5, 4], [6, 2], [5, 6], [6, 3], [1, 6], [6, 4], [7, 6], [3, 7], [7, 4], [1, 7], [2, 7], [5, 7]]], ['edgelist', 'G1240', 7, [[4, 3], [5, 2], [5, 1], [4, 1], [3, 5], [4, 2], [6, 3], [5, 6], [6, 1], [4, 6], [6, 2], [7, 6], [3, 7], [7, 1], [4, 7], [7, 5], [2, 7]]], ['edgelist', 'G1241', 7, [[4, 3], [6, 2], [6, 1], [1, 5], [1, 2], [5, 4], [6, 4], [3, 6], [2, 5], [4, 1], [3, 5], [7, 5], [6, 7], [7, 3], [4, 7], [7, 1], [2, 7]]], ['edgelist', 'G1242', 7, [[4, 3], [6, 2], [6, 1], [1, 5], [5, 6], [1, 2], [4, 2], [3, 6], [2, 5], [4, 1], [3, 5], [7, 1], [4, 7], [7, 2], [6, 7], [7, 3], [5, 7]]], ['edgelist', 'G1243', 7, [[1, 4], [1, 5], [1, 6], [1, 7], [2, 3], [2, 4], [2, 5], [2, 6], [2, 7], [3, 4], [3, 5], [3, 6], [3, 7], [4, 6], [4, 7], [5, 6], [5, 7]]], ['edgelist', 'G1244', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [1, 6], [2, 3], [2, 4], [2, 5], [2, 6], [3, 4], [3, 5], [3, 6], [4, 5], [4, 6], [5, 6], [7, 2], [1, 7], [6, 7]]], ['edgelist', 'G1245', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [1, 6], [1, 7], [2, 3], [2, 4], [2, 5], [2, 6], [2, 7], [3, 4], [3, 5], [3, 6], [3, 7], [4, 5], [4, 6], [4, 7]]], ['edgelist', 'G1246', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [1, 6], [1, 7], [2, 3], [2, 5], [2, 6], [3, 4], [3, 5], [3, 6], [3, 7], [4, 5], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1247', 7, [[1, 2], [1, 3], [1, 4], [1, 6], [1, 7], [2, 3], [2, 4], [2, 5], [2, 6], [2, 7], [3, 4], [3, 5], [4, 5], [4, 6], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1248', 7, [[5, 1], [5, 6], [4, 1], [4, 6], [3, 1], [3, 6], [2, 4], [2, 5], [2, 6], [2, 1], [3, 4], [3, 5], [7, 1], [6, 7], [7, 2], [3, 7], [7, 5], [4, 7]]], ['edgelist', 'G1249', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [1, 6], [1, 7], [2, 3], [2, 4], [2, 5], [2, 6], [2, 7], [3, 4], [3, 5], [4, 5], [4, 6], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1250', 7, [[1, 2], [1, 3], [1, 4], [1, 6], [1, 7], [2, 3], [2, 4], [2, 5], [2, 6], [2, 7], [3, 4], [3, 5], [3, 6], [3, 7], [4, 5], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1251', 7, [[1, 2], [1, 3], [1, 4], [1, 6], [1, 7], [2, 3], [2, 4], [2, 5], [2, 6], [2, 7], [3, 4], [3, 5], [3, 6], [3, 7], [4, 5], [4, 6], [4, 7], [5, 6], [5, 7], [6, 7]]], ['edgelist', 'G1252', 7, [[1, 2], [1, 3], [1, 4], [1, 5], [1, 6], [1, 7], [2, 3], [2, 4], [2, 5], [2, 6], [2, 7], [3, 4], [3, 5], [3, 6], [3, 7], [4, 5], [4, 6], [4, 7], [5, 6], [5, 7], [6, 7]]]] return [make_small_graph(G) for G in descr_list] networkx-1.11/networkx/generators/intersection.py0000644000175000017500000000751712637544450022331 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ Generators for random intersection graphs. """ # Copyright (C) 2011 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import random import networkx as nx from networkx.algorithms import bipartite __author__ = "\n".join(['Aric Hagberg (hagberg@lanl.gov)']) __all__ = ['uniform_random_intersection_graph', 'k_random_intersection_graph', 'general_random_intersection_graph', ] def uniform_random_intersection_graph(n, m, p, seed=None): """Return a uniform random intersection graph. Parameters ---------- n : int The number of nodes in the first bipartite set (nodes) m : int The number of nodes in the second bipartite set (attributes) p : float Probability of connecting nodes between bipartite sets seed : int, optional Seed for random number generator (default=None). See Also -------- gnp_random_graph References ---------- .. [1] K.B. Singer-Cohen, Random Intersection Graphs, 1995, PhD thesis, Johns Hopkins University .. [2] Fill, J. A., Scheinerman, E. R., and Singer-Cohen, K. B., Random intersection graphs when m = !(n): An equivalence theorem relating the evolution of the g(n, m, p) and g(n, p) models. Random Struct. Algorithms 16, 2 (2000), 156–176. """ G=bipartite.random_graph(n, m, p, seed=seed) return nx.projected_graph(G, range(n)) def k_random_intersection_graph(n,m,k): """Return a intersection graph with randomly chosen attribute sets for each node that are of equal size (k). Parameters ---------- n : int The number of nodes in the first bipartite set (nodes) m : int The number of nodes in the second bipartite set (attributes) k : float Size of attribute set to assign to each node. seed : int, optional Seed for random number generator (default=None). See Also -------- gnp_random_graph, uniform_random_intersection_graph References ---------- .. [1] Godehardt, E., and Jaworski, J. Two models of random intersection graphs and their applications. Electronic Notes in Discrete Mathematics 10 (2001), 129--132. """ G = nx.empty_graph(n + m) mset = range(n,n+m) for v in range(n): targets = random.sample(mset, k) G.add_edges_from(zip([v]*len(targets), targets)) return nx.projected_graph(G, range(n)) def general_random_intersection_graph(n,m,p): """Return a random intersection graph with independent probabilities for connections between node and attribute sets. Parameters ---------- n : int The number of nodes in the first bipartite set (nodes) m : int The number of nodes in the second bipartite set (attributes) p : list of floats of length m Probabilities for connecting nodes to each attribute seed : int, optional Seed for random number generator (default=None). See Also -------- gnp_random_graph, uniform_random_intersection_graph References ---------- .. [1] Nikoletseas, S. E., Raptopoulos, C., and Spirakis, P. G. The existence and efficient construction of large independent sets in general random intersection graphs. In ICALP (2004), J. D´ıaz, J. Karhum¨aki, A. Lepist¨o, and D. Sannella, Eds., vol. 3142 of Lecture Notes in Computer Science, Springer, pp. 1029–1040. """ if len(p)!=m: raise ValueError("Probability list p must have m elements.") G = nx.empty_graph(n + m) mset = range(n,n+m) for u in range(n): for v,q in zip(mset,p): if random.random(). """Provides explicit constructions of expander graphs. """ import itertools import networkx as nx __all__ = ['margulis_gabber_galil_graph', 'chordal_cycle_graph'] # Other discrete torus expanders can be constructed by using the following edge # sets. For more information, see Chapter 4, "Expander Graphs", in # "Pseudorandomness", by Salil Vadhan. # # For a directed expander, add edges from (x, y) to: # # (x, y), # ((x + 1) % n, y), # (x, (y + 1) % n), # (x, (x + y) % n), # (-y % n, x) # # For an undirected expander, add the reverse edges. # # Also appearing in the paper of Gabber and Galil: # # (x, y), # (x, (x + y) % n), # (x, (x + y + 1) % n), # ((x + y) % n, y), # ((x + y + 1) % n, y) # # and: # # (x, y), # ((x + 2*y) % n, y), # ((x + (2*y + 1)) % n, y), # ((x + (2*y + 2)) % n, y), # (x, (y + 2*x) % n), # (x, (y + (2*x + 1)) % n), # (x, (y + (2*x + 2)) % n), # def margulis_gabber_galil_graph(n, create_using=None): """Return the Margulis-Gabber-Galil undirected MultiGraph on `n^2` nodes. The undirected MultiGraph is regular with degree `8`. Nodes are integer pairs. The second-largest eigenvalue of the adjacency matrix of the graph is at most `5 \sqrt{2}`, regardless of `n`. Parameters ---------- n : int Determines the number of nodes in the graph: `n^2`. create_using : graph-like A graph-like object that receives the constructed edges. If ``None``, then a :class:`~networkx.MultiGraph` instance is used. Returns ------- G : graph The constructed undirected multigraph. Raises ------ NetworkXError If the graph is directed or not a multigraph. """ if create_using is None: create_using = nx.MultiGraph() elif create_using.is_directed() or not create_using.is_multigraph(): msg = "`create_using` must be an undirected multigraph." raise nx.NetworkXError(msg) G = create_using G.clear() for (x, y) in itertools.product(range(n), repeat=2): for (u, v) in (((x + 2 * y) % n, y), ((x + (2 * y + 1)) % n, y), (x, (y + 2 * x) % n), (x, (y + (2 * x + 1)) % n)): G.add_edge((x, y), (u, v)) G.graph['name'] = "margulis_gabber_galil_graph({0})".format(n) return G def chordal_cycle_graph(p, create_using=None): """Return the chordal cycle graph on `p` nodes. The returned graph is a cycle graph on `p` nodes with chords joining each vertex `x` to its inverse modulo `p`. This graph is a (mildly explicit) 3-regular expander [1]_. ``p`` *must* be a prime number. Parameters ---------- p : a prime number The number of vertices in the graph. This also indicates where the chordal edges in the cycle will be created. create_using : graph-like A graph-like object that receives the constructed edges. If ``None``, then a :class:`~networkx.MultiGraph` instance is used. Returns ------- G : graph The constructed undirected multigraph. Raises ------ NetworkXError If the graph provided in ``create_using`` is directed or not a multigraph. References ---------- .. [1] Theorem 4.4.2 in A. Lubotzky. "Discrete groups, expanding graphs and invariant measures", volume 125 of Progress in Mathematics. Birkhäuser Verlag, Basel, 1994. """ if create_using is None: create_using = nx.MultiGraph() elif create_using.is_directed() or not create_using.is_multigraph(): msg = "`create_using` must be an undirected multigraph." raise nx.NetworkXError(msg) G = create_using G.clear() for x in range(p): left = (x - 1) % p right = (x + 1) % p # Here we apply Fermat's Little Theorem to compute the multiplicative # inverse of x in Z/pZ. By Fermat's Little Theorem, # # x^p = x (mod p) # # Therefore, # # x * x^(p - 2) = 1 (mod p) # # The number 0 is a special case: we just let its inverse be itself. chord = pow(x, p - 2, p) if x > 0 else 0 for y in (left, right, chord): G.add_edge(x, y) G.graph['name'] = "chordal_cycle_graph({0})".format(p) return G networkx-1.11/networkx/generators/tests/0000755000175000017500000000000012653231454020374 5ustar aricaric00000000000000networkx-1.11/networkx/generators/tests/test_small.py0000644000175000017500000001437112637544500023124 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * from networkx import * from networkx.algorithms.isomorphism.isomorph import graph_could_be_isomorphic is_isomorphic=graph_could_be_isomorphic """Generators - Small ===================== Some small graphs """ null=null_graph() class TestGeneratorsSmall(): def test_make_small_graph(self): d=["adjacencylist","Bull Graph",5,[[2,3],[1,3,4],[1,2,5],[2],[3]]] G=make_small_graph(d) assert_true(is_isomorphic(G, bull_graph())) def test__LCF_graph(self): # If n<=0, then return the null_graph G=LCF_graph(-10,[1,2],100) assert_true(is_isomorphic(G,null)) G=LCF_graph(0,[1,2],3) assert_true(is_isomorphic(G,null)) G=LCF_graph(0,[1,2],10) assert_true(is_isomorphic(G,null)) # Test that LCF(n,[],0) == cycle_graph(n) for a, b, c in [(5, [], 0), (10, [], 0), (5, [], 1), (10, [], 10)]: G=LCF_graph(a, b, c) assert_true(is_isomorphic(G,cycle_graph(a))) # Generate the utility graph K_{3,3} G=LCF_graph(6,[3,-3],3) utility_graph=complete_bipartite_graph(3,3) assert_true(is_isomorphic(G, utility_graph)) def test_properties_named_small_graphs(self): G=bull_graph() assert_equal(G.number_of_nodes(), 5) assert_equal(G.number_of_edges(), 5) assert_equal(sorted(G.degree().values()), [1, 1, 2, 3, 3]) assert_equal(diameter(G), 3) assert_equal(radius(G), 2) G=chvatal_graph() assert_equal(G.number_of_nodes(), 12) assert_equal(G.number_of_edges(), 24) assert_equal(list(G.degree().values()), 12 * [4]) assert_equal(diameter(G), 2) assert_equal(radius(G), 2) G=cubical_graph() assert_equal(G.number_of_nodes(), 8) assert_equal(G.number_of_edges(), 12) assert_equal(list(G.degree().values()), 8*[3]) assert_equal(diameter(G), 3) assert_equal(radius(G), 3) G=desargues_graph() assert_equal(G.number_of_nodes(), 20) assert_equal(G.number_of_edges(), 30) assert_equal(list(G.degree().values()), 20*[3]) G=diamond_graph() assert_equal(G.number_of_nodes(), 4) assert_equal(sorted(G.degree().values()), [2, 2, 3, 3]) assert_equal(diameter(G), 2) assert_equal(radius(G), 1) G=dodecahedral_graph() assert_equal(G.number_of_nodes(), 20) assert_equal(G.number_of_edges(), 30) assert_equal(list(G.degree().values()), 20*[3]) assert_equal(diameter(G), 5) assert_equal(radius(G), 5) G=frucht_graph() assert_equal(G.number_of_nodes(), 12) assert_equal(G.number_of_edges(), 18) assert_equal(list(G.degree().values()), 12*[3]) assert_equal(diameter(G), 4) assert_equal(radius(G), 3) G=heawood_graph() assert_equal(G.number_of_nodes(), 14) assert_equal(G.number_of_edges(), 21) assert_equal(list(G.degree().values()), 14*[3]) assert_equal(diameter(G), 3) assert_equal(radius(G), 3) G=house_graph() assert_equal(G.number_of_nodes(), 5) assert_equal(G.number_of_edges(), 6) assert_equal(sorted(G.degree().values()), [2, 2, 2, 3, 3]) assert_equal(diameter(G), 2) assert_equal(radius(G), 2) G=house_x_graph() assert_equal(G.number_of_nodes(), 5) assert_equal(G.number_of_edges(), 8) assert_equal(sorted(G.degree().values()), [2, 3, 3, 4, 4]) assert_equal(diameter(G), 2) assert_equal(radius(G), 1) G=icosahedral_graph() assert_equal(G.number_of_nodes(), 12) assert_equal(G.number_of_edges(), 30) assert_equal(list(G.degree().values()), [5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5]) assert_equal(diameter(G), 3) assert_equal(radius(G), 3) G=krackhardt_kite_graph() assert_equal(G.number_of_nodes(), 10) assert_equal(G.number_of_edges(), 18) assert_equal(sorted(G.degree().values()), [1, 2, 3, 3, 3, 4, 4, 5, 5, 6]) G=moebius_kantor_graph() assert_equal(G.number_of_nodes(), 16) assert_equal(G.number_of_edges(), 24) assert_equal(list(G.degree().values()), 16*[3]) assert_equal(diameter(G), 4) G=octahedral_graph() assert_equal(G.number_of_nodes(), 6) assert_equal(G.number_of_edges(), 12) assert_equal(list(G.degree().values()), 6*[4]) assert_equal(diameter(G), 2) assert_equal(radius(G), 2) G=pappus_graph() assert_equal(G.number_of_nodes(), 18) assert_equal(G.number_of_edges(), 27) assert_equal(list(G.degree().values()), 18*[3]) assert_equal(diameter(G), 4) G=petersen_graph() assert_equal(G.number_of_nodes(), 10) assert_equal(G.number_of_edges(), 15) assert_equal(list(G.degree().values()), 10*[3]) assert_equal(diameter(G), 2) assert_equal(radius(G), 2) G=sedgewick_maze_graph() assert_equal(G.number_of_nodes(), 8) assert_equal(G.number_of_edges(), 10) assert_equal(sorted(G.degree().values()), [1, 2, 2, 2, 3, 3, 3, 4]) G=tetrahedral_graph() assert_equal(G.number_of_nodes(), 4) assert_equal(G.number_of_edges(), 6) assert_equal(list(G.degree().values()), [3, 3, 3, 3]) assert_equal(diameter(G), 1) assert_equal(radius(G), 1) G=truncated_cube_graph() assert_equal(G.number_of_nodes(), 24) assert_equal(G.number_of_edges(), 36) assert_equal(list(G.degree().values()), 24*[3]) G=truncated_tetrahedron_graph() assert_equal(G.number_of_nodes(), 12) assert_equal(G.number_of_edges(), 18) assert_equal(list(G.degree().values()), 12*[3]) G=tutte_graph() assert_equal(G.number_of_nodes(), 46) assert_equal(G.number_of_edges(), 69) assert_equal(list(G.degree().values()), 46*[3]) # Test create_using with directed or multigraphs on small graphs assert_raises(networkx.exception.NetworkXError, tutte_graph, create_using=DiGraph()) MG=tutte_graph(create_using=MultiGraph()) assert_equal(MG.edges(), G.edges()) networkx-1.11/networkx/generators/tests/test_ego.py0000644000175000017500000000250312637544500022560 0ustar aricaric00000000000000#!/usr/bin/env python """ ego graph --------- """ from nose.tools import assert_true, assert_equal import networkx as nx class TestGeneratorEgo(): def test_ego(self): G=nx.star_graph(3) H=nx.ego_graph(G,0) assert_true(nx.is_isomorphic(G,H)) G.add_edge(1,11) G.add_edge(2,22) G.add_edge(3,33) H=nx.ego_graph(G,0) assert_true(nx.is_isomorphic(nx.star_graph(3),H)) G=nx.path_graph(3) H=nx.ego_graph(G,0) assert_equal(H.edges(), [(0, 1)]) H=nx.ego_graph(G,0,undirected=True) assert_equal(H.edges(), [(0, 1)]) H=nx.ego_graph(G,0,center=False) assert_equal(H.edges(), []) def test_ego_distance(self): G=nx.Graph() G.add_edge(0,1,weight=2,distance=1) G.add_edge(1,2,weight=2,distance=2) G.add_edge(2,3,weight=2,distance=1) assert_equal(sorted(nx.ego_graph(G,0,radius=3).nodes()),[0,1,2,3]) eg=nx.ego_graph(G,0,radius=3,distance='weight') assert_equal(sorted(eg.nodes()),[0,1]) eg=nx.ego_graph(G,0,radius=3,distance='weight',undirected=True) assert_equal(sorted(eg.nodes()),[0,1]) eg=nx.ego_graph(G,0,radius=3,distance='distance') assert_equal(sorted(eg.nodes()),[0,1,2]) networkx-1.11/networkx/generators/tests/test_expanders.py0000644000175000017500000000470112637544450024005 0ustar aricaric00000000000000# Copyright 2014 "cheebee7i". # Copyright 2014 "alexbrc". # Copyright 2014 Jeffrey Finkelstein . """Unit tests for the :mod:`networkx.generators.expanders` module. """ try: import scipy is_scipy_available = True except: is_scipy_available = False import networkx as nx from networkx import adjacency_matrix from networkx import number_of_nodes from networkx.generators.expanders import chordal_cycle_graph from networkx.generators.expanders import margulis_gabber_galil_graph from nose import SkipTest from nose.tools import assert_equal from nose.tools import assert_less from nose.tools import assert_raises from nose.tools import assert_true def test_margulis_gabber_galil_graph(): try: # Scipy is required for conversion to an adjacency matrix. # We also use scipy for computing the eigenvalues, # but this second use could be done using only numpy. import numpy as np import scipy.linalg has_scipy = True except ImportError as e: has_scipy = False for n in 2, 3, 5, 6, 10: g = margulis_gabber_galil_graph(n) assert_equal(number_of_nodes(g), n*n) for node in g: assert_equal(g.degree(node), 8) assert_equal(len(node), 2) for i in node: assert_equal(int(i), i) assert_true(0 <= i < n) if has_scipy: # Eigenvalues are already sorted using the scipy eigvalsh, # but the implementation in numpy does not guarantee order. w = sorted(scipy.linalg.eigvalsh(adjacency_matrix(g).A)) assert_less(w[-2], 5*np.sqrt(2)) def test_chordal_cycle_graph(): """Test for the :func:`networkx.chordal_cycle_graph` function.""" if not is_scipy_available: raise SkipTest('SciPy is not available') primes = [3, 5, 7, 11] for p in primes: G = chordal_cycle_graph(p) assert_equal(len(G), p) # TODO The second largest eigenvalue should be smaller than a constant, # independent of the number of nodes in the graph: # # eigs = sorted(scipy.linalg.eigvalsh(adjacency_matrix(G).A)) # assert_less(eigs[-2], ...) # def test_margulis_gabber_galil_graph_badinput(): assert_raises(nx.NetworkXError, margulis_gabber_galil_graph, 3, nx.DiGraph()) assert_raises(nx.NetworkXError, margulis_gabber_galil_graph, 3, nx.Graph()) networkx-1.11/networkx/generators/tests/test_random_graphs.py0000755000175000017500000001113612637544500024637 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * from networkx import * from networkx.generators.random_graphs import * class TestGeneratorsRandom(): def smoke_test_random_graph(self): seed = 42 G=gnp_random_graph(100,0.25,seed) G=binomial_graph(100,0.25,seed) G=erdos_renyi_graph(100,0.25,seed) G=fast_gnp_random_graph(100,0.25,seed) G=gnm_random_graph(100,20,seed) G=dense_gnm_random_graph(100,20,seed) G=watts_strogatz_graph(10,2,0.25,seed) assert_equal(len(G), 10) assert_equal(G.number_of_edges(), 10) G=connected_watts_strogatz_graph(10,2,0.1,seed) assert_equal(len(G), 10) assert_equal(G.number_of_edges(), 10) G=watts_strogatz_graph(10,4,0.25,seed) assert_equal(len(G), 10) assert_equal(G.number_of_edges(), 20) G=newman_watts_strogatz_graph(10,2,0.0,seed) assert_equal(len(G), 10) assert_equal(G.number_of_edges(), 10) G=newman_watts_strogatz_graph(10,4,0.25,seed) assert_equal(len(G), 10) assert_true(G.number_of_edges() >= 20) G=barabasi_albert_graph(100,1,seed) G=barabasi_albert_graph(100,3,seed) assert_equal(G.number_of_edges(),(97*3)) G=powerlaw_cluster_graph(100,1,1.0,seed) G=powerlaw_cluster_graph(100,3,0.0,seed) assert_equal(G.number_of_edges(),(97*3)) G=duplication_divergence_graph(100,1.0,seed) assert_equal(len(G), 100) assert_raises(networkx.exception.NetworkXError, duplication_divergence_graph, 100, 2) assert_raises(networkx.exception.NetworkXError, duplication_divergence_graph, 100, -1) G=random_regular_graph(10,20,seed) assert_raises(networkx.exception.NetworkXError, random_regular_graph, 3, 21) constructor=[(10,20,0.8),(20,40,0.8)] G=random_shell_graph(constructor,seed) G=nx.random_lobster(10,0.1,0.5,seed) def test_random_zero_regular_graph(self): """Tests that a 0-regular graph has the correct number of nodes and edges. """ G = random_regular_graph(0, 10) assert_equal(len(G), 10) assert_equal(len(G.edges()), 0) def test_gnp(self): for generator in [gnp_random_graph, binomial_graph, erdos_renyi_graph, fast_gnp_random_graph]: G = generator(10, -1.1) assert_equal(len(G), 10) assert_equal(len(G.edges()), 0) G = generator(10, 0.1) assert_equal(len(G), 10) G = generator(10, 0.1, seed=42) assert_equal(len(G), 10) G = generator(10, 1.1) assert_equal(len(G), 10) assert_equal(len(G.edges()), 45) G = generator(10, -1.1, directed=True) assert_true(G.is_directed()) assert_equal(len(G), 10) assert_equal(len(G.edges()), 0) G = generator(10, 0.1, directed=True) assert_true(G.is_directed()) assert_equal(len(G), 10) G = generator(10, 1.1, directed=True) assert_true(G.is_directed()) assert_equal(len(G), 10) assert_equal(len(G.edges()), 90) # assert that random graphs generate all edges for p close to 1 edges = 0 runs = 100 for i in range(runs): edges += len(generator(10, 0.99999, directed=True).edges()) assert_almost_equal(edges/float(runs), 90, delta=runs*2.0/100) def test_gnm(self): G=gnm_random_graph(10,3) assert_equal(len(G),10) assert_equal(len(G.edges()),3) G=gnm_random_graph(10,3,seed=42) assert_equal(len(G),10) assert_equal(len(G.edges()),3) G=gnm_random_graph(10,100) assert_equal(len(G),10) assert_equal(len(G.edges()),45) G=gnm_random_graph(10,100,directed=True) assert_equal(len(G),10) assert_equal(len(G.edges()),90) G=gnm_random_graph(10,-1.1) assert_equal(len(G),10) assert_equal(len(G.edges()),0) def test_watts_strogatz_big_k(self): assert_raises(networkx.exception.NetworkXError, watts_strogatz_graph, 10, 10, 0.25) assert_raises(networkx.exception.NetworkXError, newman_watts_strogatz_graph, 10, 10, 0.25) # could create an infinite loop, now doesn't # infinite loop used to occur when a node has degree n-1 and needs to rewire watts_strogatz_graph(10, 9, 0.25, seed=0) newman_watts_strogatz_graph(10, 9, 0.5, seed=0) networkx-1.11/networkx/generators/tests/test_line.py0000644000175000017500000000446512637544450022752 0ustar aricaric00000000000000import networkx as nx from nose.tools import * import networkx.generators.line as line def test_node_func(): # graph G = nx.Graph() G.add_edge(1,2) nf = line._node_func(G) assert_equal(nf(1,2), (1,2)) assert_equal(nf(2,1), (1,2)) # multigraph G = nx.MultiGraph() G.add_edge(1,2) G.add_edge(1,2) nf = line._node_func(G) assert_equal(nf(1,2,0), (1,2,0)) assert_equal(nf(2,1,0), (1,2,0)) def test_edge_func(): # graph G = nx.Graph() G.add_edge(1,2) G.add_edge(2,3) ef = line._edge_func(G) expected = [(1,2),(2,3)] result = sorted(ef()) assert_equal(expected, result) # digraph G = nx.MultiDiGraph() G.add_edge(1,2) G.add_edge(2,3) G.add_edge(2,3) ef = line._edge_func(G) expected = [(1,2,0),(2,3,0),(2,3,1)] result = sorted(ef()) assert_equal(expected, result) def test_sorted_edge(): assert_equal( (1,2), line._sorted_edge(1,2) ) assert_equal( (1,2), line._sorted_edge(2,1) ) class TestGeneratorLine(): def test_star(self): G = nx.star_graph(5) L = nx.line_graph(G) assert_true(nx.is_isomorphic(L, nx.complete_graph(5))) def test_path(self): G = nx.path_graph(5) L = nx.line_graph(G) assert_true(nx.is_isomorphic(L, nx.path_graph(4))) def test_cycle(self): G = nx.cycle_graph(5) L = nx.line_graph(G) assert_true(nx.is_isomorphic(L, G)) def test_digraph1(self): G = nx.DiGraph() G.add_edges_from([(0,1),(0,2),(0,3)]) L = nx.line_graph(G) # no edge graph, but with nodes assert_equal(L.adj, {(0,1):{}, (0,2):{}, (0,3):{}}) def test_digraph2(self): G = nx.DiGraph() G.add_edges_from([(0,1),(1,2),(2,3)]) L = nx.line_graph(G) assert_equal(sorted(L.edges()), [((0, 1), (1, 2)), ((1, 2), (2, 3))]) def test_create1(self): G = nx.DiGraph() G.add_edges_from([(0,1),(1,2),(2,3)]) L = nx.line_graph(G, create_using=nx.Graph()) assert_equal(sorted(L.edges()), [((0, 1), (1, 2)), ((1, 2), (2, 3))]) def test_create2(self): G = nx.Graph() G.add_edges_from([(0,1),(1,2),(2,3)]) L = nx.line_graph(G, create_using=nx.DiGraph()) assert_equal(sorted(L.edges()), [((0, 1), (1, 2)), ((1, 2), (2, 3))]) networkx-1.11/networkx/generators/tests/test_nonisomorphic_trees.py0000644000175000017500000000352112637544500026100 0ustar aricaric00000000000000#!/usr/bin/env python """ ==================== Generators - Non Isomorphic Trees ==================== Unit tests for WROM algorithm generator in generators/nonisomorphic_trees.py """ from nose.tools import * from networkx import * class TestGeneratorNonIsomorphicTrees(): def test_tree_structure(self): # test for tree structure for nx.nonisomorphic_trees() f = lambda x: list(nx.nonisomorphic_trees(x)) for i in f(6): assert_true(nx.is_tree(i)) for i in f(8): assert_true(nx.is_tree(i)) def test_nonisomorphism(self): # test for nonisomorphism of trees for nx.nonisomorphic_trees() f = lambda x: list(nx.nonisomorphic_trees(x)) trees = f(6) for i in range(len(trees)): for j in range(i + 1, len(trees)): assert_false(nx.is_isomorphic(trees[i], trees[j])) trees = f(8) for i in range(len(trees)): for j in range(i + 1, len(trees)): assert_false(nx.is_isomorphic(trees[i], trees[j])) def test_number_of_nonisomorphic_trees(self): # http://oeis.org/A000055 assert_equal(nx.number_of_nonisomorphic_trees(2), 1) assert_equal(nx.number_of_nonisomorphic_trees(3), 1) assert_equal(nx.number_of_nonisomorphic_trees(4), 2) assert_equal(nx.number_of_nonisomorphic_trees(5), 3) assert_equal(nx.number_of_nonisomorphic_trees(6), 6) assert_equal(nx.number_of_nonisomorphic_trees(7), 11) assert_equal(nx.number_of_nonisomorphic_trees(8), 23) def test_nonisomorphic_trees(self): f = lambda x: list(nx.nonisomorphic_trees(x)) assert_equal(sorted(f(3)[0].edges()), [(0, 1), (0, 2)]) assert_equal(sorted(f(4)[0].edges()), [(0, 1), (0, 3), (1, 2)]) assert_equal(sorted(f(4)[1].edges()), [(0, 1), (0, 2), (0, 3)]) networkx-1.11/networkx/generators/tests/test_atlas.py0000644000175000017500000000360312637544450023120 0ustar aricaric00000000000000from nose.tools import * from nose import SkipTest class TestAtlas(object): @classmethod def setupClass(cls): global atlas import platform if platform.python_implementation()=='Jython': raise SkipTest('graph atlas not available under Jython.') import networkx.generators.atlas as atlas def setUp(self): self.GAG=atlas.graph_atlas_g() def test_sizes(self): G=self.GAG[0] assert_equal(G.number_of_nodes(),0) assert_equal(G.number_of_edges(),0) G=self.GAG[7] assert_equal(G.number_of_nodes(),3) assert_equal(G.number_of_edges(),3) def test_names(self): i=0 for g in self.GAG: name=g.name assert_equal(int(name[1:]),i) i+=1 def test_monotone_nodes(self): # check for monotone increasing number of nodes previous=self.GAG[0] for g in self.GAG: assert_false(len(g)-len(previous) > 1) previous=g.copy() def test_monotone_nodes(self): # check for monotone increasing number of edges # (for fixed number of nodes) previous=self.GAG[0] for g in self.GAG: if len(g)==len(previous): assert_false(g.size()-previous.size() > 1) previous=g.copy() def test_monotone_degree_sequence(self): # check for monotone increasing degree sequence # (for fixed number f nodes and edges) # note that 111223 < 112222 previous=self.GAG[0] for g in self.GAG: if len(g)==0: continue if len(g)==len(previous) & g.size()==previous.size(): deg_seq=sorted(g.degree().values()) previous_deg_seq=sorted(previous.degree().values()) assert_true(previous_deg_seq < deg_seq) previous=g.copy() networkx-1.11/networkx/generators/tests/test_threshold.py0000644000175000017500000001477112637544500024014 0ustar aricaric00000000000000#!/usr/bin/env python """Threshold Graphs ================ """ from nose.tools import * from nose import SkipTest from nose.plugins.attrib import attr import networkx as nx import networkx.generators.threshold as nxt from networkx.algorithms.isomorphism.isomorph import graph_could_be_isomorphic cnlti = nx.convert_node_labels_to_integers class TestGeneratorThreshold(): def test_threshold_sequence_graph_test(self): G=nx.star_graph(10) assert_true(nxt.is_threshold_graph(G)) assert_true(nxt.is_threshold_sequence(list(G.degree().values()))) G=nx.complete_graph(10) assert_true(nxt.is_threshold_graph(G)) assert_true(nxt.is_threshold_sequence(list(G.degree().values()))) deg=[3,2,2,1,1,1] assert_false(nxt.is_threshold_sequence(deg)) deg=[3,2,2,1] assert_true(nxt.is_threshold_sequence(deg)) G=nx.generators.havel_hakimi_graph(deg) assert_true(nxt.is_threshold_graph(G)) def test_creation_sequences(self): deg=[3,2,2,1] G=nx.generators.havel_hakimi_graph(deg) cs0=nxt.creation_sequence(deg) H0=nxt.threshold_graph(cs0) assert_equal(''.join(cs0), 'ddid') cs1=nxt.creation_sequence(deg, with_labels=True) H1=nxt.threshold_graph(cs1) assert_equal(cs1, [(1, 'd'), (2, 'd'), (3, 'i'), (0, 'd')]) cs2=nxt.creation_sequence(deg, compact=True) H2=nxt.threshold_graph(cs2) assert_equal(cs2, [2, 1, 1]) assert_equal(''.join(nxt.uncompact(cs2)), 'ddid') assert_true(graph_could_be_isomorphic(H0,G)) assert_true(graph_could_be_isomorphic(H0,H1)) assert_true(graph_could_be_isomorphic(H0,H2)) def test_shortest_path(self): deg=[3,2,2,1] G=nx.generators.havel_hakimi_graph(deg) cs1=nxt.creation_sequence(deg, with_labels=True) for n, m in [(3, 0), (0, 3), (0, 2), (0, 1), (1, 3), (3, 1), (1, 2), (2, 3)]: assert_equal(nxt.shortest_path(cs1,n,m), nx.shortest_path(G, n, m)) spl=nxt.shortest_path_length(cs1,3) spl2=nxt.shortest_path_length([ t for v,t in cs1],2) assert_equal(spl, spl2) spld={} for j,pl in enumerate(spl): n=cs1[j][0] spld[n]=pl assert_equal(spld, nx.single_source_shortest_path_length(G, 3)) def test_weights_thresholds(self): wseq=[3,4,3,3,5,6,5,4,5,6] cs=nxt.weights_to_creation_sequence(wseq,threshold=10) wseq=nxt.creation_sequence_to_weights(cs) cs2=nxt.weights_to_creation_sequence(wseq) assert_equal(cs, cs2) wseq=nxt.creation_sequence_to_weights(nxt.uncompact([3,1,2,3,3,2,3])) assert_equal(wseq, [s*0.125 for s in [4,4,4,3,5,5,2,2,2,6,6,6,1,1,7,7,7]]) wseq=nxt.creation_sequence_to_weights([3,1,2,3,3,2,3]) assert_equal(wseq, [s*0.125 for s in [4,4,4,3,5,5,2,2,2,6,6,6,1,1,7,7,7]]) wseq=nxt.creation_sequence_to_weights(list(enumerate('ddidiiidididi'))) assert_equal(wseq, [s*0.1 for s in [5,5,4,6,3,3,3,7,2,8,1,9,0]]) wseq=nxt.creation_sequence_to_weights('ddidiiidididi') assert_equal(wseq, [s*0.1 for s in [5,5,4,6,3,3,3,7,2,8,1,9,0]]) wseq=nxt.creation_sequence_to_weights('ddidiiidididid') ws=[s/float(12) for s in [6,6,5,7,4,4,4,8,3,9,2,10,1,11]] assert_true(sum([abs(c-d) for c,d in zip(wseq,ws)]) < 1e-14) def test_finding_routines(self): G=nx.Graph({1:[2],2:[3],3:[4],4:[5],5:[6]}) G.add_edge(2,4) G.add_edge(2,5) G.add_edge(2,7) G.add_edge(3,6) G.add_edge(4,6) # Alternating 4 cycle assert_equal(nxt.find_alternating_4_cycle(G), [1, 2, 3, 6]) # Threshold graph TG=nxt.find_threshold_graph(G) assert_true(nxt.is_threshold_graph(TG)) assert_equal(sorted(TG.nodes()), [1, 2, 3, 4, 5, 7]) cs=nxt.creation_sequence(TG.degree(),with_labels=True) assert_equal(nxt.find_creation_sequence(G), cs) def test_fast_versions_properties_threshold_graphs(self): cs='ddiiddid' G=nxt.threshold_graph(cs) assert_equal(nxt.density('ddiiddid'), nx.density(G)) assert_equal(sorted(nxt.degree_sequence(cs)), sorted(G.degree().values())) ts=nxt.triangle_sequence(cs) assert_equal(ts, list(nx.triangles(G).values())) assert_equal(sum(ts) // 3, nxt.triangles(cs)) c1=nxt.cluster_sequence(cs) c2=list(nx.clustering(G).values()) assert_almost_equal(sum([abs(c-d) for c,d in zip(c1,c2)]), 0) b1=nx.betweenness_centrality(G).values() b2=nxt.betweenness_sequence(cs) assert_true(sum([abs(c-d) for c,d in zip(b1,b2)]) < 1e-14) assert_equal(nxt.eigenvalues(cs), [0, 1, 3, 3, 5, 7, 7, 8]) # Degree Correlation assert_true(abs(nxt.degree_correlation(cs)+0.593038821954) < 1e-12) assert_equal(nxt.degree_correlation('diiiddi'), -0.8) assert_equal(nxt.degree_correlation('did'), -1.0) assert_equal(nxt.degree_correlation('ddd'), 1.0) assert_equal(nxt.eigenvalues('dddiii'), [0, 0, 0, 0, 3, 3]) assert_equal(nxt.eigenvalues('dddiiid'), [0, 1, 1, 1, 4, 4, 7]) def test_tg_creation_routines(self): s=nxt.left_d_threshold_sequence(5,7) s=nxt.right_d_threshold_sequence(5,7) s1=nxt.swap_d(s,1.0,1.0) @attr('numpy') def test_eigenvectors(self): try: import numpy as N eigenval=N.linalg.eigvals import scipy except ImportError: raise SkipTest('SciPy not available.') cs='ddiiddid' G=nxt.threshold_graph(cs) (tgeval,tgevec)=nxt.eigenvectors(cs) dot=N.dot assert_equal([ abs(dot(lv,lv)-1.0)<1e-9 for lv in tgevec ], [True]*8) lapl=nx.laplacian_matrix(G) # tgev=[ dot(lv,dot(lapl,lv)) for lv in tgevec ] # assert_true(sum([abs(c-d) for c,d in zip(tgev,tgeval)]) < 1e-9) # tgev.sort() # lev=list(eigenval(lapl)) # lev.sort() # assert_true(sum([abs(c-d) for c,d in zip(tgev,lev)]) < 1e-9) def test_create_using(self): cs='ddiiddid' G=nxt.threshold_graph(cs) assert_raises(nx.exception.NetworkXError, nxt.threshold_graph, cs, create_using=nx.DiGraph()) MG=nxt.threshold_graph(cs,create_using=nx.MultiGraph()) assert_equal(MG.edges(), G.edges()) networkx-1.11/networkx/generators/tests/test_intersection.py0000644000175000017500000000120712637544450024520 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * import networkx as nx class TestIntersectionGraph(): def test_random_intersection_graph(self): G=nx.uniform_random_intersection_graph(10,5,0.5) assert_equal(len(G),10) def test_k_random_intersection_graph(self): G=nx.k_random_intersection_graph(10,5,2) assert_equal(len(G),10) def test_general_random_intersection_graph(self): G=nx.general_random_intersection_graph(10,5,[0.1,0.2,0.2,0.1,0.1]) assert_equal(len(G),10) assert_raises(ValueError, nx.general_random_intersection_graph,10,5, [0.1,0.2,0.2,0.1]) networkx-1.11/networkx/generators/tests/test_degree_seq.py0000644000175000017500000001314612637544500024116 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * import networkx from networkx import * from networkx.generators.degree_seq import * from networkx.utils import uniform_sequence,powerlaw_sequence def test_configuration_model_empty(): # empty graph has empty degree sequence deg_seq=[] G=configuration_model(deg_seq) assert_equal(G.degree(), {}) def test_configuration_model(): deg_seq=[5,3,3,3,3,2,2,2,1,1,1] G=configuration_model(deg_seq,seed=12345678) assert_equal(sorted(G.degree().values(),reverse=True), [5, 3, 3, 3, 3, 2, 2, 2, 1, 1, 1]) assert_equal(sorted(G.degree(range(len(deg_seq))).values(), reverse=True), [5, 3, 3, 3, 3, 2, 2, 2, 1, 1, 1]) # test that fixed seed delivers the same graph deg_seq=[3,3,3,3,3,3,3,3,3,3,3,3] G1=configuration_model(deg_seq,seed=1000) G2=configuration_model(deg_seq,seed=1000) assert_true(is_isomorphic(G1,G2)) G1=configuration_model(deg_seq,seed=10) G2=configuration_model(deg_seq,seed=10) assert_true(is_isomorphic(G1,G2)) @raises(NetworkXError) def test_configuation_raise(): z=[5,3,3,3,3,2,2,2,1,1,1] G = configuration_model(z, create_using=DiGraph()) @raises(NetworkXError) def test_configuation_raise_odd(): z=[5,3,3,3,3,2,2,2,1,1] G = configuration_model(z, create_using=DiGraph()) @raises(NetworkXError) def test_directed_configuation_raise_unequal(): zin = [5,3,3,3,3,2,2,2,1,1] zout = [5,3,3,3,3,2,2,2,1,2] G = directed_configuration_model(zin, zout) def test_directed_configuation_mode(): G = directed_configuration_model([],[],seed=0) assert_equal(len(G),0) def test_expected_degree_graph_empty(): # empty graph has empty degree sequence deg_seq=[] G=expected_degree_graph(deg_seq) assert_equal(G.degree(), {}) def test_expected_degree_graph(): # test that fixed seed delivers the same graph deg_seq=[3,3,3,3,3,3,3,3,3,3,3,3] G1=expected_degree_graph(deg_seq,seed=1000) G2=expected_degree_graph(deg_seq,seed=1000) assert_true(is_isomorphic(G1,G2)) G1=expected_degree_graph(deg_seq,seed=10) G2=expected_degree_graph(deg_seq,seed=10) assert_true(is_isomorphic(G1,G2)) def test_expected_degree_graph_selfloops(): deg_seq=[3,3,3,3,3,3,3,3,3,3,3,3] G1=expected_degree_graph(deg_seq,seed=1000, selfloops=False) G2=expected_degree_graph(deg_seq,seed=1000, selfloops=False) assert_true(is_isomorphic(G1,G2)) def test_expected_degree_graph_skew(): deg_seq=[10,2,2,2,2] G1=expected_degree_graph(deg_seq,seed=1000) G2=expected_degree_graph(deg_seq,seed=1000) assert_true(is_isomorphic(G1,G2)) def test_havel_hakimi_construction(): G = havel_hakimi_graph([]) assert_equal(len(G),0) z=[1000,3,3,3,3,2,2,2,1,1,1] assert_raises(networkx.exception.NetworkXError, havel_hakimi_graph, z) z=["A",3,3,3,3,2,2,2,1,1,1] assert_raises(networkx.exception.NetworkXError, havel_hakimi_graph, z) z=[5,4,3,3,3,2,2,2] G=havel_hakimi_graph(z) G=configuration_model(z) z=[6,5,4,4,2,1,1,1] assert_raises(networkx.exception.NetworkXError, havel_hakimi_graph, z) z=[10,3,3,3,3,2,2,2,2,2,2] G=havel_hakimi_graph(z) assert_raises(networkx.exception.NetworkXError, havel_hakimi_graph, z, create_using=DiGraph()) def test_directed_havel_hakimi(): # Test range of valid directed degree sequences n, r = 100, 10 p = 1.0 / r for i in range(r): G1 = nx.erdos_renyi_graph(n,p*(i+1),None,True) din = list(G1.in_degree().values()) dout = list(G1.out_degree().values()) G2 = nx.directed_havel_hakimi_graph(din, dout) assert_true(din == list(G2.in_degree().values())) assert_true(dout == list(G2.out_degree().values())) # Test non-graphical sequence dout = [1000,3,3,3,3,2,2,2,1,1,1] din=[103,102,102,102,102,102,102,102,102,102] assert_raises(nx.exception.NetworkXError, nx.directed_havel_hakimi_graph, din, dout) # Test valid sequences dout=[1, 1, 1, 1, 1, 2, 2, 2, 3, 4] din=[2, 2, 2, 2, 2, 2, 2, 2, 0, 2] G2 = nx.directed_havel_hakimi_graph(din, dout) assert_true(din == list(G2.in_degree().values())) assert_true(dout == list(G2.out_degree().values())) # Test unequal sums din=[2, 2, 2, 2, 2, 2, 2, 2, 2, 2] assert_raises(nx.exception.NetworkXError, nx.directed_havel_hakimi_graph, din, dout) # Test for negative values din=[2, 2, 2, 2, 2, 2, 2, 2, 2, 2, -2] assert_raises(nx.exception.NetworkXError, nx.directed_havel_hakimi_graph, din, dout) def test_degree_sequence_tree(): z=[1, 1, 1, 1, 1, 2, 2, 2, 3, 4] G=degree_sequence_tree(z) assert_true(len(G.nodes())==len(z)) assert_true(len(G.edges())==sum(z)/2) assert_raises(networkx.exception.NetworkXError, degree_sequence_tree, z, create_using=DiGraph()) z=[1, 1, 1, 1, 1, 1, 2, 2, 2, 3, 4] assert_raises(networkx.exception.NetworkXError, degree_sequence_tree, z) def test_random_degree_sequence_graph(): d=[1,2,2,3] G = nx.random_degree_sequence_graph(d) assert_equal(d, list(G.degree().values())) def test_random_degree_sequence_graph_raise(): z=[1, 1, 1, 1, 1, 1, 2, 2, 2, 3, 4] assert_raises(networkx.exception.NetworkXUnfeasible, random_degree_sequence_graph, z) def test_random_degree_sequence_large(): G = nx.fast_gnp_random_graph(100,0.1) d = G.degree().values() G = nx.random_degree_sequence_graph(d, seed=0) assert_equal(sorted(d), sorted(list(G.degree().values()))) networkx-1.11/networkx/generators/tests/test_geometric.py0000644000175000017500000000201412637544450023765 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * import networkx as nx class TestGeneratorsGeometric(): def test_random_geometric_graph(self): G=nx.random_geometric_graph(50,0.25) assert_equal(len(G),50) def test_geographical_threshold_graph(self): G=nx.geographical_threshold_graph(50,100) assert_equal(len(G),50) def test_waxman_graph(self): G=nx.waxman_graph(50,0.5,0.1) assert_equal(len(G),50) G=nx.waxman_graph(50,0.5,0.1,L=1) assert_equal(len(G),50) def test_naviable_small_world(self): G = nx.navigable_small_world_graph(5,p=1,q=0) gg = nx.grid_2d_graph(5,5).to_directed() assert_true(nx.is_isomorphic(G,gg)) G = nx.navigable_small_world_graph(5,p=1,q=0,dim=3) gg = nx.grid_graph([5,5,5]).to_directed() assert_true(nx.is_isomorphic(G,gg)) G = nx.navigable_small_world_graph(5,p=1,q=0,dim=1) gg = nx.grid_graph([5]).to_directed() assert_true(nx.is_isomorphic(G,gg)) networkx-1.11/networkx/generators/tests/test_stochastic.py0000644000175000017500000000201312637544500024146 0ustar aricaric00000000000000from nose.tools import assert_true, assert_equal, raises import networkx as nx def test_stochastic(): G=nx.DiGraph() G.add_edge(0,1) G.add_edge(0,2) S=nx.stochastic_graph(G) assert_true(nx.is_isomorphic(G,S)) assert_equal(sorted(S.edges(data=True)), [(0, 1, {'weight': 0.5}), (0, 2, {'weight': 0.5})]) S=nx.stochastic_graph(G,copy=True) assert_equal(sorted(S.edges(data=True)), [(0, 1, {'weight': 0.5}), (0, 2, {'weight': 0.5})]) def test_stochastic_ints(): G=nx.DiGraph() G.add_edge(0,1,weight=1) G.add_edge(0,2,weight=1) S=nx.stochastic_graph(G) assert_equal(sorted(S.edges(data=True)), [(0, 1, {'weight': 0.5}), (0, 2, {'weight': 0.5})]) @raises(nx.NetworkXNotImplemented) def test_stochastic_graph_input(): S = nx.stochastic_graph(nx.Graph()) @raises(nx.NetworkXNotImplemented) def test_stochastic_multigraph_input(): S = nx.stochastic_graph(nx.MultiGraph()) networkx-1.11/networkx/generators/tests/test_directed.py0000644000175000017500000000244112637544500023572 0ustar aricaric00000000000000#!/usr/bin/env python """Generators - Directed Graphs ---------------------------- """ from nose.tools import * from networkx import * from networkx.generators.directed import * class TestGeneratorsDirected(): def test_smoke_test_random_graphs(self): G=gn_graph(100) G=gnr_graph(100,0.5) G=gnc_graph(100) G=scale_free_graph(100) def test_create_using_keyword_arguments(self): assert_raises(networkx.exception.NetworkXError, gn_graph, 100, create_using=Graph()) assert_raises(networkx.exception.NetworkXError, gnr_graph, 100, 0.5, create_using=Graph()) assert_raises(networkx.exception.NetworkXError, gnc_graph, 100, create_using=Graph()) assert_raises(networkx.exception.NetworkXError, scale_free_graph, 100, create_using=Graph()) G=gn_graph(100,seed=1) MG=gn_graph(100,create_using=MultiDiGraph(),seed=1) assert_equal(G.edges(), MG.edges()) G=gnr_graph(100,0.5,seed=1) MG=gnr_graph(100,0.5,create_using=MultiDiGraph(),seed=1) assert_equal(G.edges(), MG.edges()) G=gnc_graph(100,seed=1) MG=gnc_graph(100,create_using=MultiDiGraph(),seed=1) assert_equal(G.edges(), MG.edges()) networkx-1.11/networkx/generators/tests/test_random_clustered.py0000644000175000017500000000166212637544450025351 0ustar aricaric00000000000000#!/usr/bin/env python from nose.tools import * import networkx class TestRandomClusteredGraph: def test_valid(self): node=[1,1,1,2,1,2,0,0] tri=[0,0,0,0,0,1,1,1] joint_degree_sequence=zip(node,tri) G = networkx.random_clustered_graph(joint_degree_sequence) assert_equal(G.number_of_nodes(),8) assert_equal(G.number_of_edges(),7) def test_valid2(self): G = networkx.random_clustered_graph(\ [(1,2),(2,1),(1,1),(1,1),(1,1),(2,0)]) assert_equal(G.number_of_nodes(),6) assert_equal(G.number_of_edges(),10) def test_invalid1(self): assert_raises((TypeError,networkx.NetworkXError), networkx.random_clustered_graph,[[1,1],[2,1],[0,1]]) def test_invalid2(self): assert_raises((TypeError,networkx.NetworkXError), networkx.random_clustered_graph,[[1,1],[1,2],[0,1]]) networkx-1.11/networkx/generators/tests/test_community.py0000644000175000017500000000765612637544500024050 0ustar aricaric00000000000000import networkx as nx from nose.tools import * def test_random_partition_graph(): G = nx.random_partition_graph([3,3,3],1,0) C = G.graph['partition'] assert_equal(C,[set([0,1,2]), set([3,4,5]), set([6,7,8])]) assert_equal(len(G),9) assert_equal(len(G.edges()),9) G = nx.random_partition_graph([3,3,3],0,1) C = G.graph['partition'] assert_equal(C,[set([0,1,2]), set([3,4,5]), set([6,7,8])]) assert_equal(len(G),9) assert_equal(len(G.edges()),27) G = nx.random_partition_graph([3,3,3],1,0,directed=True) C = G.graph['partition'] assert_equal(C,[set([0,1,2]), set([3,4,5]), set([6,7,8])]) assert_equal(len(G),9) assert_equal(len(G.edges()),18) G = nx.random_partition_graph([3,3,3],0,1,directed=True) C = G.graph['partition'] assert_equal(C,[set([0,1,2]), set([3,4,5]), set([6,7,8])]) assert_equal(len(G),9) assert_equal(len(G.edges()),54) G = nx.random_partition_graph([1,2,3,4,5], 0.5, 0.1) C = G.graph['partition'] assert_equal(C,[set([0]), set([1,2]), set([3,4,5]), set([6,7,8,9]), set([10,11,12,13,14])]) assert_equal(len(G),15) assert_raises(nx.NetworkXError, nx.random_partition_graph,[1,2,3],1.1,0.1) assert_raises(nx.NetworkXError, nx.random_partition_graph,[1,2,3],-0.1,0.1) assert_raises(nx.NetworkXError, nx.random_partition_graph,[1,2,3],0.1,1.1) assert_raises(nx.NetworkXError, nx.random_partition_graph,[1,2,3],0.1,-0.1) def test_planted_partition_graph(): G = nx.planted_partition_graph(4,3,1,0) C = G.graph['partition'] assert_equal(len(C),4) assert_equal(len(G),12) assert_equal(len(G.edges()),12) G = nx.planted_partition_graph(4,3,0,1) C = G.graph['partition'] assert_equal(len(C),4) assert_equal(len(G),12) assert_equal(len(G.edges()),54) G = nx.planted_partition_graph(10,4,.5,.1,seed=42) C = G.graph['partition'] assert_equal(len(C),10) assert_equal(len(G),40) assert_equal(len(G.edges()),108) G = nx.planted_partition_graph(4,3,1,0,directed=True) C = G.graph['partition'] assert_equal(len(C),4) assert_equal(len(G),12) assert_equal(len(G.edges()),24) G = nx.planted_partition_graph(4,3,0,1,directed=True) C = G.graph['partition'] assert_equal(len(C),4) assert_equal(len(G),12) assert_equal(len(G.edges()),108) G = nx.planted_partition_graph(10,4,.5,.1,seed=42,directed=True) C = G.graph['partition'] assert_equal(len(C),10) assert_equal(len(G),40) assert_equal(len(G.edges()),218) assert_raises(nx.NetworkXError, nx.planted_partition_graph, 3, 3, 1.1, 0.1) assert_raises(nx.NetworkXError, nx.planted_partition_graph, 3, 3,-0.1, 0.1) assert_raises(nx.NetworkXError, nx.planted_partition_graph, 3, 3, 0.1, 1.1) assert_raises(nx.NetworkXError, nx.planted_partition_graph, 3, 3, 0.1,-0.1) def test_relaxed_caveman_graph(): G = nx.relaxed_caveman_graph(4,3,0) assert_equal(len(G),12) assert_equal(len(G.nodes()),12) G = nx.relaxed_caveman_graph(4,3,1) assert_equal(len(G),12) assert_equal(len(G.nodes()),12) G = nx.relaxed_caveman_graph(4,3,0.5) assert_equal(len(G),12) assert_equal(len(G.edges()),12) def test_connected_caveman_graph(): G = nx.connected_caveman_graph(4,3) assert_equal(len(G),12) assert_equal(len(G.nodes()),12) G = nx.connected_caveman_graph(1,5) K5 = nx.complete_graph(5) K5.remove_edge(3,4) assert_true(nx.is_isomorphic(G,K5)) def test_caveman_graph(): G = nx.caveman_graph(4,3) assert_equal(len(G),12) assert_equal(len(G.nodes()),12) G = nx.caveman_graph(1,5) K5 = nx.complete_graph(5) assert_true(nx.is_isomorphic(G,K5)) def test_gaussian_random_partition_graph(): G = nx.gaussian_random_partition_graph(100, 10, 10, 0.3, 0.01) assert_equal(len(G),100) assert_raises(nx.NetworkXError, nx.gaussian_random_partition_graph, 100, 101, 10, 1, 0) networkx-1.11/networkx/generators/tests/test_classic.py0000644000175000017500000004006012637544500023427 0ustar aricaric00000000000000#!/usr/bin/env python """ ==================== Generators - Classic ==================== Unit tests for various classic graph generators in generators/classic.py """ import itertools from nose.tools import * import networkx as nx from networkx import * from networkx.algorithms.isomorphism.isomorph import graph_could_be_isomorphic from networkx.testing import assert_edges_equal from networkx.testing import assert_nodes_equal is_isomorphic=graph_could_be_isomorphic class TestGeneratorClassic(): def test_balanced_tree(self): # balanced_tree(r,h) is a tree with (r**(h+1)-1)/(r-1) edges for r,h in [(2,2),(3,3),(6,2)]: t=balanced_tree(r,h) order=t.order() assert_true(order==(r**(h+1)-1)/(r-1)) assert_true(is_connected(t)) assert_true(t.size()==order-1) dh = degree_histogram(t) assert_equal(dh[0],0) # no nodes of 0 assert_equal(dh[1],r**h) # nodes of degree 1 are leaves assert_equal(dh[r],1) # root is degree r assert_equal(dh[r+1],order-r**h-1)# everyone else is degree r+1 assert_equal(len(dh),r+2) def test_balanced_tree_star(self): # balanced_tree(r,1) is the r-star t=balanced_tree(r=2,h=1) assert_true(is_isomorphic(t,star_graph(2))) t=balanced_tree(r=5,h=1) assert_true(is_isomorphic(t,star_graph(5))) t=balanced_tree(r=10,h=1) assert_true(is_isomorphic(t,star_graph(10))) def test_full_rary_tree(self): r=2 n=9 t=full_rary_tree(r,n) assert_equal(t.order(),n) assert_true(is_connected(t)) dh = degree_histogram(t) assert_equal(dh[0],0) # no nodes of 0 assert_equal(dh[1],5) # nodes of degree 1 are leaves assert_equal(dh[r],1) # root is degree r assert_equal(dh[r+1],9-5-1) # everyone else is degree r+1 assert_equal(len(dh),r+2) def test_full_rary_tree_balanced(self): t=full_rary_tree(2,15) th=balanced_tree(2,3) assert_true(is_isomorphic(t,th)) def test_full_rary_tree_path(self): t=full_rary_tree(1,10) assert_true(is_isomorphic(t,path_graph(10))) def test_full_rary_tree_empty(self): t=full_rary_tree(0,10) assert_true(is_isomorphic(t,empty_graph(10))) t=full_rary_tree(3,0) assert_true(is_isomorphic(t,empty_graph(0))) def test_full_rary_tree_3_20(self): t=full_rary_tree(3,20) assert_equal(t.order(),20) def test_barbell_graph(self): # number of nodes = 2*m1 + m2 (2 m1-complete graphs + m2-path + 2 edges) # number of edges = 2*(number_of_edges(m1-complete graph) + m2 + 1 m1=3; m2=5 b=barbell_graph(m1,m2) assert_true(number_of_nodes(b)==2*m1+m2) assert_true(number_of_edges(b)==m1*(m1-1) + m2 + 1) assert_equal(b.name, 'barbell_graph(3,5)') m1=4; m2=10 b=barbell_graph(m1,m2) assert_true(number_of_nodes(b)==2*m1+m2) assert_true(number_of_edges(b)==m1*(m1-1) + m2 + 1) assert_equal(b.name, 'barbell_graph(4,10)') m1=3; m2=20 b=barbell_graph(m1,m2) assert_true(number_of_nodes(b)==2*m1+m2) assert_true(number_of_edges(b)==m1*(m1-1) + m2 + 1) assert_equal(b.name, 'barbell_graph(3,20)') # Raise NetworkXError if m1<2 m1=1; m2=20 assert_raises(networkx.exception.NetworkXError, barbell_graph, m1, m2) # Raise NetworkXError if m2<0 m1=5; m2=-2 assert_raises(networkx.exception.NetworkXError, barbell_graph, m1, m2) # barbell_graph(2,m) = path_graph(m+4) m1=2; m2=5 b=barbell_graph(m1,m2) assert_true(is_isomorphic(b, path_graph(m2+4))) m1=2; m2=10 b=barbell_graph(m1,m2) assert_true(is_isomorphic(b, path_graph(m2+4))) m1=2; m2=20 b=barbell_graph(m1,m2) assert_true(is_isomorphic(b, path_graph(m2+4))) assert_raises(networkx.exception.NetworkXError, barbell_graph, m1, m2, create_using=DiGraph()) mb=barbell_graph(m1, m2, create_using=MultiGraph()) assert_true(mb.edges()==b.edges()) def test_complete_graph(self): # complete_graph(m) is a connected graph with # m nodes and m*(m+1)/2 edges for m in [0, 1, 3, 5]: g = complete_graph(m) assert_true(number_of_nodes(g) == m) assert_true(number_of_edges(g) == m * (m - 1) // 2) mg=complete_graph(m, create_using=MultiGraph()) assert_true(mg.edges()==g.edges()) def test_complete_digraph(self): # complete_graph(m) is a connected graph with # m nodes and m*(m+1)/2 edges for m in [0, 1, 3, 5]: g = complete_graph(m,create_using=nx.DiGraph()) assert_true(number_of_nodes(g) == m) assert_true(number_of_edges(g) == m * (m - 1)) def test_circular_ladder_graph(self): G=circular_ladder_graph(5) assert_raises(networkx.exception.NetworkXError, circular_ladder_graph, 5, create_using=DiGraph()) mG=circular_ladder_graph(5, create_using=MultiGraph()) assert_equal(mG.edges(), G.edges()) def test_circulant_graph(self): # Ci_n(1) is the cycle graph for all n Ci6_1 = circulant_graph(6, [1]) C6 = cycle_graph(6) assert_equal(Ci6_1.edges(), C6.edges()) # Ci_n(1, 2, ..., n div 2) is the complete graph for all n Ci7 = circulant_graph(7, [1, 2, 3]) K7 = complete_graph(7) assert_equal(Ci7.edges(), K7.edges()) # Ci_6(1, 3) is K_3,3 i.e. the utility graph Ci6_1_3 = circulant_graph(6, [1, 3]) K3_3 = complete_bipartite_graph(3, 3) assert_true(is_isomorphic(Ci6_1_3, K3_3)) def test_cycle_graph(self): G=cycle_graph(4) assert_equal(sorted(G.edges()), [(0, 1), (0, 3), (1, 2), (2, 3)]) mG=cycle_graph(4, create_using=MultiGraph()) assert_equal(sorted(mG.edges()), [(0, 1), (0, 3), (1, 2), (2, 3)]) G=cycle_graph(4, create_using=DiGraph()) assert_false(G.has_edge(2,1)) assert_true(G.has_edge(1,2)) def test_dorogovtsev_goltsev_mendes_graph(self): G=dorogovtsev_goltsev_mendes_graph(0) assert_equal(G.edges(), [(0, 1)]) assert_equal(G.nodes(), [0, 1]) G=dorogovtsev_goltsev_mendes_graph(1) assert_equal(G.edges(), [(0, 1), (0, 2), (1, 2)]) assert_equal(average_clustering(G), 1.0) assert_equal(list(triangles(G).values()), [1, 1, 1]) G=dorogovtsev_goltsev_mendes_graph(10) assert_equal(number_of_nodes(G), 29526) assert_equal(number_of_edges(G), 59049) assert_equal(G.degree(0), 1024) assert_equal(G.degree(1), 1024) assert_equal(G.degree(2), 1024) assert_raises(networkx.exception.NetworkXError, dorogovtsev_goltsev_mendes_graph, 7, create_using=DiGraph()) assert_raises(networkx.exception.NetworkXError, dorogovtsev_goltsev_mendes_graph, 7, create_using=MultiGraph()) def test_empty_graph(self): G=empty_graph() assert_equal(number_of_nodes(G), 0) G=empty_graph(42) assert_equal(number_of_nodes(G), 42) assert_equal(number_of_edges(G), 0) assert_equal(G.name, 'empty_graph(42)') # create empty digraph G=empty_graph(42,create_using=DiGraph(name="duh")) assert_equal(number_of_nodes(G), 42) assert_equal(number_of_edges(G), 0) assert_equal(G.name, 'empty_graph(42)') assert_true(isinstance(G,DiGraph)) # create empty multigraph G=empty_graph(42,create_using=MultiGraph(name="duh")) assert_equal(number_of_nodes(G), 42) assert_equal(number_of_edges(G), 0) assert_equal(G.name, 'empty_graph(42)') assert_true(isinstance(G,MultiGraph)) # create empty graph from another pete=petersen_graph() G=empty_graph(42,create_using=pete) assert_equal(number_of_nodes(G), 42) assert_equal(number_of_edges(G), 0) assert_equal(G.name, 'empty_graph(42)') assert_true(isinstance(G,Graph)) def test_grid_2d_graph(self): n=5;m=6 G=grid_2d_graph(n,m) assert_equal(number_of_nodes(G), n*m) assert_equal(degree_histogram(G), [0,0,4,2*(n+m)-8,(n-2)*(m-2)]) DG=grid_2d_graph(n,m, create_using=DiGraph()) assert_equal(DG.succ, G.adj) assert_equal(DG.pred, G.adj) MG=grid_2d_graph(n,m, create_using=MultiGraph()) assert_equal(MG.edges(), G.edges()) def test_grid_graph(self): """grid_graph([n,m]) is a connected simple graph with the following properties: number_of_nodes=n*m degree_histogram=[0,0,4,2*(n+m)-8,(n-2)*(m-2)] """ for n, m in [(3, 5), (5, 3), (4, 5), (5, 4)]: dim=[n,m] g=grid_graph(dim) assert_equal(number_of_nodes(g), n*m) assert_equal(degree_histogram(g), [0,0,4,2*(n+m)-8,(n-2)*(m-2)]) assert_equal(dim,[n,m]) for n, m in [(1, 5), (5, 1)]: dim=[n,m] g=grid_graph(dim) assert_equal(number_of_nodes(g), n*m) assert_true(is_isomorphic(g,path_graph(5))) assert_equal(dim,[n,m]) # mg=grid_graph([n,m], create_using=MultiGraph()) # assert_equal(mg.edges(), g.edges()) def test_hypercube_graph(self): for n, G in [(0, null_graph()), (1, path_graph(2)), (2, cycle_graph(4)), (3, cubical_graph())]: g=hypercube_graph(n) assert_true(is_isomorphic(g, G)) g=hypercube_graph(4) assert_equal(degree_histogram(g), [0, 0, 0, 0, 16]) g=hypercube_graph(5) assert_equal(degree_histogram(g), [0, 0, 0, 0, 0, 32]) g=hypercube_graph(6) assert_equal(degree_histogram(g), [0, 0, 0, 0, 0, 0, 64]) # mg=hypercube_graph(6, create_using=MultiGraph()) # assert_equal(mg.edges(), g.edges()) def test_ladder_graph(self): for i, G in [(0, empty_graph(0)), (1, path_graph(2)), (2, hypercube_graph(2)), (10, grid_graph([2,10]))]: assert_true(is_isomorphic(ladder_graph(i), G)) assert_raises(networkx.exception.NetworkXError, ladder_graph, 2, create_using=DiGraph()) g = ladder_graph(2) mg=ladder_graph(2, create_using=MultiGraph()) assert_equal(mg.edges(), g.edges()) def test_lollipop_graph(self): # number of nodes = m1 + m2 # number of edges = number_of_edges(complete_graph(m1)) + m2 for m1, m2 in [(3, 5), (4, 10), (3, 20)]: b=lollipop_graph(m1,m2) assert_equal(number_of_nodes(b), m1+m2) assert_equal(number_of_edges(b), m1*(m1-1)/2 + m2) assert_equal(b.name, 'lollipop_graph(' + str(m1) + ',' + str(m2) + ')') # Raise NetworkXError if m<2 assert_raises(networkx.exception.NetworkXError, lollipop_graph, 1, 20) # Raise NetworkXError if n<0 assert_raises(networkx.exception.NetworkXError, lollipop_graph, 5, -2) # lollipop_graph(2,m) = path_graph(m+2) for m1, m2 in [(2, 5), (2, 10), (2, 20)]: b=lollipop_graph(m1,m2) assert_true(is_isomorphic(b, path_graph(m2+2))) assert_raises(networkx.exception.NetworkXError, lollipop_graph, m1, m2, create_using=DiGraph()) mb=lollipop_graph(m1, m2, create_using=MultiGraph()) assert_true(mb.edges(), b.edges()) def test_null_graph(self): assert_equal(number_of_nodes(null_graph()), 0) def test_path_graph(self): p=path_graph(0) assert_true(is_isomorphic(p, null_graph())) assert_equal(p.name, 'path_graph(0)') p=path_graph(1) assert_true(is_isomorphic( p, empty_graph(1))) assert_equal(p.name, 'path_graph(1)') p=path_graph(10) assert_true(is_connected(p)) assert_equal(sorted(list(p.degree().values())), [1, 1, 2, 2, 2, 2, 2, 2, 2, 2]) assert_equal(p.order()-1, p.size()) dp=path_graph(3, create_using=DiGraph()) assert_true(dp.has_edge(0,1)) assert_false(dp.has_edge(1,0)) mp=path_graph(10, create_using=MultiGraph()) assert_true(mp.edges()==p.edges()) def test_periodic_grid_2d_graph(self): g=grid_2d_graph(0,0, periodic=True) assert_equal(g.degree(), {}) for m, n, G in [(2, 2, cycle_graph(4)), (1, 7, cycle_graph(7)), (7, 1, cycle_graph(7)), (2, 5, circular_ladder_graph(5)), (5, 2, circular_ladder_graph(5)), (2, 4, cubical_graph()), (4, 2, cubical_graph())]: g=grid_2d_graph(m,n, periodic=True) assert_true(is_isomorphic(g, G)) DG=grid_2d_graph(4, 2, periodic=True, create_using=DiGraph()) assert_equal(DG.succ,g.adj) assert_equal(DG.pred,g.adj) MG=grid_2d_graph(4, 2, periodic=True, create_using=MultiGraph()) assert_equal(MG.edges(),g.edges()) def test_star_graph(self): assert_true(is_isomorphic(star_graph(0), empty_graph(1))) assert_true(is_isomorphic(star_graph(1), path_graph(2))) assert_true(is_isomorphic(star_graph(2), path_graph(3))) s=star_graph(10) assert_equal(sorted(list(s.degree().values())), [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 10]) assert_raises(networkx.exception.NetworkXError, star_graph, 10, create_using=DiGraph()) ms=star_graph(10, create_using=MultiGraph()) assert_true(ms.edges()==s.edges()) def test_trivial_graph(self): assert_equal(number_of_nodes(trivial_graph()), 1) def test_wheel_graph(self): for n, G in [(0, null_graph()), (1, empty_graph(1)), (2, path_graph(2)), (3, complete_graph(3)), (4, complete_graph(4))]: g=wheel_graph(n) assert_true(is_isomorphic( g, G)) assert_equal(g.name, 'wheel_graph(4)') g=wheel_graph(10) assert_equal(sorted(list(g.degree().values())), [3, 3, 3, 3, 3, 3, 3, 3, 3, 9]) assert_raises(networkx.exception.NetworkXError, wheel_graph, 10, create_using=DiGraph()) mg=wheel_graph(10, create_using=MultiGraph()) assert_equal(mg.edges(), g.edges()) def test_complete_0_partite_graph(self): """Tests that the complete 0-partite graph is the null graph.""" G = nx.complete_multipartite_graph() H = nx.null_graph() assert_nodes_equal(G, H) assert_edges_equal(G.edges(), H.edges()) def test_complete_1_partite_graph(self): """Tests that the complete 1-partite graph is the empty graph.""" G = nx.complete_multipartite_graph(3) H = nx.empty_graph(3) assert_nodes_equal(G, H) assert_edges_equal(G.edges(), H.edges()) def test_complete_2_partite_graph(self): """Tests that the complete 2-partite graph is the complete bipartite graph. """ G = nx.complete_multipartite_graph(2, 3) H = nx.complete_bipartite_graph(2, 3) assert_nodes_equal(G, H) assert_edges_equal(G.edges(), H.edges()) def test_complete_multipartite_graph(self): """Tests for generating the complete multipartite graph.""" G = nx.complete_multipartite_graph(2, 3, 4) blocks = [(0, 1), (2, 3, 4), (5, 6, 7, 8)] # Within each block, no two vertices should be adjacent. for block in blocks: for u, v in itertools.combinations_with_replacement(block, 2): assert_true(v not in G[u]) assert_equal(G.node[u], G.node[v]) # Across blocks, all vertices should be adjacent. for (block1, block2) in itertools.combinations(blocks, 2): for u, v in itertools.product(block1, block2): assert_true(v in G[u]) assert_not_equal(G.node[u], G.node[v]) networkx-1.11/networkx/generators/social.py0000644000175000017500000002516712637544450021076 0ustar aricaric00000000000000""" Famous social networks. """ import networkx as nx __author__ = """\n""".join(['Jordi Torrents ', 'Katy Bold ', 'Aric Hagberg >> import networkx as nx >>> G = nx.karate_club_graph() >>> G.node[5]['club'] 'Mr. Hi' >>> G.node[9]['club'] 'Officer' References ---------- .. [1] Zachary, Wayne W. "An Information Flow Model for Conflict and Fission in Small Groups." *Journal of Anthropological Research*, 33, 452--473, (1977). .. [2] Data file from: http://vlado.fmf.uni-lj.si/pub/networks/data/Ucinet/UciData.htm """ # Create the set of all members, and the members of each club. all_members = set(range(34)) club1 = {0, 1, 2, 3, 4, 5, 6, 7, 8, 10, 11, 12, 13, 16, 17, 19, 21} # club2 = all_members - club1 G = nx.Graph() G.add_nodes_from(all_members) G.name = "Zachary's Karate Club" zacharydat = """\ 0 1 1 1 1 1 1 1 1 0 1 1 1 1 0 0 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 1 0 0 1 0 1 1 0 0 0 1 0 0 0 0 0 1 0 0 0 1 0 1 0 1 0 0 0 0 0 0 0 0 1 0 0 0 1 1 0 1 0 0 0 1 1 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 1 0 1 1 1 0 0 0 0 1 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 1 1 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 1 0 0 0 1 1 0 0 1 0 0 0 0 0 1 0 0 0 0 0 1 1 0 0 1 0 1 0 1 1 0 0 0 0 0 1 1 1 0 1 0 0 0 0 0 0 0 0 1 1 0 0 0 1 1 1 0 0 1 1 1 0 1 1 0 0 1 1 1 1 1 1 1 0""" for row, line in enumerate(zacharydat.split('\n')): thisrow = [int(b) for b in line.split()] for col, entry in enumerate(thisrow): if entry == 1: G.add_edge(row, col) # Add the name of each member's club as a node attribute. for v in G: G.node[v]['club'] = 'Mr. Hi' if v in club1 else 'Officer' return G def davis_southern_women_graph(): """Return Davis Southern women social network. This is a bipartite graph. References ---------- .. [1] A. Davis, Gardner, B. B., Gardner, M. R., 1941. Deep South. University of Chicago Press, Chicago, IL. """ G = nx.Graph() # Top nodes women = ["Evelyn Jefferson", "Laura Mandeville", "Theresa Anderson", "Brenda Rogers", "Charlotte McDowd", "Frances Anderson", "Eleanor Nye", "Pearl Oglethorpe", "Ruth DeSand", "Verne Sanderson", "Myra Liddel", "Katherina Rogers", "Sylvia Avondale", "Nora Fayette", "Helen Lloyd", "Dorothy Murchison", "Olivia Carleton", "Flora Price"] G.add_nodes_from(women, bipartite=0) # Bottom nodes events = ["E1", "E2", "E3", "E4", "E5", "E6", "E7", "E8", "E9", "E10", "E11", "E12", "E13", "E14"] G.add_nodes_from(events, bipartite=1) G.add_edges_from([("Evelyn Jefferson","E1"), ("Evelyn Jefferson","E2"), ("Evelyn Jefferson","E3"), ("Evelyn Jefferson","E4"), ("Evelyn Jefferson","E5"), ("Evelyn Jefferson","E6"), ("Evelyn Jefferson","E8"), ("Evelyn Jefferson","E9"), ("Laura Mandeville","E1"), ("Laura Mandeville","E2"), ("Laura Mandeville","E3"), ("Laura Mandeville","E5"), ("Laura Mandeville","E6"), ("Laura Mandeville","E7"), ("Laura Mandeville","E8"), ("Theresa Anderson","E2"), ("Theresa Anderson","E3"), ("Theresa Anderson","E4"), ("Theresa Anderson","E5"), ("Theresa Anderson","E6"), ("Theresa Anderson","E7"), ("Theresa Anderson","E8"), ("Theresa Anderson","E9"), ("Brenda Rogers","E1"), ("Brenda Rogers","E3"), ("Brenda Rogers","E4"), ("Brenda Rogers","E5"), ("Brenda Rogers","E6"), ("Brenda Rogers","E7"), ("Brenda Rogers","E8"), ("Charlotte McDowd","E3"), ("Charlotte McDowd","E4"), ("Charlotte McDowd","E5"), ("Charlotte McDowd","E7"), ("Frances Anderson","E3"), ("Frances Anderson","E5"), ("Frances Anderson","E6"), ("Frances Anderson","E8"), ("Eleanor Nye","E5"), ("Eleanor Nye","E6"), ("Eleanor Nye","E7"), ("Eleanor Nye","E8"), ("Pearl Oglethorpe","E6"), ("Pearl Oglethorpe","E8"), ("Pearl Oglethorpe","E9"), ("Ruth DeSand","E5"), ("Ruth DeSand","E7"), ("Ruth DeSand","E8"), ("Ruth DeSand","E9"), ("Verne Sanderson","E7"), ("Verne Sanderson","E8"), ("Verne Sanderson","E9"), ("Verne Sanderson","E12"), ("Myra Liddel","E8"), ("Myra Liddel","E9"), ("Myra Liddel","E10"), ("Myra Liddel","E12"), ("Katherina Rogers","E8"), ("Katherina Rogers","E9"), ("Katherina Rogers","E10"), ("Katherina Rogers","E12"), ("Katherina Rogers","E13"), ("Katherina Rogers","E14"), ("Sylvia Avondale","E7"), ("Sylvia Avondale","E8"), ("Sylvia Avondale","E9"), ("Sylvia Avondale","E10"), ("Sylvia Avondale","E12"), ("Sylvia Avondale","E13"), ("Sylvia Avondale","E14"), ("Nora Fayette","E6"), ("Nora Fayette","E7"), ("Nora Fayette","E9"), ("Nora Fayette","E10"), ("Nora Fayette","E11"), ("Nora Fayette","E12"), ("Nora Fayette","E13"), ("Nora Fayette","E14"), ("Helen Lloyd","E7"), ("Helen Lloyd","E8"), ("Helen Lloyd","E10"), ("Helen Lloyd","E11"), ("Helen Lloyd","E12"), ("Dorothy Murchison","E8"), ("Dorothy Murchison","E9"), ("Olivia Carleton","E9"), ("Olivia Carleton","E11"), ("Flora Price","E9"), ("Flora Price","E11")]) G.graph['top'] = women G.graph['bottom'] = events return G def florentine_families_graph(): """Return Florentine families graph. References ---------- .. [1] Ronald L. Breiger and Philippa E. Pattison Cumulated social roles: The duality of persons and their algebras,1 Social Networks, Volume 8, Issue 3, September 1986, Pages 215-256 """ G=nx.Graph() G.add_edge('Acciaiuoli','Medici') G.add_edge('Castellani','Peruzzi') G.add_edge('Castellani','Strozzi') G.add_edge('Castellani','Barbadori') G.add_edge('Medici','Barbadori') G.add_edge('Medici','Ridolfi') G.add_edge('Medici','Tornabuoni') G.add_edge('Medici','Albizzi') G.add_edge('Medici','Salviati') G.add_edge('Salviati','Pazzi') G.add_edge('Peruzzi','Strozzi') G.add_edge('Peruzzi','Bischeri') G.add_edge('Strozzi','Ridolfi') G.add_edge('Strozzi','Bischeri') G.add_edge('Ridolfi','Tornabuoni') G.add_edge('Tornabuoni','Guadagni') G.add_edge('Albizzi','Ginori') G.add_edge('Albizzi','Guadagni') G.add_edge('Bischeri','Guadagni') G.add_edge('Guadagni','Lamberteschi') return G networkx-1.11/networkx/generators/small.py0000644000175000017500000003107612637544500020724 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ Various small and named graphs, together with some compact generators. """ __author__ ="""Aric Hagberg (hagberg@lanl.gov)\nPieter Swart (swart@lanl.gov)""" # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. __all__ = ['make_small_graph', 'LCF_graph', 'bull_graph', 'chvatal_graph', 'cubical_graph', 'desargues_graph', 'diamond_graph', 'dodecahedral_graph', 'frucht_graph', 'heawood_graph', 'house_graph', 'house_x_graph', 'icosahedral_graph', 'krackhardt_kite_graph', 'moebius_kantor_graph', 'octahedral_graph', 'pappus_graph', 'petersen_graph', 'sedgewick_maze_graph', 'tetrahedral_graph', 'truncated_cube_graph', 'truncated_tetrahedron_graph', 'tutte_graph'] import networkx as nx from networkx.generators.classic import empty_graph, cycle_graph, path_graph, complete_graph from networkx.exception import NetworkXError #------------------------------------------------------------------------------ # Tools for creating small graphs #------------------------------------------------------------------------------ def make_small_undirected_graph(graph_description, create_using=None): """ Return a small undirected graph described by graph_description. See make_small_graph. """ if create_using is not None and create_using.is_directed(): raise NetworkXError("Directed Graph not supported") return make_small_graph(graph_description, create_using) def make_small_graph(graph_description, create_using=None): """ Return the small graph described by graph_description. graph_description is a list of the form [ltype,name,n,xlist] Here ltype is one of "adjacencylist" or "edgelist", name is the name of the graph and n the number of nodes. This constructs a graph of n nodes with integer labels 0,..,n-1. If ltype="adjacencylist" then xlist is an adjacency list with exactly n entries, in with the j'th entry (which can be empty) specifies the nodes connected to vertex j. e.g. the "square" graph C_4 can be obtained by >>> G=nx.make_small_graph(["adjacencylist","C_4",4,[[2,4],[1,3],[2,4],[1,3]]]) or, since we do not need to add edges twice, >>> G=nx.make_small_graph(["adjacencylist","C_4",4,[[2,4],[3],[4],[]]]) If ltype="edgelist" then xlist is an edge list written as [[v1,w2],[v2,w2],...,[vk,wk]], where vj and wj integers in the range 1,..,n e.g. the "square" graph C_4 can be obtained by >>> G=nx.make_small_graph(["edgelist","C_4",4,[[1,2],[3,4],[2,3],[4,1]]]) Use the create_using argument to choose the graph class/type. """ ltype=graph_description[0] name=graph_description[1] n=graph_description[2] G=empty_graph(n, create_using) nodes=G.nodes() if ltype=="adjacencylist": adjlist=graph_description[3] if len(adjlist) != n: raise NetworkXError("invalid graph_description") G.add_edges_from([(u-1,v) for v in nodes for u in adjlist[v]]) elif ltype=="edgelist": edgelist=graph_description[3] for e in edgelist: v1=e[0]-1 v2=e[1]-1 if v1<0 or v1>n-1 or v2<0 or v2>n-1: raise NetworkXError("invalid graph_description") else: G.add_edge(v1,v2) G.name=name return G def LCF_graph(n,shift_list,repeats,create_using=None): """ Return the cubic graph specified in LCF notation. LCF notation (LCF=Lederberg-Coxeter-Fruchte) is a compressed notation used in the generation of various cubic Hamiltonian graphs of high symmetry. See, for example, dodecahedral_graph, desargues_graph, heawood_graph and pappus_graph below. n (number of nodes) The starting graph is the n-cycle with nodes 0,...,n-1. (The null graph is returned if n < 0.) shift_list = [s1,s2,..,sk], a list of integer shifts mod n, repeats integer specifying the number of times that shifts in shift_list are successively applied to each v_current in the n-cycle to generate an edge between v_current and v_current+shift mod n. For v1 cycling through the n-cycle a total of k*repeats with shift cycling through shiftlist repeats times connect v1 with v1+shift mod n The utility graph K_{3,3} >>> G=nx.LCF_graph(6,[3,-3],3) The Heawood graph >>> G=nx.LCF_graph(14,[5,-5],7) See http://mathworld.wolfram.com/LCFNotation.html for a description and references. """ if create_using is not None and create_using.is_directed(): raise NetworkXError("Directed Graph not supported") if n <= 0: return empty_graph(0, create_using) # start with the n-cycle G=cycle_graph(n, create_using) G.name="LCF_graph" nodes=G.nodes() n_extra_edges=repeats*len(shift_list) # edges are added n_extra_edges times # (not all of these need be new) if n_extra_edges < 1: return G for i in range(n_extra_edges): shift=shift_list[i%len(shift_list)] #cycle through shift_list v1=nodes[i%n] # cycle repeatedly through nodes v2=nodes[(i + shift)%n] G.add_edge(v1, v2) return G #------------------------------------------------------------------------------- # Various small and named graphs #------------------------------------------------------------------------------- def bull_graph(create_using=None): """Return the Bull graph. """ description=[ "adjacencylist", "Bull Graph", 5, [[2,3],[1,3,4],[1,2,5],[2],[3]] ] G=make_small_undirected_graph(description, create_using) return G def chvatal_graph(create_using=None): """Return the Chvátal graph.""" description=[ "adjacencylist", "Chvatal Graph", 12, [[2,5,7,10],[3,6,8],[4,7,9],[5,8,10], [6,9],[11,12],[11,12],[9,12], [11],[11,12],[],[]] ] G=make_small_undirected_graph(description, create_using) return G def cubical_graph(create_using=None): """Return the 3-regular Platonic Cubical graph.""" description=[ "adjacencylist", "Platonic Cubical Graph", 8, [[2,4,5],[1,3,8],[2,4,7],[1,3,6], [1,6,8],[4,5,7],[3,6,8],[2,5,7]] ] G=make_small_undirected_graph(description, create_using) return G def desargues_graph(create_using=None): """ Return the Desargues graph.""" G=LCF_graph(20, [5,-5,9,-9], 5, create_using) G.name="Desargues Graph" return G def diamond_graph(create_using=None): """Return the Diamond graph. """ description=[ "adjacencylist", "Diamond Graph", 4, [[2,3],[1,3,4],[1,2,4],[2,3]] ] G=make_small_undirected_graph(description, create_using) return G def dodecahedral_graph(create_using=None): """ Return the Platonic Dodecahedral graph. """ G=LCF_graph(20, [10,7,4,-4,-7,10,-4,7,-7,4], 2, create_using) G.name="Dodecahedral Graph" return G def frucht_graph(create_using=None): """Return the Frucht Graph. The Frucht Graph is the smallest cubical graph whose automorphism group consists only of the identity element. """ G=cycle_graph(7, create_using) G.add_edges_from([[0,7],[1,7],[2,8],[3,9],[4,9],[5,10],[6,10], [7,11],[8,11],[8,9],[10,11]]) G.name="Frucht Graph" return G def heawood_graph(create_using=None): """ Return the Heawood graph, a (3,6) cage. """ G=LCF_graph(14, [5,-5], 7, create_using) G.name="Heawood Graph" return G def house_graph(create_using=None): """Return the House graph (square with triangle on top).""" description=[ "adjacencylist", "House Graph", 5, [[2,3],[1,4],[1,4,5],[2,3,5],[3,4]] ] G=make_small_undirected_graph(description, create_using) return G def house_x_graph(create_using=None): """Return the House graph with a cross inside the house square.""" description=[ "adjacencylist", "House-with-X-inside Graph", 5, [[2,3,4],[1,3,4],[1,2,4,5],[1,2,3,5],[3,4]] ] G=make_small_undirected_graph(description, create_using) return G def icosahedral_graph(create_using=None): """Return the Platonic Icosahedral graph.""" description=[ "adjacencylist", "Platonic Icosahedral Graph", 12, [[2,6,8,9,12],[3,6,7,9],[4,7,9,10],[5,7,10,11], [6,7,11,12],[7,12],[],[9,10,11,12], [10],[11],[12],[]] ] G=make_small_undirected_graph(description, create_using) return G def krackhardt_kite_graph(create_using=None): """ Return the Krackhardt Kite Social Network. A 10 actor social network introduced by David Krackhardt to illustrate: degree, betweenness, centrality, closeness, etc. The traditional labeling is: Andre=1, Beverley=2, Carol=3, Diane=4, Ed=5, Fernando=6, Garth=7, Heather=8, Ike=9, Jane=10. """ description=[ "adjacencylist", "Krackhardt Kite Social Network", 10, [[2,3,4,6],[1,4,5,7],[1,4,6],[1,2,3,5,6,7],[2,4,7], [1,3,4,7,8],[2,4,5,6,8],[6,7,9],[8,10],[9]] ] G=make_small_undirected_graph(description, create_using) return G def moebius_kantor_graph(create_using=None): """Return the Moebius-Kantor graph.""" G=LCF_graph(16, [5,-5], 8, create_using) G.name="Moebius-Kantor Graph" return G def octahedral_graph(create_using=None): """Return the Platonic Octahedral graph.""" description=[ "adjacencylist", "Platonic Octahedral Graph", 6, [[2,3,4,5],[3,4,6],[5,6],[5,6],[6],[]] ] G=make_small_undirected_graph(description, create_using) return G def pappus_graph(): """ Return the Pappus graph.""" G=LCF_graph(18,[5,7,-7,7,-7,-5],3) G.name="Pappus Graph" return G def petersen_graph(create_using=None): """Return the Petersen graph.""" description=[ "adjacencylist", "Petersen Graph", 10, [[2,5,6],[1,3,7],[2,4,8],[3,5,9],[4,1,10],[1,8,9],[2,9,10], [3,6,10],[4,6,7],[5,7,8]] ] G=make_small_undirected_graph(description, create_using) return G def sedgewick_maze_graph(create_using=None): """ Return a small maze with a cycle. This is the maze used in Sedgewick,3rd Edition, Part 5, Graph Algorithms, Chapter 18, e.g. Figure 18.2 and following. Nodes are numbered 0,..,7 """ G=empty_graph(0, create_using) G.add_nodes_from(range(8)) G.add_edges_from([[0,2],[0,7],[0,5]]) G.add_edges_from([[1,7],[2,6]]) G.add_edges_from([[3,4],[3,5]]) G.add_edges_from([[4,5],[4,7],[4,6]]) G.name="Sedgewick Maze" return G def tetrahedral_graph(create_using=None): """ Return the 3-regular Platonic Tetrahedral graph.""" G=complete_graph(4, create_using) G.name="Platonic Tetrahedral graph" return G def truncated_cube_graph(create_using=None): """Return the skeleton of the truncated cube.""" description=[ "adjacencylist", "Truncated Cube Graph", 24, [[2,3,5],[12,15],[4,5],[7,9], [6],[17,19],[8,9],[11,13], [10],[18,21],[12,13],[15], [14],[22,23],[16],[20,24], [18,19],[21],[20],[24], [22],[23],[24],[]] ] G=make_small_undirected_graph(description, create_using) return G def truncated_tetrahedron_graph(create_using=None): """Return the skeleton of the truncated Platonic tetrahedron.""" G=path_graph(12, create_using) # G.add_edges_from([(1,3),(1,10),(2,7),(4,12),(5,12),(6,8),(9,11)]) G.add_edges_from([(0,2),(0,9),(1,6),(3,11),(4,11),(5,7),(8,10)]) G.name="Truncated Tetrahedron Graph" return G def tutte_graph(create_using=None): """Return the Tutte graph.""" description=[ "adjacencylist", "Tutte's Graph", 46, [[2,3,4],[5,27],[11,12],[19,20],[6,34], [7,30],[8,28],[9,15],[10,39],[11,38], [40],[13,40],[14,36],[15,16],[35], [17,23],[18,45],[19,44],[46],[21,46], [22,42],[23,24],[41],[25,28],[26,33], [27,32],[34],[29],[30,33],[31], [32,34],[33],[],[],[36,39], [37],[38,40],[39],[],[], [42,45],[43],[44,46],[45],[],[]] ] G=make_small_undirected_graph(description, create_using) return G networkx-1.11/networkx/generators/random_clustered.py0000644000175000017500000001066712637544450023155 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """Generate graphs with given degree and triangle sequence. """ # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import random import networkx as nx __author__ = "\n".join(['Aric Hagberg (hagberg@lanl.gov)', 'Joel Miller (joel.c.miller.research@gmail.com)']) __all__ = ['random_clustered_graph'] def random_clustered_graph(joint_degree_sequence, create_using=None, seed=None): """Generate a random graph with the given joint independent edge degree and triangle degree sequence. This uses a configuration model-like approach to generate a random graph (with parallel edges and self-loops) by randomly assigning edges to match the given joint degree sequence. The joint degree sequence is a list of pairs of integers of the form `[(d_{1,i}, d_{1,t}), \dotsc, (d_{n,i}, d_{n,t})]`. According to this list, vertex `u` is a member of `d_{u,t}` triangles and has `d_{u, i}` other edges. The number `d_{u,t}` is the *triangle degree* of `u` and the number `d_{u,i}` is the *independent edge degree*. Parameters ---------- joint_degree_sequence : list of integer pairs Each list entry corresponds to the independent edge degree and triangle degree of a node. create_using : graph, optional (default MultiGraph) Return graph of this type. The instance will be cleared. seed : hashable object, optional The seed for the random number generator. Returns ------- G : MultiGraph A graph with the specified degree sequence. Nodes are labeled starting at 0 with an index corresponding to the position in deg_sequence. Raises ------ NetworkXError If the independent edge degree sequence sum is not even or the triangle degree sequence sum is not divisible by 3. Notes ----- As described by Miller [1]_ (see also Newman [2]_ for an equivalent description). A non-graphical degree sequence (not realizable by some simple graph) is allowed since this function returns graphs with self loops and parallel edges. An exception is raised if the independent degree sequence does not have an even sum or the triangle degree sequence sum is not divisible by 3. This configuration model-like construction process can lead to duplicate edges and loops. You can remove the self-loops and parallel edges (see below) which will likely result in a graph that doesn't have the exact degree sequence specified. This "finite-size effect" decreases as the size of the graph increases. References ---------- .. [1] Joel C. Miller. "Percolation and epidemics in random clustered networks". In: Physical review. E, Statistical, nonlinear, and soft matter physics 80 (2 Part 1 August 2009). .. [2] M. E. J. Newman. "Random Graphs with Clustering". In: Physical Review Letters 103 (5 July 2009) Examples -------- >>> deg = [(1, 0), (1, 0), (1, 0), (2, 0), (1, 0), (2, 1), (0, 1), (0, 1)] >>> G = nx.random_clustered_graph(deg) To remove parallel edges: >>> G = nx.Graph(G) To remove self loops: >>> G.remove_edges_from(G.selfloop_edges()) """ if create_using is None: create_using = nx.MultiGraph() elif create_using.is_directed(): raise nx.NetworkXError("Directed Graph not supported") if not seed is None: random.seed(seed) # In Python 3, zip() returns an iterator. Make this into a list. joint_degree_sequence = list(joint_degree_sequence) N = len(joint_degree_sequence) G = nx.empty_graph(N,create_using) ilist = [] tlist = [] for n in G: degrees = joint_degree_sequence[n] for icount in range(degrees[0]): ilist.append(n) for tcount in range(degrees[1]): tlist.append(n) if len(ilist)%2 != 0 or len(tlist)%3 != 0: raise nx.NetworkXError('Invalid degree sequence') random.shuffle(ilist) random.shuffle(tlist) while ilist: G.add_edge(ilist.pop(),ilist.pop()) while tlist: n1 = tlist.pop() n2 = tlist.pop() n3 = tlist.pop() G.add_edges_from([(n1,n2),(n1,n3),(n2,n3)]) G.name = "random_clustered %d nodes %d edges"%(G.order(),G.size()) return G networkx-1.11/networkx/generators/stochastic.py0000644000175000017500000000377112637544500021761 0ustar aricaric00000000000000"""Functions for generating stochastic graphs from a given weighted directed graph. """ # Copyright (C) 2010-2013 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. from __future__ import division import warnings import networkx as nx from networkx.utils import not_implemented_for __author__ = "Aric Hagberg " __all__ = ['stochastic_graph'] @not_implemented_for('multigraph') @not_implemented_for('undirected') def stochastic_graph(G, copy=True, weight='weight'): """Returns a right-stochastic representation of the directed graph ``G``. A right-stochastic graph is a weighted digraph in which for each node, the sum of the weights of all the out-edges of that node is 1. If the graph is already weighted (for example, via a ``'weight'`` edge attribute), the reweighting takes that into account. Parameters ---------- G : directed graph A :class:`~networkx.DiGraph` or :class:`~networkx.MultiDiGraph`. copy : boolean, optional If this is ``True``, then this function returns a new instance of :class:`networkx.Digraph`. Otherwise, the original graph is modified in-place (and also returned, for convenience). weight : edge attribute key (optional, default='weight') Edge attribute key used for reading the existing weight and setting the new weight. If no attribute with this key is found for an edge, then the edge weight is assumed to be 1. If an edge has a weight, it must be a a positive number. """ if copy: W = nx.DiGraph(G) else: # Reference the original graph, don't make a copy. W = G degree = W.out_degree(weight=weight) for (u, v, d) in W.edges(data=True): if degree[u] == 0: warnings.warn('zero out-degree for node %s' % u) d[weight] = 0 else: d[weight] = d.get(weight, 1) / degree[u] return W networkx-1.11/networkx/generators/random_graphs.py0000644000175000017500000007524212637544500022443 0ustar aricaric00000000000000# -*- coding: utf-8 -*- """ Generators for random graphs. """ # Copyright (C) 2004-2015 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. __author__ = "\n".join(['Aric Hagberg (hagberg@lanl.gov)', 'Pieter Swart (swart@lanl.gov)', 'Dan Schult (dschult@colgate.edu)']) import itertools import random import math import networkx as nx from networkx.generators.classic import empty_graph, path_graph, complete_graph from collections import defaultdict __all__ = ['fast_gnp_random_graph', 'gnp_random_graph', 'dense_gnm_random_graph', 'gnm_random_graph', 'erdos_renyi_graph', 'binomial_graph', 'newman_watts_strogatz_graph', 'watts_strogatz_graph', 'connected_watts_strogatz_graph', 'random_regular_graph', 'barabasi_albert_graph', 'powerlaw_cluster_graph', 'duplication_divergence_graph', 'random_lobster', 'random_shell_graph', 'random_powerlaw_tree', 'random_powerlaw_tree_sequence'] #------------------------------------------------------------------------- # Some Famous Random Graphs #------------------------------------------------------------------------- def fast_gnp_random_graph(n, p, seed=None, directed=False): """Returns a `G_{n,p}` random graph, also known as an Erdős-Rényi graph or a binomial graph. Parameters ---------- n : int The number of nodes. p : float Probability for edge creation. seed : int, optional Seed for random number generator (default=None). directed : bool, optional (default=False) If ``True``, this function returns a directed graph. Notes ----- The `G_{n,p}` graph algorithm chooses each of the `[n (n - 1)] / 2` (undirected) or `n (n - 1)` (directed) possible edges with probability `p`. This algorithm runs in `O(n + m)` time, where `m` is the expected number of edges, which equals `p n (n - 1) / 2`. This should be faster than :func:`gnp_random_graph` when `p` is small and the expected number of edges is small (that is, the graph is sparse). See Also -------- gnp_random_graph References ---------- .. [1] Vladimir Batagelj and Ulrik Brandes, "Efficient generation of large random networks", Phys. Rev. E, 71, 036113, 2005. """ G = empty_graph(n) G.name="fast_gnp_random_graph(%s,%s)"%(n,p) if not seed is None: random.seed(seed) if p <= 0 or p >= 1: return nx.gnp_random_graph(n,p,directed=directed) w = -1 lp = math.log(1.0 - p) if directed: G = nx.DiGraph(G) # Nodes in graph are from 0,n-1 (start with v as the first node index). v = 0 while v < n: lr = math.log(1.0 - random.random()) w = w + 1 + int(lr/lp) if v == w: # avoid self loops w = w + 1 while w >= n and v < n: w = w - n v = v + 1 if v == w: # avoid self loops w = w + 1 if v < n: G.add_edge(v, w) else: # Nodes in graph are from 0,n-1 (start with v as the second node index). v = 1 while v < n: lr = math.log(1.0 - random.random()) w = w + 1 + int(lr/lp) while w >= v and v < n: w = w - v v = v + 1 if v < n: G.add_edge(v, w) return G def gnp_random_graph(n, p, seed=None, directed=False): """Returns a `G_{n,p}` random graph, also known as an Erdős-Rényi graph or a binomial graph. The `G_{n,p}` model chooses each of the possible edges with probability ``p``. The functions :func:`binomial_graph` and :func:`erdos_renyi_graph` are aliases of this function. Parameters ---------- n : int The number of nodes. p : float Probability for edge creation. seed : int, optional Seed for random number generator (default=None). directed : bool, optional (default=False) If ``True``, this function returns a directed graph. See Also -------- fast_gnp_random_graph Notes ----- This algorithm runs in `O(n^2)` time. For sparse graphs (that is, for small values of `p`), :func:`fast_gnp_random_graph` is a faster algorithm. References ---------- .. [1] P. Erdős and A. Rényi, On Random Graphs, Publ. Math. 6, 290 (1959). .. [2] E. N. Gilbert, Random Graphs, Ann. Math. Stat., 30, 1141 (1959). """ if directed: G=nx.DiGraph() else: G=nx.Graph() G.add_nodes_from(range(n)) G.name="gnp_random_graph(%s,%s)"%(n,p) if p<=0: return G if p>=1: return complete_graph(n,create_using=G) if not seed is None: random.seed(seed) if G.is_directed(): edges=itertools.permutations(range(n),2) else: edges=itertools.combinations(range(n),2) for e in edges: if random.random() < p: G.add_edge(*e) return G # add some aliases to common names binomial_graph=gnp_random_graph erdos_renyi_graph=gnp_random_graph def dense_gnm_random_graph(n, m, seed=None): """Returns a `G_{n,m}` random graph. In the `G_{n,m}` model, a graph is chosen uniformly at random from the set of all graphs with `n` nodes and `m` edges. This algorithm should be faster than :func:`gnm_random_graph` for dense graphs. Parameters ---------- n : int The number of nodes. m : int The number of edges. seed : int, optional Seed for random number generator (default=None). See Also -------- gnm_random_graph() Notes ----- Algorithm by Keith M. Briggs Mar 31, 2006. Inspired by Knuth's Algorithm S (Selection sampling technique), in section 3.4.2 of [1]_. References ---------- .. [1] Donald E. Knuth, The Art of Computer Programming, Volume 2/Seminumerical algorithms, Third Edition, Addison-Wesley, 1997. """ mmax=n*(n-1)/2 if m>=mmax: G=complete_graph(n) else: G=empty_graph(n) G.name="dense_gnm_random_graph(%s,%s)"%(n,m) if n==1 or m>=mmax: return G if seed is not None: random.seed(seed) u=0 v=1 t=0 k=0 while True: if random.randrange(mmax-t)=max_edges: return complete_graph(n,create_using=G) nlist=G.nodes() edge_count=0 while edge_count < m: # generate random edge,u,v u = random.choice(nlist) v = random.choice(nlist) if u==v or G.has_edge(u,v): continue else: G.add_edge(u,v) edge_count=edge_count+1 return G def newman_watts_strogatz_graph(n, k, p, seed=None): """Return a Newman–Watts–Strogatz small-world graph. Parameters ---------- n : int The number of nodes. k : int Each node is joined with its ``k`` nearest neighbors in a ring topology. p : float The probability of adding a new edge for each edge. seed : int, optional The seed for the random number generator (the default is ``None``). Notes ----- First create a ring over ``n`` nodes. Then each node in the ring is connected with its ``k`` nearest neighbors (or ``k - 1`` neighbors if ``k`` is odd). Then shortcuts are created by adding new edges as follows: for each edge ``(u, v)`` in the underlying "``n``-ring with ``k`` nearest neighbors" with probability ``p`` add a new edge ``(u, w)`` with randomly-chosen existing node ``w``. In contrast with :func:`watts_strogatz_graph`, no edges are removed. See Also -------- watts_strogatz_graph() References ---------- .. [1] M. E. J. Newman and D. J. Watts, Renormalization group analysis of the small-world network model, Physics Letters A, 263, 341, 1999. http://dx.doi.org/10.1016/S0375-9601(99)00757-4 """ if seed is not None: random.seed(seed) if k>=n: raise nx.NetworkXError("k>=n, choose smaller k or larger n") G=empty_graph(n) G.name="newman_watts_strogatz_graph(%s,%s,%s)"%(n,k,p) nlist = G.nodes() fromv = nlist # connect the k/2 neighbors for j in range(1, k // 2+1): tov = fromv[j:] + fromv[0:j] # the first j are now last for i in range(len(fromv)): G.add_edge(fromv[i], tov[i]) # for each edge u-v, with probability p, randomly select existing # node w and add new edge u-w e = G.edges() for (u, v) in e: if random.random() < p: w = random.choice(nlist) # no self-loops and reject if edge u-w exists # is that the correct NWS model? while w == u or G.has_edge(u, w): w = random.choice(nlist) if G.degree(u) >= n-1: break # skip this rewiring else: G.add_edge(u,w) return G def watts_strogatz_graph(n, k, p, seed=None): """Return a Watts–Strogatz small-world graph. Parameters ---------- n : int The number of nodes k : int Each node is joined with its ``k`` nearest neighbors in a ring topology. p : float The probability of rewiring each edge seed : int, optional Seed for random number generator (default=None) See Also -------- newman_watts_strogatz_graph() connected_watts_strogatz_graph() Notes ----- First create a ring over ``n`` nodes. Then each node in the ring is joined to its ``k`` nearest neighbors (or ``k - 1`` neighbors if ``k`` is odd). Then shortcuts are created by replacing some edges as follows: for each edge ``(u, v)`` in the underlying "``n``-ring with ``k`` nearest neighbors" with probability ``p`` replace it with a new edge ``(u, w)`` with uniformly random choice of existing node ``w``. In contrast with :func:`newman_watts_strogatz_graph`, the random rewiring does not increase the number of edges. The rewired graph is not guaranteed to be connected as in :func:`connected_watts_strogatz_graph`. References ---------- .. [1] Duncan J. Watts and Steven H. Strogatz, Collective dynamics of small-world networks, Nature, 393, pp. 440--442, 1998. """ if k>=n: raise nx.NetworkXError("k>=n, choose smaller k or larger n") if seed is not None: random.seed(seed) G = nx.Graph() G.name="watts_strogatz_graph(%s,%s,%s)"%(n,k,p) nodes = list(range(n)) # nodes are labeled 0 to n-1 # connect each node to k/2 neighbors for j in range(1, k // 2+1): targets = nodes[j:] + nodes[0:j] # first j nodes are now last in list G.add_edges_from(zip(nodes,targets)) # rewire edges from each node # loop over all nodes in order (label) and neighbors in order (distance) # no self loops or multiple edges allowed for j in range(1, k // 2+1): # outer loop is neighbors targets = nodes[j:] + nodes[0:j] # first j nodes are now last in list # inner loop in node order for u,v in zip(nodes,targets): if random.random() < p: w = random.choice(nodes) # Enforce no self-loops or multiple edges while w == u or G.has_edge(u, w): w = random.choice(nodes) if G.degree(u) >= n-1: break # skip this rewiring else: G.remove_edge(u,v) G.add_edge(u,w) return G def connected_watts_strogatz_graph(n, k, p, tries=100, seed=None): """Returns a connected Watts–Strogatz small-world graph. Attempts to generate a connected graph by repeated generation of Watts–Strogatz small-world graphs. An exception is raised if the maximum number of tries is exceeded. Parameters ---------- n : int The number of nodes k : int Each node is joined with its ``k`` nearest neighbors in a ring topology. p : float The probability of rewiring each edge tries : int Number of attempts to generate a connected graph. seed : int, optional The seed for random number generator. See Also -------- newman_watts_strogatz_graph() watts_strogatz_graph() """ G = watts_strogatz_graph(n, k, p, seed) t=1 while not nx.is_connected(G): G = watts_strogatz_graph(n, k, p, seed) t=t+1 if t>tries: raise nx.NetworkXError("Maximum number of tries exceeded") return G def random_regular_graph(d, n, seed=None): """Returns a random ``d``-regular graph on ``n`` nodes. The resulting graph has no self-loops or parallel edges. Parameters ---------- d : int The degree of each node. n : integer The number of nodes. The value of ``n * d`` must be even. seed : hashable object The seed for random number generator. Notes ----- The nodes are numbered from ``0`` to ``n - 1``. Kim and Vu's paper [2]_ shows that this algorithm samples in an asymptotically uniform way from the space of random graphs when `d = O(n^{1 / 3 - \epsilon})`. Raises ------ NetworkXError If ``n * d`` is odd or ``d`` is greater than or equal to ``n``. References ---------- .. [1] A. Steger and N. Wormald, Generating random regular graphs quickly, Probability and Computing 8 (1999), 377-396, 1999. http://citeseer.ist.psu.edu/steger99generating.html .. [2] Jeong Han Kim and Van H. Vu, Generating random regular graphs, Proceedings of the thirty-fifth ACM symposium on Theory of computing, San Diego, CA, USA, pp 213--222, 2003. http://portal.acm.org/citation.cfm?id=780542.780576 """ if (n * d) % 2 != 0: raise nx.NetworkXError("n * d must be even") if not 0 <= d < n: raise nx.NetworkXError("the 0 <= d < n inequality must be satisfied") if d == 0: return empty_graph(n) if seed is not None: random.seed(seed) def _suitable(edges, potential_edges): # Helper subroutine to check if there are suitable edges remaining # If False, the generation of the graph has failed if not potential_edges: return True for s1 in potential_edges: for s2 in potential_edges: # Two iterators on the same dictionary are guaranteed # to visit it in the same order if there are no # intervening modifications. if s1 == s2: # Only need to consider s1-s2 pair one time break if s1 > s2: s1, s2 = s2, s1 if (s1, s2) not in edges: return True return False def _try_creation(): # Attempt to create an edge set edges = set() stubs = list(range(n)) * d while stubs: potential_edges = defaultdict(lambda: 0) random.shuffle(stubs) stubiter = iter(stubs) for s1, s2 in zip(stubiter, stubiter): if s1 > s2: s1, s2 = s2, s1 if s1 != s2 and ((s1, s2) not in edges): edges.add((s1, s2)) else: potential_edges[s1] += 1 potential_edges[s2] += 1 if not _suitable(edges, potential_edges): return None # failed to find suitable edge set stubs = [node for node, potential in potential_edges.items() for _ in range(potential)] return edges # Even though a suitable edge set exists, # the generation of such a set is not guaranteed. # Try repeatedly to find one. edges = _try_creation() while edges is None: edges = _try_creation() G = nx.Graph() G.name = "random_regular_graph(%s, %s)" % (d, n) G.add_edges_from(edges) return G def _random_subset(seq,m): """ Return m unique elements from seq. This differs from random.sample which can return repeated elements if seq holds repeated elements. """ targets=set() while len(targets)=n: raise nx.NetworkXError("Barabási–Albert network must have m >= 1" " and m < n, m = %d, n = %d" % (m, n)) if seed is not None: random.seed(seed) # Add m initial nodes (m0 in barabasi-speak) G=empty_graph(m) G.name="barabasi_albert_graph(%s,%s)"%(n,m) # Target nodes for new edges targets=list(range(m)) # List of existing nodes, with nodes repeated once for each adjacent edge repeated_nodes=[] # Start adding the other n-m nodes. The first node is m. source=m while source1 and m 1 or p < 0: raise nx.NetworkXError(\ "NetworkXError p must be in [0,1], p=%f"%(p)) if seed is not None: random.seed(seed) G=empty_graph(m) # add m initial nodes (m0 in barabasi-speak) G.name="Powerlaw-Cluster Graph" repeated_nodes=G.nodes() # list of existing nodes to sample from # with nodes repeated once for each adjacent edge source=m # next node is m while source 1 or p < 0: msg = "NetworkXError p={0} is not in [0,1].".format(p) raise nx.NetworkXError(msg) if n < 2: msg = 'n must be greater than or equal to 2' raise nx.NetworkXError(msg) if seed is not None: random.seed(seed) G = nx.Graph() G.graph['name'] = "Duplication-Divergence Graph" # Initialize the graph with two connected nodes. G.add_edge(0,1) i = 2 while i < n: # Choose a random node from current graph to duplicate. random_node = random.choice(G.nodes()) # Make the replica. G.add_node(i) # flag indicates whether at least one edge is connected on the replica. flag=False for nbr in G.neighbors(random_node): if random.random() < p: # Link retention step. G.add_edge(i, nbr) flag = True if not flag: # Delete replica if no edges retained. G.remove_node(i) else: # Successful duplication. i += 1 return G def random_lobster(n, p1, p2, seed=None): """Returns a random lobster graph. A lobster is a tree that reduces to a caterpillar when pruning all leaf nodes. A caterpillar is a tree that reduces to a path graph when pruning all leaf nodes; setting ``p2`` to zero produces a caterillar. Parameters ---------- n : int The expected number of nodes in the backbone p1 : float Probability of adding an edge to the backbone p2 : float Probability of adding an edge one level beyond backbone seed : int, optional Seed for random number generator (default=None). """ # a necessary ingredient in any self-respecting graph library if seed is not None: random.seed(seed) llen=int(2*random.random()*n + 0.5) L=path_graph(llen) L.name="random_lobster(%d,%s,%s)"%(n,p1,p2) # build caterpillar: add edges to path graph with probability p1 current_node=llen-1 for n in range(llen): if random.random()>> constructor = [(10, 20, 0.8), (20, 40, 0.8)] >>> G = nx.random_shell_graph(constructor) """ G=empty_graph(0) G.name="random_shell_graph(constructor)" if seed is not None: random.seed(seed) glist=[] intra_edges=[] nnodes=0 # create gnm graphs for each shell for (n,m,d) in constructor: inter_edges=int(m*d) intra_edges.append(m-inter_edges) g=nx.convert_node_labels_to_integers( gnm_random_graph(n,inter_edges), first_label=nnodes) glist.append(g) nnodes+=n G=nx.operators.union(G,g) # connect the shells randomly for gi in range(len(glist)-1): nlist1=glist[gi].nodes() nlist2=glist[gi+1].nodes() total_edges=intra_edges[gi] edge_count=0 while edge_count < total_edges: u = random.choice(nlist1) v = random.choice(nlist2) if u==v or G.has_edge(u,v): continue else: G.add_edge(u,v) edge_count=edge_count+1 return G def random_powerlaw_tree(n, gamma=3, seed=None, tries=100): """Returns a tree with a power law degree distribution. Parameters ---------- n : int The number of nodes. gamma : float Exponent of the power law. seed : int, optional Seed for random number generator (default=None). tries : int Number of attempts to adjust the sequence to make it a tree. Raises ------ NetworkXError If no valid sequence is found within the maximum number of attempts. Notes ----- A trial power law degree sequence is chosen and then elements are swapped with new elements from a powerlaw distribution until the sequence makes a tree (by checking, for example, that the number of edges is one smaller than the number of nodes). """ from networkx.generators.degree_seq import degree_sequence_tree try: s=random_powerlaw_tree_sequence(n, gamma=gamma, seed=seed, tries=tries) except: raise nx.NetworkXError(\ "Exceeded max (%d) attempts for a valid tree sequence."%tries) G=degree_sequence_tree(s) G.name="random_powerlaw_tree(%s,%s)"%(n,gamma) return G def random_powerlaw_tree_sequence(n, gamma=3, seed=None, tries=100): """Returns a degree sequence for a tree with a power law distribution. Parameters ---------- n : int, The number of nodes. gamma : float Exponent of the power law. seed : int, optional Seed for random number generator (default=None). tries : int Number of attempts to adjust the sequence to make it a tree. Raises ------ NetworkXError If no valid sequence is found within the maximum number of attempts. Notes ----- A trial power law degree sequence is chosen and then elements are swapped with new elements from a power law distribution until the sequence makes a tree (by checking, for example, that the number of edges is one smaller than the number of nodes). """ if seed is not None: random.seed(seed) # get trial sequence z=nx.utils.powerlaw_sequence(n,exponent=gamma) # round to integer values in the range [0,n] zseq=[min(n, max( int(round(s)),0 )) for s in z] # another sequence to swap values from z=nx.utils.powerlaw_sequence(tries,exponent=gamma) # round to integer values in the range [0,n] swap=[min(n, max( int(round(s)),0 )) for s in z] for deg in swap: if n-sum(zseq)/2.0 == 1.0: # is a tree, return sequence return zseq index=random.randint(0,n-1) zseq[index]=swap.pop() raise nx.NetworkXError(\ "Exceeded max (%d) attempts for a valid tree sequence."%tries) return False networkx-1.11/networkx/generators/nonisomorphic_trees.py0000644000175000017500000001270612637544500023704 0ustar aricaric00000000000000""" Implementation of the Wright, Richmond, Odlyzko and McKay (WROM) algorithm for the enumeration of all non-isomorphic free trees of a given order. Rooted trees are represented by level sequences, i.e., lists in which the i-th element specifies the distance of vertex i to the root. """ # Copyright (C) 2013 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. __author__ = "\n".join(["Aric Hagberg (hagberg@lanl.gov)", "Mridul Seth (seth.mridul@gmail.com)"]) __all__ = ['nonisomorphic_trees', 'number_of_nonisomorphic_trees'] import networkx as nx def nonisomorphic_trees(order, create="graph"): """Returns a list of nonisomporphic trees Parameters ---------- order : int order of the desired tree(s) create : graph or matrix (default="Graph) If graph is selected a list of trees will be returned, if matrix is selected a list of adjancency matrix will be returned Returns ------- G : List of NetworkX Graphs M : List of Adjacency matrices References ---------- """ if order < 2: raise ValueError # start at the path graph rooted at its center layout = list(range(order // 2 + 1)) + list(range(1, (order + 1) // 2)) while layout is not None: layout = _next_tree(layout) if layout is not None: if create == "graph": yield _layout_to_graph(layout) elif create == "matrix": yield _layout_to_matrix(layout) layout = _next_rooted_tree(layout) def number_of_nonisomorphic_trees(order): """Returns the number of nonisomorphic trees Parameters ---------- order : int order of the desired tree(s) Returns ------- length : Number of nonisomorphic graphs for the given order References ---------- """ length = sum(1 for _ in nonisomorphic_trees(order)) return length def _next_rooted_tree(predecessor, p=None): """One iteration of the Beyer-Hedetniemi algorithm.""" if p is None: p = len(predecessor) - 1 while predecessor[p] == 1: p -= 1 if p == 0: return None q = p - 1 while predecessor[q] != predecessor[p] - 1: q -= 1 result = list(predecessor) for i in range(p, len(result)): result[i] = result[i - p + q] return result def _next_tree(candidate): """One iteration of the Wright, Richmond, Odlyzko and McKay algorithm.""" # valid representation of a free tree if: # there are at least two vertices at layer 1 # (this is always the case because we start at the path graph) left, rest = _split_tree(candidate) # and the left subtree of the root # is less high than the tree with the left subtree removed left_height = max(left) rest_height = max(rest) valid = rest_height >= left_height if valid and rest_height == left_height: # and, if left and rest are of the same height, # if left does not encompass more vertices if len(left) > len(rest): valid = False # and, if they have the same number or vertices, # if left does not come after rest lexicographically elif len(left) == len(rest) and left > rest: valid = False if valid: return candidate else: # jump to the next valid free tree p = len(left) new_candidate = _next_rooted_tree(candidate, p) if candidate[p] > 2: new_left, new_rest = _split_tree(new_candidate) new_left_height = max(new_left) suffix = range(1, new_left_height + 2) new_candidate[-len(suffix):] = suffix return new_candidate def _split_tree(layout): """Return a tuple of two layouts, one containing the left subtree of the root vertex, and one containing the original tree with the left subtree removed.""" one_found = False m = None for i in range(len(layout)): if layout[i] == 1: if one_found: m = i break else: one_found = True if m is None: m = len(layout) left = [layout[i] - 1 for i in range(1, m)] rest = [0] + [layout[i] for i in range(m, len(layout))] return (left, rest) def _layout_to_matrix(layout): """Create the adjacency matrix for the tree specified by the given layout (level sequence).""" result = [[0] * len(layout) for i in range(len(layout))] stack = [] for i in range(len(layout)): i_level = layout[i] if stack: j = stack[-1] j_level = layout[j] while j_level >= i_level: stack.pop() j = stack[-1] j_level = layout[j] result[i][j] = result[j][i] = 1 stack.append(i) return result def _layout_to_graph(layout): """Create a NetworkX Graph for the tree specified by the given layout(level sequence)""" result = [[0] * len(layout) for i in range(len(layout))] G = nx.Graph() stack = [] for i in range(len(layout)): i_level = layout[i] if stack: j = stack[-1] j_level = layout[j] while j_level >= i_level: stack.pop() j = stack[-1] j_level = layout[j] G.add_edge(i, j) stack.append(i) return G networkx-1.11/networkx/generators/community.py0000644000175000017500000002727312637544500021644 0ustar aricaric00000000000000"""Generators for classes of graphs used in studying social networks.""" import itertools import math import random import networkx as nx # Copyright(C) 2011 by # Ben Edwards # Aric Hagberg # All rights reserved. # BSD license. __author__ = """\n""".join(['Ben Edwards (bedwards@cs.unm.edu)', 'Aric Hagberg (hagberg@lanl.gov)']) __all__ = ['caveman_graph', 'connected_caveman_graph', 'relaxed_caveman_graph', 'random_partition_graph', 'planted_partition_graph', 'gaussian_random_partition_graph'] def caveman_graph(l, k): """Returns a caveman graph of ``l`` cliques of size ``k``. Parameters ---------- l : int Number of cliques k : int Size of cliques Returns ------- G : NetworkX Graph caveman graph Notes ----- This returns an undirected graph, it can be converted to a directed graph using :func:`nx.to_directed`, or a multigraph using ``nx.MultiGraph(nx.caveman_graph(l, k))``. Only the undirected version is described in [1]_ and it is unclear which of the directed generalizations is most useful. Examples -------- >>> G = nx.caveman_graph(3, 3) See also -------- connected_caveman_graph References ---------- .. [1] Watts, D. J. 'Networks, Dynamics, and the Small-World Phenomenon.' Amer. J. Soc. 105, 493-527, 1999. """ # l disjoint cliques of size k G = nx.empty_graph(l*k) G.name = "caveman_graph(%s,%s)" % (l*k, k) if k > 1: for start in range(0, l*k, k): edges = itertools.combinations(range(start, start+k), 2) G.add_edges_from(edges) return G def connected_caveman_graph(l, k): """Returns a connected caveman graph of ``l`` cliques of size ``k``. The connected caveman graph is formed by creating ``n`` cliques of size ``k``, then a single edge in each clique is rewired to a node in an adjacent clique. Parameters ---------- l : int number of cliques k : int size of cliques Returns ------- G : NetworkX Graph connected caveman graph Notes ----- This returns an undirected graph, it can be converted to a directed graph using :func:`nx.to_directed`, or a multigraph using ``nx.MultiGraph(nx.caveman_graph(l, k))``. Only the undirected version is described in [1]_ and it is unclear which of the directed generalizations is most useful. Examples -------- >>> G = nx.connected_caveman_graph(3, 3) References ---------- .. [1] Watts, D. J. 'Networks, Dynamics, and the Small-World Phenomenon.' Amer. J. Soc. 105, 493-527, 1999. """ G = nx.caveman_graph(l, k) G.name = "connected_caveman_graph(%s,%s)" % (l, k) for start in range(0, l*k, k): G.remove_edge(start, start+1) G.add_edge(start, (start-1) % (l*k)) return G def relaxed_caveman_graph(l, k, p, seed=None): """Return a relaxed caveman graph. A relaxed caveman graph starts with ``l`` cliques of size ``k``. Edges are then randomly rewired with probability ``p`` to link different cliques. Parameters ---------- l : int Number of groups k : int Size of cliques p : float Probabilty of rewiring each edge. seed : int,optional Seed for random number generator(default=None) Returns ------- G : NetworkX Graph Relaxed Caveman Graph Raises ------ NetworkXError: If p is not in [0,1] Examples -------- >>> G = nx.relaxed_caveman_graph(2, 3, 0.1, seed=42) References ---------- .. [1] Santo Fortunato, Community Detection in Graphs, Physics Reports Volume 486, Issues 3-5, February 2010, Pages 75-174. http://arxiv.org/abs/0906.0612 """ if not seed is None: random.seed(seed) G = nx.caveman_graph(l, k) nodes = G.nodes() G.name = "relaxed_caveman_graph (%s,%s,%s)" % (l, k, p) for (u, v) in G.edges(): if random.random() < p: # rewire the edge x = random.choice(nodes) if G.has_edge(u, x): continue G.remove_edge(u, v) G.add_edge(u, x) return G def random_partition_graph(sizes, p_in, p_out, seed=None, directed=False): """Return the random partition graph with a partition of sizes. A partition graph is a graph of communities with sizes defined by s in sizes. Nodes in the same group are connected with probability p_in and nodes of different groups are connected with probability p_out. Parameters ---------- sizes : list of ints Sizes of groups p_in : float probability of edges with in groups p_out : float probability of edges between groups directed : boolean optional, default=False Whether to create a directed graph seed : int optional, default None A seed for the random number generator Returns ------- G : NetworkX Graph or DiGraph random partition graph of size sum(gs) Raises ------ NetworkXError If p_in or p_out is not in [0,1] Examples -------- >>> G = nx.random_partition_graph([10,10,10],.25,.01) >>> len(G) 30 >>> partition = G.graph['partition'] >>> len(partition) 3 Notes ----- This is a generalization of the planted-l-partition described in [1]_. It allows for the creation of groups of any size. The partition is store as a graph attribute 'partition'. References ---------- .. [1] Santo Fortunato 'Community Detection in Graphs' Physical Reports Volume 486, Issue 3-5 p. 75-174. http://arxiv.org/abs/0906.0612 http://arxiv.org/abs/0906.0612 """ # Use geometric method for O(n+m) complexity algorithm # partition=nx.community_sets(nx.get_node_attributes(G,'affiliation')) if not seed is None: random.seed(seed) if not 0.0 <= p_in <= 1.0: raise nx.NetworkXError("p_in must be in [0,1]") if not 0.0 <= p_out <= 1.0: raise nx.NetworkXError("p_out must be in [0,1]") if directed: G = nx.DiGraph() else: G = nx.Graph() G.graph['partition'] = [] n = sum(sizes) G.add_nodes_from(range(n)) # start with len(sizes) groups of gnp random graphs with parameter p_in # graphs are unioned together with node labels starting at # 0, sizes[0], sizes[0]+sizes[1], ... next_group = {} # maps node key (int) to first node in next group start = 0 group = 0 for n in sizes: edges = ((u+start, v+start) for u, v in nx.fast_gnp_random_graph(n, p_in, directed=directed).edges()) G.add_edges_from(edges) next_group.update(dict.fromkeys(range(start, start+n), start+n)) G.graph['partition'].append(set(range(start, start+n))) group += 1 start += n # handle edge cases if p_out == 0: return G if p_out == 1: for n in next_group: targets = range(next_group[n], len(G)) G.add_edges_from(zip([n]*len(targets), targets)) if directed: G.add_edges_from(zip(targets, [n]*len(targets))) return G # connect each node in group randomly with the nodes not in group # use geometric method like fast_gnp_random_graph() lp = math.log(1.0 - p_out) n = len(G) if directed: for u in range(n): v = 0 while v < n: lr = math.log(1.0 - random.random()) v += int(lr/lp) # skip over nodes in the same group as v, including self loops if next_group.get(v, n) == next_group[u]: v = next_group[u] if v < n: G.add_edge(u, v) v += 1 else: for u in range(n-1): v = next_group[u] # start with next node not in this group while v < n: lr = math.log(1.0 - random.random()) v += int(lr/lp) if v < n: G.add_edge(u, v) v += 1 return G def planted_partition_graph(l, k, p_in, p_out, seed=None, directed=False): """Return the planted l-partition graph. This model partitions a graph with n=l*k vertices in l groups with k vertices each. Vertices of the same group are linked with a probability p_in, and vertices of different groups are linked with probability p_out. Parameters ---------- l : int Number of groups k : int Number of vertices in each group p_in : float probability of connecting vertices within a group p_out : float probability of connected vertices between groups seed : int,optional Seed for random number generator(default=None) directed : bool,optional (default=False) If True return a directed graph Returns ------- G : NetworkX Graph or DiGraph planted l-partition graph Raises ------ NetworkXError: If p_in,p_out are not in [0,1] or Examples -------- >>> G = nx.planted_partition_graph(4, 3, 0.5, 0.1,seed=42) See Also -------- random_partition_model References ---------- .. [1] A. Condon, R.M. Karp, Algorithms for graph partitioning on the planted partition model, Random Struct. Algor. 18 (2001) 116-140. .. [2] Santo Fortunato 'Community Detection in Graphs' Physical Reports Volume 486, Issue 3-5 p. 75-174. http://arxiv.org/abs/0906.0612 """ return random_partition_graph([k]*l, p_in, p_out, seed, directed) def gaussian_random_partition_graph(n, s, v, p_in, p_out, directed=False, seed=None): """Generate a Gaussian random partition graph. A Gaussian random partition graph is created by creating k partitions each with a size drawn from a normal distribution with mean s and variance s/v. Nodes are connected within clusters with probability p_in and between clusters with probability p_out[1] Parameters ---------- n : int Number of nodes in the graph s : float Mean cluster size v : float Shape parameter. The variance of cluster size distribution is s/v. p_in : float Probabilty of intra cluster connection. p_out : float Probability of inter cluster connection. directed : boolean, optional default=False Whether to create a directed graph or not seed : int Seed value for random number generator Returns ------- G : NetworkX Graph or DiGraph gaussian random partition graph Raises ------ NetworkXError If s is > n If p_in or p_out is not in [0,1] Notes ----- Note the number of partitions is dependent on s,v and n, and that the last partition may be considerably smaller, as it is sized to simply fill out the nodes [1] See Also -------- random_partition_graph Examples -------- >>> G = nx.gaussian_random_partition_graph(100,10,10,.25,.1) >>> len(G) 100 References ---------- .. [1] Ulrik Brandes, Marco Gaertler, Dorothea Wagner, Experiments on Graph Clustering Algorithms, In the proceedings of the 11th Europ. Symp. Algorithms, 2003. """ if s > n: raise nx.NetworkXError("s must be <= n") assigned = 0 sizes = [] while True: size = int(random.normalvariate(s, float(s) / v + 0.5)) if size < 1: # how to handle 0 or negative sizes? continue if assigned + size >= n: sizes.append(n-assigned) break assigned += size sizes.append(size) return random_partition_graph(sizes, p_in, p_out, directed, seed) networkx-1.11/networkx/version.py0000644000175000017500000000113212653165414017117 0ustar aricaric00000000000000""" Version information for NetworkX, created during installation. Do not add this file to the repository. """ import datetime version = '1.11' date = 'Sat Jan 30 09:55:40 2016' # Was NetworkX built from a development version? If so, remember that the major # and minor versions reference the "target" (rather than "current") release. dev = False # Format: (name, major, min, revision) version_info = ('networkx', '1', '11', None) # Format: a 'datetime.datetime' instance date_info = datetime.datetime(2016, 1, 30, 9, 55, 40, 165755) # Format: (vcs, vcs_tuple) vcs_info = (None, (None, None)) networkx-1.11/LICENSE.txt0000644000175000017500000000335312653165361015032 0ustar aricaric00000000000000License ======= NetworkX is distributed with the BSD license. :: Copyright (C) 2004-2016, NetworkX Developers Aric Hagberg Dan Schult Pieter Swart All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. * Neither the name of the NetworkX Developers nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. networkx-1.11/MANIFEST.in0000644000175000017500000000135712637544450014751 0ustar aricaric00000000000000include MANIFEST.in include setup.py include INSTALL.txt include LICENSE.txt include README.rst recursive-include examples *.py *.edgelist *.mbox *.gz *.bz2 *.zip recursive-include doc *.py *.rst Makefile *.html *.png *.txt *.css *.inc include scripts/* include networkx/tests/*.txt include networkx/tests/*.py include networkx/*/tests/*.txt include networkx/*/tests/*.py include networkx/*/*/tests/*.txt include networkx/*/*/tests/*.py include networkx/*/*/tests/*.A99 include networkx/*/*/tests/*.B99 include networkx/*/*/tests/*.bz2 global-exclude *~ global-exclude *.pyc global-exclude .svn prune doc/build prune doc/source/reference/generated prune doc/source/examples prune doc/source/static/examples prune doc/source/templates/gallery.html networkx-1.11/doc/0000755000175000017500000000000012653231454013745 5ustar aricaric00000000000000networkx-1.11/doc/gitwash_dumper.py0000644000175000017500000001732412637544450017355 0ustar aricaric00000000000000#!/usr/bin/env python ''' Checkout gitwash repo into directory and do search replace on name ''' from __future__ import print_function import os from os.path import join as pjoin import shutil import sys import re import glob import fnmatch import tempfile from subprocess import call from optparse import OptionParser verbose = False def clone_repo(url, branch): cwd = os.getcwd() tmpdir = tempfile.mkdtemp() try: cmd = 'git clone %s %s' % (url, tmpdir) call(cmd, shell=True) os.chdir(tmpdir) cmd = 'git checkout %s' % branch call(cmd, shell=True) except: shutil.rmtree(tmpdir) raise finally: os.chdir(cwd) return tmpdir def cp_files(in_path, globs, out_path): try: os.makedirs(out_path) except OSError: pass out_fnames = [] for in_glob in globs: in_glob_path = pjoin(in_path, in_glob) for in_fname in glob.glob(in_glob_path): out_fname = in_fname.replace(in_path, out_path) pth, _ = os.path.split(out_fname) if not os.path.isdir(pth): os.makedirs(pth) shutil.copyfile(in_fname, out_fname) out_fnames.append(out_fname) return out_fnames def filename_search_replace(sr_pairs, filename, backup=False): ''' Search and replace for expressions in files ''' in_txt = open(filename, 'rt').read(-1) out_txt = in_txt[:] for in_exp, out_exp in sr_pairs: in_exp = re.compile(in_exp) out_txt = in_exp.sub(out_exp, out_txt) if in_txt == out_txt: return False open(filename, 'wt').write(out_txt) if backup: open(filename + '.bak', 'wt').write(in_txt) return True def copy_replace(replace_pairs, repo_path, out_path, cp_globs=('*',), rep_globs=('*',), renames = ()): out_fnames = cp_files(repo_path, cp_globs, out_path) renames = [(re.compile(in_exp), out_exp) for in_exp, out_exp in renames] fnames = [] for rep_glob in rep_globs: fnames += fnmatch.filter(out_fnames, rep_glob) if verbose: print('\n'.join(fnames)) for fname in fnames: filename_search_replace(replace_pairs, fname, False) for in_exp, out_exp in renames: new_fname, n = in_exp.subn(out_exp, fname) if n: os.rename(fname, new_fname) break def make_link_targets(proj_name, user_name, repo_name, known_link_fname, out_link_fname, url=None, ml_url=None): """ Check and make link targets If url is None or ml_url is None, check if there are links present for these in `known_link_fname`. If not, raise error. The check is: Look for a target `proj_name`. Look for a target `proj_name` + ' mailing list' Also, look for a target `proj_name` + 'github'. If this exists, don't write this target into the new file below. If we are writing any of the url, ml_url, or github address, then write new file with these links, of form: .. _`proj_name` .. _`proj_name`: url .. _`proj_name` mailing list: url """ link_contents = open(known_link_fname, 'rt').readlines() have_url = not url is None have_ml_url = not ml_url is None have_gh_url = None for line in link_contents: if not have_url: match = re.match(r'..\s+_`%s`:\s+' % proj_name, line) if match: have_url = True if not have_ml_url: match = re.match(r'..\s+_`%s mailing list`:\s+' % proj_name, line) if match: have_ml_url = True if not have_gh_url: match = re.match(r'..\s+_`%s github`:\s+' % proj_name, line) if match: have_gh_url = True if not have_url or not have_ml_url: raise RuntimeError('Need command line or known project ' 'and / or mailing list URLs') lines = [] if not url is None: lines.append('.. _`%s`: %s\n' % (proj_name, url)) if not have_gh_url: gh_url = 'http://github.com/%s/%s\n' % (user_name, repo_name) lines.append('.. _`%s github`: %s\n' % (proj_name, gh_url)) if not ml_url is None: lines.append('.. _`%s mailing list`: %s\n' % (proj_name, ml_url)) if len(lines) == 0: # Nothing to do return # A neat little header line lines = ['.. %s\n' % proj_name] + lines out_links = open(out_link_fname, 'wt') out_links.writelines(lines) out_links.close() USAGE = ''' If not set with options, the repository name is the same as the If not set with options, the main github user is the same as the repository name.''' GITWASH_CENTRAL = 'git://github.com/matthew-brett/gitwash.git' GITWASH_BRANCH = 'master' def main(): parser = OptionParser() parser.set_usage(parser.get_usage().strip() + USAGE) parser.add_option("--repo-name", dest="repo_name", help="repository name - e.g. nitime", metavar="REPO_NAME") parser.add_option("--github-user", dest="main_gh_user", help="github username for main repo - e.g fperez", metavar="MAIN_GH_USER") parser.add_option("--gitwash-url", dest="gitwash_url", help="URL to gitwash repository - default %s" % GITWASH_CENTRAL, default=GITWASH_CENTRAL, metavar="GITWASH_URL") parser.add_option("--gitwash-branch", dest="gitwash_branch", help="branch in gitwash repository - default %s" % GITWASH_BRANCH, default=GITWASH_BRANCH, metavar="GITWASH_BRANCH") parser.add_option("--source-suffix", dest="source_suffix", help="suffix of ReST source files - default '.rst'", default='.rst', metavar="SOURCE_SUFFIX") parser.add_option("--project-url", dest="project_url", help="URL for project web pages", default=None, metavar="PROJECT_URL") parser.add_option("--project-ml-url", dest="project_ml_url", help="URL for project mailing list", default=None, metavar="PROJECT_ML_URL") (options, args) = parser.parse_args() if len(args) < 2: parser.print_help() sys.exit() out_path, project_name = args if options.repo_name is None: options.repo_name = project_name if options.main_gh_user is None: options.main_gh_user = options.repo_name repo_path = clone_repo(options.gitwash_url, options.gitwash_branch) try: copy_replace((('PROJECTNAME', project_name), ('REPONAME', options.repo_name), ('MAIN_GH_USER', options.main_gh_user)), repo_path, out_path, cp_globs=(pjoin('gitwash', '*'),), rep_globs=('*.rst',), renames=(('\.rst$', options.source_suffix),)) make_link_targets(project_name, options.main_gh_user, options.repo_name, pjoin(out_path, 'gitwash', 'known_projects.inc'), pjoin(out_path, 'gitwash', 'this_project.inc'), options.project_url, options.project_ml_url) finally: shutil.rmtree(repo_path) if __name__ == '__main__': main() networkx-1.11/doc/rst_templates/0000755000175000017500000000000012653231454016633 5ustar aricaric00000000000000networkx-1.11/doc/rst_templates/autosummary/0000755000175000017500000000000012653231454021221 5ustar aricaric00000000000000networkx-1.11/doc/rst_templates/autosummary/function.rst0000644000175000017500000000013612637544450023605 0ustar aricaric00000000000000{{ name }} {{ underline }} .. currentmodule:: {{ module }} .. autofunction:: {{ objname }} networkx-1.11/doc/rst_templates/autosummary/module.rst0000644000175000017500000000116312637544450023246 0ustar aricaric00000000000000{{ name }} {{ underline }} .. automodule:: {{ name }} {% block functions %} {% if functions %} .. rubric:: Functions .. autosummary:: {% for item in functions %} {{ item }} {%- endfor %} {% endif %} {% endblock %} {% block classes %} {% if classes %} .. rubric:: Classes .. autosummary:: {% for item in classes %} {{ item }} {%- endfor %} {% endif %} {% endblock %} {% block exceptions %} {% if exceptions %} .. rubric:: Exceptions .. autosummary:: {% for item in exceptions %} {{ item }} {%- endfor %} {% endif %} {% endblock %} networkx-1.11/doc/rst_templates/autosummary/class.rst0000644000175000017500000000101312637544450023060 0ustar aricaric00000000000000{{ name }} {{ underline }} .. currentmodule:: {{ module }} .. autoclass:: {{ objname }} {% block methods %} .. automethod:: __init__ {% if methods %} .. rubric:: Methods .. autosummary:: {% for item in methods %} ~{{ name }}.{{ item }} {%- endfor %} {% endif %} {% endblock %} {% block attributes %} {% if attributes %} .. rubric:: Attributes .. autosummary:: {% for item in attributes %} ~{{ name }}.{{ item }} {%- endfor %} {% endif %} {% endblock %} networkx-1.11/doc/rst_templates/autosummary/base.rst0000644000175000017500000000014212637544450022667 0ustar aricaric00000000000000{{ name }} {{ underline }} .. currentmodule:: {{ module }} .. auto{{ objtype }}:: {{ objname }} networkx-1.11/doc/requirements.txt0000644000175000017500000000025412637544450017237 0ustar aricaric00000000000000ipython[nbconvert] numpy >= 1.6 matplotlib pandas pydotplus -e git://github.com/sphinx-doc/sphinx.git@stable#egg=Sphinx-origin_stable sphinxcontrib-bibtex sphinx_rtd_theme networkx-1.11/doc/make_examples_rst.py0000755000175000017500000001252512637544450020037 0ustar aricaric00000000000000""" generate the rst files for the examples by iterating over the networkx examples """ # This code was developed from the Matplotlib gen_rst.py module # and is distributed with the same license as Matplotlib from __future__ import print_function import os, glob import os import re import sys #fileList = [] #rootdir = '../../examples' def out_of_date(original, derived): """ Returns True if derivative is out-of-date wrt original, both of which are full file paths. TODO: this check isn't adequate in some cases. Eg, if we discover a bug when building the examples, the original and derived will be unchanged but we still want to fource a rebuild. We can manually remove from _static, but we may need another solution """ return (not os.path.exists(derived) or os.stat(derived).st_mtime < os.stat(original).st_mtime) def main(exampledir,sourcedir): noplot_regex = re.compile(r"#\s*-\*-\s*noplot\s*-\*-") datad = {} for root, subFolders, files in os.walk(exampledir): for fname in files: if ( fname.startswith('.') or fname.startswith('#') or fname.startswith('_') or fname.find('.svn')>=0 or not fname.endswith('.py') ): continue fullpath = os.path.join(root,fname) contents = file(fullpath).read() # indent relpath = os.path.split(root)[-1] datad.setdefault(relpath, []).append((fullpath, fname, contents)) subdirs = datad.keys() subdirs.sort() output_dir=os.path.join(sourcedir,'examples') if not os.path.exists(output_dir): os.makedirs(output_dir) fhindex = file(os.path.join(sourcedir,'examples','index.rst'), 'w') fhindex.write("""\ .. _examples-index: ***************** NetworkX Examples ***************** .. only:: html :Release: |version| :Date: |today| .. toctree:: :maxdepth: 2 """) for subdir in subdirs: output_dir= os.path.join(sourcedir,'examples',subdir) if not os.path.exists(output_dir): os.makedirs(output_dir) static_dir = os.path.join(sourcedir, 'static', 'examples') if not os.path.exists(static_dir): os.makedirs(static_dir) subdirIndexFile = os.path.join(subdir, 'index.rst') fhsubdirIndex = file(os.path.join(output_dir,'index.rst'), 'w') fhindex.write(' %s\n\n'%subdirIndexFile) #thumbdir = '../_static/plot_directive/mpl_examples/%s/thumbnails/'%subdir #for thumbname in glob.glob(os.path.join(thumbdir,'*.png')): # fhindex.write(' %s\n'%thumbname) fhsubdirIndex.write("""\ .. _%s-examples-index: ############################################## %s ############################################## .. only:: html :Release: |version| :Date: |today| .. toctree:: :maxdepth: 1 """%(subdir, subdir.title())) data = datad[subdir] data.sort() #parts = os.path.split(static_dir) #thumb_dir = ('../'*(len(parts)-1)) + os.path.join(static_dir, 'thumbnails') for fullpath, fname, contents in data: basename, ext = os.path.splitext(fname) static_file = os.path.join(static_dir, fname) #thumbfile = os.path.join(thumb_dir, '%s.png'%basename) #print ' static_dir=%s, basename=%s, fullpath=%s, fname=%s, thumb_dir=%s, thumbfile=%s'%(static_dir, basename, fullpath, fname, thumb_dir, thumbfile) rstfile = '%s.rst'%basename outfile = os.path.join(output_dir, rstfile) fhsubdirIndex.write(' %s\n'%rstfile) if (not out_of_date(fullpath, static_file) and not out_of_date(fullpath, outfile)): continue print('%s/%s' % (subdir,fname)) fhstatic = file(static_file, 'w') fhstatic.write(contents) fhstatic.close() fh = file(outfile, 'w') fh.write('.. _%s-%s:\n\n'%(subdir, basename)) base=fname.partition('.')[0] title = '%s'%(base.replace('_',' ').title()) #title = ' %s example code: %s'%(thumbfile, subdir, fname) fh.write(title + '\n') fh.write('='*len(title) + '\n\n') pngname=base+".png" png=os.path.join(static_dir,pngname) linkname = os.path.join('..', '..', 'static', 'examples') if os.path.exists(png): fh.write('.. image:: %s \n\n'%os.path.join(linkname,pngname)) linkname = os.path.join('..', '..', '_static', 'examples') fh.write("[`source code <%s>`_]\n\n::\n\n" % os.path.join(linkname,fname)) # indent the contents contents = '\n'.join([' %s'%row.rstrip() for row in contents.split('\n')]) fh.write(contents) # fh.write('\n\nKeywords: python, matplotlib, pylab, example, codex (see :ref:`how-to-search-examples`)') fh.close() fhsubdirIndex.close() fhindex.close() if __name__ == '__main__': import sys try: arg0,arg1,arg2=sys.argv[:3] except: arg0=sys.argv[0] print(""" Usage: %s exampledir sourcedir exampledir: a directory containing the python code for the examples. sourcedir: a directory to put the generated documentation source for these examples. """ % (arg0)) else: main(arg1,arg2) networkx-1.11/doc/source/0000755000175000017500000000000012653231454015245 5ustar aricaric00000000000000networkx-1.11/doc/source/download.rst0000644000175000017500000000075412637544500017615 0ustar aricaric00000000000000-------- Download -------- Software ~~~~~~~~ Source and binary releases: http://cheeseshop.python.org/pypi/networkx/ Github (latest development): https://github.com/networkx/networkx/ Documentation ~~~~~~~~~~~~~ *PDF* http://networkx.github.io/documentation/latest/_downloads/networkx_tutorial.pdf http://networkx.github.io/documentation/latest/_downloads/networkx_reference.pdf *HTML in zip file* http://networkx.github.io/documentation/latest/_downloads/networkx-documentation.zip networkx-1.11/doc/source/developer/0000755000175000017500000000000012653231454017232 5ustar aricaric00000000000000networkx-1.11/doc/source/developer/index.rst0000644000175000017500000000014012637544450021073 0ustar aricaric00000000000000.. _developer: Developer Guide *************** .. toctree:: :maxdepth: 2 gitwash/index networkx-1.11/doc/source/developer/gitwash/0000755000175000017500000000000012653231454020700 5ustar aricaric00000000000000networkx-1.11/doc/source/developer/gitwash/following_latest.rst0000644000175000017500000000363212637544450025017 0ustar aricaric00000000000000.. _following-latest: ============================= Following the latest source ============================= These are the instructions if you just want to follow the latest *networkx* source, but you don't need to do any development for now. The steps are: * :ref:`install-git` * get local copy of the `networkx github`_ git repository * update local copy from time to time Get the local copy of the code ============================== From the command line:: git clone git://github.com/networkx/networkx.git You now have a copy of the code tree in the new ``networkx`` directory. Updating the code ================= From time to time you may want to pull down the latest code. It is necessary to add the networkx repository as a remote to your configuration file. We call it upstream. git remote set-url upstream https://github.com/networkx/networkx.git Now git knows where to fetch updates from. cd networkx git fetch upstream The tree in ``networkx`` will now have the latest changes from the initial repository, unless you have made local changes in the meantime. In this case, you have to merge. git merge upstream/master It is also possible to update your local fork directly from GitHub: 1. Open your fork on GitHub. 2. Click on 'Pull Requests'. 3. Click on 'New Pull Request'. By default, GitHub will compare the original with your fork. If you didn’t make any changes, there is nothing to compare. 4. Click on 'Switching the base' or click 'Edit' and switch the base manually. Now GitHub will compare your fork with the original, and you should see all the latest changes. 5. Click on 'Click to create a pull request for this comparison' and name your pull request. 6. Click on Send pull request. 7. Scroll down and click 'Merge pull request' and finally 'Confirm merge'. You will be able to merge it automatically unless you did not change you local repo. .. include:: links.inc networkx-1.11/doc/source/developer/gitwash/maintainer_workflow.rst0000644000175000017500000000601412637544450025521 0ustar aricaric00000000000000.. _maintainer-workflow: ################### Maintainer workflow ################### This page is for maintainers |emdash| those of us who merge our own or other peoples' changes into the upstream repository. Being as how you're a maintainer, you are completely on top of the basic stuff in :ref:`development-workflow`. The instructions in :ref:`linking-to-upstream` add a remote that has read-only access to the upstream repo. Being a maintainer, you've got read-write access. It's good to have your upstream remote have a scary name, to remind you that it's a read-write remote:: git remote add upstream-rw git@github.com:networkx/networkx.git git fetch upstream-rw ******************* Integrating changes ******************* Let's say you have some changes that need to go into trunk (``upstream-rw/master``). The changes are in some branch that you are currently on. For example, you are looking at someone's changes like this:: git remote add someone git://github.com/someone/networkx.git git fetch someone git branch cool-feature --track someone/cool-feature git checkout cool-feature So now you are on the branch with the changes to be incorporated upstream. The rest of this section assumes you are on this branch. A few commits ============= If there are only a few commits, consider rebasing to upstream:: # Fetch upstream changes git fetch upstream-rw # rebase git rebase upstream-rw/master Remember that, if you do a rebase, and push that, you'll have to close any github pull requests manually, because github will not be able to detect the changes have already been merged. A long series of commits ======================== If there are a longer series of related commits, consider a merge instead:: git fetch upstream-rw git merge --no-ff upstream-rw/master The merge will be detected by github, and should close any related pull requests automatically. Note the ``--no-ff`` above. This forces git to make a merge commit, rather than doing a fast-forward, so that these set of commits branch off trunk then rejoin the main history with a merge, rather than appearing to have been made directly on top of trunk. Check the history ================= Now, in either case, you should check that the history is sensible and you have the right commits:: git log --oneline --graph git log -p upstream-rw/master.. The first line above just shows the history in a compact way, with a text representation of the history graph. The second line shows the log of commits excluding those that can be reached from trunk (``upstream-rw/master``), and including those that can be reached from current HEAD (implied with the ``..`` at the end). So, it shows the commits unique to this branch compared to trunk. The ``-p`` option shows the diff for these commits in patch form. Push to trunk ============= :: git push upstream-rw my-new-feature:master This pushes the ``my-new-feature`` branch in this repository to the ``master`` branch in the ``upstream-rw`` repository. .. include:: links.inc networkx-1.11/doc/source/developer/gitwash/forking_hell.rst0000644000175000017500000000224512637544450024105 0ustar aricaric00000000000000.. _forking: ====================================================== Making your own copy (fork) of networkx ====================================================== You need to do this only once. The instructions here are very similar to the instructions at https://help.github.com/articles/fork-a-repo/ |emdash| please see that page for more detail. We're repeating some of it here just to give the specifics for the `networkx`_ project, and to suggest some default names. Set up and configure a github account ===================================== If you don't have a github account, go to the github page, and make one. You then need to configure your account to allow write access |emdash| see the ``Generating SSH keys`` help on `github help`_. Create your own forked copy of `networkx`_ ====================================================== #. Log into your github account. #. Go to the `networkx`_ github home at `networkx github`_. #. Click on the *fork* button: .. image:: forking_button.png Now, after a short pause and some 'Hardcore forking action', you should find yourself at the home page for your own forked copy of `networkx`_. .. include:: links.inc networkx-1.11/doc/source/developer/gitwash/branch_dropdown.png0000644000175000017500000003766712637544450024607 0ustar aricaric00000000000000PNG  IHDR7'piCCPICC ProfilexTkA6n"Zkx"IYhE6bk Ed3In6&*Ezd/JZE(ޫ(b-nL~7}ov r4 Ril|Bj A4%UN$As{z[V{wwҶ@G*q Y<ߡ)t9Nyx+=Y"|@5-MS%@H8qR>׋infObN~N>! ?F?aĆ=5`5_M'Tq. VJp8dasZHOLn}&wVQygE0  HPEaP@<14r?#{2u$jtbDA{6=Q<("qCA*Oy\V;噹sM^|vWGyz?W15s-_̗)UKuZ17ߟl;=..s7VgjHUO^gc)1&v!.K `m)m$``/]?[xF QT*d4o(/lșmSqens}nk~8X<R5 vz)Ӗ9R,bRPCRR%eKUbvؙn9BħJeRR~NցoEx pHYs   IDATxEi9'IdADO' z9ܙ!D# I0P3Kf[l M;3vOwW~aHIvio.UTCo+"dEcml"E?%E@PE@ʄx(182:"(@!G .eSE ĬLHP$e͢(@!G fe(bŊrHy"("2={[WE@(ĬLjEcRȡ)"8 :`ӣ"( s&§犀"( 2qt^"( AڸqdggKNN.A(@!2!Pݎsʮ]L2Rre)RTnI4Z"(B &e*G8%7"P+$S-CPD .ecR&x$,;cOMKPE bd(ݱc(QS%E@P4D feIL\i(@#W+?% ,6mʘ^/_~x-(G nLKsQ$ ,FI˖-3&<Ɨ.W\i畊+*{viM(@)<I9%ڼuVٶml߽=E8S6E@[$ X\H2J,)[lї:3uhE xU'@+t;d.-[P@̞S$ #KVKW.L Wӵ5"(B fτ Yɕ\}|]xŇV"2I&+۰iԩl2^vav#d#N^(UVɺuZjRFjNzh/;%dee%͠D۷oKsznO9L /;1Xxnb +~z!Mr墲2$*q&[nիWr䭷*gyf蚞Ew}.;lР{RfiÆ r=?OUL:Ejת9pnw`~{Nxd~ew}jd=䓂p;t sLTՍ~xRby:uEʒ%K^T)]virAw/`>Bzٽ{} ?mڴv wyGڨ2I7བɁv .H6m*gϖ^{Mn69m۶ZMO$^~ >\;8.\P|Myg妛nҥKh{K/&M~p޽U,OYێ*U]_msc/g cǎ!#_}U+vȚkGJB8q؇n"sNQ:t(Txo\xRlYK/$x G}={R;wJCkB <,N(FZ3g ^K|VHq0y.ȑ#_ꫯUW]%͚5#8B "X͛7{W~+tbLÆ m9Xb6mbh{/|r'_Wڵk@c;~wߑ[ ˱W^ ޼y }?|p\ƍg:BKWU6? Huֶ<'AK.:/+sD](@/ݏ.Æ G$ba/_܎-ZȔ)S/~ 7ZƳ|;G+0ygołڰWζV=_Te2.b v BGG3y?;(y~| k֬}֬Y{=}t s \rp=a0.?DG)sذa?[h b*4ƀ9ꨣd` /1O>9JKٖMP:i$lZL6ncYNaѢE¡켤Gx@Xx*Z11]NW\!u-( 0

"9F+<<_XG'KZog5;R+ٚcsN;Iǘ&-a0ȏgt!뮻jSc1 oRqpFgeG_EsN`KS'9XtoBcb`mϋ;rI'> N P" Lh}acgkSh7l0p<2s]G#yY.| w϶?cJD;Q&<xx2ž6 HyCNPsqJx@y7"K^ Gye=V0 uȥ "rP Xoz#er M^~ߴ {8|Y=A}= sG:9o1@9Ƴ+7^xu]/WopCymLgr1ec(ǟ~ɮ"v>j(hYb(>euͷS47cB.$9e8 -,,9gyBX,pĈ@(Q&eYgMs p2eO Q\n1^_>d +3ʧ?Ƴw|;1Y!ƒwj<V1~Q<'ƌÈ3V["+ bVp&<bAXXj(E`0)d3AzuG C(zZLb9(^k& BVĬ/bC!# sw;PXTXX=1QZx6 a Cµ aG{L)G @p>u1YBxxڽցB(EP Yy6k*+lD7 4ÍX>lx($r%/l)Bsm(x1&! Z#ؿ:E;?ZAhUB[b` ?ﴉX4&0?f lIt :v屰...{X( ڋB[<WQ(XgD!xq便m\l%a /Ed ;ʉ0uxN9~Tv)%m=;e&l%&O:b JH ~'ZA?>z|ǭ (o.x1^ 2q -Ld){CD;Ὗ F=?y.mߏ;'r.ZY픉kc4~‘ 7ڽ(lu(^3pD)3gA ")aq??H(AGݥK#8S$o[~gbx^#O˛YE  \M8-OuޙۍeNCi =E@P ŢHh sYE dS4WO8c&n(+^YP3$ԕĪP[P@Lh&AI!PҀdCA#,e",V WSP2I*@e2q.OF~E@P1+<]qDIIPE 5Y$] $a-_P!vɃBKVE@+LSE@!az("WR&0^k늀">q)d6)E@PQ&[^J"(@ʄ0$LOPkE@P2 ]͕i=UE $4\RRE@HORFU̙3Q:fZjsZM(@(p0q.Bĉ A+ "Hea.'s>Ϝ֖zRF\ymh P )Lמ ڒ> 'M)LThI0 TS ~dٲeI`WٹsgvMM, ޴i#9RnuP\GrrrK<t:/毰ԟ2]|G2d|ouѣwߕ\~m"&3f|3DL3m4KbŊ6 Cx@ʖ-1߁܈ͷȣ>j7N|M)"'tL0~*.k狣ϸ׼ dʔ)&|ʹN:裥٭!-^,gϖݻ˜9sU˖/1iУG4_*]wK{4NG1߁/{nr*/tmOڵe%ˌ-ZhU ɿp"yᇥ|+tӦMK.R%K&!oꫯ ^|<q믥wض}|7ܥ=2S$>AygeRn]믗^{MXȱkfثL ܬ^5^A!x?5sJ#FHfͤtҲ]ZL\8.ާl^'#G/8?vfR(KL' ,*UHdk6/K.@|ҨqcOQ&U2+r,{l߶]8it9KR."q֭Qvl&cƎ=T21xL Odƍ%C5;%9[o2fs 5a0`<H>c)W\pҧO \yrmmV{VVL2RjU!1eԯ__nQ4PnlXLrXXSXi{YP_XS3FYѶm[˃߰a﷗]v^޿4(X˧|y?kS(Ν:ncpR$kc93beWQ& .wBA?o-x!%J&?V3C t옱f[f7c ܢxa^¥\rԠ`;eT;w7cRRVm_|w4*U,mBg\iƌ(p\pT6iFFQZR%Rv-kn}No/yP$M~4-[ӌт,;̸XmfO27}[8-ttR5҇(B͟7O~|~+fq/;#ϟo"+"a&B=΄XyymƏ/O= vw /o 7 dy(3<ӆ 9}";#"qGh1y;/%8CkrMʲ crs$EKL?ۍC[ O?]N0b c"*#QNc~fԑx{Xx{ {93{3{UH8-sڵ/o 7vt5kJ+Y,ܽ>n6J%#ع&mH KV^-r@:Frmٱ2F:͆`$k9oo#G`G3,P&ccN8a PK䷡&?slF܄CI._]KeBypDXc"KCuGpф#p8G*U moL>J3%I=ocIDATX9-wqͱpp^阏p"x&4 Q&oЮGyTnN?ZL &%̬Y/ #+y;/XsgY(͛['aF(!4ŏ,i GG9qП,uD>ƆnǂI~m͢ʯf+}Y5廴(+'%Jڹqƣui7DE̘ܽ> !7:(;ǜ ![j5kAаaCۇ^}}\g/c[ل]^\]ys4 ?c C}wǔQ&`'㈉o0eJw%  AUQ>>8/`fnPHe|!(^/؆X+bC\sUB-$Gi/õϛ6 :ƬC &K;aܲX-j&kըi ,\,L`/X; F)ϦXe5 X5CPV4R^Ь_^s%=WHe";qk,.pCA!nx줴R'G8(bHe`-X;ڍQڦMkU[xKxJX=Wx)G>,Oa1OM11Ka,᥏7wC@ Tgs8K(#K[+ׄth9\΍p@@ LW26[ݽleO2̋q`LK/ԄP Ʋs:$B)Y1{cBxn|1G6[TbRы/h{XD{1yvn9D!-!^²@aI:x3 " %E) X,ѡCeya1`𜼫E 2Y|LBM'BP/W]iP[m#dwѷ,dJռh4&JB" ,dBPhtd?~ "cxMX~#GV"(/}^,D(Oe[\)p#(?7K_12B<%ry? iiR7k"g{)sy/DJnuQ޹P^,,yR3!%lx#@|ax83ibB1NLHPho! pD -rÕ R3 bR( ‚VJ,eEjm{X:)LrWyGA@mO"2Ŀ-zv0#0nfM'sݲc C+@"t>\ g>8E|ʳ@x1d"%%E@P W&vgÛU?攬9PC>~lKXrE@D eLܜ`0ztjQY-eJKבWX!7l>Gז.~PVdo.ij>ZT|gfE=][v%d)oD"ҾIA2m&8rt_%D19k-bOVMҺay߫<>bIx8CI2a|>m1ewm]E.^/!VhnRe~qRSJ:emY?(5 ~ώZ":T_))"+i'16%sdT˷}oϕV_T^b>(#sn_Ⱥ-;euҠF?jZS#t[/+(P>}59}U5i|;glپ;vֺ߸C_nfo)ύ\lQT15F@iӻ_+KS߮wXLPsAKv˲5$g95vȲ[m{C.ꕲdњuOPH;ĵcjˑk0}4WN֣jyTaE^W+e֣Ueyl"Y*su{@FKKm#`:1}6R{_W"h]{m\T*-!233+*Wח*QT~0&ay;_>lTԭRJf-߫8Vzd=HuX#\6(@XV/=|n2$DUPoѠLAV/ePT,%Kx27c 4 s*ұy%pcLNʕޛ7J{o)*+V.e-Wfowt#jHrY7˄Gݐ0U-HuoE@P;AISMdž~[/o{]A %%sCdY% uK8>R,>|2m4+Ԃ3g|=cܹsn$׿T~TXQn>''G;+L/j_P Q{+? &iFV^N=APxt%|С2sL^P?s2"űT}>xqKLo7X;'|ڵK7n,SL Dwmn0|Pϟ/ӧO5ju]g|r 'Xo&MX˞˖-<?!UW]%>޽[(W+]t0yfڵ 6L*UڻARļԨQ#yWyhhBB 8PUۻh/M 6E7Qz1h֬Y`y,6OK(!}/ԩuY6-B}Q~~,<!ʈijǢEA+(ȔV&u֥}J@@2ʮ\~J*UNM2˒%Kc A^f۵kN)2)S̛7Oz衰|\xA 䡠z ;V.+_]{!>'(VZ.תU A2zx,MbMGX#Zyx]%T|8ʇzAދ;B$(7gʏDx}USZDb^+Baopks׽GC=Z:}ժUvڲ`w* a,k/BX>=)~G9#w^2wy ]?ކr6mjP<ՆN|+++zjSNL($}䊉5chE*#ZիW;mpT1exoşJ+@B@M0NFGu<ʆ֯_oqI>}֭_g=nIPE)qR|y;ҦM C?O?ٹ* j_PA#(M翼,&Md%:bQP?4юy-Ëc:3^Wpb=z#FE,UuYN=T KI{feur)0Q}儢ݻ XrT/:̩ 3<3l\dmps<({,bzw#}Am"֡C2Hm']T"\AX9F˖-CϠWOky-O>z8SzGJ@aA`ĉvN8tBT􌇘#aC7/)VԬwj=9&^~ suL[(@!$D9RE Pev] +"z2I>QE@H;RzuJ"(@*N~_{5c:uo³Y"/3^~cv3pyK~l^Xxّ - !CwY7ߴoE owU֮]kʇ"7=gƌ6lX9zhzmӦMv;q'KhvȧU}rԩ[ZK Cmʕ+oav8Xڿ[4o}JoܩsD kРAv71/Gmp;$nΧ(PX6QP!oc1;+W.|xGn:Yξ XӢ8_l:I{#L Ldɒ69O*]J*WKAuMtk[!Xp\wuv_5ppJܕv#@UG)(/epn"H[ejd<O<Ɵ}1cX=& ^ ^"ć@ʆkdh.ڶBAC.X3(z&dE@P. N4Z"(* tm"(F@IE@@Td`kE@H4L(@"$;](@PehD= 1.7 you can ensure that the link is correctly set by using the ``--set-upstream`` option:: git push --set-upstream origin my-new-feature From now on git will know that ``my-new-feature`` is related to the ``my-new-feature`` branch in the github repo. .. _edit-flow: The editing workflow ==================== Overview -------- :: # hack hack git add my_new_file git commit -am 'NF - some message' git push In more detail -------------- #. Make some changes #. See which files have changed with ``git status`` (see `git status`_). You'll see a listing like this one:: # On branch ny-new-feature # Changed but not updated: # (use "git add ..." to update what will be committed) # (use "git checkout -- ..." to discard changes in working directory) # # modified: README # # Untracked files: # (use "git add ..." to include in what will be committed) # # INSTALL no changes added to commit (use "git add" and/or "git commit -a") #. Check what the actual changes are with ``git diff`` (`git diff`_). #. Add any new files to version control ``git add new_file_name`` (see `git add`_). #. To commit all modified files into the local copy of your repo,, do ``git commit -am 'A commit message'``. Note the ``-am`` options to ``commit``. The ``m`` flag just signals that you're going to type a message on the command line. The ``a`` flag |emdash| you can just take on faith |emdash| or see `why the -a flag?`_ |emdash| and the helpful use-case description in the `tangled working copy problem`_. The `git commit`_ manual page might also be useful. #. To push the changes up to your forked repo on github, do a ``git push`` (see `git push`_). Ask for your changes to be reviewed or merged ============================================= When you are ready to ask for someone to review your code and consider a merge: #. Go to the URL of your forked repo, say ``http://github.com/your-user-name/networkx``. #. Use the 'Switch Branches' dropdown menu near the top left of the page to select the branch with your changes: .. image:: branch_dropdown.png #. Click on the 'Pull request' button: .. image:: pull_button.png Enter a title for the set of changes, and some explanation of what you've done. Say if there is anything you'd like particular attention for - like a complicated change or some code you are not happy with. If you don't think your request is ready to be merged, just say so in your pull request message. This is still a good way of getting some preliminary code review. Some other things you might want to do ====================================== Delete a branch on github ------------------------- :: git checkout master # delete branch locally git branch -D my-unwanted-branch # delete branch on github git push origin :my-unwanted-branch (Note the colon ``:`` before ``test-branch``. See also: http://github.com/guides/remove-a-remote-branch Several people sharing a single repository ------------------------------------------ If you want to work on some stuff with other people, where you are all committing into the same repository, or even the same branch, then just share it via github. First fork networkx into your account, as from :ref:`forking`. Then, go to your forked repository github page, say ``http://github.com/your-user-name/networkx`` Click on the 'Admin' button, and add anyone else to the repo as a collaborator: .. image:: pull_button.png Now all those people can do:: git clone git@githhub.com:your-user-name/networkx.git Remember that links starting with ``git@`` use the ssh protocol and are read-write; links starting with ``git://`` are read-only. Your collaborators can then commit directly into that repo with the usual:: git commit -am 'ENH - much better code' git push origin master # pushes directly into your repo Explore your repository ----------------------- To see a graphical representation of the repository branches and commits:: gitk --all To see a linear list of commits for this branch:: git log You can also look at the `network graph visualizer`_ for your github repo. Finally the :ref:`fancy-log` ``lg`` alias will give you a reasonable text-based graph of the repository. .. _rebase-on-trunk: Rebasing on trunk ----------------- Let's say you thought of some work you'd like to do. You :ref:`update-mirror-trunk` and :ref:`make-feature-branch` called ``cool-feature``. At this stage trunk is at some commit, let's call it E. Now you make some new commits on your ``cool-feature`` branch, let's call them A, B, C. Maybe your changes take a while, or you come back to them after a while. In the meantime, trunk has progressed from commit E to commit (say) G:: A---B---C cool-feature / D---E---F---G trunk At this stage you consider merging trunk into your feature branch, and you remember that this here page sternly advises you not to do that, because the history will get messy. Most of the time you can just ask for a review, and not worry that trunk has got a little ahead. But sometimes, the changes in trunk might affect your changes, and you need to harmonize them. In this situation you may prefer to do a rebase. rebase takes your changes (A, B, C) and replays them as if they had been made to the current state of ``trunk``. In other words, in this case, it takes the changes represented by A, B, C and replays them on top of G. After the rebase, your history will look like this:: A'--B'--C' cool-feature / D---E---F---G trunk See `rebase without tears`_ for more detail. To do a rebase on trunk:: # Update the mirror of trunk git fetch upstream # go to the feature branch git checkout cool-feature # make a backup in case you mess up git branch tmp cool-feature # rebase cool-feature onto trunk git rebase --onto upstream/master upstream/master cool-feature In this situation, where you are already on branch ``cool-feature``, the last command can be written more succinctly as:: git rebase upstream/master When all looks good you can delete your backup branch:: git branch -D tmp If it doesn't look good you may need to have a look at :ref:`recovering-from-mess-up`. If you have made changes to files that have also changed in trunk, this may generate merge conflicts that you need to resolve - see the `git rebase`_ man page for some instructions at the end of the "Description" section. There is some related help on merging in the git user manual - see `resolving a merge`_. .. _recovering-from-mess-up: Recovering from mess-ups ------------------------ Sometimes, you mess up merges or rebases. Luckily, in git it is relatively straightforward to recover from such mistakes. If you mess up during a rebase:: git rebase --abort If you notice you messed up after the rebase:: # reset branch back to the saved point git reset --hard tmp If you forgot to make a backup branch:: # look at the reflog of the branch git reflog show cool-feature 8630830 cool-feature@{0}: commit: BUG: io: close file handles immediately 278dd2a cool-feature@{1}: rebase finished: refs/heads/my-feature-branch onto 11ee694744f2552d 26aa21a cool-feature@{2}: commit: BUG: lib: make seek_gzip_factory not leak gzip obj ... # reset the branch to where it was before the botched rebase git reset --hard cool-feature@{2} .. _rewriting-commit-history: Rewriting commit history ------------------------ .. note:: Do this only for your own feature branches. There's an embarassing typo in a commit you made? Or perhaps the you made several false starts you would like the posterity not to see. This can be done via *interactive rebasing*. Suppose that the commit history looks like this:: git log --oneline eadc391 Fix some remaining bugs a815645 Modify it so that it works 2dec1ac Fix a few bugs + disable 13d7934 First implementation 6ad92e5 * masked is now an instance of a new object, MaskedConstant 29001ed Add pre-nep for a copule of structured_array_extensions. ... and ``6ad92e5`` is the last commit in the ``cool-feature`` branch. Suppose we want to make the following changes: * Rewrite the commit message for ``13d7934`` to something more sensible. * Combine the commits ``2dec1ac``, ``a815645``, ``eadc391`` into a single one. We do as follows:: # make a backup of the current state git branch tmp HEAD # interactive rebase git rebase -i 6ad92e5 This will open an editor with the following text in it:: pick 13d7934 First implementation pick 2dec1ac Fix a few bugs + disable pick a815645 Modify it so that it works pick eadc391 Fix some remaining bugs # Rebase 6ad92e5..eadc391 onto 6ad92e5 # # Commands: # p, pick = use commit # r, reword = use commit, but edit the commit message # e, edit = use commit, but stop for amending # s, squash = use commit, but meld into previous commit # f, fixup = like "squash", but discard this commit's log message # # If you remove a line here THAT COMMIT WILL BE LOST. # However, if you remove everything, the rebase will be aborted. # To achieve what we want, we will make the following changes to it:: r 13d7934 First implementation pick 2dec1ac Fix a few bugs + disable f a815645 Modify it so that it works f eadc391 Fix some remaining bugs This means that (i) we want to edit the commit message for ``13d7934``, and (ii) collapse the last three commits into one. Now we save and quit the editor. Git will then immediately bring up an editor for editing the commit message. After revising it, we get the output:: [detached HEAD 721fc64] FOO: First implementation 2 files changed, 199 insertions(+), 66 deletions(-) [detached HEAD 0f22701] Fix a few bugs + disable 1 files changed, 79 insertions(+), 61 deletions(-) Successfully rebased and updated refs/heads/my-feature-branch. and the history looks now like this:: 0f22701 Fix a few bugs + disable 721fc64 ENH: Sophisticated feature 6ad92e5 * masked is now an instance of a new object, MaskedConstant If it went wrong, recovery is again possible as explained :ref:`above `. .. include:: links.inc networkx-1.11/doc/source/developer/gitwash/git_install.rst0000644000175000017500000000110212637544450023742 0ustar aricaric00000000000000.. _install-git: ============= Install git ============= Overview ======== ================ ============= Debian / Ubuntu ``sudo apt-get install git`` Fedora ``sudo yum install git-core`` Windows Download and install msysGit_ OS X Use the git-osx-installer_ ================ ============= In detail ========= See the git page for the most recent information. Have a look at the github install help pages available from `github help`_ There are good instructions here: http://book.git-scm.com/2_installing_git.html .. include:: links.inc networkx-1.11/doc/source/developer/gitwash/git_resources.rst0000644000175000017500000000246412637544450024322 0ustar aricaric00000000000000.. _git-resources: ============= git resources ============= Tutorials and summaries ======================= `github help`_ is Git's own help and tutorial site. `github more help`_ lists more resources for learning Git and GitHub, including YouTube channels. The list is constantly updated. In case you are used to subversion_ , you can directly consult the `git svn crash course`_. To make full use of Git, you need to understand the concept behind Git. The following pages might help you: * `git parable`_ |emdash| an easy read parable * `git foundation`_ |emdash| more on the git parable * `git magic`_ |emdash| extended introduction with intermediate detail in many languages * `git concepts`_ |emdash| a technical page on the concepts Other than that, many devlopers list their personal tips and tricks. Among others there are `Fernando Perez`_, `Nick Quaranto`_ and `Linus Torvalds`_. Manual pages online =================== You can get these on your own machine with (e.g) ``git help push`` or (same thing) ``git push --help``, but, for convenience, here are the online manual pages for some common commands: * `git add`_ * `git branch`_ * `git checkout`_ * `git clone`_ * `git commit`_ * `git config`_ * `git diff`_ * `git log`_ * `git pull`_ * `git push`_ * `git remote`_ * `git status`_ .. include:: links.inc networkx-1.11/doc/source/developer/gitwash/git_links.inc0000644000175000017500000000607112637544450023367 0ustar aricaric00000000000000.. This (-*- rst -*-) format file contains commonly used link targets and name substitutions. It may be included in many files, therefore it should only contain link targets and name substitutions. Try grepping for "^\.\. _" to find plausible candidates for this list. .. NOTE: reST targets are __not_case_sensitive__, so only one target definition is needed for nipy, NIPY, Nipy, etc... .. git stuff .. _git: http://git-scm.com/ .. _github: http://github.com .. _github help: http://help.github.com .. _github more help: https://help.github.com/articles/what-are-other-good-resources-for-learning-git-and-github/ .. _msysgit: http://code.google.com/p/msysgit/downloads/list .. _git-osx-installer: http://code.google.com/p/git-osx-installer/downloads/list .. _subversion: http://subversion.tigris.org/ .. _git svn crash course: http://git-scm.com/course/svn.html .. _network graph visualizer: http://github.com/blog/39-say-hello-to-the-network-graph-visualizer .. _git user manual: http://schacon.github.com/git/user-manual.html .. _git tutorial: http://schacon.github.com/git/gittutorial.html .. _Nick Quaranto: http://www.gitready.com/ .. _Fernando Perez: http://www.fperez.org/py4science/git.html .. _git magic: http://www-cs-students.stanford.edu/~blynn/gitmagic/index.html .. _git concepts: http://www.eecs.harvard.edu/~cduan/technical/git/ .. _git clone: http://schacon.github.com/git/git-clone.html .. _git checkout: http://schacon.github.com/git/git-checkout.html .. _git commit: http://schacon.github.com/git/git-commit.html .. _git push: http://schacon.github.com/git/git-push.html .. _git pull: http://schacon.github.com/git/git-pull.html .. _git add: http://schacon.github.com/git/git-add.html .. _git status: http://schacon.github.com/git/git-status.html .. _git diff: http://schacon.github.com/git/git-diff.html .. _git log: http://schacon.github.com/git/git-log.html .. _git branch: http://schacon.github.com/git/git-branch.html .. _git remote: http://schacon.github.com/git/git-remote.html .. _git rebase: http://schacon.github.com/git/git-rebase.html .. _git config: http://schacon.github.com/git/git-config.html .. _why the -a flag?: http://www.gitready.com/beginner/2009/01/18/the-staging-area.html .. _git staging area: http://www.gitready.com/beginner/2009/01/18/the-staging-area.html .. _tangled working copy problem: http://tomayko.com/writings/the-thing-about-git .. _Linus Torvalds: http://www.mail-archive.com/dri-devel@lists.sourceforge.net/msg39091.html .. _git parable: http://tom.preston-werner.com/2009/05/19/the-git-parable.html .. _git foundation: http://matthew-brett.github.com/pydagogue/foundation.html .. _deleting master on github: http://matthew-brett.github.com/pydagogue/gh_delete_master.html .. _rebase without tears: http://matthew-brett.github.com/pydagogue/rebase_without_tears.html .. _resolving a merge: http://schacon.github.com/git/user-manual.html#resolving-a-merge .. _ipython git workflow: http://mail.scipy.org/pipermail/ipython-dev/2010-October/006746.html .. other stuff .. _python: http://www.python.org .. |emdash| unicode:: U+02014 .. vim: ft=rst networkx-1.11/doc/source/developer/gitwash/links.inc0000644000175000017500000000016112637544450022516 0ustar aricaric00000000000000.. compiling links file .. include:: known_projects.inc .. include:: this_project.inc .. include:: git_links.inc networkx-1.11/doc/source/developer/gitwash/configure_git.rst0000644000175000017500000001130012637544450024256 0ustar aricaric00000000000000.. _configure-git: =============== Configure git =============== .. _git-config-basic: Overview ======== Your personal git configurations are saved in the ``.gitconfig`` file in your home directory. Here is an example ``.gitconfig`` file:: [user] name = Your Name email = you@yourdomain.example.com [alias] ci = commit -a co = checkout st = status stat = status br = branch wdiff = diff --color-words [core] editor = vim [merge] summary = true You can edit this file directly or you can use the ``git config --global`` command:: git config --global user.name "Your Name" git config --global user.email you@yourdomain.example.com git config --global alias.ci "commit -a" git config --global alias.co checkout git config --global alias.st "status -a" git config --global alias.stat "status -a" git config --global alias.br branch git config --global alias.wdiff "diff --color-words" git config --global core.editor vim git config --global merge.summary true To set up on another computer, you can copy your ``~/.gitconfig`` file, or run the commands above. In detail ========= user.name and user.email ------------------------ It is good practice to tell git_ who you are, for labeling any changes you make to the code. The simplest way to do this is from the command line:: git config --global user.name "Your Name" git config --global user.email you@yourdomain.example.com This will write the settings into your git configuration file, which should now contain a user section with your name and email:: [user] name = Your Name email = you@yourdomain.example.com Of course you'll need to replace ``Your Name`` and ``you@yourdomain.example.com`` with your actual name and email address. Aliases ------- You might well benefit from some aliases to common commands. For example, you might well want to be able to shorten ``git checkout`` to ``git co``. Or you may want to alias ``git diff --color-words`` (which gives a nicely formatted output of the diff) to ``git wdiff`` The following ``git config --global`` commands:: git config --global alias.ci "commit -a" git config --global alias.co checkout git config --global alias.st "status -a" git config --global alias.stat "status -a" git config --global alias.br branch git config --global alias.wdiff "diff --color-words" will create an ``alias`` section in your ``.gitconfig`` file with contents like this:: [alias] ci = commit -a co = checkout st = status -a stat = status -a br = branch wdiff = diff --color-words Editor ------ You may also want to make sure that your editor of choice is used :: git config --global core.editor vim Merging ------- To enforce summaries when doing merges (``~/.gitconfig`` file again):: [merge] log = true Or from the command line:: git config --global merge.log true .. _fancy-log: Fancy log output ---------------- This is a very nice alias to get a fancy log output; it should go in the ``alias`` section of your ``.gitconfig`` file:: lg = log --graph --pretty=format:'%Cred%h%Creset -%C(yellow)%d%Creset %s %Cgreen(%cr) %C(bold blue)[%an]%Creset' --abbrev-commit --date=relative You use the alias with:: git lg and it gives graph / text output something like this (but with color!):: * 6d8e1ee - (HEAD, origin/my-fancy-feature, my-fancy-feature) NF - a fancy file (45 minutes ago) [Matthew Brett] * d304a73 - (origin/placeholder, placeholder) Merge pull request #48 from hhuuggoo/master (2 weeks ago) [Jonathan Terhorst] |\ | * 4aff2a8 - fixed bug 35, and added a test in test_bugfixes (2 weeks ago) [Hugo] |/ * a7ff2e5 - Added notes on discussion/proposal made during Data Array Summit. (2 weeks ago) [Corran Webster] * 68f6752 - Initial implimentation of AxisIndexer - uses 'index_by' which needs to be changed to a call on an Axes object - this is all very sketchy right now. (2 weeks ago) [Corr * 376adbd - Merge pull request #46 from terhorst/master (2 weeks ago) [Jonathan Terhorst] |\ | * b605216 - updated joshu example to current api (3 weeks ago) [Jonathan Terhorst] | * 2e991e8 - add testing for outer ufunc (3 weeks ago) [Jonathan Terhorst] | * 7beda5a - prevent axis from throwing an exception if testing equality with non-axis object (3 weeks ago) [Jonathan Terhorst] | * 65af65e - convert unit testing code to assertions (3 weeks ago) [Jonathan Terhorst] | * 956fbab - Merge remote-tracking branch 'upstream/master' (3 weeks ago) [Jonathan Terhorst] | |\ | |/ Thanks to Yury V. Zaytsev for posting it. .. include:: links.inc networkx-1.11/doc/source/developer/gitwash/this_project.inc0000644000175000017500000000030012637544450024066 0ustar aricaric00000000000000.. networkx .. _`networkx`: http://networkx.github.io .. _`networkx github`: http://github.com/networkx/networkx .. _`networkx mailing list`: http://groups.google.com/group/networkx-discuss/ networkx-1.11/doc/source/developer/gitwash/git_development.rst0000644000175000017500000000034012637544450024621 0ustar aricaric00000000000000.. _git-development: ===================== Git for development ===================== Contents: .. toctree:: :maxdepth: 2 forking_hell set_up_fork configure_git development_workflow maintainer_workflow networkx-1.11/doc/source/developer/gitwash/pull_button.png0000644000175000017500000003113512637544450023765 0ustar aricaric00000000000000PNG  IHDR~\iu pHYs   IDATx]|ToߔMHH PKT,"S, O}%"]zHR@dw޽fIBB$a&{3s̙3s72p8 G# ?G#p!}p]G#?G#p!}p]G#?G#p!HKKCVVVuNPF C./y>xbpˏUVe͛7׭=Obyyy()6c\,4lYѬ)4,= {$H& 0,n8^CtqѸ˱gEsZzl6~?vu;Oͽyv+gkx G#}كSԩSª8xy$;w}Z*.6.O[B$ɴ r)ɝĻB? Z}|?ìqu'.=IXHG.A0[rgcǎN,YDx_>kƮG όSN3^ǶQX+PmY+ˁ\K_,?~\#GBPN$,"E؟ A~ǧGLVs',o<5#3( fa0`9UI l>`ŒOлa彣p0e3ggx~Zb>1cƠiӦE*(O_˝)ޝ:8G| < 'g>!A>; oxMۍ9gH0taق.IFp̟3W6bά|`xhs &އĶ?EK+X4|ad#,0+VCSȦٳоq?$` 0ot$WRsp-.vx3k;~ŒO >`4|H>Og,O5E 4}LZ'Š&^xz` hXl,CbM8ִvXI:?? $̯?c{+]DJ___L4 ׿h5 &O  ^d g|tU"?5yɟ0pwX&<[?GWirToV|){<3mLAmxs 6'@9l>.#0}H4tŅx-w}__n Ơu~1b{֮҇#`4|(6y{fm^g][l?2wagsxqt'`oqѪ!!CMcZxL8ϦmE)*ue/fJ{ ž{LhϞ=WZˀ8y36 ‘3ݟL?w $ZMsѬ^ tɉ)"NKto+Щw+N7oG@2_ >tł@Y``0<\"aJ)Ϭ &57"&I! Rs=oֱ8) V8ޑxkme: #Hxӧp8ƿRh,@ #膣kw )@Ȥ r-e0gG|)o$b}"1bPddpyC#W.?E'\lǂ\Zز\m4rognYr2y^ /&욟*SY|!ڜi&rE|v9#' eH\|=0i;@F%!_Am62bK ? ( #`H.Ң #ӑݐ`6ֿfO< ͨOzde/mȄ<@soΛXyɒKcs30oOIZM}5 R-AnJiz$HlY3"1j^8A: GZE_ErrV $1W,PߐT?M!1v\w'_w :\db5nMQ^&SZ ,h׮Knq7mP?څ3PBK2}G6IjwMF:,`*Lъ<-r$PjZaY4l/,y&q-[$e]u߾}'ABWՂgyL&%1/H}0 3wSJ;Fc~1{NĿ?\ꂠ8\"`}H`5%zf40>-|ڊyJw-<˖-1}tt ؿ?q7ni#)}ӡd:җ؁hn)U8D_Nҗ"z@vM$Y/ kg߲CC0[mğev= Lg!Kĩ֮#_<*JzPp. ˌA"ө)(5K@0Mm?_@~}'v U8k ~Z(tn@Pu/71fM |_7P>1/fGn, K/)_QWKcAtهK嗰EVG,]0>\!dD6]4}.>_F MX5GfNY {;Ga1 Z*,Rf-Nslff*e0gJ1)#36fKGzܰ W&}9cpNA-[XP&ɹ[u.lNw}M=]gT̚ghSOgϞ`&eؤ) |4xnwH{ n>X'5!ȋQ(0Mз"d]H001\ka/<[`0->BkG8.ТVUo[1㥓l&IUF˾B["^.W".=REIiWiMb6WW/.MDvT~kg ;DY(4(sy#jᓟephn×/?Wx,j[ П|\Ld1q@)2j8ˊ}'`㝑|v ( j:/}Za;,Tw(pDQBU2Z}d.s^_|~j?^4IaA'¯vW>TaS I) ֖4$^+0ܹ^=NѺs ""]tg:ea3YLN˳g/tZ9llv4AAslnf6L4Ծbf"^}ϱS Y9P{“].ã䞠{'śrx)帬V _cO%U8c\LdҗVHe =S9SnLH'?lm|,:Z( cq8@,o]UW8UKi<+{? f˘0`X.pnPⱹWS8nL8ftv\+­=bcNIU:%qK^L<+B-p8!PiX\)p8U,8G|p_>x)G#PeʲpʇWÍp8U,8G|p_>x)G#PeʲpʇWÍp8U,8G|p_>x)G#PeʲpʇWÍp8U,8G|p_>x)G#PeʲpʇWÍp8U,8G|p_>x)G#PeʲpʇWÍp8U,8G|p_>x)G#Peʲpʇ+Kq8@D@YF*I8'#pʇw7^#TY p8#-cKp8*WU}xG#PvGJ#-*>NP{@)#mq57 B"hC<jE@O>ǧ"ӪC:9ш /1g.%O_V0$t3D& S݆WT`3kA-sprNdlg.Ν8x"g/!Eo.0nUՆ+ZjUW*@\=SXXe(^l0T:2hVquSM(lx:}0^su*i 7 ,I&g-51xk/cx2S&2[qyZHAͩ#ɑz[Pw ~!} _lH2pm0q;W۶q6_1\=D"|+`-'B-09Tyy0 "u Aj>{`ƄpmȭWw1XXZM90t$Q6h@<FsxE(٤mҮ/}|%fg7е7IAvAPz^TTKu8L: ځ,{+NZ*(Vw_Bn+oAxbPSĬɊ-ʋB_zeʠhƕ qB뗐Њ$㲞CzegڪZO$<g_b̛F>-rOG`V'%|<|࡮x y-U rYѦfdLYǾy8dAZ>\ IfEnA$_Nͳ/[liQyNxvyC;S[>SXm1dqd4v:mS`- ܈6tAx5nCfy9A7Nnʭ kja&5Rnhۼ6^6䦜ǁh3|zc՘W)0;BHwb9bj4 cuɰhjt/nF蓮!mHPHs`"nfVbl8l#ߠ@ {/|3cGw|g,h42G`Ӯ|l7aCwa\]T߉ 7"9=$FNm:6ݒ!c)v5.gBgNϧV㔃+Pw yF+D_ҝ EzqLhIf/@1[Ix}!⃺uB?m"A VD{lŔycۡtxb]?ca6JZIbvby-z==\dryW)ih__:(\JNESOgL"VtjڹkuImvHtUBe_h^&*oM6hGN>g WCaBU|A _ I+,N,ɉ~ʕF)2\ju~]8kGP^AwԈ9ʇum %O6B\ti-%IT2ꃊr*Cm& o^ f2f쒘pF6r%߾tzfdQ OQ2 hZKwV&v@taa iZV~ jOadtNC},ŏ4<RdL'Sԥ9s>ǡa,Jm{|˨CLN]3>^| vM.\5|xU wnsI,!Τ-gI<4 nۣ 1f4GOJ~F\VD؜A<)IaR$1tX6{y _?x$ ;6\KC=;!keؖpMhBT1X9]l A&-q4 ž|Yj:/; PÓ͂bdٱILMQjb =)7M7@ >G'5j4abwTm |9٠9!mʻ@ w+ۀVJh-7iĽ>;$ \V&·/>:=C+:joxFjobpH;;x'#%nȋJ$9nYioP.eӠ1Ypr>̦ oH ^Bs@jr)l+kl7QWSg]-u9[ӈՑIU.d+)/'Ъ jFw$@20HV~$˛')^ЫI-l}V z'&h`űbi.2b J+W"lR*d f$He bɔ)D{2E= u yS/,oe@t6Gځ+Fk 4UZ됰{-"ikKpq ώ u9yG; bYR~S&@tmļԀMBHygCju/.Ykس4c|:(>*6mBjڐ&lM4X.mv6\[lS)o"jegGʸ4@|d:tjH.*ъ x=ŷj 4$e%z FN8џEgD׎K( _kҥȩF~JeY.i7!%5O߫cm'lŶN6ܔl$ɊfD9hP3;okJ-RY~uʒ{?$Zܶ;۱yՏ$:ڣK;r;zCbwād >vvQ 6at܆BDȗ*XMTcKVGӪ. >,o;~g&M~)?lFRv_;r1gbK5$_5::EuZsZG)&rЊMn.@gT"T:5,JVC`R]Ehi~ 2߶hk'5!h߆k[&xB4v>EMWBb<iR+UŖ)$K'Ϯ3pSb7\8X0: <;+K#;Yv#^;7҆ZtBZш՘Xj,8(G0S[ gbI T&(dt Vn O t…NAF*Qϓtj#+)e& LdȾSx 2K.Ҳ ]?tj rrjCm#-ϡ{.M9d,de8_7,mf$i3O(lI IɆл 5$PER Cg1~wf )kVzփϥ7csb7?ƳSn>yH̫TShj$R:;o .<Ѫfڌ;^EapiWSGJ6#>|nV7ՄԠpǟxL{.鴡)kK~b$|& r's;% '* ;mAӬ?g~ݒ<|}=p)LW|>}LIRT/M}twI4mP{Ċuo" vZ ,7:t`J!6%) E ge4)ZF,{ MX%Ft8W23- ^ y/pQP2ZSyfRC1.6ꈼ>^l$ཹo9"l$Crr#6R $36iXiEG ╭_cYd;̚=3!k-Nr䲘dF #UəPFPBxKI+\;&{&{eTR%}@Y3rr)|wE1`h׬&RF\C1T~4j%3&lgz dQ4#-ߝYzr/! zjI^p8@+ hWp8:youSp8eA[eAp8nW&.p8 -r8jWՀ G,pWOYy9@5@[Հ G,?V#9}nIENDB`networkx-1.11/doc/source/developer/gitwash/forking_button.png0000644000175000017500000003144412637544450024453 0ustar aricaric00000000000000PNG  IHDR]Vl8E pHYs   IDATx]|Tv7ݴMN]&O  X(("}`@z!!@$ I6{wHŐ ww|s,JlG#i7p8t"p8Dn ͛p8tp8Dn ͛p8tp8D۪SMYVܺu jN:H;Ц22{wG(7o#77}Z+6/_18+{22+ߔ'Tnn.\\\z)H~cCEҿ++3^KX,`B`UC(s*Ҁ{/22'7#G>F},\>ë˗ sݿ/^v;s+p[cF5nGd2{VVՍÙ3gh"A0'OFvٱ߹nɊd-ڕzzz:mۆ9s=33ž鯬C~0~Ure$!nm,W#q(gǝ݌XlP844\K7n' USֽ__\] \d 222@h[/Aadr]3\ĻTnJo8aӻ%&52_VN߫nV䭄ha9r0`0vލM6 駟.ڙ٦z$])Fg,Zn.P:٠3Uӕ+17sBo aMM0V9EGի3f S`f Bq]_w34&k%dAG=pS*V999;w.Fc$o0}tNANj̈́B!@<E~=ֹG ~Jk6hubC}@#S1@H³^gȌn1j˕+W" AAA5kؽS6> |oZlY`ݖ>,uh ؽę}&|~4Ú`{a{hi_}UΘ(d]ٳÐD74'Xfm Z`zd ޚ0Ԥ/{q٫!Lq)5(J/1+pũK|K+bէƏ>ƺ9Bc>GSޝfDND?](f[<kcc\wviprr\*씝6vTJ%m㿳b%/Q8`[BLjo?@ ew26؆c޻E'6W  _B8'^m6@m_ϚA'|%=s}]j8HGǦ`wgiSbȜy8|b7r$*&t+|Nc;#p헏P:GJ+.2. Ô)S'|RHc_}=Z6P(QKؘUN|N6ؤlEqbs!ES0nl"\CoBvn>lB{o6IBѽs#-F!#ؾ= D*βă84I!V<.#\iZe?`(l 6 >َ; pOoʼnU %/8. D/8˨㻘vn'[UH+w-_|@l1b9&N(rY-47{=Q/W[n4b`vq=b,bpu&=\Ip`g8ً+V!JJ@Gc?ExF(>1JO 2/s@~λvH4rl.ӻ-,Ĩc#U"~T|bBǽzDؗLrK@硫]+d)J ('Y<3nʲJZLeE;)yc3QiXʊYc,^A>,#El %I_ɎƙHB# <$W|,S. FqF{-.-R=F?zGzys> 2ȭ0.峂dSr%b>\f1eiggg和ܸhME(p]_ ĶC`zP__cgA ~U%%Yt5q˞ՁÈ!kl;HdiLט\.8>n0y: aPK%:`#"DJ.])b#P|cl,:hՂNB"C^~.n,"NG.zc2jb7Xƶň % ^.nJ?DKVfʃ,/S)].`:2= ]EXE>H{ Xt`qܹϟ/={~xxx7Do*|]k=&teAN:mj>:EO![N>EqwpJx CqḬlRR4TƋ k ¯Ɍr тTRF0r,zp9pG%4oo~)dߞn ƅ*X攝D4Ta3'XJB%Qj`.KPˋ9i M&[Yp{yq,Y\0*9a=!aON:`AB,l"mӏWr~ţcah|4$ꔠ`Bo3 {e5;Q#m.擏810I"'S33~^fӘah#18mkSp`rGZ(D}woo5+p_ +9[Kc;91czꅤ$pq>|*OJJ\ uX*:e> Q+ƌ8c1>[f(h^ de 76ѯ=.]a6 y_Bzϡg#3.7 1P-FbM&'c3Иz >5<( /^ Kj(ǸO1wٟ|e=?6M~A q`lѩS'!XYJN&&<9soVC_=똧s-? !:bs". ۿϛ/R􅊔ʫQБ=ˋs ONS.VM>‘tװ}ؿe"_^5^оMNָ.W 6C':ep[ȶ)o۠m]h c'q&iI>7`=p!xhVEφ2HGa{T*{ӾߊVC&&iɲv.p)2br-ˈYk:NgϞ &y @L W?}@+8w4¤sщfF 'ڲ3S9HF@O4)PZMh1Q̞0 pQ!E/G ز`sa&^~fXh4|]D7pTZ͛z/^T 53ĉ,"{e[C}#+N!vr_b\ \*w2l%7=gadaR.^H7n\A ޗ$ - Lj,::Sycr67'nAp_A+Y'A>0|1ާWg`ΆH<'x*A_IS"PNkcoǓf!#&Ǿo,ċt(r3i"m7с{ 55 N\D"R"G"_gRdU2VޖSqI=?r2eȶD]Rhew pMyhZ=#hӓ&6-P*/w;R\\=i&UZg&&3|=)L5@mh`+  =R\V;ӍZˡP J|MKKCHHH_+믿^Ph•ijrJ@J>K gOBC.gGZ-LdyzcW q!pKa1h_vPtE^5km+d4FAwJd`iǝg]7,)H/n@1 7Ȝ!w;CFq{Eƫ-ɭpÇZt+IF zm vAFZXԆAThO";l \vHftoTJ6PSwRrƀMvSfΪ%ҒU# ŋ.,VJ~C4T*R\VKe0fRفٯ*,^l2FF?qd__fޖ"*'ov(JH*VZ`E½<^?olz[wG8 7Z18'{322];aVun 1ǡ" +#3~{.ɍxq9.5s_VMv6Gy~ǘPʹȌ);+G>E|S>4G#-8]GN"IN#-8]GN"IN#-8]GN"IN#-8]GN"IN#-8]GN"IN#-8]GN"IN#-8]GN"IN#-8]GN"IN#-8]GN"IN#-km.G#s8yyyչAs8Bj y.G#P'plur|G6nmp,t9@ju&G#Pgn=8G#Ppҭ y@E9@[:o#pKΊ# [69:lʔ)ӫ:fD}$U-\OhZ #r4՛t.Y?p"ͭHbѦ…xe!-/_]oxDE"QxR5|\ing8]ܢ>d:jpv]9kMDMTMtȼQ Rq!1psqxÔeɦK/m804sayF ,YWcu DMkYaIߢ#GwgSp&"V(OaRa_7`?ķ"lQk\Nȸr+~ك[LW^w CQ#OCj$6l(=㥁Q8}KV}6"öduSLꫛj2dFaǎs4*ծŞp݋xUkشgM3k[KӜ[@)xl&!͟#c ZʧN0k"lV CDҁHt`12^$XL6]1wV7 R~ҟtp vGxcʳh+8f6$w'Ӟ*Y̮Pa܊9<=4/}k |MP[D??|e_ϫO˘0qHsK<1TO=$_jF¾q,^ v2b(Zkǖ}r 4n=26`Q`mBG*s[Ȃ͕'-/@Ett /2Dz%D;yxa` ",DYKTU8k$褰!awU1Hv͌ m}1v0 oÑ/w||}%DϨ=4ƀ^a”W81ra4ޱ O›<`p4]һ>-%t'`!'W'|t59*֯ .!H?1~@S$Dg!'0ok8[2m h  \ Nmؓe1rHsIYvPTѾC!,z >b0Z+q`8-4&5IȒ*i5f^GVB>T0K6z%6.sQ/C1bpONᗅ{`0(wʕЭPk:R6FhBd":Ҥ%زy%y\]4eBDGiK$[HUu1#е-#|֟"ԄHw#Ȉˑκ˩(ɇwסA#?S{h'\}(YQY0F 6p}Q#,|4_`IšP7t8 F- םi._i&Mg@LAPiySpխwm$QMՔ|rU{ KHI%-C1\ˆt˰)W2P Md,a#T:Qx 9ߦZ\s'LfA>[5&Mo£W'Q B " q1't"񮐑QKY˷yyUDC;"ڤB O/;Çggd \'!2<Ύe q1pUzbCå'iV#KxM¬ 4)7֞ z[ѭ[s@v)]BKKS*읤VF6fIVbs&4莽!ȴ6sg sx[9l0p#3ѡl̀ JgAR3 F3dMKdYìuXlr_udMȻ!?f|`IF(ĥg"W$ꎩS?YJڶrlsçtܳk&R0,bE<=;$bTv_F2+ԭ(rxȿoo!+a[I[.`EC/3aĠIDATF6`!vdI#:T ZT=$&{+-O^#XA>c 01x(`61ʠ\g/IkM"݊J/.\EK[4i@_ĥ,훉hZ5T=:df?ʒ47n@{mFL-\-yM4X>'6icTr9lthJGN}gr>\u#O/.d':F6$vgp&70eڹ;j?B;W*s=lɇ#m5Х>d aF~0 ޭ >e"4XPG^Cf1f`D=Ef@@61X?{26CmukhDii/( ;H#$2h)aQz#1hfˉ YKO im,?-(aoak&*"HI+q0<ӔR 5$ ! /Hw"\4~1߯u@9LGAeñ+7KE{KȲ!:ՆaP،ɡе\:1oJݣ`ީoȥcu bBGzBWZ $^!g67: c tf q$&_p=IB-4"{8W趒#zlBZ%RhEk5!Gk4(F'4 ͪe}-.PuKV`s@|^+D=.xێ.#6)Ei* i,<ĢFڡ)Vjq%|;-Phk&v!_ޱ㈊?oSc0HT#[ Tcv$zھ-OFd3[ 7PѷڊS.#!Uqe8pƍ4諔F`#I žHػvñq6RMdÉq*җHXgk1d mmT@R)[X vI2!d0˒yÍpv;0 if" YidD!YӬ3▧PgJȓ{%!rP]PN!CV:X[iAxx2BAvͰoWf:4Ul$+Š"RgCoڦ-4p/HgQacfANDf5.],$ &&ӡ&] 9 B 5$ە_xaI6ƢBKm&=?߹OgQ!xE= :Ǹsss^6'_3Y0I!6€aKm|T[0hj:B4R⥧GgB/AT}۰!IiCۋ+\FlpJUw ]qOҤ" t:Y~, F"/ĨehE@ m:HMh{Q}lD17R4ep&mq+C[D/]V q[xN-lH7ɔ *49v0i{= hN=)B !~2rHXXB=4΂gyXFH8 E +EP _-("Cv|ċT͚ةKHS_Xf&Y8vEI 9h dP]z :.'M;D-ċccwM˓ 歡ƉYrm߁_* ++pCtJߍҰ ,Y{U!aeҲ^4{F&G4Y\XciG%hأTV:@kv i׈Z6e7wع2VڤP߉~©rkbhr\Tn]_fc-jxV4hۊ&sVj\?a$N'WRdt;$dLV;[GYJG~4d%ŚG&F֣t]k#X):j_`pgx!$7Uy`X{ARؙ@c=ca&ůN%ȅ:(h6Wь"H[Oؿ E,3N^'7O¼6'[KUDt ?gc JtA`V>rV:~^O_$ڙ9匉gѧoZNuIUti $3KB;fA4.ApT#z5Act9EY* <HSI_4!4Kװh vb-!3iGךp! hEYqڴ\#It[0vcX=p)>)/)m'dۆ?>b]yOM\1phmO}P/+^5h5p4W]paزO֠&6ȅY*\6dg#UZ,)#a=MpǘIHOeߙ²4b+CflZ [z?? = ^]111 V%߅ ɕTwWVWNt^MŐ@oZ+\_"o_=4w1#ڒ0PxEKwDǚѵ]؏Sm3hڈ'RT*ꀓ™ɂ#n䅕8tyS`QZіᓸ9d2`9+L>MΡ LeMKvGv]+hRQφƒUO6fdUC88~+e=9ܝc[ȍKA !6w沸tƂہ ^Bf;W5H^C?K4 (NidQ8)TӬFnYE,tH樗.Ɓ]tp"W]) rA5*`9ɹ^|PZf>NÛ"nNHf:tc DE)Y~f|JMr+sjQU/QxCub|M. [Г޴"2QJ((/DYYEI?vX0&)<|}lVAlK ZJד_ |-f'DbYt`&wV+|UtKĂ'YwZ:H4* J %ʂnTN@aW;%x1paLͿUSg My::WY:Ҕ*_wY(Ț%pYFrZ<b7ɇ~?MehdE{ix*# hSId3}n÷coK[95w?]sN W9ёӎOQ١܂2V2rWᏠk3t@ -݂ɢX;l :+Jݙ\ڛ\s"Ǵ&&}/o/[^@[f%~5[,A\GFm˲feK'\Vd"kTirMn .Mƿ;8su*$TVY䋑poo kJ U-ro~Z p/gRUkiKJf$h.kj)$\]'q0tJ 7NGA^Ąv} &O Lj_DUXʪӒ(c3|WںhkpYIݮ`e )V2d 3YB^5 )qXE%W[{ {ѿDWҲ4} zTYx|# UC5[:TI/_wrȭpM~XL[,E1w<$!!sJM p8@8G#P>U2V~}<#pA[Ó8@u#-FpA[Ó8@u#Iq8rrIGd-N$IENDB`networkx-1.11/doc/source/developer/gitwash/patching.rst0000644000175000017500000000772512637544450023247 0ustar aricaric00000000000000================ Making a patch ================ You've discovered a bug or something else you want to change in `networkx`_ .. |emdash| excellent! You've worked out a way to fix it |emdash| even better! You want to tell us about it |emdash| best of all! The easiest way is to make a *patch* or set of patches. Here we explain how. Making a patch is the simplest and quickest, but if you're going to be doing anything more than simple quick things, please consider following the :ref:`git-development` model instead. .. _making-patches: Making patches ============== Overview -------- :: # tell git who you are git config --global user.email you@yourdomain.example.com git config --global user.name "Your Name Comes Here" # get the repository if you don't have it git clone git://github.com/networkx/networkx.git # make a branch for your patching cd networkx git branch the-fix-im-thinking-of git checkout the-fix-im-thinking-of # hack, hack, hack # Tell git about any new files you've made git add somewhere/tests/test_my_bug.py # commit work in progress as you go git commit -am 'BF - added tests for Funny bug' # hack hack, hack git commit -am 'BF - added fix for Funny bug' # make the patch files git format-patch -M -C master Then, send the generated patch files to the `networkx mailing list`_ |emdash| where we will thank you warmly. In detail --------- #. Tell git who you are so it can label the commits you've made:: git config --global user.email you@yourdomain.example.com git config --global user.name "Your Name Comes Here" #. If you don't already have one, clone a copy of the `networkx`_ repository:: git clone git://github.com/networkx/networkx.git cd networkx #. Make a 'feature branch'. This will be where you work on your bug fix. It's nice and safe and leaves you with access to an unmodified copy of the code in the main branch:: git branch the-fix-im-thinking-of git checkout the-fix-im-thinking-of #. Do some edits, and commit them as you go:: # hack, hack, hack # Tell git about any new files you've made git add somewhere/tests/test_my_bug.py # commit work in progress as you go git commit -am 'BF - added tests for Funny bug' # hack hack, hack git commit -am 'BF - added fix for Funny bug' Note the ``-am`` options to ``commit``. The ``m`` flag just signals that you're going to type a message on the command line. The ``a`` flag |emdash| you can just take on faith |emdash| or see `why the -a flag?`_. #. When you have finished, check you have committed all your changes:: git status #. Finally, make your commits into patches. You want all the commits since you branched from the ``master`` branch:: git format-patch -M -C master You will now have several files named for the commits:: 0001-BF-added-tests-for-Funny-bug.patch 0002-BF-added-fix-for-Funny-bug.patch Send these files to the `networkx mailing list`_. When you are done, to switch back to the main copy of the code, just return to the ``master`` branch:: git checkout master Moving from patching to development =================================== If you find you have done some patches, and you have one or more feature branches, you will probably want to switch to development mode. You can do this with the repository you have. Fork the `networkx`_ repository on github |emdash| :ref:`forking`. Then:: # checkout and refresh master branch from main repo git checkout master git pull origin master # rename pointer to main repository to 'upstream' git remote rename origin upstream # point your repo to default read / write to your fork on github git remote add origin git@github.com:your-user-name/networkx.git # push up any branches you've made and want to keep git push origin the-fix-im-thinking-of Then you can, if you want, follow the :ref:`development-workflow`. .. include:: links.inc networkx-1.11/doc/source/bibliography.rst0000644000175000017500000000457612637544450020473 0ustar aricaric00000000000000.. -*- coding: utf-8 -*- Bibliography ============ .. [BA02] R. Albert and A.-L. Barabási, "Statistical mechanics of complex networks", Reviews of Modern Physics, 74, pp. 47-97, 2002. http://arxiv.org/abs/cond-mat/0106096 .. [Bollobas01] B. Bollobás, "Random Graphs", Second Edition, Cambridge University Press, 2001. .. [BE05] U. Brandes and T. Erlebach, "Network Analysis: Methodological Foundations", Lecture Notes in Computer Science, Volume 3418, Springer-Verlag, 2005. .. [CL1996] G. Chartrand and L. Lesniak, "Graphs and Digraphs", Chapman and Hall/CRC, 1996. .. [choudum1986] S.A. Choudum. "A simple proof of the Erdős-Gallai theorem on graph sequences." Bulletin of the Australian Mathematical Society, 33, pp 67-70, 1986. http://dx.doi.org/10.1017/S0004972700002872 .. [Diestel97] R. Diestel, "Graph Theory", Springer-Verlag, 1997. http://diestel-graph-theory.com/index.html .. [DM03] S.N. Dorogovtsev and J.F.F. Mendes, "Evolution of Networks", Oxford University Press, 2003. .. [EppsteinPads] David Eppstein. PADS, A library of Python Algorithms and Data Structures. http://www.ics.uci.edu/~eppstein/PADS .. [EG1960] Erdős and Gallai, Mat. Lapok 11 264, 1960. .. [hakimi1962] Hakimi, S. "On the Realizability of a Set of Integers as Degrees of the Vertices of a Graph." SIAM J. Appl. Math. 10, 496-506, 1962. .. [havel1955] Havel, V. "A Remark on the Existence of Finite Graphs" Casopis Pest. Mat. 80, 477-480, 1955. .. [Langtangen04] H.P. Langtangen, "Python Scripting for Computational Science.", Springer Verlag Series in Computational Science and Engineering, 2004. .. [Martelli03] A. Martelli, "Python in a Nutshell", O'Reilly Media Inc, 2003. .. [Newman03] M.E.J. Newman, "The Structure and Function of Complex Networks", SIAM Review, 45, pp. 167-256, 2003. http://epubs.siam.org/doi/abs/10.1137/S003614450342480 .. [Sedgewick02] R. Sedgewick, "Algorithms in C: Parts 1-4: Fundamentals, Data Structure, Sorting, Searching", Addison Wesley Professional, 3rd ed., 2002. .. [Sedgewick01] R. Sedgewick, "Algorithms in C, Part 5: Graph Algorithms", Addison Wesley Professional, 3rd ed., 2001. .. [West01] D. B. West, "Introduction to Graph Theory", Prentice Hall, 2nd ed., 2001. .. [vanRossum98] Guido van Rossum. Python Patterns - Implementing Graphs, 1998. http://www.python.org/doc/essays/graphs networkx-1.11/doc/source/conf.py0000644000175000017500000001576212653231353016555 0ustar aricaric00000000000000# -*- coding: utf-8 -*- # # Sphinx documentation build configuration file, created by # sphinx-quickstart.py on Sat Mar 8 21:47:50 2008. # # This file is execfile()d with the current directory set to its containing dir. # # The contents of this file are pickled, so don't put values in the namespace # that aren't pickleable (module imports are okay, they're removed automatically). # # All configuration values have a default value; values that are commented out # serve to show the default value. from __future__ import print_function import sys, os, re import contextlib @contextlib.contextmanager def cd(newpath): """ Change the current working directory to `newpath`, temporarily. If the old current working directory no longer exists, do not return back. """ oldpath = os.getcwd() os.chdir(newpath) try: yield finally: try: os.chdir(oldpath) except OSError: # If oldpath no longer exists, stay where we are. pass # Check Sphinx version import sphinx if sphinx.__version__ < "1.3": raise RuntimeError("Sphinx 1.3 or newer required") # Environment variable to know if the docs are being built on rtd. on_rtd = os.environ.get('READTHEDOCS', None) == 'True' print print("Building on ReadTheDocs: {}".format(on_rtd)) print print("Current working directory: {}".format(os.path.abspath(os.curdir))) print("Python: {}".format(sys.executable)) if on_rtd: # Build is not via Makefile (yet). # So we manually build the examples and gallery. import subprocess with cd('..'): # The Makefile is run from networkx/doc, so we need to move there # from networkx/doc/source (which holds conf.py). py = sys.executable subprocess.call([py, 'make_gallery.py']) subprocess.call([py, 'make_examples_rst.py', '../examples', 'source']) # If your extensions are in another directory, add it here. # These locations are relative to conf.py sys.path.append(os.path.abspath('../sphinxext')) # General configuration # --------------------- # Add any Sphinx extension module names here, as strings. They can be extensions # coming with Sphinx (named 'sphinx.ext.*') or your custom ones. extensions = [ 'sphinx.ext.autosummary', 'sphinx.ext.autodoc', 'sphinx.ext.coverage', 'sphinx.ext.doctest', 'sphinx.ext.intersphinx', 'sphinx.ext.mathjax', 'sphinx.ext.napoleon', 'sphinx.ext.pngmath', 'sphinx.ext.todo', 'sphinx.ext.viewcode', #'sphinxcontrib.bibtex', #'IPython.sphinxext.ipython_console_highlighting', #'IPython.sphinxext.ipython_directive', ] # generate autosummary pages autosummary_generate=True # Add any paths that contain templates here, relative to this directory. templates_path = ['templates','../rst_templates'] # The suffix of source filenames. source_suffix = '.rst' # The encoding of source files. source_encoding = 'utf-8' # The master toctree document. master_doc = 'index' # General substitutions. project = 'NetworkX' copyright = '2016, NetworkX Developers' # The default replacements for |version| and |release|, also used in various # other places throughout the built documents. # # The short X.Y version. import networkx version = networkx.__version__ # The full version, including dev info release = networkx.__version__.replace('_','') # There are two options for replacing |today|: either, you set today to some # non-false value, then it is used: #today = '' # Else, today_fmt is used as the format for a strftime call. #today_fmt = '%B %d, %Y' # List of documents that shouldn't be included in the build. # unused_docs = ['reference/pdf_reference'] # If true, '()' will be appended to :func: etc. cross-reference text. #add_function_parentheses = True # If true, the current module name will be prepended to all description # unit titles (such as .. function::). add_module_names = False # show_authors = True # The name of the Pygments (syntax highlighting) style to use. #pygments_style = 'friendly' pygments_style = 'sphinx' # A list of prefixs that are ignored when creating the module index. (new in Sphinx 0.6) modindex_common_prefix=['networkx.'] doctest_global_setup="import networkx as nx" # Options for HTML output # ----------------------- if not on_rtd: import sphinx_rtd_theme html_theme = 'sphinx_rtd_theme' html_theme_path = [sphinx_rtd_theme.get_html_theme_path()] #html_theme_options = { # "rightsidebar": "true", # "relbarbgcolor: "black" #} # The style sheet to use for HTML and HTML Help pages. A file of that name # must exist either in Sphinx' static/ path, or in one of the custom paths # given in html_static_path. html_style = 'networkx.css' # Add any paths that contain custom static files (such as style sheets) here, # relative to this directory. They are copied after the builtin static files, # so a file named "default.css" will overwrite the builtin "default.css". html_static_path = ['static'] # If not '', a 'Last updated on:' timestamp is inserted at every page bottom, # using the given strftime format. html_last_updated_fmt = '%b %d, %Y' # If true, SmartyPants will be used to convert quotes and dashes to # typographically correct entities. #html_use_smartypants = True # Content template for the index page. #html_index = 'index.html' html_index = 'contents.html' # Custom sidebar templates, maps page names to templates. #html_sidebars = {'index': 'indexsidebar.html'} # Additional templates that should be rendered to pages, maps page names to # templates. #html_additional_pages = {'index': 'index.html','gallery':'gallery.html'} html_additional_pages = {'gallery':'gallery.html'} # If true, the reST sources are included in the HTML build as _sources/. html_copy_source = False html_use_opensearch = 'http://networkx.github.io' # Output file base name for HTML help builder. htmlhelp_basename = 'NetworkX' pngmath_use_preview = True # Options for LaTeX output # ------------------------ # The paper size ('letter' or 'a4'). latex_paper_size = 'letter' # The font size ('10pt', '11pt' or '12pt'). #latex_font_size = '10pt' # Grouping the document tree into LaTeX files. List of tuples # (source start file, target name, title, author, document class [howto/manual]). latex_documents = [('tutorial/index', 'networkx_tutorial.tex', 'NetworkX Tutorial', 'Aric Hagberg, Dan Schult, Pieter Swart', 'howto', 1), ('reference/pdf_reference', 'networkx_reference.tex', 'NetworkX Reference', 'Aric Hagberg, Dan Schult, Pieter Swart', 'manual', 1)] #latex_appendices = ['installing']#,'legal'],'citing','credits','history'] #latex_appendices = ['credits'] # Intersphinx mapping intersphinx_mapping = {'http://docs.python.org/': None, 'http://docs.scipy.org/doc/numpy/': None, } # For trac custom roles default_role = 'math' trac_url = 'https://networkx.lanl.gov/trac/' mathjax_path = 'https://cdn.mathjax.org/mathjax/latest/MathJax.js?config=TeX-AMS_HTML' numpydoc_show_class_members = False networkx-1.11/doc/source/overview.rst0000644000175000017500000000644112637544500017653 0ustar aricaric00000000000000.. -*- coding: utf-8 -*- Overview ======== NetworkX is a Python language software package for the creation, manipulation, and study of the structure, dynamics, and function of complex networks. With NetworkX you can load and store networks in standard and nonstandard data formats, generate many types of random and classic networks, analyze network structure, build network models, design new network algorithms, draw networks, and much more. Who uses NetworkX? ------------------ The potential audience for NetworkX includes mathematicians, physicists, biologists, computer scientists, and social scientists. Good reviews of the state-of-the-art in the science of complex networks are presented in Albert and Barabási [BA02]_, Newman [Newman03]_, and Dorogovtsev and Mendes [DM03]_. See also the classic texts [Bollobas01]_, [Diestel97]_ and [West01]_ for graph theoretic results and terminology. For basic graph algorithms, we recommend the texts of Sedgewick, e.g. [Sedgewick01]_ and [Sedgewick02]_ and the survey of Brandes and Erlebach [BE05]_. Goals ----- NetworkX is intended to provide - tools for the study of the structure and dynamics of social, biological, and infrastructure networks, - a standard programming interface and graph implementation that is suitable for many applications, - a rapid development environment for collaborative, multidisciplinary projects, - an interface to existing numerical algorithms and code written in C, C++, and FORTRAN, - the ability to painlessly slurp in large nonstandard data sets. The Python programming language ------------------------------- Python is a powerful programming language that allows simple and flexible representations of networks, and clear and concise expressions of network algorithms (and other algorithms too). Python has a vibrant and growing ecosystem of packages that NetworkX uses to provide more features such as numerical linear algebra and drawing. In addition Python is also an excellent "glue" language for putting together pieces of software from other languages which allows reuse of legacy code and engineering of high-performance algorithms [Langtangen04]_. Equally important, Python is free, well-supported, and a joy to use. In order to make the most out of NetworkX you will want to know how to write basic programs in Python. Among the many guides to Python, we recommend the documentation at http://www.python.org and the text by Alex Martelli [Martelli03]_. Free software ------------- NetworkX is free software; you can redistribute it and/or modify it under the terms of the :doc:`BSD License `. We welcome contributions from the community. Information on NetworkX development is found at the NetworkX Developer Zone at Github https://github.com/networkx/networkx History ------- NetworkX was born in May 2002. The original version was designed and written by Aric Hagberg, Dan Schult, and Pieter Swart in 2002 and 2003. The first public release was in April 2005. Many people have contributed to the success of NetworkX. Some of the contributors are listed in the :doc:`credits. ` What Next ^^^^^^^^^ - :doc:`A Brief Tour ` - :doc:`Installing ` - :doc:`Reference ` - :doc:`Examples ` networkx-1.11/doc/source/templates/0000755000175000017500000000000012653231454017243 5ustar aricaric00000000000000networkx-1.11/doc/source/templates/index.html0000644000175000017500000000751712637544450021257 0ustar aricaric00000000000000{% extends "layout.html" %} {% set title = 'Overview' %} {% set css_files = css_files + ["_static/force/force.css"] %} {% block body %}

High productivity software for complex networks

NetworkX is a Python language software package for the creation, manipulation, and study of the structure, dynamics, and functions of complex networks.

Quick Example

>>> import networkx as nx

>>> G=nx.Graph()
>>> G.add_node("spam")
>>> G.add_edge(1,2)
>>> print(G.nodes())
[1, 2, 'spam']
>>> print(G.edges())
[(1, 2)]

Documentation

Features

  • Python language data structures for graphs, digraphs, and multigraphs.
  • Nodes can be "anything" (e.g. text, images, XML records)
  • Edges can hold arbitrary data (e.g. weights, time-series)
  • Generators for classic graphs, random graphs, and synthetic networks
  • Standard graph algorithms
  • Network structure and analysis measures
  • Basic graph drawing
  • Open source BSD license
  • Well tested: more than 1500 unit tests
  • Additional benefits from Python: fast prototyping, easy to teach, multi-platform

{% endblock %} networkx-1.11/doc/source/templates/indexsidebar.html0000644000175000017500000000220112637544450022572 0ustar aricaric00000000000000

Download

{% if version.split('.')[-1].startswith('dev') %}

This documentation is for version {{ version }}, which is not released yet.

You can get the latest source from http://networkx.lanl.gov/hg/networkx/ or look for released versions in the Python Package Index.

{% else %}

Current version: {{ version }}

Get NetworkX from the Python Package Index, or install it with:

easy_install networkx
{% endif %}

Questions? Suggestions?

Join the Google group:

You can also open a ticket at the NetworkX Developer Zone.

networkx-1.11/doc/source/templates/layout.html0000644000175000017500000000154012637544450021453 0ustar aricaric00000000000000{% extends "!layout.html" %} {% block rootrellink %}
  • NetworkX Home
  • Documentation
  • Download
  • Developer (Github)
  • {% endblock %} {% block relbar1 %}
    NetworkX
    {{ super() }} {% if version.split('.')[-1].startswith('dev') %}

    This documentation is for the development version {{ version }}

    {% endif %} {% endblock %} {# put the sidebar before the body #} {% block sidebar1 %} {{ sidebar() }}{% endblock %} {% block sidebar2 %}{% endblock %} networkx-1.11/doc/source/test.rst0000644000175000017500000000304212637544450016762 0ustar aricaric00000000000000******* Testing ******* Requirements for testing ======================== NetworkX uses the Python nose testing package. If you don't already have that package installed, follow the directions here http://somethingaboutorange.com/mrl/projects/nose Testing a source distribution ============================= You can test the complete package from the unpacked source directory with:: python setup_egg.py nosetests Testing an installed package ============================ If you have a file-based (not a Python egg) installation you can test the installed package with >>> import networkx >>> networkx.test() or:: python -c "import networkx; networkx.test()" Testing for developers ====================== You can test any or all of NetworkX by using the "nosetests" test runner. First make sure the NetworkX version you want to test is in your PYTHONPATH (either installed or pointing to your unpacked source directory). Then you can run individual test files with:: nosetests path/to/file or all tests found in dir and an directories contained in dir:: nosetests path/to/dir By default nosetests doesn't test docutils style tests in Python modules but you can turn that on with:: nosetests --with-doctest For doctests in stand-alone files NetworkX uses the extension txt so you can add:: nosetests --with-doctest --doctest-extension=txt to also execute those tests. These options are on by default if you run nosetests from the root of the NetworkX distribution since they are specified in the setup.cfg file found there. networkx-1.11/doc/source/reference/0000755000175000017500000000000012653231454017203 5ustar aricaric00000000000000networkx-1.11/doc/source/reference/algorithms.assortativity.rst0000644000175000017500000000143112637544450025036 0ustar aricaric00000000000000************* Assortativity ************* .. automodule:: networkx.algorithms.assortativity .. autosummary:: :toctree: generated/ Assortativity ------------- .. autosummary:: :toctree: generated/ degree_assortativity_coefficient attribute_assortativity_coefficient numeric_assortativity_coefficient degree_pearson_correlation_coefficient Average neighbor degree ----------------------- .. autosummary:: :toctree: generated/ average_neighbor_degree Average degree connectivity --------------------------- .. autosummary:: :toctree: generated/ average_degree_connectivity k_nearest_neighbors Mixing ------ .. autosummary:: :toctree: generated/ attribute_mixing_matrix degree_mixing_matrix degree_mixing_dict attribute_mixing_dict networkx-1.11/doc/source/reference/exceptions.rst0000644000175000017500000000062612637544450022127 0ustar aricaric00000000000000********** Exceptions ********** .. automodule:: networkx.exception .. currentmodule:: networkx .. autoclass:: networkx.NetworkXException .. autoclass:: networkx.NetworkXError .. autoclass:: networkx.NetworkXPointlessConcept .. autoclass:: networkx.NetworkXAlgorithmError .. autoclass:: networkx.NetworkXUnfeasible .. autoclass:: networkx.NetworkXNoPath .. autoclass:: networkx.NetworkXUnbounded networkx-1.11/doc/source/reference/readwrite.pajek.rst0000644000175000017500000000021612637544450023020 0ustar aricaric00000000000000Pajek ===== .. automodule:: networkx.readwrite.pajek .. autosummary:: :toctree: generated/ read_pajek write_pajek parse_pajek networkx-1.11/doc/source/reference/algorithms.shortest_paths.rst0000644000175000017500000000245112637544450025166 0ustar aricaric00000000000000Shortest Paths ============== .. automodule:: networkx.algorithms.shortest_paths.generic .. autosummary:: :toctree: generated/ shortest_path all_shortest_paths shortest_path_length average_shortest_path_length has_path Advanced Interface ------------------ .. automodule:: networkx.algorithms.shortest_paths.unweighted .. autosummary:: :toctree: generated/ single_source_shortest_path single_source_shortest_path_length all_pairs_shortest_path all_pairs_shortest_path_length predecessor .. automodule:: networkx.algorithms.shortest_paths.weighted .. autosummary:: :toctree: generated/ dijkstra_path dijkstra_path_length single_source_dijkstra_path single_source_dijkstra_path_length all_pairs_dijkstra_path all_pairs_dijkstra_path_length single_source_dijkstra bidirectional_dijkstra dijkstra_predecessor_and_distance bellman_ford negative_edge_cycle johnson Dense Graphs ------------ .. automodule:: networkx.algorithms.shortest_paths.dense .. autosummary:: :toctree: generated/ floyd_warshall floyd_warshall_predecessor_and_distance floyd_warshall_numpy A* Algorithm ------------ .. automodule:: networkx.algorithms.shortest_paths.astar .. autosummary:: :toctree: generated/ astar_path astar_path_length networkx-1.11/doc/source/reference/api_1.9.rst0000644000175000017500000002406112637544450021105 0ustar aricaric00000000000000********************************* Version 1.9 notes and API changes ********************************* This page reflects API changes from NetworkX 1.8 to NetworkX 1.9. Please send comments and questions to the networkx-discuss mailing list: . Flow package ------------ The flow package (:samp:`networkx.algorithms.flow`) is completely rewritten with backward *incompatible* changes. It introduces a new interface to flow algorithms. Existing code that uses the flow package will not work unmodified with NetworkX 1.9. Main changes ============ 1. We added two new maximum flow algorithms (:samp:`preflow_push` and :samp:`shortest_augmenting_path`) and rewrote the Edmonds–Karp algorithm in :samp:`flow_fulkerson` which is now in :samp:`edmonds_karp`. `@ysitu `_ contributed implementations of all new maximum flow algorithms. The legacy Edmonds–Karp algorithm implementation in :samp:`ford_fulkerson` is still available but will be removed in the next release. 2. All maximum flow algorithm implementations (including the legacy :samp:`ford_fulkerson`) output now a residual network (i.e., a :samp:`DiGraph`) after computing the maximum flow. See :samp:`maximum_flow` documentation for the details on the conventions that NetworkX uses for defining a residual network. 3. We removed the old :samp:`max_flow` and :samp:`min_cut` functions. The main entry points to flow algorithms are now the functions :samp:`maximum_flow`, :samp:`maximum_flow_value`, :samp:`minimum_cut` and :samp:`minimum_cut_value`, which have new parameters that control maximum flow computation: :samp:`flow_func` for specifying the algorithm that will do the actual computation (it accepts a function as argument that implements a maximum flow algorithm), :samp:`cutoff` for suggesting a maximum flow value at which the algorithm stops, :samp:`value_only` for stopping the computation as soon as we have the value of the flow, and :samp:`residual` that accepts as argument a residual network to be reused in repeated maximum flow computation. 4. All flow algorithms are required to accept arguments for these parameters but may selectively ignored the inapplicable ones. For instance, :samp:`preflow_push` algorithm can stop after the preflow phase without computing a maximum flow if we only need the flow value, but both :samp:`edmonds_karp` and :samp:`shortest_augmenting_path` always compute a maximum flow to obtain the flow value. 5. The new function :samp:`minimum_cut` returns the cut value and a node partition that defines the minimum cut. The function :samp:`minimum_cut_value` returns only the value of the cut, which is what the removed :samp:`min_cut` function used to return before 1.9. 6. The functions that implement flow algorithms (i.e., :samp:`preflow_push`, :samp:`edmonds_karp`, :samp:`shortest_augmenting_path` and :samp:`ford_fulkerson`) are not imported to the base NetworkX namespace. You have to explicitly import them from the flow package: >>> from networkx.algorithms.flow import (ford_fulkerson, preflow_push, ... edmonds_karp, shortest_augmenting_path) 7. We also added a capacity-scaling minimum cost flow algorithm: :samp:`capacity_scaling`. It supports :samp:`MultiDiGraph` and disconnected networks. Examples ======== Below are some small examples illustrating how to obtain the same output than in NetworkX 1.8.1 using the new interface to flow algorithms introduced in 1.9: >>> import networkx as nx >>> G = nx.icosahedral_graph() >>> nx.set_edge_attributes(G, 'capacity', 1) With NetworkX 1.8: >>> flow_value = nx.max_flow(G, 0, 6) >>> cut_value = nx.min_cut(G, 0, 6) >>> flow_value == cut_value True >>> flow_value, flow_dict = nx.ford_fulkerson(G, 0, 6) With NetworkX 1.9: >>> from networkx.algorithms.flow import (ford_fulkerson, preflow_push, ... edmonds_karp, shortest_augmenting_path) >>> flow_value = nx.maximum_flow_value(G, 0, 6) >>> cut_value = nx.minimum_cut_value(G, 0, 6) >>> flow_value == cut_value True >>> # Legacy: this returns the exact same output than ford_fulkerson in 1.8.1 >>> flow_value, flow_dict = nx.maximum_flow(G, 0, 6, flow_func=ford_fulkerson) >>> # We strongly recommend to use the new algorithms: >>> flow_value, flow_dict = nx.maximum_flow(G, 0, 6) >>> # If no flow_func is passed as argument, the default flow_func >>> # (preflow-push) is used. Therefore this is the same than: >>> flow_value, flow_dict = nx.maximum_flow(G, 0, 6, flow_func=preflow_push) >>> # You can also use alternative maximum flow algorithms: >>> flow_value, flow_dict = nx.maximum_flow(G, 0, 6, flow_func=shortest_augmenting_path) >>> flow_value, flow_dict = nx.maximum_flow(G, 0, 6, flow_func=edmonds_karp) Connectivity package -------------------- The flow-based connecitivity and cut algorithms from the connectivity package (:samp:`networkx.algorithms.connectivity`) are adapted to take advantage of the new interface to flow algorithms. As a result, flow-based connectivity algorithms are up to 10x faster than in NetworkX 1.8 for some problems, such as sparse networks with highly skewed degree distributions. A few backwards *incompatible* changes were introduced. * The functions for local connectivity and cuts accept now arguments for the new parameters defined for the flow interface: :samp:`flow_func` for defining the algorithm that will perform the underlying maximum flow computations, :samp:`residual` that accepts as argument a residual network to be reused in repeated maximum flow computations, and :samp:`cutoff` for defining a maximum flow value at which the underlying maximum flow algorithm stops. The big speed improvement with respect to 1.8 comes mainly from the reuse of the residual network and the use of :samp:`cutoff`. * We removed the flow-based local connectivity and cut functions from the base namespace. Now they have to be explicitly imported from the connectivity package. The main entry point to flow-based connectivity and cut functions are the functions :samp:`edge_connectivity`, :samp:`node_connectivity`, :samp:`minimum_edge_cut`, and :samp:`minimum_node_cut`. All these functions accept a couple of nodes as optional arguments for computing local connectivity and cuts. * We improved the auxiliary network for connectivity functions: The node mapping dict needed for node connectivity and minimum node cuts is now a graph attribute of the auxiliary network. Thus we removed the :samp:`mapping` parameter from the local versions of connectivity and cut functions. We also changed the parameter name for the auxuliary digraph from :samp:`aux_digraph` to :samp:`auxiliary`. * We changed the name of the function :samp:`all_pairs_node_connectiviy_matrix` to :samp:`all_pairs_node_connectivity`. This function now returns a dictionary instead of a NumPy 2D array. We added a new parameter :samp:`nbunch` for computing node connectivity only among pairs of nodes in :samp:`nbunch`. * A :samp:`stoer_wagner` function is added to the connectivity package for computing the weighted minimum cuts of undirected graphs using the Stoer–Wagner algorithm. This algorithm is not based on maximum flows. Several heap implementations are also added in the utility package (:samp:`networkx.utils`) for use in this function. :class:`BinaryHeap` is recommeded over :class:`PairingHeap` for Python implementations without optimized attribute accesses (e.g., CPython) despite a slower asymptotic running time. For Python implementations with optimized attribute accesses (e.g., PyPy), :class:`PairingHeap` provides better performance. Other new functionalities ------------------------- * A :samp:`disperson` function is added in the centrality package (:samp:`networkx.algorithms.centrality`) for computing the dispersion of graphs. * A community package (:samp:`networkx.generators.community`) is added for generating community graphs. * An :samp:`is_semiconnected` function is added in the connectivity package (:samp:`networkx.algorithms.connectivity`) for recognizing semiconnected graphs. * The :samp:`eulerian_circuit` function in the Euler package (:samp:`networkx.algorithm.euler`) is changed to use a linear-time algorithm. * A :samp:`non_edges` function in added in the function package (:samp:`networkx.functions`) for enumerating nonexistent edges between existing nodes of graphs. * The linear algebra package (:samp:`networkx.linalg`) is changed to use SciPy sparse matrices. * Functions :samp:`algebraic_connectivity`, :samp:`fiedler_vector` and :samp:`spectral_ordering` are added in the linear algebra package (:samp:`networkx.linalg`) for computing the algebraic connectivity, Fiedler vectors and spectral orderings of undirected graphs. * A link prediction package (:samp:`networkx.algorithms.link_prediction`) is added to provide link prediction-related functionalities. * Write Support for the graph6 and sparse6 formats is added in the read/write package (:samp:`networx.readwrite`). * A :samp:`goldberg_radzik` function is added in the shortest path package (:samp:`networkx.algorithms.shortest_paths`) for computing shortest paths using the Goldberg–Radzik algorithm. * A tree package (:samp:`networkx.tree`) is added to provide tree recognition functionalities. * A context manager :samp:`reversed` is added in the utility package (:samp:`networkx.utils`) for temporary in-place reversal of graphs. Miscellaneous changes --------------------- * The functions in the components package (:samp:`networkx.algorithms.components`) such as :samp:`connected_components`, :samp:`connected_components_subgraph` now return generators instead of lists. To recover the earlier behavior, use :samp:`list(connected_components(G))`. * JSON helpers in the JSON graph package (:samp:`networkx.readwrite.json_graph`) are removed. Use functions from the standard library (e.g., :samp:`json.dumps`) instead. * Support for Python 3.1 is dropped. Basic support is added for Jython 2.7 and IronPython 2.7, although they remain not officially supported. * Numerous reported issues are fixed. networkx-1.11/doc/source/reference/algorithms.flow.rst0000644000175000017500000000157312637544450023067 0ustar aricaric00000000000000***** Flows ***** .. automodule:: networkx.algorithms.flow Maximum Flow ------------ .. autosummary:: :toctree: generated/ maximum_flow maximum_flow_value minimum_cut minimum_cut_value Edmonds-Karp ------------ .. autosummary:: :toctree: generated/ edmonds_karp Shortest Augmenting Path ------------------------ .. autosummary:: :toctree: generated/ shortest_augmenting_path Preflow-Push ------------ .. autosummary:: :toctree: generated/ preflow_push Utils ----- .. autosummary:: :toctree: generated/ build_residual_network Network Simplex --------------- .. autosummary:: :toctree: generated/ network_simplex min_cost_flow_cost min_cost_flow cost_of_flow max_flow_min_cost Capacity Scaling Minimum Cost Flow ---------------------------------- .. autosummary:: :toctree: generated/ capacity_scaling networkx-1.11/doc/source/reference/convert.rst0000644000175000017500000000160312637544450021422 0ustar aricaric00000000000000***************************************** Converting to and from other data formats ***************************************** .. currentmodule:: networkx To NetworkX Graph ----------------- .. automodule:: networkx.convert .. autosummary:: :toctree: generated/ to_networkx_graph Dictionaries ------------ .. autosummary:: :toctree: generated/ to_dict_of_dicts from_dict_of_dicts Lists ----- .. autosummary:: :toctree: generated/ to_dict_of_lists from_dict_of_lists to_edgelist from_edgelist Numpy ----- .. automodule:: networkx.convert_matrix .. autosummary:: :toctree: generated/ to_numpy_matrix to_numpy_recarray from_numpy_matrix Scipy ----- .. autosummary:: :toctree: generated/ to_scipy_sparse_matrix from_scipy_sparse_matrix Pandas ------ .. autosummary:: :toctree: generated/ to_pandas_dataframe from_pandas_dataframe networkx-1.11/doc/source/reference/algorithms.community.rst0000644000175000017500000000037212637544500024134 0ustar aricaric00000000000000*********** Communities *********** .. automodule:: networkx.algorithms.community .. currentmodule:: networkx K-Clique ^^^^^^^^ .. automodule:: networkx.algorithms.community.kclique .. autosummary:: :toctree: generated/ k_clique_communities networkx-1.11/doc/source/reference/api_0.99.rst0000644000175000017500000002032512637544450021174 0ustar aricaric00000000000000************************ Version 0.99 API changes ************************ The version networkx-0.99 is the penultimate release before networkx-1.0. We have bumped the version from 0.37 to 0.99 to indicate (in our unusual version number scheme) that this is a major change to NetworkX. We have made some significant changes, detailed below, to NetworkX to improve performance, functionality, and clarity. Version 0.99 requires Python 2.4 or greater. Please send comments and questions to the networkx-discuss mailing list. http://groups.google.com/group/networkx-discuss Changes in base classes ======================= The most significant changes are in the graph classes. We have redesigned the Graph() and DiGraph() classes to optionally allow edge data. This change allows Graph and DiGraph to naturally represent weighted graphs and to hold arbitrary information on edges. - Both Graph and DiGraph take an optional argument weighted=True|False. When weighted=True the graph is assumed to have numeric edge data (with default 1). The Graph and DiGraph classes in earlier versions used the Python None as data (which is still allowed as edge data). - The Graph and DiGraph classes now allow self loops. - The XGraph and XDiGraph classes are removed and replaced with MultiGraph and MultiDiGraph. MultiGraph and MultiDiGraph optionally allow parallel (multiple) edges between two nodes. The mapping from old to new classes is as follows:: - Graph -> Graph (self loops allowed now, default edge data is 1) - DiGraph -> DiGraph (self loops allowed now, default edge data is 1) - XGraph(multiedges=False) -> Graph - XGraph(multiedges=True) -> MultiGraph - XDiGraph(multiedges=False) -> DiGraph - XDiGraph(multiedges=True) -> MultiDiGraph Methods changed --------------- edges() ^^^^^^^ New keyword data=True|False keyword determines whether to return two-tuples (u,v) (False) or three-tuples (u,v,d) (True) delete_node() ^^^^^^^^^^^^^ The preferred name is now remove_node(). delete_nodes_from() ^^^^^^^^^^^^^^^^^^^ No longer raises an exception on an attempt to delete a node not in the graph. The preferred name is now remove_nodes_from(). delete_edge() ^^^^^^^^^^^^^^ Now raises an exception on an attempt to delete an edge not in the graph. The preferred name is now remove_edge(). delete_edges_from() ^^^^^^^^^^^^^^^^^^^ The preferred name is now remove_edges_from(). add_edge() ^^^^^^^^^^ The add_edge() method no longer accepts an edge tuple (u,v) directly. The tuple must be unpacked into individual nodes. >>> import networkx as nx >>> u='a' >>> v='b' >>> e=(u,v) >>> G=nx.Graph() Old >>> # G.add_edge((u,v)) # or G.add_edge(e) New >>> G.add_edge(*e) # or G.add_edge(*(u,v)) The * operator unpacks the edge tuple in the argument list. Add edge now has a data keyword parameter for setting the default (data=1) edge data. >>> # G.add_edge('a','b','foo') # add edge with string "foo" as data >>> # G.add_edge(1,2,5.0) # add edge with float 5 as data add_edges_from() ^^^^^^^^^^^^^^^^ Now can take list or iterator of either 2-tuples (u,v), 3-tuples (u,v,data) or a mix of both. Now has data keyword parameter (default 1) for setting the edge data for any edge in the edge list that is a 2-tuple. has_edge() ^^^^^^^^^^ The has_edge() method no longer accepts an edge tuple (u,v) directly. The tuple must be unpacked into individual nodes. Old: >>> # G.has_edge((u,v)) # or has_edge(e) New: >>> G.has_edge(*e) # or has_edge(*(u,v)) True The * operator unpacks the edge tuple in the argument list. get_edge() ^^^^^^^^^^ Now has the keyword argument "default" to specify what value to return if no edge is found. If not specified an exception is raised if no edge is found. The fastest way to get edge data for edge (u,v) is to use G[u][v] instead of G.get_edge(u,v) degree_iter() ^^^^^^^^^^^^^ The degree_iter method now returns an iterator over pairs of (node, degree). This was the previous behavior of degree_iter(with_labels=true) Also there is a new keyword weighted=False|True for weighted degree. subgraph() ^^^^^^^^^^ The argument inplace=False|True has been replaced with copy=True|False. Subgraph no longer takes create_using keyword. To change the graph type either make a copy of the graph first and then change type or change type and make a subgraph. E.g. >>> G=nx.path_graph(5) >>> H=nx.DiGraph(G.subgraph([0,1])) # digraph of copy of induced subgraph __getitem__() ^^^^^^^^^^^^^ Getting node neighbors from the graph with G[v] now returns a dictionary. >>> G=nx.path_graph(5) >>> # G[0] # {1: 1} To get a list of neighbors you can either use the keys of that dictionary or use >>> G.neighbors(0) [1] This change allows algorithms to use the underlying dict-of-dict representation through G[v] for substantial performance gains. Warning: The returned dictionary should not be modified as it may corrupt the graph data structure. Make a copy G[v].copy() if you wish to modify the dict. Methods removed --------------- info() ^^^^^^ now a function >>> G=nx.Graph(name='test me') >>> nx.info(G) Name: test me Type: Graph Number of nodes: 0 Number of edges: 0 node_boundary() ^^^^^^^^^^^^^^^ now a function edge_boundary() ^^^^^^^^^^^^^^^ now a function is_directed() ^^^^^^^^^^^^^ use the directed attribute >>> G=nx.DiGraph() >>> # G.directed # True G.out_edges() ^^^^^^^^^^^^^ use G.edges() G.in_edges() ^^^^^^^^^^^^ use >>> G=nx.DiGraph() >>> R=G.reverse() >>> R.edges() [] or >>> [(v,u) for (u,v) in G.edges()] [] Methods added ------------- adjacency_list() ^^^^^^^^^^^^^^^^ Returns a list-of-lists adjacency list representation of the graph. adjacency_iter() ^^^^^^^^^^^^^^^^ Returns an iterator of (node, adjacency_dict[node]) over all nodes in the graph. Intended for fast access to the internal data structure for use in internal algorithms. Other possible incompatibilities with existing code =================================================== Imports ------- Some of the code modules were moved into subdirectories. Import statements such as:: import networkx.centrality from networkx.centrality import * may no longer work (including that example). Use either >>> import networkx # e.g. centrality functions available as networkx.fcn() or >>> from networkx import * # e.g. centrality functions available as fcn() Self-loops ---------- For Graph and DiGraph self loops are now allowed. This might affect code or algorithms that add self loops which were intended to be ignored. Use the methods - nodes_with_selfloops() - selfloop_edges() - number_of_selfloops() to discover any self loops. Copy ---- Copies of NetworkX graphs including using the copy() method now return complete copies of the graph. This means that all connection information is copied--subsequent changes in the copy do not change the old graph. But node keys and edge data in the original and copy graphs are pointers to the same data. prepare_nbunch -------------- Used internally - now called nbunch_iter and returns an iterator. Converting your old code to Version 0.99 ======================================== Mostly you can just run the code and python will raise an exception for features that changed. Common places for changes are - Converting XGraph() to either Graph or MultiGraph - Converting XGraph.edges() to Graph.edges(data=True) - Switching some rarely used methods to attributes (e.g. directed) or to functions (e.g. node_boundary) - If you relied on the old default edge data being None, you will have to account for it now being 1. You may also want to look through your code for places which could improve speed or readability. The iterators are helpful with large graphs and getting edge data via G[u][v] is quite fast. You may also want to change G.neighbors(n) to G[n] which returns the dict keyed by neighbor nodes to the edge data. It is faster for many purposes but does not work well when you are changing the graph. networkx-1.11/doc/source/reference/functions.rst0000644000175000017500000000136512637544500021753 0ustar aricaric00000000000000********* Functions ********* .. automodule:: networkx.classes.function Graph ----- .. autosummary:: :toctree: generated/ degree degree_histogram density info create_empty_copy is_directed Nodes ----- .. autosummary:: :toctree: generated/ nodes number_of_nodes nodes_iter all_neighbors non_neighbors common_neighbors Edges ----- .. autosummary:: :toctree: generated/ edges number_of_edges edges_iter non_edges Attributes ---------- .. autosummary:: :toctree: generated/ set_node_attributes get_node_attributes set_edge_attributes get_edge_attributes Freezing graph structure ------------------------ .. autosummary:: :toctree: generated/ freeze is_frozen networkx-1.11/doc/source/reference/api_1.4.rst0000644000175000017500000000202412637544450021073 0ustar aricaric00000000000000********************************* Version 1.4 notes and API changes ********************************* We have made some API changes, detailed below, to add clarity. This page reflects changes from networkx-1.3 to networkx-1.4. For changes from earlier versions to networkx-1.0 see :doc:`Version 1.0 API changes `. Please send comments and questions to the networkx-discuss mailing list: http://groups.google.com/group/networkx-discuss . Algorithms changed ================== Shortest path ------------- astar_path(), astar_path_length(), shortest_path(), shortest_path_length(), ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ bidirectional_shortest_path(), dijkstra_path(), dijkstra_path_length(), ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ bidirectional_dijkstra() ^^^^^^^^^^^^^^^^^^^^^^^^ These algorithms now raise an exception when a source and a target are specified and no path exist between these two nodes. The exception is a NetworkXNoPath exception. networkx-1.11/doc/source/reference/api_1.11.rst0000644000175000017500000000306712653165361021157 0ustar aricaric00000000000000********************************** Version 1.11 notes and API changes ********************************** This page includes more detailed release information and API changes from NetworkX 1.10 to NetworkX 1.11. Please send comments and questions to the networkx-discuss mailing list: . API changes ----------- * [`#1930 `_] No longer import nx_agraph and nx_pydot into the top-level namespace. They can be accessed within networkx as e.g. ``nx.nx_agraph.write_dot`` or imported as ``from networkx.drawing.nx_agraph import write_dot``. * [`#1750 `_] Arguments center and scale are now available for all layout functions. The defaul values revert to the v1.9 values (center is the origin for circular layouts and domain is [0, scale) for others. * [`#1924 `_] Replace pydot with pydotplus for drawing with the pydot interface. * [`#1888 `_] Replace support for Python3.2 with support for Python 3.5. Miscellaneous changes --------------------- * [`#1763 `_] Set up appveyor to automatically test installation on Windows machines. Remove symbolic links in examples to help such istallation. Change many doc_string typos to allow sphinx to build the docs without errors or warnings. Enable the docs to be automatically built on readthedocs.org by changing requirements.txt networkx-1.11/doc/source/reference/algorithms.hierarchy.rst0000644000175000017500000000021112637544450024062 0ustar aricaric00000000000000********* Hierarchy ********* .. automodule:: networkx.algorithms.hierarchy .. autosummary:: :toctree: generated/ flow_hierarchy networkx-1.11/doc/source/reference/classes.rst0000644000175000017500000000155212637544450021402 0ustar aricaric00000000000000.. _classes: *********** Graph types *********** NetworkX provides data structures and methods for storing graphs. All NetworkX graph classes allow (hashable) Python objects as nodes. and any Python object can be assigned as an edge attribute. The choice of graph class depends on the structure of the graph you want to represent. Which graph class should I use? =============================== =================== ======================== Graph Type NetworkX Class =================== ======================== Undirected Simple Graph Directed Simple DiGraph With Self-loops Graph, DiGraph With Parallel edges MultiGraph, MultiDiGraph =================== ======================== Basic graph types ================= .. toctree:: :maxdepth: 2 classes.graph classes.digraph classes.multigraph classes.multidigraph networkx-1.11/doc/source/reference/algorithms.vitality.rst0000644000175000017500000000021112637544450023751 0ustar aricaric00000000000000******** Vitality ******** .. automodule:: networkx.algorithms.vitality .. autosummary:: :toctree: generated/ closeness_vitality networkx-1.11/doc/source/reference/algorithms.distance_measures.rst0000644000175000017500000000032612637544450025611 0ustar aricaric00000000000000***************** Distance Measures ***************** .. automodule:: networkx.algorithms.distance_measures .. autosummary:: :toctree: generated/ center diameter eccentricity periphery radius networkx-1.11/doc/source/reference/credits.rst0000644000175000017500000001216212637544450021401 0ustar aricaric00000000000000Credits ======= NetworkX was originally written by Aric Hagberg, Dan Schult, and Pieter Swart, and has been developed with the help of many others. Thanks to everyone who has improved NetworkX by contributing code, bug reports (and fixes), documentation, and input on design, features, and the future of NetworkX. Contributions ------------- This section aims to provide a list of people and projects that have contributed to ``networkx``. It is intended to be an *inclusive* list, and anyone who has contributed and wishes to make that contribution known is welcome to add an entry into this file. Generally, no name should be added to this list without the approval of the person associated with that name. Creating a comprehensive list of contributors can be difficult, and the list within this file is almost certainly incomplete. Contributors include testers, bug reporters, contributors who wish to remain anonymous, funding sources, academic advisors, end users, and even build/integration systems (such as `TravisCI `_, `coveralls `_, and `readthedocs `_). Do you want to make your contribution known? If you have commit access, edit this file and add your name. If you do not have commit access, feel free to open an `issue `_, submit a `pull request `_, or get in contact with one of the official team `members `_. A supplementary (but still incomplete) list of contributors is given by the list of names that have commits in ``networkx``'s `git `_ repository. This can be obtained via:: git log --raw | grep "^Author: " | sort | uniq A historical, partial listing of contributors and their contributions to some of the earlier versions of NetworkX can be found `here `_. Original Authors ^^^^^^^^^^^^^^^^ | Aric Hagberg | Dan Schult | Pieter Swart | Contributors ^^^^^^^^^^^^ Optionally, add your desired name and include a few relevant links. The order is partially historical, and now, mostly arbitrary. - Aric Hagberg, GitHub: `hagberg `_ - Dan Schult, GitHub: `dschult `_ - Pieter Swart - Katy Bold - Hernan Rozenfeld - Brendt Wohlberg - Jim Bagrow - Holly Johnsen - Arnar Flatberg - Chris Myers - Joel Miller - Keith Briggs - Ignacio Rozada - Phillipp Pagel - Sverre Sundsdal - Ross M. Richardson - Eben Kenah - Sasha Gutfriend - Udi Weinsberg - Matteo Dell'Amico - Andrew Conway - Raf Guns - Salim Fadhley - Matteo Dell'Amico - Fabrice Desclaux - Arpad Horvath - Minh Van Nguyen - Willem Ligtenberg - Loïc Séguin-C. - Paul McGuire - Jesus Cerquides - Ben Edwards - Jon Olav Vik - Hugh Brown - Ben Reilly - Leo Lopes - Jordi Torrents, GitHub: `jtorrents `_ - Dheeraj M R - Franck Kalala - Simon Knight - Conrad Lee - Sérgio Nery Simões - Robert King - Nick Mancuso - Brian Cloteaux - Alejandro Weinstein - Dustin Smith - Mathieu Larose - Vincent Gauthier - Sérgio Nery Simões - chebee7i, GitHub: `chebee7i `_ - Jeffrey Finkelstein - Jean-Gabriel Young, Github: `jg-you `_ - Andrey Paramonov, http://aparamon.msk.ru - Mridul Seth, GitHub: `MridulS `_ - Thodoris Sotiropoulos, GitHub: `theosotr `_ - Konstantinos Karakatsanis, GitHub: `k-karakatsanis `_ - Ryan Nelson, GitHub: `rnelsonchem `_ Support ------- ``networkx`` and those who have contributed to ``networkx`` have received support throughout the years from a variety of sources. We list them below. If you have provided support to ``networkx`` and a support acknowledgment does not appear below, please help us remedy the situation, and similarly, please let us know if you'd like something modified or corrected. Research Groups ^^^^^^^^^^^^^^^ ``networkx`` acknowledges support from the following: - `Center for Nonlinear Studies `_, Los Alamos National Laboratory, PI: Aric Hagberg - `Open Source Programs Office `_, Google - `Complexity Sciences Center `_, Department of Physics, University of California-Davis, PI: James P. Crutchfield - `Center for Complexity and Collective Computation `_, Wisconsin Institute for Discovery, University of Wisconsin-Madison, PIs: Jessica C. Flack and David C. Krakauer Funding ^^^^^^^ ``networkx`` acknowledges support from the following: - Google Summer of Code via Python Software Foundation - U.S. Army Research Office grant W911NF-12-1-0288 - DARPA Physical Intelligence Subcontract No. 9060-000709 - NSF Grant No. PHY-0748828 - John Templeton Foundation through a grant to the Santa Fe Institute to study complexity - U.S. Army Research Laboratory and the U.S. Army Research Office under contract number W911NF-13-1-0340 networkx-1.11/doc/source/reference/api_1.5.rst0000644000175000017500000000340112637544450021074 0ustar aricaric00000000000000********************************* Version 1.5 notes and API changes ********************************* This page reflects API changes from networkx-1.4 to networkx-1.5. Please send comments and questions to the networkx-discuss mailing list: http://groups.google.com/group/networkx-discuss . Weighted graph algorithms ------------------------- Many 'weighted' graph algorithms now take optional parameter to specifiy which edge attribute should be used for the weight (default='weight') (ticket https://networkx.lanl.gov/trac/ticket/509) In some cases the parameter name was changed from weighted_edges, or weighted, to weight. Here is how to specify which edge attribute will be used in the algorithms: - Use weight=None to consider all weights equally (unweighted case) - Use weight=True or weight='weight' to use the 'weight' edge atribute - Use weight='other' to use the 'other' edge attribute Algorithms affected are: betweenness_centrality, closeness_centrality, edge_bewteeness_centrality, betweeness_centrality_subset, edge_betweenness_centrality_subset, betweenness_centrality_source, load, closness_vitality, weiner_index, spectral_bipartivity current_flow_betweenness_centrality, edge_current_flow_betweenness_centrality, current_flow_betweenness_centrality_subset, edge_current_flow_betweenness_centrality_subset, laplacian, normalized_laplacian, adj_matrix, adjacency_spectrum, shortest_path, shortest_path_length, average_shortest_path_length, single_source_dijkstra_path_basic, astar_path, astar_path_length Random geometric graph ---------------------- The random geometric graph generator has been simplified. It no longer supports the create_using, repel, or verbose parameters. An optional pos keyword was added to allow specification of node positions. networkx-1.11/doc/source/reference/readwrite.rst0000644000175000017500000000062412637544450021732 0ustar aricaric00000000000000.. _readwrite: ************************** Reading and writing graphs ************************** .. toctree:: :maxdepth: 2 readwrite.adjlist readwrite.multiline_adjlist readwrite.edgelist readwrite.gexf readwrite.gml readwrite.gpickle readwrite.graphml readwrite.json_graph readwrite.leda readwrite.yaml readwrite.sparsegraph6 readwrite.pajek readwrite.nx_shp networkx-1.11/doc/source/reference/algorithms.graphical.rst0000644000175000017500000000051012637544450024040 0ustar aricaric00000000000000************************* Graphical degree sequence ************************* .. automodule:: networkx.algorithms.graphical .. autosummary:: :toctree: generated/ is_graphical is_digraphical is_multigraphical is_pseudographical is_valid_degree_sequence_havel_hakimi is_valid_degree_sequence_erdos_gallai networkx-1.11/doc/source/reference/relabel.rst0000644000175000017500000000035012637544450021346 0ustar aricaric00000000000000**************** Relabeling nodes **************** .. currentmodule:: networkx Relabeling ---------- .. automodule:: networkx.relabel .. autosummary:: :toctree: generated/ convert_node_labels_to_integers relabel_nodes networkx-1.11/doc/source/reference/readwrite.edgelist.rst0000644000175000017500000000035212637544450023527 0ustar aricaric00000000000000 Edge List ========= .. automodule:: networkx.readwrite.edgelist .. autosummary:: :toctree: generated/ read_edgelist write_edgelist read_weighted_edgelist write_weighted_edgelist generate_edgelist parse_edgelist networkx-1.11/doc/source/reference/algorithms.minors.rst0000644000175000017500000000027012637544450023420 0ustar aricaric00000000000000****** Minors ****** .. automodule:: networkx.algorithms.minors .. autosummary:: :toctree: generated/ contracted_edge contracted_nodes identified_nodes quotient_graph networkx-1.11/doc/source/reference/readwrite.nx_shp.rst0000644000175000017500000000021312637544450023222 0ustar aricaric00000000000000GIS Shapefile ============= .. automodule:: networkx.readwrite.nx_shp .. autosummary:: :toctree: generated/ read_shp write_shp networkx-1.11/doc/source/reference/algorithms.core.rst0000644000175000017500000000024112637544450023037 0ustar aricaric00000000000000***** Cores ***** .. automodule:: networkx.algorithms.core .. autosummary:: :toctree: generated/ core_number k_core k_shell k_crust k_corona networkx-1.11/doc/source/reference/algorithms.link_analysis.rst0000644000175000017500000000064012637544450024752 0ustar aricaric00000000000000************* Link Analysis ************* PageRank -------- .. automodule:: networkx.algorithms.link_analysis.pagerank_alg .. autosummary:: :toctree: generated/ pagerank pagerank_numpy pagerank_scipy google_matrix Hits ---- .. automodule:: networkx.algorithms.link_analysis.hits_alg .. autosummary:: :toctree: generated/ hits hits_numpy hits_scipy hub_matrix authority_matrix networkx-1.11/doc/source/reference/news.rst0000644000175000017500000010165612653231353020720 0ustar aricaric00000000000000.. -*- coding: utf-8 -*- .. currentmodule:: networkx Release Log =========== NetworkX 1.11 ------------- Release date: 30 January 2016 Support for Python 3.5 added, drop support for Python 3.2. Highlights ~~~~~~~~~~ Pydot features now use pydotplus. Fixes installation on some machines and test with appveyor, Restores default center and scale of layout routines, Fixes various docs including no symbolic links in examples. Docs can now build using autosummary on readthedocs.org. NetworkX 1.10 ------------- Release date: 2 August 2015 Support for Python 2.6 is dropped in this release. Highlights ~~~~~~~~~~ - Connected components now return generators - new functions including + enumerate_all_cliques, greedy_coloring, edge_dfs, find_cycle immediate_dominators, harmonic_centrality + Hopcraft--Karp algorithm for maximum matchings + optimum branchings and arborescences. + all_simple_paths - pyparsing dependence removed from GML reader/parser - improve flow algorithms - new generators releated to expander graphs. - new generators for multipartite graphs, nonisomorphic trees, circulant graphs - allow graph subclasses to use dict-like objects in place of dicts - added ordered graph subclasses - pandas dataframe read/write added. - data keyword in G.edges() allows requesting edge attribute directly - expanded layout flexibility for node subsets - Kanesky's algorithm for cut sets and k_components - power function for graphs - approximation of node connectivity - transitive closure, triadic census and antichains - quotient graphs and minors - longest_path for DAGS - modularity matrix routines API changes ~~~~~~~~~~~ See :doc:`api_1.10`. NetworkX 1.9.1 -------------- Release date: 13 September 2014 Bugfix release for minor installation and documentation issues. NetworkX 1.9 ------------ Release date: 21 June 2014 Support for Python 3.1 is dropped in this release. Highlights ~~~~~~~~~~ - Completely rewritten maximum flow and flow-based connectivity algorithms with backwards incompatible interfaces - Community graph generators - Stoer–Wagner minimum cut algorithm - Linear-time Eulerian circuit algorithm - Linear algebra package changed to use SciPy sparse matrices - Algebraic connectivity, Fiedler vector, spectral ordering algorithms - Link prediction algorithms - Goldberg–Radzik shortest path algorithm - Semiconnected graph and tree recognition algorithms API changes ~~~~~~~~~~~ See :doc:`api_1.9`. NetworkX 1.8.1 -------------- Release date: 4 August 2013 Bugfix release for missing files in source packaging. NetworkX 1.8 ------------ Release date: 28 July 2013 Highlights ~~~~~~~~~~ - Faster (linear-time) graphicality tests and Havel-Hakimi graph generators - Directed Laplacian matrix generator - Katz centrality algorithm - Functions to generate all simple paths - Improved shapefile reader - More flexible weighted projection of bipartite graphs - Faster topological sort, decendents and ancestors of DAGs - Scaling parameter for force-directed layout Bug fixes ~~~~~~~~~ - Error with average weighted connectivity for digraphs, correct normalized laplacian with self-loops, load betweenness for single node graphs, isolated nodes missing from dfs/bfs trees, normalize HITS using l1, handle density of graphs with self loops - Cleaner handling of current figure status with Matplotlib, Pajek files now don't write troublesome header line, default alpha value for GEXF files, read curved edges from yEd GraphML For full details of the issues closed for this release (added features and bug fixes) see: https://github.com/networkx/networkx/issues?milestone=1&page=1&state=closed API changes ~~~~~~~~~~~ See :doc:`api_1.8` NetworkX 1.7 ------------ Release date: 4 July 2012 Highlights ~~~~~~~~~~ - New functions for k-clique community finding, flow hierarchy, union, disjoint union, compose, and intersection operators that work on lists of graphs, and creating the biadjacency matrix of a bipartite graph. - New approximation algorithms for dominating set, edge dominating set, independent set, max clique, and min-weighted vertex cover. - Many bug fixes and other improvements. For full details of the tickets closed for this release (added features and bug fixes) see: https://networkx.lanl.gov/trac/query?status=closed&group=milestone&milestone=networkx-1.7 API changes ~~~~~~~~~~~ See :doc:`api_1.7` NetworkX 1.6 ------------ Release date: 20 November 2011 Highlights ~~~~~~~~~~ New functions for finding articulation points, generating random bipartite graphs, constructing adjacency matrix representations, forming graph products, computing assortativity coefficients, measuring subgraph centrality and communicability, finding k-clique communities, and writing JSON format output. New examples for drawing with D3 Javascript library, and ordering matrices with the Cuthill-McKee algorithm. More memory efficient implementation of current-flow betweenness and new approximation algorithms for current-flow betweenness and shortest-path betweenness. Simplified handling of "weight" attributes for algorithms that use weights/costs/values. See :doc:`api_1.6`. Updated all code to work with the PyPy Python implementation http://pypy.org which produces faster performance on many algorithms. For full details of the tickets closed for this release (added features and bug fixes) see: https://networkx.lanl.gov/trac/query?status=closed&group=milestone&milestone=networkx-1.6 API changes ~~~~~~~~~~~ See :doc:`api_1.6` NetworkX 1.5 ------------ Release date: 4 June 2011 For full details of the tickets closed for this release see: https://networkx.lanl.gov/trac/query?status=closed&group=milestone&milestone=networkx-1.5 Highlights ~~~~~~~~~~ New features ~~~~~~~~~~~~ - Algorithms for :mod:`generating ` and :mod:`analyzing ` bipartite graphs - :mod:`Maximal independent set ` algorithm - :mod:`Erdős-Gallai graphical degree sequence test ` - :mod:`Negative edge cycle test ` - More memory efficient :mod:`Dijkstra path length ` with cutoff parameter - :mod:`Weighted clustering coefficient ` - Read and write version 1.2 of :mod:`GEXF reader ` format - :mod:`Neighbor degree correlation ` that handle subsets of nodes - :mod:`In-place node relabeling ` - Many 'weighted' graph algorithms now take optional parameter to use specified edge attribute (default='weight') (ticket https://networkx.lanl.gov/trac/ticket/509) - Test for :mod:`distance regular ` graphs - Fast :mod:`directed Erdős-Renyi graph ` generator - Fast :mod:`expected degree graph ` generator - :mod:`Navigable small world ` generator - :mod:`Waxman model ` generator - :mod:`Geographical threshold graph ` generator - :mod:`Karate Club, Florentine Families, and Davis' Women's Club ` graphs API changes ~~~~~~~~~~~ See :doc:`api_1.5` Bug fixes ~~~~~~~~~ - Fix edge handling for multigraphs in networkx/graphviz interface (ticket https://networkx.lanl.gov/trac/ticket/507) - Update networkx/pydot interface for new versions of pydot (ticket https://networkx.lanl.gov/trac/ticket/506) (ticket https://networkx.lanl.gov/trac/ticket/535) - Fix negative cycle handling in Bellman-Ford (ticket https://networkx.lanl.gov/trac/ticket/502) - Write more attributes with GraphML and GML formats (ticket https://networkx.lanl.gov/trac/ticket/480) - Handle white space better in read_edgelist (ticket https://networkx.lanl.gov/trac/ticket/513) - Better parsing of Pajek format files (ticket https://networkx.lanl.gov/trac/ticket/524) (ticket https://networkx.lanl.gov/trac/ticket/542) - Isolates functions work with directed graphs (ticket https://networkx.lanl.gov/trac/ticket/526) - Faster conversion to numpy matrices (ticket https://networkx.lanl.gov/trac/ticket/529) - Add graph['name'] and use properties to access Graph.name (ticket https://networkx.lanl.gov/trac/ticket/544) - Topological sort confused None and 0 (ticket https://networkx.lanl.gov/trac/ticket/546) - GEXF writer mishandled weight=0 (ticket https://networkx.lanl.gov/trac/ticket/550) - Speedup in SciPy version of PageRank (ticket https://networkx.lanl.gov/trac/ticket/554) - Numpy PageRank node order incorrect + speedups (ticket https://networkx.lanl.gov/trac/ticket/555) NetworkX 1.4 ------------ Release date: 23 January 2011 New features ~~~~~~~~~~~~ - :mod:`k-shell,k-crust,k-corona ` - :mod:`read GraphML files from yEd ` - :mod:`read/write GEXF format files ` - :mod:`find cycles in a directed graph ` - :mod:`DFS ` and :mod:`BFS ` algorithms - :mod:`chordal graph functions ` - :mod:`Prim's algorithm for minimum spanning tree ` - :mod:`r-ary tree generator ` - :mod:`rich club coefficient ` - NumPy matrix version of :mod:`Floyd's algorithm for all-pairs shortest path ` - :mod:`read GIS shapefiles ` - :mod:`functions to get and set node and edge attributes ` - and more, see https://networkx.lanl.gov/trac/query?status=closed&group=milestone&milestone=networkx-1.4 API changes ~~~~~~~~~~~ - :mod:`gnp_random_graph() ` now takes a directed=True|False keyword instead of create_using - :mod:`gnm_random_graph() ` now takes a directed=True|False keyword instead of create_using Bug fixes ~~~~~~~~~ - see https://networkx.lanl.gov/trac/query?status=closed&group=milestone&milestone=networkx-1.4 NetworkX 1.3 ------------ Release date: 28 August 2010 See: https://networkx.lanl.gov/trac/timeline New features ~~~~~~~~~~~~ - Works with Python versions 2.6, 2.7, 3.1, and 3.2 (but not 2.4 and 2.5). - :mod:`Minimum cost flow algorithms ` - :mod:`Bellman-Ford shortest paths ` - :mod:`GraphML reader and writer ` - :mod:`More exception/error types ` - Updated many tests to unittest style. Run with: "import networkx; networkx.test()" (requires nose testing package) - and more, see https://networkx.lanl.gov/trac/query?status=closed&group=milestone&milestone=networkx-1.3 API changes ~~~~~~~~~~~ - :mod:`minimum_spanning_tree() now returns a NetworkX Graph (a tree or forest) ` Bug fixes ~~~~~~~~~ - see https://networkx.lanl.gov/trac/query?status=closed&group=milestone&milestone=networkx-1.3 NetworkX 1.2 ------------ Release date: 28 July 2010 See: https://networkx.lanl.gov/trac/timeline New features ~~~~~~~~~~~~ - :mod:`Ford-Fulkerson max flow and min cut ` - :mod:`Closeness vitality ` - :mod:`Eulerian circuits ` - :mod:`Functions for isolates ` - :mod:`Simpler s_max generator ` - Compatible with IronPython-2.6 - Improved testing functionality: import networkx; networkx.test() tests entire package and skips tests with missing optional packages - All tests work with Python-2.4 - and more, see https://networkx.lanl.gov/trac/query?status=closed&group=milestone&milestone=networkx-1.2 NetworkX 1.1 ------------ Release date: 21 April 2010 See: https://networkx.lanl.gov/trac/timeline New features ~~~~~~~~~~~~ - :mod:`Algorithm for finding a basis for graph cycles ` - :mod:`Blockmodeling ` - :mod:`Assortativity and mixing matrices ` - :mod:`in-degree and out-degree centrality ` - :mod:`Attracting components ` and :mod:`condensation `. - :mod:`Weakly connected components ` - :mod:`Simpler interface to shortest path algorithms ` - :mod:`Edgelist format to read and write data with attributes ` - :mod:`Attribute matrices ` - :mod:`GML reader for nested attributes ` - Current-flow (random walk) :mod:`betweenness ` and :mod:`closeness `. - :mod:`Directed configuration model `, and :mod:`directed random graph model `. - Improved documentation of drawing, shortest paths, and other algorithms - Many more tests, can be run with "import networkx; networkx.test()" - and much more, see https://networkx.lanl.gov/trac/query?status=closed&group=milestone&milestone=networkx-1.1 API changes ~~~~~~~~~~~ Returning dictionaries ********************** Several of the algorithms and the degree() method now return dictionaries keyed by node instead of lists. In some cases there was a with_labels keyword which is no longer necessary. For example, >>> G=nx.Graph() >>> G.add_edge('a','b') >>> G.degree() # returns dictionary of degree keyed by node {'a': 1, 'b': 1} Asking for the degree of a single node still returns a single number >>> G.degree('a') 1 The following now return dictionaries by default (instead of lists) and the with_labels keyword has been removed: - :meth:`Graph.degree`, :meth:`MultiGraph.degree`, :meth:`DiGraph.degree`, :meth:`DiGraph.in_degree`, :meth:`DiGraph.out_degree`, :meth:`MultiDiGraph.degree`, :meth:`MultiDiGraph.in_degree`, :meth:`MultiDiGraph.out_degree`. - :func:`clustering`, :func:`triangles` - :func:`node_clique_number`, :func:`number_of_cliques`, :func:`cliques_containing_node` - :func:`eccentricity` The following now return dictionaries by default (instead of lists) - :func:`pagerank` - :func:`hits` Adding nodes ************ add_nodes_from now accepts (node,attrdict) two-tuples >>> G=nx.Graph() >>> G.add_nodes_from([(1,{'color':'red'})]) Examples ~~~~~~~~ - `Mayvi2 drawing `_ - `Blockmodel `_ - `Sampson's monastery `_ - `Ego graph `_ Bug fixes ~~~~~~~~~ - Support graph attributes with union, intersection, and other graph operations - Improve subgraph speed (and related algorithms such as connected_components_subgraphs()) - Handle multigraphs in more operators (e.g. union) - Handle double-quoted labels with pydot - Normalize betweenness_centrality for undirected graphs correctly - Normalize eigenvector_centrality by l2 norm - :func:`read_gml` now returns multigraphs NetworkX 1.0.1 -------------- Release date: 11 Jan 2010 See: https://networkx.lanl.gov/trac/timeline Bug fix release for missing setup.py in manifest. NetworkX 1.0 ------------ Release date: 8 Jan 2010 See: https://networkx.lanl.gov/trac/timeline New features ~~~~~~~~~~~~ This release has significant changes to parts of the graph API to allow graph, node, and edge attributes. See http://networkx.lanl.gov//reference/api_changes.html - Update Graph, DiGraph, and MultiGraph classes to allow attributes. - Default edge data is now an empty dictionary (was the integer 1) - Difference and intersection operators - Average shortest path - A* (A-Star) algorithm - PageRank, HITS, and eigenvector centrality - Read Pajek files - Line graphs - Minimum spanning tree (Kruskal's algorithm) - Dense and sparse Fruchterman-Reingold layout - Random clustered graph generator - Directed scale-free graph generator - Faster random regular graph generator - Improved edge color and label drawing with Matplotlib - and much more, see https://networkx.lanl.gov/trac/query?status=closed&group=milestone&milestone=networkx-1.0 Examples ~~~~~~~~ - Update to work with networkx-1.0 API - Graph subclass example NetworkX 0.99 ------------- Release date: 18 November 2008 See: https://networkx.lanl.gov/trac/timeline New features ~~~~~~~~~~~~ This release has significant changes to parts of the graph API. See http://networkx.lanl.gov//reference/api_changes.html - Update Graph and DiGraph classes to use weighted graphs as default Change in API for performance and code simplicity. - New MultiGraph and MultiDiGraph classes (replace XGraph and XDiGraph) - Update to use Sphinx documentation system http://networkx.lanl.gov/ - Developer site at https://networkx.lanl.gov/trac/ - Experimental LabeledGraph and LabeledDiGraph - Moved package and file layout to subdirectories. Bug fixes ~~~~~~~~~ - handle root= option to draw_graphviz correctly Examples ~~~~~~~~ - Update to work with networkx-0.99 API - Drawing examples now use matplotlib.pyplot interface - Improved drawings in many examples - New examples - see http://networkx.lanl.gov/examples/ NetworkX 0.37 --------------- Release date: 17 August 2008 See: https://networkx.lanl.gov/trac/timeline NetworkX now requires Python 2.4 or later for full functionality. New features ~~~~~~~~~~~~ - Edge coloring and node line widths with Matplotlib drawings - Update pydot functions to work with pydot-1.0.2 - Maximum-weight matching algorithm - Ubigraph interface for 3D OpenGL layout and drawing - Pajek graph file format reader and writer - p2g graph file format reader and writer - Secondary sort in topological sort Bug fixes ~~~~~~~~~ - Better edge data handling with GML writer - Edge betweenness fix for XGraph with default data of None - Handle Matplotlib version strings (allow "pre") - Interface to PyGraphviz (to_agraph()) now handles parallel edges - Fix bug in copy from XGraph to XGraph with multiedges - Use SciPy sparse lil matrix format instead of coo format - Clear up ambiguous cases for Barabasi-Albert model - Better care of color maps with Matplotlib when drawing colored nodes and edges - Fix error handling in layout.py Examples ~~~~~~~~ - Ubigraph examples showing 3D drawing NetworkX 0.36 --------------- Release date: 13 January 2008 See: https://networkx.lanl.gov/trac/timeline New features ~~~~~~~~~~~~ - GML format graph reader, tests, and example (football.py) - edge_betweenness() and load_betweenness() Bug fixes ~~~~~~~~~ - remove obsolete parts of pygraphviz interface - improve handling of Matplotlib version strings - write_dot() now writes parallel edges and self loops - is_bipartite() and bipartite_color() fixes - configuration model speedup using random.shuffle() - convert with specified nodelist now works correctly - vf2 isomorphism checker updates NetworkX 0.35.1 --------------- Release date: 27 July 2007 See: https://networkx.lanl.gov/trac/timeline Small update to fix import readwrite problem and maintain Python2.3 compatibility. NetworkX 0.35 ------------- Release date: 22 July 2007 See: https://networkx.lanl.gov/trac/timeline New features ~~~~~~~~~~~~ - algorithms for strongly connected components. - Brandes betweenness centrality algorithm (weighted and unweighted versions) - closeness centrality for weighted graphs - dfs_preorder, dfs_postorder, dfs_tree, dfs_successor, dfs_predecessor - readers for GraphML, LEDA, sparse6, and graph6 formats. - allow arguments in graphviz_layout to be passed directly to graphviz Bug fixes ~~~~~~~~~ - more detailed installation instructions - replaced dfs_preorder,dfs_postorder (see search.py) - allow initial node positions in spectral_layout - report no error on attempting to draw empty graph - report errors correctly when using tuples as nodes #114 - handle conversions from incomplete dict-of-dict data NetworkX 0.34 ------------- Release date: 12 April 2007 See: https://networkx.lanl.gov/trac/timeline New features ~~~~~~~~~~~~ - benchmarks for graph classes - Brandes betweenness centrality algorithm - Dijkstra predecessor and distance algorithm - xslt to convert DIA graphs to NetworkX - number_of_edges(u,v) counts edges between nodes u and v - run tests with python setup_egg.py test (needs setuptools) else use python -c "import networkx; networkx.test()" - is_isomorphic() that uses vf2 algorithm Bug fixes ~~~~~~~~~ - speedups of neighbors() - simplified Dijkstra's algorithm code - better exception handling for shortest paths - get_edge(u,v) returns None (instead of exception) if no edge u-v - floyd_warshall_array fixes for negative weights - bad G467, docs, and unittest fixes for graph atlas - don't put nans in numpy or scipy sparse adjacency matrix - handle get_edge() exception (return None if no edge) - remove extra kwds arguments in many places - no multi counting edges in conversion to dict of lists for multigraphs - allow passing tuple to get_edge() - bad parameter order in node/edge betweenness - edge betweenness doesn't fail with XGraph - don't throw exceptions for nodes not in graph (silently ignore instead) in edges_* and degree_* NetworkX 0.33 ------------- Release date: 27 November 2006 See: https://networkx.lanl.gov/trac/timeline New features ~~~~~~~~~~~~ - draw edges with specified colormap - more efficient version of Floyd's algorithm for all pairs shortest path - use numpy only, Numeric is deprecated - include tests in source package (networkx/tests) - include documentation in source package (doc) - tests can now be run with >>> import networkx >>> networkx.test() Bug fixes ~~~~~~~~~ - read_gpickle now works correctly with Windows - refactored large modules into smaller code files - degree(nbunch) now returns degrees in same order as nbunch - degree() now works for multiedges=True - update node_boundary and edge_boundary for efficiency - edited documentation for graph classes, now mostly in info.py Examples ~~~~~~~~ - Draw edges with colormap NetworkX 0.32 ------------- Release date: 29 September 2006 See: https://networkx.lanl.gov/trac/timeline New features ~~~~~~~~~~~~ - Update to work with numpy-1.0x - Make egg usage optional: use python setup_egg.py bdist_egg to build egg - Generators and functions for bipartite graphs - Experimental classes for trees and forests - Support for new pygraphviz update (in nx_agraph.py) , see http://networkx.lanl.gov/pygraphviz/ for pygraphviz details Bug fixes ~~~~~~~~~ - Handle special cases correctly in triangles function - Typos in documentation - Handle special cases in shortest_path and shortest_path_length, allow cutoff parameter for maximum depth to search - Update examples: erdos_renyi.py, miles.py, roget,py, eigenvalues.py Examples ~~~~~~~~ - Expected degree sequence - New pygraphviz interface NetworkX 0.31 ------------- Release date: 20 July 2006 See: https://networkx.lanl.gov/trac/timeline New features ~~~~~~~~~~~~ - arbitrary node relabeling (use relabel_nodes) - conversion of NetworkX graphs to/from Python dict/list types, numpy matrix or array types, and scipy_sparse_matrix types - generator for random graphs with given expected degree sequence Bug fixes ~~~~~~~~~ - Allow drawing graphs with no edges using pylab - Use faster heapq in dijkstra - Don't complain if X windows is not available Examples ~~~~~~~~ - update drawing examples NetworkX 0.30 ------------- Release date: 23 June 2006 See: https://networkx.lanl.gov/trac/timeline New features ~~~~~~~~~~~~ - update to work with Python 2.5 - bidirectional version of shortest_path and Dijkstra - single_source_shortest_path and all_pairs_shortest_path - s-metric and experimental code to generate maximal s-metric graph - double_edge_swap and connected_double_edge_swap - Floyd's algorithm for all pairs shortest path - read and write unicode graph data to text files - read and write YAML format text files, http://yaml.org Bug fixes ~~~~~~~~~ - speed improvements (faster version of subgraph, is_connected) - added cumulative distribution and modified discrete distribution utilities - report error if DiGraphs are sent to connected_components routines - removed with_labels keywords for many functions where it was causing confusion - function name changes in shortest_path routines - saner internal handling of nbunch (node bunches), raise an exception if an nbunch isn't a node or iterable - better keyword handling in io.py allows reading multiple graphs - don't mix Numeric and numpy arrays in graph layouts and drawing - avoid automatically rescaling matplotlib axes when redrawing graph layout Examples ~~~~~~~~ - unicode node labels NetworkX 0.29 ------------- Release date: 28 April 2006 See: https://networkx.lanl.gov/trac/timeline New features ~~~~~~~~~~~~ - Algorithms for betweenness, eigenvalues, eigenvectors, and spectral projection for threshold graphs - Use numpy when available - dense_gnm_random_graph generator - Generators for some directed graphs: GN, GNR, and GNC by Krapivsky and Redner - Grid graph generators now label by index tuples. Helper functions for manipulating labels. - relabel_nodes_with_function Bug fixes ~~~~~~~~~ - Betweenness centrality now correctly uses Brandes definition and has normalization option outside main loop - Empty graph now labeled as empty_graph(n) - shortest_path_length used python2.4 generator feature - degree_sequence_tree off by one error caused nonconsecutive labeling - periodic_grid_2d_graph removed in favor of grid_2d_graph with periodic=True NetworkX 0.28 ------------- Release date: 13 March 2006 See: https://networkx.lanl.gov/trac/timeline New features ~~~~~~~~~~~~ - Option to construct Laplacian with rows and columns in specified order - Option in convert_node_labels_to_integers to use sorted order - predecessor(G,n) function that returns dictionary of nodes with predecessors from breadth-first search of G starting at node n. https://networkx.lanl.gov/trac/ticket/26 Examples ~~~~~~~~ - Formation of giant component in binomial_graph: - Chess masters matches: - Gallery https://networkx.lanl.gov/gallery.html Bug fixes ~~~~~~~~~ - Adjusted names for random graphs. + erdos_renyi_graph=binomial_graph=gnp_graph: n nodes with edge probability p + gnm_graph: n nodes and m edges + fast_gnp_random_graph: gnp for sparse graphs (small p) - Documentation contains correct spelling of Barabási, Bollobás, Erdős, and Rényi in UTF-8 encoding - Increased speed of connected_components and related functions by using faster BFS algorithm in networkx.paths https://networkx.lanl.gov/trac/ticket/27 - XGraph and XDiGraph with multiedges=True produced error on delete_edge - Cleaned up docstring errors - Normalize names of some graphs to produce strings that represent calling sequence NetworkX 0.27 ------------- Release date: 5 February 2006 See: https://networkx.lanl.gov/trac/timeline New features ~~~~~~~~~~~~ - sparse_binomial_graph: faster graph generator for sparse random graphs - read/write routines in io.py now handle XGraph() type and gzip and bzip2 files - optional mapping of type for read/write routine to allow on-the-fly conversion of node and edge datatype on read - Substantial changes related to digraphs and definitions of neighbors() and edges(). For digraphs edges=out_edges. Neighbors now returns a list of neighboring nodes with possible duplicates for graphs with parallel edges See https://networkx.lanl.gov/trac/ticket/24 - Addition of out_edges, in_edges and corresponding out_neighbors and in_neighbors for digraphs. For digraphs edges=out_edges. Examples ~~~~~~~~ - Minard's data for Napoleon's Russian campaign Bug fixes ~~~~~~~~~ - XGraph(multiedges=True) returns a copy of the list of edges for get_edge() NetworkX 0.26 ------------- Release date: 6 January 2006 New features ~~~~~~~~~~~~ - Simpler interface to drawing with pylab - G.info(node=None) function returns short information about graph or node - adj_matrix now takes optional nodelist to force ordering of rows/columns in matrix - optional pygraphviz and pydot interface to graphviz is now callable as "graphviz" with pygraphviz preferred. Use draw_graphviz(G). Examples ~~~~~~~~ - Several new examples showing how draw to graphs with various properties of nodes, edges, and labels Bug fixes ~~~~~~~~~ - Default data type for all graphs is now None (was the integer 1) - add_nodes_from now won't delete edges if nodes added already exist - Added missing names to generated graphs - Indexes for nodes in graphs start at zero by default (was 1) NetworkX 0.25 ------------- Release date: 5 December 2005 New features ~~~~~~~~~~~~ - Uses setuptools for installation http://peak.telecommunity.com/DevCenter/setuptools - Improved testing infrastructure, can now run python setup.py test - Added interface to draw graphs with pygraphviz https://networkx.lanl.gov/pygraphviz/ - is_directed() function call Examples ~~~~~~~~ - Email example shows how to use XDiGraph with Python objects as edge data Documentation ~~~~~~~~~~~~~ - Reformat menu, minor changes to Readme, better stylesheet Bug fixes ~~~~~~~~~ - use create_using= instead of result= keywords for graph types in all cases - missing weights for degree 0 and 1 nodes in clustering - configuration model now uses XGraph, returns graph with identical degree sequence as input sequence - fixed Dijkstra priority queue - fixed non-recursive toposort and is_directed_acyclic graph NetworkX 0.24 ------------- Release date: 20 August 2005 Bug fixes ~~~~~~~~~ - Update of Dijkstra algorithm code - dfs_successor now calls proper search method - Changed to list comprehension in DiGraph.reverse() for python2.3 compatibility - Barabasi-Albert graph generator fixed - Attempt to add self loop should add node even if parallel edges not allowed NetworkX 0.23 ------------- Release date: 14 July 2005 The NetworkX web locations have changed: http://networkx.lanl.gov/ - main documentation site http://networkx.lanl.gov/svn/ - subversion source code repository https://networkx.lanl.gov/trac/ - bug tracking and info Important Change ~~~~~~~~~~~~~~~~ The naming conventions in NetworkX have changed. The package name "NX" is now "networkx". The suggested ways to import the NetworkX package are - import networkx - import networkx as NX - from networkx import * New features ~~~~~~~~~~~~ - DiGraph reverse - Graph generators + watts_strogatz_graph now does rewiring method + old watts_strogatz_graph->newman_watts_strogatz_graph Examples ~~~~~~~~ Documentation ~~~~~~~~~~~~~ - Changed to reflect NX-networkx change - main site is now https://networkx.lanl.gov/ Bug fixes ~~~~~~~~~ - Fixed logic in io.py for reading DiGraphs. - Path based centrality measures (betweenness, closeness) modified so they work on graphs that are not connected and produce the same result as if each connected component were considered separately. NetworkX 0.22 ------------- Release date: 17 June 2005 New features ~~~~~~~~~~~~ - Topological sort, testing for directed acyclic graphs (DAGs) - Dijkstra's algorithm for shortest paths in weighted graphs - Multidimensional layout with dim=n for drawing - 3d rendering demonstration with vtk - Graph generators + random_powerlaw_tree + dorogovtsev_goltsev_mendes_graph Examples ~~~~~~~~ - Kevin Bacon movie actor graph: Examples/kevin_bacon.py - Compute eigenvalues of graph Laplacian: Examples/eigenvalues.py - Atlas of small graphs: Examples/atlas.py Documentation ~~~~~~~~~~~~~ - Rewrite of setup scripts to install documentation and tests in documentation directory specified Bug fixes ~~~~~~~~~ - Handle calls to edges() with non-node, non-iterable items. - truncated_tetrahedral_graph was just plain wrong - Speedup of betweenness_centrality code - bfs_path_length now returns correct lengths - Catch error if target of search not in connected component of source - Code cleanup to label internal functions with _name - Changed import statement lines to always use "import NX" to protect name-spaces - Other minor bug-fixes and testing added networkx-1.11/doc/source/reference/api_1.0.rst0000644000175000017500000002025412637544450021074 0ustar aricaric00000000000000********************************* Version 1.0 notes and API changes ********************************* We have made some significant API changes, detailed below, to add functionality and clarity. This page reflects changes from networkx-0.99 to networkx-1.0. For changes from earlier versions to networkx-0.99 see :doc:`Version 0.99 API changes `. Version 1.0 requires Python 2.4 or greater. Please send comments and questions to the networkx-discuss mailing list: http://groups.google.com/group/networkx-discuss . Version numbering ================= In the future we will use a more standard release numbering system with major.minor[build] labels where major and minor are numbers and [build] is a label such as "dev1379" to indicate a development version or "rc1" to indicate a release candidate. We plan on sticking closer to a time-based release schedule with smaller incremental changes released on a roughly quarterly basis. The graph classes API will remain fixed, unless we determine there are serious bugs or other defects in the existing classes, until networkx-2.0 is released at some time in the future. Changes in base classes ======================= The most significant changes in are in the graph classes. All of the graph classes now allow optional graph, node, and edge attributes. Those attributes are stored internally in the graph classes as dictionaries and can be accessed simply like Python dictionaries in most cases. Graph attributes ---------------- Each graph keeps a dictionary of key=value attributes in the member G.graph. These attributes can be accessed directly using G.graph or added at instantiation using keyword arguments. >>> G=nx.Graph(region='Africa') >>> G.graph['color']='green' >>> G.graph {'color': 'green', 'region': 'Africa'} Node attributes --------------- Each node has a corresponding dictionary of attributes. Adding attributes to nodes is optional. Add node attributes using add_node(), add_nodes_from() or G.node >>> G.add_node(1, time='5pm') >>> G.add_nodes_from([3], time='2pm') >>> G.node[1] {'time': '5pm'} >>> G.node[1]['room'] = 714 >>> G.nodes(data=True) [(1, {'room': 714, 'time': '5pm'}), (3, {'time': '2pm'})] Edge attributes --------------- Each edge has a corresponding dictionary of attributes. The default edge data is now an empty dictionary of attributes and adding attributes to edges is optional. A common use case is to add a weight attribute to an edge: >>> G.add_edge(1,2,weight=3.14159) Add edge attributes using add_edge(), add_edges_from(), subscript notation, or G.edge. >>> G.add_edge(1, 2, weight=4.7 ) >>> G.add_edges_from([(3,4),(4,5)], color='red') >>> G.add_edges_from([(1,2,{'color':'blue'}), (2,3,{'weight':8})]) >>> G[1][2]['weight'] = 4.7 >>> G.edge[1][2]['weight'] = 4 Methods changed --------------- Graph(), DiGraph(), MultiGraph(), MultiDiGraph() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Now takes optional keyword=value attributes on initialization. >>> G=nx.Graph(year='2009',city='New York') add_node() ^^^^^^^^^^ Now takes optional keyword=value attributes or a dictionary of attributes. >>> G.add_node(1,room=714) add_nodes_from() ^^^^^^^^^^^^^^^^ Now takes optional keyword=value attributes or a dictionary of attributes applied to all affected nodes. >>> G.add_nodes_from([1,2],time='2pm') # all nodes have same attribute add_edge() ^^^^^^^^^^ Now takes optional keyword=value attributes or a dictionary of attributes. >>> G.add_edge(1, 2, weight=4.7 ) add_edges_from() ^^^^^^^^^^^^^^^^ Now takes optional keyword=value attributes or a dictionary of attributes applied to all affected edges. >>> G.add_edges_from([(3,4),(4,5)], color='red') >>> G.add_edges_from([(1,2,{'color':'blue'}), (2,3,{'weight':8})]) nodes() and nodes_iter() ^^^^^^^^^^^^^^^^^^^^^^^^ New keyword data=True|False keyword determines whether to return two-tuples (n,dict) (True) with node attribution dictionary >>> G=nx.Graph([(1,2),(3,4)]) >>> G.nodes(data=True) [(1, {}), (2, {}), (3, {}), (4, {})] copy() ^^^^^^ Now returns a deep copy of the graph (copies all underlying data and attributes for nodes and edges). Use the class initializer to make a shallow copy: >>> G=nx.Graph() >>> G_shallow=nx.Graph(G) # shallow copy >>> G_deep=G.copy() # deep copy to_directed(), to_undirected() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Now returns a deep copy of the graph (copies all underlying data and attributes for nodes and edges). Use the class initializer to make a shallow copy: >>> G=nx.Graph() >>> D_shallow=nx.DiGraph(G) # shallow copy >>> D_deep=G.to_directed() # deep copy subgraph() ^^^^^^^^^^ With copy=True now returns a deep copy of the graph (copies all underlying data and attributes for nodes and edges). >>> G=nx.Graph() >>> # note: copy keyword deprecated in networkx>1.0 >>> # H=G.subgraph([],copy=True) # deep copy of all data add_cycle(), add_path(), add_star() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Now take optional keyword=value attributes or a dictionary of attributes which are applied to all edges affected by the method. >>> G=nx.Graph() >>> G.add_path([0,1,2,3],width=3.2) Methods removed --------------- delete_node() ^^^^^^^^^^^^^ The preferred name is now remove_node(). delete_nodes_from() ^^^^^^^^^^^^^^^^^^^ No longer raises an exception on an attempt to delete a node not in the graph. The preferred name is now remove_nodes_from(). delete_edge() ^^^^^^^^^^^^^ Now raises an exception on an attempt to delete an edge not in the graph. The preferred name is now remove_edge(). delete_edges_from() ^^^^^^^^^^^^^^^^^^^ The preferred name is now remove_edges_from(). has_neighbor(): Use has_edge() get_edge() ^^^^^^^^^^ Renamed to get_edge_data(). Returns the edge attribute dictionary. The fastest way to get edge data for edge (u,v) is to use G[u][v] instead of G.get_edge_data(u,v) Members removed --------------- directed, multigraph, weighted ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Use methods G.is_directed() and G.is_multigraph(). All graphs are weighted graphs now if they have numeric values in the 'weight' edge attribute. Methods added ------------- add_weighted edges_from() ^^^^^^^^^^^^^^^^^^^^^^^^^ Convenience method to add weighted edges to graph using a list of 3-tuples (u,v,weight). get_edge_data() ^^^^^^^^^^^^^^^ Renamed from get_edge(). The fastest way to get edge data for edge (u,v) is to use G[u][v] instead of G.get_edge_data(u,v) is_directed() ^^^^^^^^^^^^^ replaces member G.directed is_multigraph() ^^^^^^^^^^^^^^^ replaces member G.multigraph Classes Removed --------------- LabeledGraph, LabeledDiGraph ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ These classes have been folded into the regular classes. UbiGraph ^^^^^^^^ Removed as the ubigraph platform is no longer being supported. Additional functions/generators =============================== ego_graph, stochastic_graph, PageRank algorithm, HITS algorithm, GraphML writer, freeze, is_frozen, A* algorithm, directed scale-free generator, random clustered graph. Converting your existing code to networkx-1.0 ============================================= Weighted edges -------------- Edge information is now stored in an attribution dictionary so all edge data must be given a key to identify it. There is currently only one standard/reserved key, 'weight', which is used by algorithms and functions that use weighted edges. The associated value should be numeric. All other keys are available for users to assign as needed. >>> G=nx.Graph() >>> G.add_edge(1,2,weight=3.1415) # add the edge 1-2 with a weight >>> G[1][2]['weight']=2.3 # set the weight to 2.3 Similarly, for direct access the edge data, use the key of the edge data to retrieve it. >>> w = G[1][2]['weight'] All NetworkX algorithms that require/use weighted edges now use the 'weight' edge attribute. If you have existing algorithms that assumed the edge data was numeric, you should replace G[u][v] and G.get_edge(u,v) with G[u][v]['weight']. An idiom for getting a weight for graphs with or without an assigned weight key is >>> w= G[1][2].get('weight',1) # set w to 1 if there is no 'weight' key networkx-1.11/doc/source/reference/introduction.rst0000644000175000017500000003060512637544450022467 0ustar aricaric00000000000000Introduction ~~~~~~~~~~~~ .. currentmodule:: networkx .. only:: html NetworkX provides data structures for graphs (or networks) along with graph algorithms, generators, and drawing tools. The structure of NetworkX can be seen by the organization of its source code. The package provides classes for graph objects, generators to create standard graphs, IO routines for reading in existing datasets, algorithms to analyse the resulting networks and some basic drawing tools. Most of the NetworkX API is provided by functions which take a graph object as an argument. Methods of the graph object are limited to basic manipulation and reporting. This provides modularity of code and documentation. It also makes it easier for newcomers to learn about the package in stages. The source code for each module is meant to be easy to read and reading this Python code is actually a good way to learn more about network algorithms, but we have put a lot of effort into making the documentation sufficient and friendly. If you have suggestions or questions please contact us by joining the `NetworkX Google group `_. Classes are named using CamelCase (capital letters at the start of each word). functions, methods and variable names are lower_case_underscore (lowercase with an underscore representing a space between words). NetworkX Basics --------------- After starting Python, import the networkx module with (the recommended way) >>> import networkx as nx To save repetition, in the documentation we assume that NetworkX has been imported this way. If importing networkx fails, it means that Python cannot find the installed module. Check your installation and your PYTHONPATH. The following basic graph types are provided as Python classes: :class:`Graph` This class implements an undirected graph. It ignores multiple edges between two nodes. It does allow self-loop edges between a node and itself. :class:`DiGraph` Directed graphs, that is, graphs with directed edges. Operations common to directed graphs, (a subclass of Graph). :class:`MultiGraph` A flexible graph class that allows multiple undirected edges between pairs of nodes. The additional flexibility leads to some degradation in performance, though usually not significant. :class:`MultiDiGraph` A directed version of a MultiGraph. Empty graph-like objects are created with >>> G=nx.Graph() >>> G=nx.DiGraph() >>> G=nx.MultiGraph() >>> G=nx.MultiDiGraph() All graph classes allow any :term:`hashable` object as a node. Hashable objects include strings, tuples, integers, and more. Arbitrary edge attributes such as weights and labels can be associated with an edge. The graph internal data structures are based on an adjacency list representation and implemented using Python :term:`dictionary` datastructures. The graph adjaceny structure is implemented as a Python dictionary of dictionaries; the outer dictionary is keyed by nodes to values that are themselves dictionaries keyed by neighboring node to the edge attributes associated with that edge. This "dict-of-dicts" structure allows fast addition, deletion, and lookup of nodes and neighbors in large graphs. The underlying datastructure is accessed directly by methods (the programming interface "API") in the class definitions. All functions, on the other hand, manipulate graph-like objects solely via those API methods and not by acting directly on the datastructure. This design allows for possible replacement of the 'dicts-of-dicts'-based datastructure with an alternative datastructure that implements the same methods. Graphs ======= The first choice to be made when using NetworkX is what type of graph object to use. A graph (network) is a collection of nodes together with a collection of edges that are pairs of nodes. Attributes are often associated with nodes and/or edges. NetworkX graph objects come in different flavors depending on two main properties of the network: - Directed: Are the edges **directed**? Does the order of the edge pairs (u,v) matter? A directed graph is specified by the "Di" prefix in the class name, e.g. DiGraph(). We make this distinction because many classical graph properties are defined differently for directed graphs. - Multi-edges: Are multiple edges allowed between each pair of nodes? As you might imagine, multiple edges requires a different data structure, though tricky users could design edge data objects to support this functionality. We provide a standard data structure and interface for this type of graph using the prefix "Multi", e.g. MultiGraph(). The basic graph classes are named: :doc:`Graph `, :doc:`DiGraph`, :doc:`MultiGraph `, and :doc:`MultiDiGraph ` Nodes and Edges --------------- The next choice you have to make when specifying a graph is what kinds of nodes and edges to use. If the topology of the network is all you care about then using integers or strings as the nodes makes sense and you need not worry about edge data. If you have a data structure already in place to describe nodes you can simply use that structure as your nodes provided it is :term:`hashable`. If it is not hashable you can use a unique identifier to represent the node and assign the data as a :term:`node attribute`. Edges often have data associated with them. Arbitrary data can associated with edges as an :term:`edge attribute`. If the data is numeric and the intent is to represent a *weighted* graph then use the 'weight' keyword for the attribute. Some of the graph algorithms, such as Dijkstra's shortest path algorithm, use this attribute name to get the weight for each edge. Other attributes can be assigned to an edge by using keyword/value pairs when adding edges. You can use any keyword except 'weight' to name your attribute and can then easily query the edge data by that attribute keyword. Once you've decided how to encode the nodes and edges, and whether you have an undirected/directed graph with or without multiedges you are ready to build your network. Graph Creation ============== NetworkX graph objects can be created in one of three ways: - Graph generators -- standard algorithms to create network topologies. - Importing data from pre-existing (usually file) sources. - Adding edges and nodes explicitly. Explicit addition and removal of nodes/edges is the easiest to describe. Each graph object supplies methods to manipulate the graph. For example, >>> import networkx as nx >>> G=nx.Graph() >>> G.add_edge(1,2) # default edge data=1 >>> G.add_edge(2,3,weight=0.9) # specify edge data Edge attributes can be anything: >>> import math >>> G.add_edge('y','x',function=math.cos) >>> G.add_node(math.cos) # any hashable can be a node You can add many edges at one time: >>> elist=[('a','b',5.0),('b','c',3.0),('a','c',1.0),('c','d',7.3)] >>> G.add_weighted_edges_from(elist) See the :doc:`/tutorial/index` for more examples. Some basic graph operations such as union and intersection are described in the :ref:`Operators module` documentation. Graph generators such as binomial_graph and powerlaw_graph are provided in the :doc:`generators` subpackage. For importing network data from formats such as GML, GraphML, edge list text files see the :doc:`readwrite` subpackage. Graph Reporting =============== Class methods are used for the basic reporting functions neighbors, edges and degree. Reporting of lists is often needed only to iterate through that list so we supply iterator versions of many property reporting methods. For example edges() and nodes() have corresponding methods edges_iter() and nodes_iter(). Using these methods when you can will save memory and often time as well. The basic graph relationship of an edge can be obtained in two basic ways. One can look for neighbors of a node or one can look for edges incident to a node. We jokingly refer to people who focus on nodes/neighbors as node-centric and people who focus on edges as edge-centric. The designers of NetworkX tend to be node-centric and view edges as a relationship between nodes. You can see this by our avoidance of notation like G[u,v] in favor of G[u][v]. Most data structures for sparse graphs are essentially adjacency lists and so fit this perspective. In the end, of course, it doesn't really matter which way you examine the graph. G.edges() removes duplicate representations of each edge while G.neighbors(n) or G[n] is slightly faster but doesn't remove duplicates. Any properties that are more complicated than edges, neighbors and degree are provided by functions. For example nx.triangles(G,n) gives the number of triangles which include node n as a vertex. These functions are grouped in the code and documentation under the term :ref:`algorithms`. Algorithms ========== A number of graph algorithms are provided with NetworkX. These include shortest path, and breadth first search (see :ref:`traversal`), clustering and isomorphism algorithms and others. There are many that we have not developed yet too. If you implement a graph algorithm that might be useful for others please let us know through the `NetworkX Google group `_ or the Github `Developer Zone `_. As an example here is code to use Dijkstra's algorithm to find the shortest weighted path: >>> G=nx.Graph() >>> e=[('a','b',0.3),('b','c',0.9),('a','c',0.5),('c','d',1.2)] >>> G.add_weighted_edges_from(e) >>> print(nx.dijkstra_path(G,'a','d')) ['a', 'c', 'd'] Drawing ======= While NetworkX is not designed as a network layout tool, we provide a simple interface to drawing packages and some simple layout algorithms. We interface to the excellent Graphviz layout tools like dot and neato with the (suggested) pygraphviz package or the pydot interface. Drawing can be done using external programs or the Matplotlib Python package. Interactive GUI interfaces are possible though not provided. The drawing tools are provided in the module :ref:`drawing`. The basic drawing functions essentially place the nodes on a scatterplot using the positions in a dictionary or computed with a layout function. The edges are then lines between those dots. >>> G=nx.cubical_graph() >>> nx.draw(G) # default spring_layout >>> nx.draw(G,pos=nx.spectral_layout(G), nodecolor='r',edge_color='b') See the :doc:`examples` for more ideas. Data Structure ============== NetworkX uses a "dictionary of dictionaries of dictionaries" as the basic network data structure. This allows fast lookup with reasonable storage for large sparse networks. The keys are nodes so G[u] returns an adjacency dictionary keyed by neighbor to the edge attribute dictionary. The expression G[u][v] returns the edge attribute dictionary itself. A dictionary of lists would have also been possible, but not allowed fast edge detection nor convenient storage of edge data. Advantages of dict-of-dicts-of-dicts data structure: - Find edges and remove edges with two dictionary look-ups. - Prefer to "lists" because of fast lookup with sparse storage. - Prefer to "sets" since data can be attached to edge. - G[u][v] returns the edge attribute dictionary. - ``n in G`` tests if node ``n`` is in graph G. - ``for n in G:`` iterates through the graph. - ``for nbr in G[n]:`` iterates through neighbors. As an example, here is a representation of an undirected graph with the edges ('A','B'), ('B','C') >>> G=nx.Graph() >>> G.add_edge('A','B') >>> G.add_edge('B','C') >>> print(G.adj) {'A': {'B': {}}, 'C': {'B': {}}, 'B': {'A': {}, 'C': {}}} The data structure gets morphed slightly for each base graph class. For DiGraph two dict-of-dicts-of-dicts structures are provided, one for successors and one for predecessors. For MultiGraph/MultiDiGraph we use a dict-of-dicts-of-dicts-of-dicts [#turtles]_ where the third dictionary is keyed by an edge key identifier to the fourth dictionary which contains the edge attributes for that edge between the two nodes. Graphs use a dictionary of attributes for each edge. We use a dict-of-dicts-of-dicts data structure with the inner dictionary storing "name-value" relationships for that edge. >>> G=nx.Graph() >>> G.add_edge(1,2,color='red',weight=0.84,size=300) >>> print(G[1][2]['size']) 300 .. rubric:: Footnotes .. [#turtles] "It's dictionaries all the way down." networkx-1.11/doc/source/reference/algorithms.clustering.rst0000644000175000017500000000032112637544450024265 0ustar aricaric00000000000000********** Clustering ********** .. automodule:: networkx.algorithms.cluster .. autosummary:: :toctree: generated/ triangles transitivity clustering average_clustering square_clustering networkx-1.11/doc/source/reference/algorithms.traversal.rst0000644000175000017500000000132312637544450024114 0ustar aricaric00000000000000.. _traversal: Traversal ========= .. toctree:: :maxdepth: 2 Depth First Search ------------------ .. automodule:: networkx.algorithms.traversal.depth_first_search .. autosummary:: :toctree: generated/ dfs_edges dfs_tree dfs_predecessors dfs_successors dfs_preorder_nodes dfs_postorder_nodes dfs_labeled_edges Breadth First Search -------------------- .. automodule:: networkx.algorithms.traversal.breadth_first_search .. autosummary:: :toctree: generated/ bfs_edges bfs_tree bfs_predecessors bfs_successors Depth First Search on Edges --------------------------- .. automodule:: networkx.algorithms.traversal.edgedfs .. autosummary:: :toctree: generated/ edge_dfs networkx-1.11/doc/source/reference/algorithms.euler.rst0000644000175000017500000000022312637544450023223 0ustar aricaric00000000000000******** Eulerian ******** .. automodule:: networkx.algorithms.euler .. autosummary:: :toctree: generated/ is_eulerian eulerian_circuit networkx-1.11/doc/source/reference/classes.multidigraph.rst0000644000175000017500000000443212637544500024066 0ustar aricaric00000000000000.. _multidigraph: ================================================================= MultiDiGraph - Directed graphs with self loops and parallel edges ================================================================= Overview ======== .. currentmodule:: networkx .. autofunction:: MultiDiGraph ======= Methods ======= Adding and Removing Nodes and Edges =================================== .. autosummary:: :toctree: generated/ MultiDiGraph.__init__ MultiDiGraph.add_node MultiDiGraph.add_nodes_from MultiDiGraph.remove_node MultiDiGraph.remove_nodes_from MultiDiGraph.add_edge MultiDiGraph.add_edges_from MultiDiGraph.add_weighted_edges_from MultiDiGraph.remove_edge MultiDiGraph.remove_edges_from MultiDiGraph.add_star MultiDiGraph.add_path MultiDiGraph.add_cycle MultiDiGraph.clear Iterating over nodes and edges ============================== .. autosummary:: :toctree: generated/ MultiDiGraph.nodes MultiDiGraph.nodes_iter MultiDiGraph.__iter__ MultiDiGraph.edges MultiDiGraph.edges_iter MultiDiGraph.out_edges MultiDiGraph.out_edges_iter MultiDiGraph.in_edges MultiDiGraph.in_edges_iter MultiDiGraph.get_edge_data MultiDiGraph.neighbors MultiDiGraph.neighbors_iter MultiDiGraph.__getitem__ MultiDiGraph.successors MultiDiGraph.successors_iter MultiDiGraph.predecessors MultiDiGraph.predecessors_iter MultiDiGraph.adjacency_list MultiDiGraph.adjacency_iter MultiDiGraph.nbunch_iter Information about graph structure ================================= .. autosummary:: :toctree: generated/ MultiDiGraph.has_node MultiDiGraph.__contains__ MultiDiGraph.has_edge MultiDiGraph.order MultiDiGraph.number_of_nodes MultiDiGraph.__len__ MultiDiGraph.degree MultiDiGraph.degree_iter MultiDiGraph.in_degree MultiDiGraph.in_degree_iter MultiDiGraph.out_degree MultiDiGraph.out_degree_iter MultiDiGraph.size MultiDiGraph.number_of_edges MultiDiGraph.nodes_with_selfloops MultiDiGraph.selfloop_edges MultiDiGraph.number_of_selfloops Making copies and subgraphs =========================== .. autosummary:: :toctree: generated/ MultiDiGraph.copy MultiDiGraph.to_undirected MultiDiGraph.to_directed MultiDiGraph.subgraph MultiDiGraph.reverse networkx-1.11/doc/source/reference/drawing.rst0000644000175000017500000000367512637544500021404 0ustar aricaric00000000000000.. _drawing: ******* Drawing ******* NetworkX provides basic functionality for visualizing graphs, but its main goal is to enable graph analysis rather than perform graph visualization. In the future, graph visualization functionality may be removed from NetworkX or only available as an add-on package. Proper graph visualization is hard, and we highly recommend that people visualize their graphs with tools dedicated to that task. Notable examples of dedicated and fully-featured graph visualization tools are `Cytoscape `_, `Gephi `_, `Graphviz `_ and, for `LaTeX `_ typesetting, `PGF/TikZ `_. To use these and other such tools, you should export your NetworkX graph into a format that can be read by those tools. For example, Cytoscape can read the GraphML format, and so, `networkx.write_graphml(G)` might be an appropriate choice. Matplotlib ========== .. automodule:: networkx.drawing.nx_pylab .. autosummary:: :toctree: generated/ draw draw_networkx draw_networkx_nodes draw_networkx_edges draw_networkx_labels draw_networkx_edge_labels draw_circular draw_random draw_spectral draw_spring draw_shell draw_graphviz Graphviz AGraph (dot) ===================== .. automodule:: networkx.drawing.nx_agraph .. autosummary:: :toctree: generated/ from_agraph to_agraph write_dot read_dot graphviz_layout pygraphviz_layout Graphviz with pydot =================== .. automodule:: networkx.drawing.nx_pydot .. autosummary:: :toctree: generated/ from_pydot to_pydot write_dot read_dot graphviz_layout pydot_layout Graph Layout ============ .. automodule:: networkx.drawing.layout .. autosummary:: :toctree: generated/ circular_layout fruchterman_reingold_layout random_layout shell_layout spring_layout spectral_layout networkx-1.11/doc/source/reference/algorithms.coloring.rst0000644000175000017500000000020312637544450023721 0ustar aricaric00000000000000******** Coloring ******** .. automodule:: networkx.algorithms.coloring .. autosummary:: :toctree: generated/ greedy_color networkx-1.11/doc/source/reference/algorithms.matching.rst0000644000175000017500000000023612637544450023705 0ustar aricaric00000000000000******** Matching ******** .. automodule:: networkx.algorithms.matching .. autosummary:: :toctree: generated/ maximal_matching max_weight_matching networkx-1.11/doc/source/reference/algorithms.isomorphism.vf2.rst0000644000175000017500000000262412637544450025163 0ustar aricaric00000000000000.. _vf2: ************* VF2 Algorithm ************* .. automodule:: networkx.algorithms.isomorphism.isomorphvf2 Graph Matcher ------------- .. currentmodule:: networkx.algorithms.isomorphism .. autosummary:: :toctree: generated/ GraphMatcher.__init__ GraphMatcher.initialize GraphMatcher.is_isomorphic GraphMatcher.subgraph_is_isomorphic GraphMatcher.isomorphisms_iter GraphMatcher.subgraph_isomorphisms_iter GraphMatcher.candidate_pairs_iter GraphMatcher.match GraphMatcher.semantic_feasibility GraphMatcher.syntactic_feasibility DiGraph Matcher --------------- .. currentmodule:: networkx.algorithms.isomorphism .. autosummary:: :toctree: generated/ DiGraphMatcher.__init__ DiGraphMatcher.initialize DiGraphMatcher.is_isomorphic DiGraphMatcher.subgraph_is_isomorphic DiGraphMatcher.isomorphisms_iter DiGraphMatcher.subgraph_isomorphisms_iter DiGraphMatcher.candidate_pairs_iter DiGraphMatcher.match DiGraphMatcher.semantic_feasibility DiGraphMatcher.syntactic_feasibility Match helpers ------------- .. currentmodule:: networkx.algorithms.isomorphism .. autosummary:: :toctree: generated/ categorical_node_match categorical_edge_match categorical_multiedge_match numerical_node_match numerical_edge_match numerical_multiedge_match generic_node_match generic_edge_match generic_multiedge_match networkx-1.11/doc/source/reference/index.rst0000644000175000017500000000053012637544450021047 0ustar aricaric00000000000000.. _reference: Reference ********* :Release: |release| :Date: |today| .. toctree:: :maxdepth: 2 introduction classes algorithms functions generators linalg convert relabel readwrite drawing exceptions utils legal citing credits glossary .. toctree:: :hidden: pdf_reference networkx-1.11/doc/source/reference/classes.graph.rst0000644000175000017500000000304412637544500022474 0ustar aricaric00000000000000.. _graph: ========================================== Graph -- Undirected graphs with self loops ========================================== Overview ======== .. currentmodule:: networkx .. autofunction:: Graph ======= Methods ======= Adding and removing nodes and edges =================================== .. autosummary:: :toctree: generated/ Graph.__init__ Graph.add_node Graph.add_nodes_from Graph.remove_node Graph.remove_nodes_from Graph.add_edge Graph.add_edges_from Graph.add_weighted_edges_from Graph.remove_edge Graph.remove_edges_from Graph.add_star Graph.add_path Graph.add_cycle Graph.clear Iterating over nodes and edges ============================== .. autosummary:: :toctree: generated/ Graph.nodes Graph.nodes_iter Graph.__iter__ Graph.edges Graph.edges_iter Graph.get_edge_data Graph.neighbors Graph.neighbors_iter Graph.__getitem__ Graph.adjacency_list Graph.adjacency_iter Graph.nbunch_iter Information about graph structure ================================= .. autosummary:: :toctree: generated/ Graph.has_node Graph.__contains__ Graph.has_edge Graph.order Graph.number_of_nodes Graph.__len__ Graph.degree Graph.degree_iter Graph.size Graph.number_of_edges Graph.nodes_with_selfloops Graph.selfloop_edges Graph.number_of_selfloops Making copies and subgraphs =========================== .. autosummary:: :toctree: generated/ Graph.copy Graph.to_undirected Graph.to_directed Graph.subgraph networkx-1.11/doc/source/reference/algorithms.swap.rst0000644000175000017500000000022612637544450023064 0ustar aricaric00000000000000**** Swap **** .. automodule:: networkx.algorithms.swap .. autosummary:: :toctree: generated/ double_edge_swap connected_double_edge_swap networkx-1.11/doc/source/reference/algorithms.dominance.rst0000644000175000017500000000024612637544450024051 0ustar aricaric00000000000000********* Dominance ********* .. automodule:: networkx.algorithms.dominance .. autosummary:: :toctree: generated/ immediate_dominators dominance_frontiers networkx-1.11/doc/source/reference/pdf_reference.rst0000644000175000017500000000046512637544450022536 0ustar aricaric00000000000000.. _pdf_reference: Reference ********* :Release: |release| :Date: |today| .. toctree:: :maxdepth: 2 ../overview introduction classes algorithms functions generators linalg convert readwrite drawing exceptions utils legal citing credits glossary networkx-1.11/doc/source/reference/api_1.10.rst0000644000175000017500000002256712637544450021166 0ustar aricaric00000000000000********************************** Version 1.10 notes and API changes ********************************** This page includes more detailed release information and API changes from NetworkX 1.9 to NetworkX 1.10. Please send comments and questions to the networkx-discuss mailing list: . API changes ----------- * [`#1501 `_] ``connected_components``, ``weakly_connected_components``, and ``strongly_connected_components`` return now a generator of sets of nodes. Previously the generator was of lists of nodes. This PR also refactored the ``connected_components`` and ``weakly_connected_components`` implementations making them faster, especially for large graphs. * [`#1547 `_] The ``func_iter`` functions in Di/Multi/Graphs classes are slated for removal in NetworkX 2.0 release. ``func`` will behave like ``func_iter`` and return an iterator instead of list. These functions are deprecated in NetworkX 1.10 release. New functionalities ------------------- * [`#823 `_] A ``enumerate_all_cliques`` function is added in the clique package (``networkx.algorithms.clique``) for enumerating all cliques (including nonmaximal ones) of undirected graphs. * [`#1105 `_] A coloring package (``networkx.algorithms.coloring``) is created for graph coloring algorithms. Initially, a ``greedy_color`` function is provided for coloring graphs using various greedy heuristics. * [`#1193 `_] A new generator ``edge_dfs``, added to ``networkx.algorithms.traversal``, implements a depth-first traversal of the edges in a graph. This complements functionality provided by a depth-first traversal of the nodes in a graph. For multigraphs, it allows the user to know precisely which edges were followed in a traversal. All NetworkX graph types are supported. A traversal can also reverse edge orientations or ignore them. * [`#1194 `_] A ``find_cycle`` function is added to the ``networkx.algorithms.cycles`` package to find a cycle in a graph. Edge orientations can be optionally reversed or ignored. * [`#1210 `_] Add a random generator for the duplication-divergence model. * [`#1241 `_] A new ``networkx.algorithms.dominance`` package is added for dominance/dominator algorithms on directed graphs. It contains a ``immediate_dominators`` function for computing immediate dominators/dominator trees and a ``dominance_frontiers`` function for computing dominance frontiers. * [`#1269 `_] The GML reader/parser and writer/generator are rewritten to remove the dependence on pyparsing and enable handling of arbitrary graph data. * [`#1280 `_] The network simplex method in the ``networkx.algorithms.flow`` package is rewritten to improve its performance and support multi- and disconnected networks. For some cases, the new implementation is two or three orders of magnitude faster than the old implementation. * [`#1286 `_] Added the Margulis--Gabber--Galil graph to ``networkx.generators``. * [`#1306 `_] Added the chordal p-cycle graph, a mildly explicit algebraic construction of a family of 3-regular expander graphs. Also, moves both the existing expander graph generator function (for the Margulis-Gabber-Galil expander) and the new chordal cycle graph function to a new module, ``networkx.generators.expanders``. * [`#1314 `_] Allow overwriting of base class dict with dict-like: OrderedGraph, ThinGraph, LogGraph, etc. * [`#1321 `_] Added ``to_pandas_dataframe`` and ``from_pandas_dataframe``. * [`#1322 `_] Added the Hopcroft--Karp algorithm for finding a maximum cardinality matching in bipartite graphs. * [`#1336 `_] Expanded data keyword in G.edges and added default keyword. * [`#1338 `_] Added support for finding optimum branchings and arborescences. * [`#1340 `_] Added a ``from_pandas_dataframe`` function that accepts Pandas DataFrames and returns a new graph object. At a minimum, the DataFrame must have two columns, which define the nodes that make up an edge. However, the function can also process an arbitrary number of additional columns as edge attributes, such as 'weight'. * [`#1354 `_] Expanded layout functions to add flexibility for drawing subsets of nodes with distinct layouts and for centering each layout around given coordinates. * [`#1356 `_] Added ordered variants of default graph class. * [`#1360 `_] Added harmonic centrality to ``network.algorithms.centrality``. * [`#1390 `_] The ``generators.bipartite`` have been moved to ``algorithms.bipartite.generators``. The functions are not imported in the main namespace, so to use it, the bipartite package has to be imported. * [`#1391 `_] Added Kanevsky's algorithm for finding all minimum-size separating node sets in an undirected graph. It is implemented as a generator of node cut sets. * [`#1399 `_] Added power function for simple graphs * [`#1405 `_] Added fast approximation for node connectivity based on White and Newman's approximation algorithm for finding node independent paths between two nodes. * [`#1413 `_] Added transitive closure and antichains function for directed acyclic graphs in ``algorithms.dag``. The antichains function was contributed by Peter Jipsen and Franco Saliola and originally developed for the SAGE project. * [`#1425 `_] Added generator function for the complete multipartite graph. * [`#1427 `_] Added nonisomorphic trees generator. * [`#1436 `_] Added a generator function for circulant graphs to the ``networkx.generators.classic`` module. * [`#1437 `_] Added function for computing quotient graphs; also created a new module, ``networkx.algorithms.minors``. * [`#1438 `_] Added longest_path and longest_path_length for DAG. * [`#1439 `_] Added node and edge contraction functions to ``networkx.algorithms.minors``. * [`#1445 `_] Added a new modularity matrix module to ``networkx.linalg``, and associated spectrum functions to the ``networkx.linalg.spectrum`` module. * [`#1447 `_] Added function to generate all simple paths starting with the shortest ones based on Yen's algorithm for finding k shortest paths at ``algorithms.simple_paths``. * [`#1455 `_] Added the directed modularity matrix to the ``networkx.linalg.modularity_matrix`` module. * [`#1474 `_] Adds ``triadic_census`` function; also creates a new module, ``networkx.algorithms.triads``. * [`#1476 `_] Adds functions for testing if a graph has weighted or negatively weighted edges. Also adds a function for testing if a graph is empty. These are ``is_weighted``, ``is_negatively_weighted``, and ``is_empty``. * [`#1481 `_] Added Johnson's algorithm; one more algorithm for shortest paths. It solves all pairs shortest path problem. This is ``johnson`` at ``algorithms.shortest_paths`` * [`#1414 `_] Added Moody and White algorithm for identifying ``k_components`` in a graph, which is based on Kanevsky's algorithm for finding all minimum-size node cut-sets (implemented in ``all_node_cuts`` #1391). * [`#1415 `_] Added fast approximation for ``k_components`` to the ``networkx.approximation`` package. This is based on White and Newman approximation algorithm for finding node independent paths between two nodes (see #1405). Removed functionalities ----------------------- * [`#1236 `_] The legacy ``ford_fulkerson`` maximum flow function is removed. Use ``edmonds_karp`` instead. Miscellaneous changes --------------------- * [`#1192 `_] Support for Python 2.6 is dropped. networkx-1.11/doc/source/reference/generators.rst0000644000175000017500000000762712637544500022123 0ustar aricaric00000000000000.. _generators: Graph generators **************** .. currentmodule:: networkx Atlas ----- .. automodule:: networkx.generators.atlas .. autosummary:: :toctree: generated/ graph_atlas_g Classic ------- .. automodule:: networkx.generators.classic .. autosummary:: :toctree: generated/ balanced_tree barbell_graph complete_graph complete_multipartite_graph circular_ladder_graph cycle_graph dorogovtsev_goltsev_mendes_graph empty_graph grid_2d_graph grid_graph hypercube_graph ladder_graph lollipop_graph null_graph path_graph star_graph trivial_graph wheel_graph Expanders --------- .. automodule:: networkx.generators.expanders .. autosummary:: :toctree: generated/ margulis_gabber_galil_graph chordal_cycle_graph Small ----- .. automodule:: networkx.generators.small .. autosummary:: :toctree: generated/ make_small_graph LCF_graph bull_graph chvatal_graph cubical_graph desargues_graph diamond_graph dodecahedral_graph frucht_graph heawood_graph house_graph house_x_graph icosahedral_graph krackhardt_kite_graph moebius_kantor_graph octahedral_graph pappus_graph petersen_graph sedgewick_maze_graph tetrahedral_graph truncated_cube_graph truncated_tetrahedron_graph tutte_graph Random Graphs ------------- .. automodule:: networkx.generators.random_graphs .. autosummary:: :toctree: generated/ fast_gnp_random_graph gnp_random_graph dense_gnm_random_graph gnm_random_graph erdos_renyi_graph binomial_graph newman_watts_strogatz_graph watts_strogatz_graph connected_watts_strogatz_graph random_regular_graph barabasi_albert_graph powerlaw_cluster_graph duplication_divergence_graph random_lobster random_shell_graph random_powerlaw_tree random_powerlaw_tree_sequence Degree Sequence --------------- .. automodule:: networkx.generators.degree_seq .. autosummary:: :toctree: generated/ configuration_model directed_configuration_model expected_degree_graph havel_hakimi_graph directed_havel_hakimi_graph degree_sequence_tree random_degree_sequence_graph Random Clustered ---------------- .. automodule:: networkx.generators.random_clustered .. autosummary:: :toctree: generated/ random_clustered_graph Directed -------- .. automodule:: networkx.generators.directed .. autosummary:: :toctree: generated/ gn_graph gnr_graph gnc_graph scale_free_graph Geometric --------- .. automodule:: networkx.generators.geometric .. autosummary:: :toctree: generated/ random_geometric_graph geographical_threshold_graph waxman_graph navigable_small_world_graph Line Graph ---------- .. automodule:: networkx.generators.line .. autosummary:: :toctree: generated/ line_graph Ego Graph --------- .. automodule:: networkx.generators.ego .. autosummary:: :toctree: generated/ ego_graph Stochastic ---------- .. automodule:: networkx.generators.stochastic .. autosummary:: :toctree: generated/ stochastic_graph Intersection ------------ .. automodule:: networkx.generators.intersection .. autosummary:: :toctree: generated/ uniform_random_intersection_graph k_random_intersection_graph general_random_intersection_graph Social Networks --------------- .. automodule:: networkx.generators.social .. autosummary:: :toctree: generated/ karate_club_graph davis_southern_women_graph florentine_families_graph Community --------- .. automodule:: networkx.generators.community .. autosummary:: :toctree: generated/ caveman_graph connected_caveman_graph relaxed_caveman_graph random_partition_graph planted_partition_graph gaussian_random_partition_graph Non Isomorphic Trees -------------------- .. automodule:: networkx.generators.nonisomorphic_trees .. autosummary:: :toctree: generated/ nonisomorphic_trees number_of_nonisomorphic_trees networkx-1.11/doc/source/reference/api_1.6.rst0000644000175000017500000000654512637544450021111 0ustar aricaric00000000000000********************************* Version 1.6 notes and API changes ********************************* This page reflects API changes from networkx-1.5 to networkx-1.6. Please send comments and questions to the networkx-discuss mailing list: http://groups.google.com/group/networkx-discuss . Graph Classes ------------- The degree* methods in the graph classes (Graph, DiGraph, MultiGraph, MultiDiGraph) now take an optional weight= keyword that allows computing weighted degree with arbitrary (numerical) edge attributes. Setting weight=None is equivalent to the previous weighted=False. Weighted graph algorithms ------------------------- Many 'weighted' graph algorithms now take optional parameter to specifiy which edge attribute should be used for the weight (default='weight') (ticket https://networkx.lanl.gov/trac/ticket/573) In some cases the parameter name was changed from weighted, to weight. Here is how to specify which edge attribute will be used in the algorithms: - Use weight=None to consider all weights equally (unweighted case) - Use weight='weight' to use the 'weight' edge atribute - Use weight='other' to use the 'other' edge attribute Algorithms affected are: to_scipy_sparse_matrix, clustering, average_clustering, bipartite.degree, spectral_layout, neighbor_degree, is_isomorphic, betweenness_centrality, betweenness_centrality_subset, vitality, load_centrality, mincost, shortest_path, shortest_path_length, average_shortest_path_length Isomorphisms ------------ Node and edge attributes are now more easily incorporated into isomorphism checks via the 'node_match' and 'edge_match' parameters. As part of this change, the following classes were removed:: WeightedGraphMatcher WeightedDiGraphMatcher WeightedMultiGraphMatcher WeightedMultiDiGraphMatcher The function signature for 'is_isomorphic' is now simply:: is_isomorphic(g1, g2, node_match=None, edge_match=None) See its docstring for more details. To aid in the creation of 'node_match' and 'edge_match' functions, users are encouraged to work with:: categorical_node_match categorical_edge_match categroical_multiedge_match numerical_node_match numerical_edge_match numerical_multiedge_match generic_node_match generic_edge_match generic_multiedge_match These functions construct functions which can be passed to 'is_isomorphic'. Finally, note that the above functions are not imported into the top-level namespace and should be accessed from 'networkx.algorithms.isomorphism'. A useful import statement that will be repeated throughout documentation is:: import networkx.algorithms.isomorphism as iso Other ----- * attracting_components A list of lists is returned instead of a list of tuples. * condensation The condensation algorithm now takes a second argument (scc) and returns a graph with nodes labeled as integers instead of node tuples. * degree connectivity average_in_degree_connectivity and average_out_degree_connectivity have have been replaced with average_degree_connectivity(G, source='in', target='in') and average_degree_connectivity(G, source='out', target='out') * neighbor degree average_neighbor_in_degree and average_neighbor_out_degreey have have been replaced with average_neighbor_degree(G, source='in', target='in') and average_neighbor_degree(G, source='out', target='out') networkx-1.11/doc/source/reference/algorithms.dominating.rst0000644000175000017500000000026112637544450024242 0ustar aricaric00000000000000*************** Dominating Sets *************** .. automodule:: networkx.algorithms.dominating .. autosummary:: :toctree: generated/ dominating_set is_dominating_set networkx-1.11/doc/source/reference/readwrite.yaml.rst0000644000175000017500000000017412637544450022673 0ustar aricaric00000000000000YAML ==== .. automodule:: networkx.readwrite.nx_yaml .. autosummary:: :toctree: generated/ read_yaml write_yaml networkx-1.11/doc/source/reference/algorithms.simple_paths.rst0000644000175000017500000000026012637544450024600 0ustar aricaric00000000000000************ Simple Paths ************ .. automodule:: networkx.algorithms.simple_paths .. autosummary:: :toctree: generated/ all_simple_paths shortest_simple_paths networkx-1.11/doc/source/reference/algorithms.chordal.rst0000644000175000017500000000037012637544450023526 0ustar aricaric00000000000000.. _chordal: Chordal ======= .. toctree:: :maxdepth: 2 .. automodule:: networkx.algorithms.chordal.chordal_alg .. autosummary:: :toctree: generated/ is_chordal chordal_graph_cliques chordal_graph_treewidth find_induced_nodes networkx-1.11/doc/source/reference/algorithms.triads.rst0000644000175000017500000000017512637544450023403 0ustar aricaric00000000000000****** Triads ****** .. automodule:: networkx.algorithms.triads .. autosummary:: :toctree: generated/ triadic_census networkx-1.11/doc/source/reference/algorithms.rst0000644000175000017500000000211312637544500022104 0ustar aricaric00000000000000.. _algorithms: ********** Algorithms ********** .. currentmodule:: networkx .. toctree:: :maxdepth: 2 algorithms.approximation algorithms.assortativity algorithms.bipartite algorithms.block algorithms.boundary algorithms.centrality algorithms.chordal algorithms.clique algorithms.clustering algorithms.coloring algorithms.community algorithms.component algorithms.connectivity algorithms.core algorithms.cycles algorithms.dag algorithms.distance_measures algorithms.distance_regular algorithms.dominance algorithms.dominating algorithms.euler algorithms.flow algorithms.graphical algorithms.hierarchy algorithms.hybrid algorithms.isolates algorithms.isomorphism algorithms.link_analysis algorithms.link_prediction algorithms.matching algorithms.minors algorithms.mis algorithms.mst algorithms.operators algorithms.rich_club algorithms.shortest_paths algorithms.simple_paths algorithms.swap algorithms.traversal algorithms.tree algorithms.triads algorithms.vitality networkx-1.11/doc/source/reference/api_changes.rst0000644000175000017500000000026412637544500022201 0ustar aricaric00000000000000*********** API changes *********** .. toctree:: :maxdepth: 2 api_1.11 api_1.10 api_1.9 api_1.8 api_1.7 api_1.6 api_1.5 api_1.4 api_1.0 api_0.99 networkx-1.11/doc/source/reference/algorithms.bipartite.rst0000644000175000017500000000361512637544450024102 0ustar aricaric00000000000000********* Bipartite ********* .. automodule:: networkx.algorithms.bipartite Basic functions --------------- .. automodule:: networkx.algorithms.bipartite.basic .. autosummary:: :toctree: generated/ is_bipartite is_bipartite_node_set sets color density degrees Matching -------- .. automodule:: networkx.algorithms.bipartite.matching .. autosummary:: :toctree: generated/ eppstein_matching hopcroft_karp_matching to_vertex_cover Matrix ------ .. automodule:: networkx.algorithms.bipartite.matrix .. autosummary:: :toctree: generated/ biadjacency_matrix from_biadjacency_matrix Projections ----------- .. automodule:: networkx.algorithms.bipartite.projection .. autosummary:: :toctree: generated/ projected_graph weighted_projected_graph collaboration_weighted_projected_graph overlap_weighted_projected_graph generic_weighted_projected_graph Spectral -------- .. automodule:: networkx.algorithms.bipartite.spectral .. autosummary:: :toctree: generated/ spectral_bipartivity Clustering ---------- .. automodule:: networkx.algorithms.bipartite.cluster .. autosummary:: :toctree: generated/ clustering average_clustering latapy_clustering robins_alexander_clustering Redundancy ---------- .. automodule:: networkx.algorithms.bipartite.redundancy .. autosummary:: :toctree: generated/ node_redundancy Centrality ---------- .. automodule:: networkx.algorithms.bipartite.centrality .. autosummary:: :toctree: generated/ closeness_centrality degree_centrality betweenness_centrality Generators ---------- .. automodule:: networkx.algorithms.bipartite.generators .. autosummary:: :toctree: generated/ complete_bipartite_graph configuration_model havel_hakimi_graph reverse_havel_hakimi_graph alternating_havel_hakimi_graph preferential_attachment_graph random_graph gnmk_random_graph networkx-1.11/doc/source/reference/api_1.8.rst0000644000175000017500000000111312637544450021075 0ustar aricaric00000000000000********************************* Version 1.8 notes and API changes ********************************* This page reflects API changes from networkx-1.7 to networkx-1.8. Please send comments and questions to the networkx-discuss mailing list: http://groups.google.com/group/networkx-discuss . * Laplacian functions now all return matrices. To get a numpy array from a matrix use L = nx.laplacian_matrix(G).A * is_directed_acyclic_graph() now returns false on undirected graphs (instead of raising exception) * cycles returned from simple_cycles() do not include repeated last node networkx-1.11/doc/source/reference/glossary.rst0000644000175000017500000000375712637544450021621 0ustar aricaric00000000000000.. _glossary: Glossary ======== .. glossary:: :sorted: edge Edges are either two-tuples of nodes (u,v) or three tuples of nodes with an edge attribute dictionary (u,v,dict). ebunch An iteratable container of edge tuples like a list, iterator, or file. edge attribute Edges can have arbitrary Python objects assigned as attributes by using keyword/value pairs when adding an edge assigning to the G.edge[u][v] attribute dictionary for the specified edge u-v. hashable An object is hashable if it has a hash value which never changes during its lifetime (it needs a __hash__() method), and can be compared to other objects (it needs an __eq__() or __cmp__() method). Hashable objects which compare equal must have the same hash value. Hashability makes an object usable as a dictionary key and a set member, because these data structures use the hash value internally. All of Python's immutable built-in objects are hashable, while no mutable containers (such as lists or dictionaries) are. Objects which are instances of user-defined classes are hashable by default; they all compare unequal, and their hash value is their id(). Definition from http://docs.python.org/glossary.html nbunch An nbunch is any iterable container of nodes that is not itself a node in the graph. It can be an iterable or an iterator, e.g. a list, set, graph, file, etc.. node attribute Nodes can have arbitrary Python objects assigned as attributes by using keyword/value pairs when adding a node or assigning to the G.node[n] attribute dictionary for the specified node n. node A node can be any hashable Python object except None. dictionary A Python dictionary maps keys to values. Also known as "hashes", or "associative arrays". See http://docs.python.org/tutorial/datastructures.html#dictionaries networkx-1.11/doc/source/reference/readwrite.adjlist.rst0000644000175000017500000000027312637544450023363 0ustar aricaric00000000000000 Adjacency List ============== .. automodule:: networkx.readwrite.adjlist .. autosummary:: :toctree: generated/ read_adjlist write_adjlist parse_adjlist generate_adjlist networkx-1.11/doc/source/reference/readwrite.gexf.rst0000644000175000017500000000021612637544450022657 0ustar aricaric00000000000000GEXF ==== .. automodule:: networkx.readwrite.gexf .. autosummary:: :toctree: generated/ read_gexf write_gexf relabel_gexf_graph networkx-1.11/doc/source/reference/algorithms.centrality.rst0000644000175000017500000000250612637544500024267 0ustar aricaric00000000000000********** Centrality ********** .. automodule:: networkx.algorithms.centrality Degree ------ .. autosummary:: :toctree: generated/ degree_centrality in_degree_centrality out_degree_centrality Closeness --------- .. autosummary:: :toctree: generated/ closeness_centrality Betweenness ----------- .. autosummary:: :toctree: generated/ betweenness_centrality edge_betweenness_centrality Current Flow Closeness ---------------------- .. autosummary:: :toctree: generated/ current_flow_closeness_centrality Current-Flow Betweenness ------------------------ .. autosummary:: :toctree: generated/ current_flow_betweenness_centrality edge_current_flow_betweenness_centrality approximate_current_flow_betweenness_centrality Eigenvector ----------- .. autosummary:: :toctree: generated/ eigenvector_centrality eigenvector_centrality_numpy katz_centrality katz_centrality_numpy Communicability --------------- .. autosummary:: :toctree: generated/ communicability communicability_exp communicability_centrality communicability_centrality_exp communicability_betweenness_centrality estrada_index Load ---- .. autosummary:: :toctree: generated/ load_centrality edge_load Dispersion ---------- .. autosummary:: :toctree: generated/ dispersion networkx-1.11/doc/source/reference/algorithms.component.rst0000644000175000017500000000331412637544450024115 0ustar aricaric00000000000000********** Components ********** .. automodule:: networkx.algorithms.components .. currentmodule:: networkx Connectivity ^^^^^^^^^^^^ .. automodule:: networkx.algorithms.components.connected .. autosummary:: :toctree: generated/ is_connected number_connected_components connected_components connected_component_subgraphs node_connected_component Strong connectivity ^^^^^^^^^^^^^^^^^^^ .. automodule:: networkx.algorithms.components.strongly_connected .. autosummary:: :toctree: generated/ is_strongly_connected number_strongly_connected_components strongly_connected_components strongly_connected_component_subgraphs strongly_connected_components_recursive kosaraju_strongly_connected_components condensation Weak connectivity ^^^^^^^^^^^^^^^^^ .. automodule:: networkx.algorithms.components.weakly_connected .. autosummary:: :toctree: generated/ is_weakly_connected number_weakly_connected_components weakly_connected_components weakly_connected_component_subgraphs Attracting components ^^^^^^^^^^^^^^^^^^^^^ .. automodule:: networkx.algorithms.components.attracting .. autosummary:: :toctree: generated/ is_attracting_component number_attracting_components attracting_components attracting_component_subgraphs Biconnected components ^^^^^^^^^^^^^^^^^^^^^^ .. automodule:: networkx.algorithms.components.biconnected .. autosummary:: :toctree: generated/ is_biconnected biconnected_components biconnected_component_edges biconnected_component_subgraphs articulation_points Semiconnectedness ^^^^^^^^^^^^^^^^^ .. automodule:: networkx.algorithms.components.semiconnected .. autosummary:: :toctree: generated/ is_semiconnected networkx-1.11/doc/source/reference/algorithms.distance_regular.rst0000644000175000017500000000035612637544450025431 0ustar aricaric00000000000000*********************** Distance-Regular Graphs *********************** .. automodule:: networkx.algorithms.distance_regular .. autosummary:: :toctree: generated/ is_distance_regular intersection_array global_parameters networkx-1.11/doc/source/reference/legal.rst0000644000175000017500000000335412653165361021031 0ustar aricaric00000000000000License ======= NetworkX is distributed with the BSD license. :: Copyright (C) 2004-2016, NetworkX Developers Aric Hagberg Dan Schult Pieter Swart All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. * Neither the name of the NetworkX Developers nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. networkx-1.11/doc/source/reference/readwrite.leda.rst0000644000175000017500000000017212637544450022634 0ustar aricaric00000000000000LEDA ==== .. automodule:: networkx.readwrite.leda .. autosummary:: :toctree: generated/ read_leda parse_leda networkx-1.11/doc/source/reference/readwrite.gml.rst0000644000175000017500000000027512637544450022512 0ustar aricaric00000000000000GML === .. automodule:: networkx.readwrite.gml .. autosummary:: :toctree: generated/ read_gml write_gml parse_gml generate_gml literal_destringizer literal_stringizer networkx-1.11/doc/source/reference/readwrite.graphml.rst0000644000175000017500000000021012637544450023352 0ustar aricaric00000000000000GraphML ======= .. automodule:: networkx.readwrite.graphml .. autosummary:: :toctree: generated/ read_graphml write_graphml networkx-1.11/doc/source/reference/linalg.rst0000644000175000017500000000166512637544450021220 0ustar aricaric00000000000000.. _linalg: Linear algebra ************** .. currentmodule:: networkx Graph Matrix ------------ .. automodule:: networkx.linalg.graphmatrix .. autosummary:: :toctree: generated/ adjacency_matrix incidence_matrix Laplacian Matrix ---------------- .. automodule:: networkx.linalg.laplacianmatrix .. autosummary:: :toctree: generated/ laplacian_matrix normalized_laplacian_matrix directed_laplacian_matrix Spectrum --------- .. automodule:: networkx.linalg.spectrum .. autosummary:: :toctree: generated/ laplacian_spectrum adjacency_spectrum Algebraic Connectivity ---------------------- .. automodule:: networkx.linalg.algebraicconnectivity .. autosummary:: :toctree: generated/ algebraic_connectivity fiedler_vector spectral_ordering Attribute Matrices ------------------ .. automodule:: networkx.linalg.attrmatrix .. autosummary:: :toctree: generated/ attr_matrix attr_sparse_matrix networkx-1.11/doc/source/reference/algorithms.hybrid.rst0000644000175000017500000000022712637544450023374 0ustar aricaric00000000000000****** Hybrid ****** .. automodule:: networkx.algorithms.hybrid .. autosummary:: :toctree: generated/ kl_connected_subgraph is_kl_connected networkx-1.11/doc/source/reference/citing.rst0000644000175000017500000000130312637544450021214 0ustar aricaric00000000000000.. -*- coding: utf-8 -*- Citing ====== To cite NetworkX please use the following publication: Aric A. Hagberg, Daniel A. Schult and Pieter J. Swart, `"Exploring network structure, dynamics, and function using NetworkX" `_, in `Proceedings of the 7th Python in Science Conference (SciPy2008) `_, Gäel Varoquaux, Travis Vaught, and Jarrod Millman (Eds), (Pasadena, CA USA), pp. 11--15, Aug 2008 .. only:: html `PDF `_ `BibTeX `_ networkx-1.11/doc/source/reference/algorithms.link_prediction.rst0000644000175000017500000000050712637544450025271 0ustar aricaric00000000000000*************** Link Prediction *************** .. automodule:: networkx.algorithms.link_prediction .. autosummary:: :toctree: generated/ resource_allocation_index jaccard_coefficient adamic_adar_index preferential_attachment cn_soundarajan_hopcroft ra_index_soundarajan_hopcroft within_inter_cluster networkx-1.11/doc/source/reference/algorithms.dag.rst0000644000175000017500000000055112637544500022642 0ustar aricaric00000000000000*********************** Directed Acyclic Graphs *********************** .. automodule:: networkx.algorithms.dag .. autosummary:: :toctree: generated/ ancestors descendants topological_sort topological_sort_recursive is_directed_acyclic_graph is_aperiodic transitive_closure antichains dag_longest_path dag_longest_path_length networkx-1.11/doc/source/reference/readwrite.sparsegraph6.rst0000644000175000017500000000055512637544450024341 0ustar aricaric00000000000000SparseGraph6 ============ Graph6 ------ .. automodule:: networkx.readwrite.graph6 .. autosummary:: :toctree: generated/ parse_graph6 read_graph6 generate_graph6 write_graph6 Sparse6 ------- .. automodule:: networkx.readwrite.sparse6 .. autosummary:: :toctree: generated/ parse_sparse6 read_sparse6 generate_sparse6 write_sparse6 networkx-1.11/doc/source/reference/algorithms.boundary.rst0000644000175000017500000000022612637544450023735 0ustar aricaric00000000000000******** Boundary ******** .. automodule:: networkx.algorithms.boundary .. autosummary:: :toctree: generated/ edge_boundary node_boundary networkx-1.11/doc/source/reference/algorithms.isolates.rst0000644000175000017500000000021412637544450023732 0ustar aricaric00000000000000******** Isolates ******** .. automodule:: networkx.algorithms.isolate .. autosummary:: :toctree: generated/ is_isolate isolates networkx-1.11/doc/source/reference/classes.digraph.rst0000644000175000017500000000365512637544500023021 0ustar aricaric00000000000000.. _digraph: ========================================= DiGraph - Directed graphs with self loops ========================================= Overview ======== .. currentmodule:: networkx .. autofunction:: DiGraph ======= Methods ======= Adding and removing nodes and edges =================================== .. autosummary:: :toctree: generated/ DiGraph.__init__ DiGraph.add_node DiGraph.add_nodes_from DiGraph.remove_node DiGraph.remove_nodes_from DiGraph.add_edge DiGraph.add_edges_from DiGraph.add_weighted_edges_from DiGraph.remove_edge DiGraph.remove_edges_from DiGraph.add_star DiGraph.add_path DiGraph.add_cycle DiGraph.clear Iterating over nodes and edges ============================== .. autosummary:: :toctree: generated/ DiGraph.nodes DiGraph.nodes_iter DiGraph.__iter__ DiGraph.edges DiGraph.edges_iter DiGraph.out_edges DiGraph.out_edges_iter DiGraph.in_edges DiGraph.in_edges_iter DiGraph.get_edge_data DiGraph.neighbors DiGraph.neighbors_iter DiGraph.__getitem__ DiGraph.successors DiGraph.successors_iter DiGraph.predecessors DiGraph.predecessors_iter DiGraph.adjacency_list DiGraph.adjacency_iter DiGraph.nbunch_iter Information about graph structure ================================= .. autosummary:: :toctree: generated/ DiGraph.has_node DiGraph.__contains__ DiGraph.has_edge DiGraph.order DiGraph.number_of_nodes DiGraph.__len__ DiGraph.degree DiGraph.degree_iter DiGraph.in_degree DiGraph.in_degree_iter DiGraph.out_degree DiGraph.out_degree_iter DiGraph.size DiGraph.number_of_edges DiGraph.nodes_with_selfloops DiGraph.selfloop_edges DiGraph.number_of_selfloops Making copies and subgraphs =========================== .. autosummary:: :toctree: generated/ DiGraph.copy DiGraph.to_undirected DiGraph.to_directed DiGraph.subgraph DiGraph.reverse networkx-1.11/doc/source/reference/algorithms.block.rst0000644000175000017500000000021512637544450023202 0ustar aricaric00000000000000************* Blockmodeling ************* .. automodule:: networkx.algorithms.block .. autosummary:: :toctree: generated/ blockmodel networkx-1.11/doc/source/reference/algorithms.rich_club.rst0000644000175000017500000000021712637544450024044 0ustar aricaric00000000000000********* Rich Club ********* .. automodule:: networkx.algorithms.richclub .. autosummary:: :toctree: generated/ rich_club_coefficient networkx-1.11/doc/source/reference/readwrite.multiline_adjlist.rst0000644000175000017500000000040112637544450025436 0ustar aricaric00000000000000 Multiline Adjacency List ======================== .. automodule:: networkx.readwrite.multiline_adjlist .. autosummary:: :toctree: generated/ read_multiline_adjlist write_multiline_adjlist parse_multiline_adjlist generate_multiline_adjlist networkx-1.11/doc/source/reference/algorithms.approximation.rst0000644000175000017500000000303112637544450025001 0ustar aricaric00000000000000************* Approximation ************* .. automodule:: networkx.algorithms.approximation Connectivity ------------ .. automodule:: networkx.algorithms.approximation.connectivity .. autosummary:: :toctree: generated/ all_pairs_node_connectivity local_node_connectivity node_connectivity K-components ------------ .. automodule:: networkx.algorithms.approximation.kcomponents .. autosummary:: :toctree: generated/ k_components Clique ------ .. automodule:: networkx.algorithms.approximation.clique .. autosummary:: :toctree: generated/ max_clique clique_removal Clustering ---------- .. automodule:: networkx.algorithms.approximation.clustering_coefficient .. autosummary:: :toctree: generated/ average_clustering Dominating Set --------------- .. automodule:: networkx.algorithms.approximation.dominating_set .. autosummary:: :toctree: generated/ min_weighted_dominating_set min_edge_dominating_set Independent Set --------------- .. automodule:: networkx.algorithms.approximation.independent_set .. autosummary:: :toctree: generated/ maximum_independent_set Matching -------- .. automodule:: networkx.algorithms.approximation.matching .. autosummary:: :toctree: generated/ min_maximal_matching Ramsey ------ .. automodule:: networkx.algorithms.approximation.ramsey .. autosummary:: :toctree: generated/ ramsey_R2 Vertex Cover ------------ .. automodule:: networkx.algorithms.approximation.vertex_cover .. autosummary:: :toctree: generated/ min_weighted_vertex_cover networkx-1.11/doc/source/reference/history.rst0000644000175000017500000000030612637544450021442 0ustar aricaric00000000000000History ******* Original Creators:: Aric Hagberg, hagberg@lanl.gov Pieter Swart, swart@lanl.gov Dan Schult, dschult@colgate.edu .. toctree:: :maxdepth: 2 api_changes news networkx-1.11/doc/source/reference/api_1.7.rst0000644000175000017500000000055112637544450021101 0ustar aricaric00000000000000********************************* Version 1.7 notes and API changes ********************************* This page reflects API changes from networkx-1.6 to networkx-1.7. Please send comments and questions to the networkx-discuss mailing list: http://groups.google.com/group/networkx-discuss . Other ----- * Untested bipartite_random_regular_graph() removed. networkx-1.11/doc/source/reference/algorithms.operators.rst0000644000175000017500000000127112637544450024131 0ustar aricaric00000000000000.. _operators: Operators ********* .. automodule:: networkx.algorithms.operators.unary .. autosummary:: :toctree: generated/ complement reverse .. automodule:: networkx.algorithms.operators.binary .. autosummary:: :toctree: generated/ compose union disjoint_union intersection difference symmetric_difference .. automodule:: networkx.algorithms.operators.all .. autosummary:: :toctree: generated/ compose_all union_all disjoint_union_all intersection_all .. automodule:: networkx.algorithms.operators.product .. autosummary:: :toctree: generated/ cartesian_product lexicographic_product strong_product tensor_product power networkx-1.11/doc/source/reference/readwrite.json_graph.rst0000644000175000017500000000031512637544450024060 0ustar aricaric00000000000000JSON ==== .. automodule:: networkx.readwrite.json_graph .. autosummary:: :toctree: generated/ node_link_data node_link_graph adjacency_data adjacency_graph tree_data tree_graph networkx-1.11/doc/source/reference/algorithms.clique.rst0000644000175000017500000000051412637544450023374 0ustar aricaric00000000000000****** Clique ****** .. automodule:: networkx.algorithms.clique .. autosummary:: :toctree: generated/ enumerate_all_cliques find_cliques make_max_clique_graph make_clique_bipartite graph_clique_number graph_number_of_cliques node_clique_number number_of_cliques cliques_containing_node networkx-1.11/doc/source/reference/readwrite.gpickle.rst0000644000175000017500000000020412637544450023341 0ustar aricaric00000000000000Pickle ====== .. automodule:: networkx.readwrite.gpickle .. autosummary:: :toctree: generated/ read_gpickle write_gpickle networkx-1.11/doc/source/reference/algorithms.mst.rst0000644000175000017500000000031012637544500022703 0ustar aricaric00000000000000********************* Minimum Spanning Tree ********************* .. automodule:: networkx.algorithms.mst .. autosummary:: :toctree: generated/ minimum_spanning_tree minimum_spanning_edges networkx-1.11/doc/source/reference/classes.multigraph.rst0000644000175000017500000000351212637544500023547 0ustar aricaric00000000000000.. _multigraph: ================================================================= MultiGraph - Undirected graphs with self loops and parallel edges ================================================================= Overview ======== .. currentmodule:: networkx .. autofunction:: MultiGraph ======= Methods ======= Adding and removing nodes and edges =================================== .. autosummary:: :toctree: generated/ MultiGraph.__init__ MultiGraph.add_node MultiGraph.add_nodes_from MultiGraph.remove_node MultiGraph.remove_nodes_from MultiGraph.add_edge MultiGraph.add_edges_from MultiGraph.add_weighted_edges_from MultiGraph.remove_edge MultiGraph.remove_edges_from MultiGraph.add_star MultiGraph.add_path MultiGraph.add_cycle MultiGraph.clear Iterating over nodes and edges ============================== .. autosummary:: :toctree: generated/ MultiGraph.nodes MultiGraph.nodes_iter MultiGraph.__iter__ MultiGraph.edges MultiGraph.edges_iter MultiGraph.get_edge_data MultiGraph.neighbors MultiGraph.neighbors_iter MultiGraph.__getitem__ MultiGraph.adjacency_list MultiGraph.adjacency_iter MultiGraph.nbunch_iter Information about graph structure ================================= .. autosummary:: :toctree: generated/ MultiGraph.has_node MultiGraph.__contains__ MultiGraph.has_edge MultiGraph.order MultiGraph.number_of_nodes MultiGraph.__len__ MultiGraph.degree MultiGraph.degree_iter MultiGraph.size MultiGraph.number_of_edges MultiGraph.nodes_with_selfloops MultiGraph.selfloop_edges MultiGraph.number_of_selfloops Making copies and subgraphs =========================== .. autosummary:: :toctree: generated/ MultiGraph.copy MultiGraph.to_undirected MultiGraph.to_directed MultiGraph.subgraph networkx-1.11/doc/source/reference/algorithms.tree.rst0000644000175000017500000000106012637544500023042 0ustar aricaric00000000000000.. _tree: Tree ==== .. toctree:: :maxdepth: 2 Recognition ----------- .. automodule:: networkx.algorithms.tree.recognition .. autosummary:: :toctree: generated/ is_tree is_forest is_arborescence is_branching Branchings and Spanning Arborescences ------------------------------------- .. automodule:: networkx.algorithms.tree.branchings .. autosummary:: :toctree: generated/ branching_weight greedy_branching maximum_branching minimum_branching maximum_spanning_arborescence minimum_spanning_arborescence Edmonds networkx-1.11/doc/source/reference/utils.rst0000644000175000017500000000240012637544450021076 0ustar aricaric00000000000000********* Utilities ********* .. automodule:: networkx.utils .. currentmodule:: networkx.utils Helper Functions ---------------- .. automodule:: networkx.utils.misc .. autosummary:: :toctree: generated/ is_string_like flatten iterable is_list_of_ints make_str generate_unique_node default_opener Data Structures and Algorithms ------------------------------ .. automodule:: networkx.utils.union_find .. autosummary:: :toctree: generated/ UnionFind.union Random Sequence Generators -------------------------- .. automodule:: networkx.utils.random_sequence .. autosummary:: :toctree: generated/ create_degree_sequence pareto_sequence powerlaw_sequence uniform_sequence cumulative_distribution discrete_sequence zipf_sequence zipf_rv random_weighted_sample weighted_choice Decorators ---------- .. automodule:: networkx.utils.decorators .. autosummary:: :toctree: generated/ open_file Cuthill-Mckee Ordering ---------------------- .. automodule:: networkx.utils.rcm .. autosummary:: :toctree: generated/ cuthill_mckee_ordering reverse_cuthill_mckee_ordering Context Managers ---------------- .. automodule:: networkx.utils.contextmanagers .. autosummary:: :toctree: generated/ reversed networkx-1.11/doc/source/reference/algorithms.isomorphism.rst0000644000175000017500000000063012637544450024462 0ustar aricaric00000000000000.. _isomorphism: *********** Isomorphism *********** .. toctree:: :maxdepth: 2 .. automodule:: networkx.algorithms.isomorphism .. autosummary:: :toctree: generated/ is_isomorphic could_be_isomorphic fast_could_be_isomorphic faster_could_be_isomorphic Advanced Interface to VF2 Algorithm ----------------------------------- .. toctree:: :maxdepth: 2 algorithms.isomorphism.vf2 networkx-1.11/doc/source/reference/algorithms.connectivity.rst0000644000175000017500000000250212637544450024627 0ustar aricaric00000000000000************ Connectivity ************ .. automodule:: networkx.algorithms.connectivity K-node-components ----------------- .. automodule:: networkx.algorithms.connectivity.kcomponents .. autosummary:: :toctree: generated/ k_components K-node-cutsets -------------- .. automodule:: networkx.algorithms.connectivity.kcutsets .. autosummary:: :toctree: generated/ all_node_cuts Flow-based Connectivity ----------------------- .. automodule:: networkx.algorithms.connectivity.connectivity .. autosummary:: :toctree: generated/ average_node_connectivity all_pairs_node_connectivity edge_connectivity local_edge_connectivity local_node_connectivity node_connectivity Flow-based Minimum Cuts ----------------------- .. automodule:: networkx.algorithms.connectivity.cuts .. autosummary:: :toctree: generated/ minimum_edge_cut minimum_node_cut minimum_st_edge_cut minimum_st_node_cut Stoer-Wagner minimum cut ------------------------ .. automodule:: networkx.algorithms.connectivity.stoerwagner .. autosummary:: :toctree: generated/ stoer_wagner Utils for flow-based connectivity --------------------------------- .. automodule:: networkx.algorithms.connectivity.utils .. autosummary:: :toctree: generated/ build_auxiliary_edge_connectivity build_auxiliary_node_connectivity networkx-1.11/doc/source/reference/algorithms.mis.rst0000644000175000017500000000026712637544450022707 0ustar aricaric00000000000000*********************** Maximal independent set *********************** .. automodule:: networkx.algorithms.mis .. autosummary:: :toctree: generated/ maximal_independent_set networkx-1.11/doc/source/reference/algorithms.cycles.rst0000644000175000017500000000023112637544450023370 0ustar aricaric00000000000000****** Cycles ****** .. automodule:: networkx.algorithms.cycles .. autosummary:: :toctree: generated/ cycle_basis simple_cycles find_cycle networkx-1.11/doc/source/index.rst0000644000175000017500000000122212637544450017110 0ustar aricaric00000000000000.. _contents: NetworkX documentation ====================== .. only:: html :Release: |version| :Date: |today| Tutorial :download:`[PDF] ` Reference :download:`[PDF] ` Tutorial+Reference :download:`[HTML zip] ` .. toctree:: :maxdepth: 2 overview download install tutorial/index reference/index test developer/index reference/history bibliography examples/index .. only:: html - `Gallery `_ Indices and tables ================== * :ref:`genindex` * :ref:`modindex` * :ref:`search` * :ref:`glossary` networkx-1.11/doc/source/tutorial/0000755000175000017500000000000012653231454017110 5ustar aricaric00000000000000networkx-1.11/doc/source/tutorial/tutorial.rst0000644000175000017500000003277012653231353021514 0ustar aricaric00000000000000.. -*- coding: utf-8 -*- .. currentmodule:: networkx Start here to begin working with NetworkX. Creating a graph ---------------- Create an empty graph with no nodes and no edges. >>> import networkx as nx >>> G=nx.Graph() By definition, a :class:`Graph` is a collection of nodes (vertices) along with identified pairs of nodes (called edges, links, etc). In NetworkX, nodes can be any hashable object e.g. a text string, an image, an XML object, another Graph, a customized node object, etc. (Note: Python's None object should not be used as a node as it determines whether optional function arguments have been assigned in many functions.) Nodes ----- The graph G can be grown in several ways. NetworkX includes many graph generator functions and facilities to read and write graphs in many formats. To get started though we'll look at simple manipulations. You can add one node at a time, >>> G.add_node(1) add a list of nodes, >>> G.add_nodes_from([2,3]) or add any :term:`nbunch` of nodes. An *nbunch* is any iterable container of nodes that is not itself a node in the graph. (e.g. a list, set, graph, file, etc..) >>> H=nx.path_graph(10) >>> G.add_nodes_from(H) Note that G now contains the nodes of H as nodes of G. In contrast, you could use the graph H as a node in G. >>> G.add_node(H) The graph G now contains H as a node. This flexibility is very powerful as it allows graphs of graphs, graphs of files, graphs of functions and much more. It is worth thinking about how to structure your application so that the nodes are useful entities. Of course you can always use a unique identifier in G and have a separate dictionary keyed by identifier to the node information if you prefer. (Note: You should not change the node object if the hash depends on its contents.) Edges ----- G can also be grown by adding one edge at a time, >>> G.add_edge(1,2) >>> e=(2,3) >>> G.add_edge(*e) # unpack edge tuple* by adding a list of edges, >>> G.add_edges_from([(1,2),(1,3)]) or by adding any :term:`ebunch` of edges. An *ebunch* is any iterable container of edge-tuples. An edge-tuple can be a 2-tuple of nodes or a 3-tuple with 2 nodes followed by an edge attribute dictionary, e.g. (2,3,{'weight':3.1415}). Edge attributes are discussed further below >>> G.add_edges_from(H.edges()) One can demolish the graph in a similar fashion; using :meth:`Graph.remove_node`, :meth:`Graph.remove_nodes_from`, :meth:`Graph.remove_edge` and :meth:`Graph.remove_edges_from`, e.g. >>> G.remove_node(H) There are no complaints when adding existing nodes or edges. For example, after removing all nodes and edges, >>> G.clear() we add new nodes/edges and NetworkX quietly ignores any that are already present. >>> G.add_edges_from([(1,2),(1,3)]) >>> G.add_node(1) >>> G.add_edge(1,2) >>> G.add_node("spam") # adds node "spam" >>> G.add_nodes_from("spam") # adds 4 nodes: 's', 'p', 'a', 'm' At this stage the graph G consists of 8 nodes and 2 edges, as can be seen by: >>> G.number_of_nodes() 8 >>> G.number_of_edges() 2 We can examine them with >>> G.nodes() ['a', 1, 2, 3, 'spam', 'm', 'p', 's'] >>> G.edges() [(1, 2), (1, 3)] >>> G.neighbors(1) [2, 3] Removing nodes or edges has similar syntax to adding: >>> G.remove_nodes_from("spam") >>> G.nodes() [1, 2, 3, 'spam'] >>> G.remove_edge(1,3) When creating a graph structure by instantiating one of the graph classes you can specify data in several formats. >>> H=nx.DiGraph(G) # create a DiGraph using the connections from G >>> H.edges() [(1, 2), (2, 1)] >>> edgelist=[(0,1),(1,2),(2,3)] >>> H=nx.Graph(edgelist) What to use as nodes and edges ------------------------------ You might notice that nodes and edges are not specified as NetworkX objects. This leaves you free to use meaningful items as nodes and edges. The most common choices are numbers or strings, but a node can be any hashable object (except None), and an edge can be associated with any object x using G.add_edge(n1,n2,object=x). As an example, n1 and n2 could be protein objects from the RCSB Protein Data Bank, and x could refer to an XML record of publications detailing experimental observations of their interaction. We have found this power quite useful, but its abuse can lead to unexpected surprises unless one is familiar with Python. If in doubt, consider using :func:`convert_node_labels_to_integers` to obtain a more traditional graph with integer labels. Accessing edges --------------- In addition to the methods :meth:`Graph.nodes`, :meth:`Graph.edges`, and :meth:`Graph.neighbors`, iterator versions (e.g. :meth:`Graph.edges_iter`) can save you from creating large lists when you are just going to iterate through them anyway. Fast direct access to the graph data structure is also possible using subscript notation. .. Warning:: Do not change the returned dict--it is part of the graph data structure and direct manipulation may leave the graph in an inconsistent state. >>> G[1] # Warning: do not change the resulting dict {2: {}} >>> G[1][2] {} You can safely set the attributes of an edge using subscript notation if the edge already exists. >>> G.add_edge(1,3) >>> G[1][3]['color']='blue' Fast examination of all edges is achieved using adjacency iterators. Note that for undirected graphs this actually looks at each edge twice. >>> FG=nx.Graph() >>> FG.add_weighted_edges_from([(1,2,0.125),(1,3,0.75),(2,4,1.2),(3,4,0.375)]) >>> for n,nbrs in FG.adjacency_iter(): ... for nbr,eattr in nbrs.items(): ... data=eattr['weight'] ... if data<0.5: print('(%d, %d, %.3f)' % (n,nbr,data)) (1, 2, 0.125) (2, 1, 0.125) (3, 4, 0.375) (4, 3, 0.375) Convenient access to all edges is achieved with the edges method. >>> for (u,v,d) in FG.edges(data='weight'): ... if d<0.5: print('(%d, %d, %.3f)'%(n,nbr,d)) (1, 2, 0.125) (3, 4, 0.375) Adding attributes to graphs, nodes, and edges --------------------------------------------- Attributes such as weights, labels, colors, or whatever Python object you like, can be attached to graphs, nodes, or edges. Each graph, node, and edge can hold key/value attribute pairs in an associated attribute dictionary (the keys must be hashable). By default these are empty, but attributes can be added or changed using add_edge, add_node or direct manipulation of the attribute dictionaries named G.graph, G.node and G.edge for a graph G. Graph attributes ~~~~~~~~~~~~~~~~ Assign graph attributes when creating a new graph >>> G = nx.Graph(day="Friday") >>> G.graph {'day': 'Friday'} Or you can modify attributes later >>> G.graph['day']='Monday' >>> G.graph {'day': 'Monday'} Node attributes ~~~~~~~~~~~~~~~ Add node attributes using add_node(), add_nodes_from() or G.node >>> G.add_node(1, time='5pm') >>> G.add_nodes_from([3], time='2pm') >>> G.node[1] {'time': '5pm'} >>> G.node[1]['room'] = 714 >>> G.nodes(data=True) [(1, {'room': 714, 'time': '5pm'}), (3, {'time': '2pm'})] Note that adding a node to G.node does not add it to the graph, use G.add_node() to add new nodes. Edge Attributes ~~~~~~~~~~~~~~~ Add edge attributes using add_edge(), add_edges_from(), subscript notation, or G.edge. >>> G.add_edge(1, 2, weight=4.7 ) >>> G.add_edges_from([(3,4),(4,5)], color='red') >>> G.add_edges_from([(1,2,{'color':'blue'}), (2,3,{'weight':8})]) >>> G[1][2]['weight'] = 4.7 >>> G.edge[1][2]['weight'] = 4 The special attribute 'weight' should be numeric and holds values used by algorithms requiring weighted edges. Directed graphs --------------- The :class:`DiGraph` class provides additional methods specific to directed edges, e.g. :meth:`DiGraph.out_edges`, :meth:`DiGraph.in_degree`, :meth:`DiGraph.predecessors`, :meth:`DiGraph.successors` etc. To allow algorithms to work with both classes easily, the directed versions of neighbors() and degree() are equivalent to successors() and the sum of in_degree() and out_degree() respectively even though that may feel inconsistent at times. >>> DG=nx.DiGraph() >>> DG.add_weighted_edges_from([(1,2,0.5), (3,1,0.75)]) >>> DG.out_degree(1,weight='weight') 0.5 >>> DG.degree(1,weight='weight') 1.25 >>> DG.successors(1) [2] >>> DG.neighbors(1) [2] Some algorithms work only for directed graphs and others are not well defined for directed graphs. Indeed the tendency to lump directed and undirected graphs together is dangerous. If you want to treat a directed graph as undirected for some measurement you should probably convert it using :meth:`Graph.to_undirected` or with >>> H = nx.Graph(G) # convert G to undirected graph Multigraphs ----------- NetworkX provides classes for graphs which allow multiple edges between any pair of nodes. The :class:`MultiGraph` and :class:`MultiDiGraph` classes allow you to add the same edge twice, possibly with different edge data. This can be powerful for some applications, but many algorithms are not well defined on such graphs. Shortest path is one example. Where results are well defined, e.g. :meth:`MultiGraph.degree` we provide the function. Otherwise you should convert to a standard graph in a way that makes the measurement well defined. >>> MG=nx.MultiGraph() >>> MG.add_weighted_edges_from([(1,2,.5), (1,2,.75), (2,3,.5)]) >>> MG.degree(weight='weight') {1: 1.25, 2: 1.75, 3: 0.5} >>> GG=nx.Graph() >>> for n,nbrs in MG.adjacency_iter(): ... for nbr,edict in nbrs.items(): ... minvalue=min([d['weight'] for d in edict.values()]) ... GG.add_edge(n,nbr, weight = minvalue) ... >>> nx.shortest_path(GG,1,3) [1, 2, 3] Graph generators and graph operations ------------------------------------- In addition to constructing graphs node-by-node or edge-by-edge, they can also be generated by 1. Applying classic graph operations, such as:: subgraph(G, nbunch) - induce subgraph of G on nodes in nbunch union(G1,G2) - graph union disjoint_union(G1,G2) - graph union assuming all nodes are different cartesian_product(G1,G2) - return Cartesian product graph compose(G1,G2) - combine graphs identifying nodes common to both complement(G) - graph complement create_empty_copy(G) - return an empty copy of the same graph class convert_to_undirected(G) - return an undirected representation of G convert_to_directed(G) - return a directed representation of G 2. Using a call to one of the classic small graphs, e.g. >>> petersen=nx.petersen_graph() >>> tutte=nx.tutte_graph() >>> maze=nx.sedgewick_maze_graph() >>> tet=nx.tetrahedral_graph() 3. Using a (constructive) generator for a classic graph, e.g. >>> K_5=nx.complete_graph(5) >>> K_3_5=nx.complete_bipartite_graph(3,5) >>> barbell=nx.barbell_graph(10,10) >>> lollipop=nx.lollipop_graph(10,20) 4. Using a stochastic graph generator, e.g. >>> er=nx.erdos_renyi_graph(100,0.15) >>> ws=nx.watts_strogatz_graph(30,3,0.1) >>> ba=nx.barabasi_albert_graph(100,5) >>> red=nx.random_lobster(100,0.9,0.9) 5. Reading a graph stored in a file using common graph formats, such as edge lists, adjacency lists, GML, GraphML, pickle, LEDA and others. >>> nx.write_gml(red,"path.to.file") >>> mygraph=nx.read_gml("path.to.file") Details on graph formats: :doc:`/reference/readwrite` Details on graph generator functions: :doc:`/reference/generators` Analyzing graphs ---------------- The structure of G can be analyzed using various graph-theoretic functions such as: >>> G=nx.Graph() >>> G.add_edges_from([(1,2),(1,3)]) >>> G.add_node("spam") # adds node "spam" >>> nx.connected_components(G) [[1, 2, 3], ['spam']] >>> sorted(nx.degree(G).values()) [0, 1, 1, 2] >>> nx.clustering(G) {1: 0.0, 2: 0.0, 3: 0.0, 'spam': 0.0} Functions that return node properties return dictionaries keyed by node label. >>> nx.degree(G) {1: 2, 2: 1, 3: 1, 'spam': 0} For values of specific nodes, you can provide a single node or an nbunch of nodes as argument. If a single node is specified, then a single value is returned. If an nbunch is specified, then the function will return a dictionary. >>> nx.degree(G,1) 2 >>> G.degree(1) 2 >>> G.degree([1,2]) {1: 2, 2: 1} >>> sorted(G.degree([1,2]).values()) [1, 2] >>> sorted(G.degree().values()) [0, 1, 1, 2] Details on graph algorithms supported: :doc:`/reference/algorithms` Drawing graphs -------------- NetworkX is not primarily a graph drawing package but basic drawing with Matplotlib as well as an interface to use the open source Graphviz software package are included. These are part of the networkx.drawing package and will be imported if possible. See :doc:`/reference/drawing` for details. Note that the drawing package in NetworkX is not yet compatible with Python versions 3.0 and above. First import Matplotlib's plot interface (pylab works too) >>> import matplotlib.pyplot as plt You may find it useful to interactively test code using "ipython -pylab", which combines the power of ipython and matplotlib and provides a convenient interactive mode. To test if the import of networkx.drawing was successful draw G using one of >>> nx.draw(G) >>> nx.draw_random(G) >>> nx.draw_circular(G) >>> nx.draw_spectral(G) when drawing to an interactive display. Note that you may need to issue a Matplotlib >>> plt.show() command if you are not using matplotlib in interactive mode: (See `Matplotlib FAQ `_ ) To save drawings to a file, use, for example >>> nx.draw(G) >>> plt.savefig("path.png") writes to the file "path.png" in the local directory. If Graphviz and PyGraphviz, or pydotplus, are available on your system, you can also use >>> from networkx.drawing.nx_pydot import write_dot >>> nx.draw_graphviz(G) >>> write_dot(G,'file.dot') Details on drawing graphs: :doc:`/reference/drawing` networkx-1.11/doc/source/tutorial/index.rst0000644000175000017500000000062312637544450020757 0ustar aricaric00000000000000.. -*- coding: utf-8 -*- ******** Tutorial ******** .. toctree:: :maxdepth: 2 tutorial **What Next** Now that you have an idea of what the NetworkX package provides, you should investigate the parts of the package most useful for you. :doc:`Reference Section` provides details on NetworkX. :doc:`/examples/index` provides some example programs written using NetworkX. networkx-1.11/doc/source/install.rst0000644000175000017500000000755612653165361017465 0ustar aricaric00000000000000********** Installing ********** Before installing NetworkX, you need to have `setuptools `_ installed. Quick install ============= Get NetworkX from the Python Package Index at http://pypi.python.org/pypi/networkx or install it with :: pip install networkx and an attempt will be made to find and install an appropriate version that matches your operating system and Python version. You can install the development version (at github.com) with :: pip install git://github.com/networkx/networkx.git#egg=networkx More download file options are at http://networkx.github.io/download.html. Installing from source ====================== You can install from source by downloading a source archive file (tar.gz or zip) or by checking out the source files from the Mercurial source code repository. NetworkX is a pure Python package; you don't need a compiler to build or install it. Source archive file ------------------- 1. Download the source (tar.gz or zip file) from https://pypi.python.org/pypi/networkx/ or get the latest development version from https://github.com/networkx/networkx/ 2. Unpack and change directory to the source directory (it should have the files README.txt and setup.py). 3. Run :samp:`python setup.py install` to build and install 4. (Optional) Run :samp:`nosetests` to execute the tests if you have `nose `_ installed. GitHub ------ 1. Clone the networkx repostitory (see https://github.com/networkx/networkx/ for options) :: git clone https://github.com/networkx/networkx.git 2. Change directory to :samp:`networkx` 3. Run :samp:`python setup.py install` to build and install 4. (Optional) Run :samp:`nosetests` to execute the tests if you have `nose `_ installed. If you don't have permission to install software on your system, you can install into another directory using the :samp:`--user`, :samp:`--prefix`, or :samp:`--home` flags to setup.py. For example :: python setup.py install --prefix=/home/username/python or :: python setup.py install --home=~ or :: python setup.py install --user If you didn't install in the standard Python site-packages directory you will need to set your PYTHONPATH variable to the alternate location. See http://docs.python.org/2/install/index.html#search-path for further details. Requirements ============ Python ------ To use NetworkX you need Python 2.7, 3.2 or later. The easiest way to get Python and most optional packages is to install the Enthought Python distribution "`Canopy `_". There are several other distributions that contain the key packages you need for scientific computing. See http://scipy.org/install.html for a list. Optional packages ================= The following are optional packages that NetworkX can use to provide additional functions. NumPy ----- Provides matrix representation of graphs and is used in some graph algorithms for high-performance matrix computations. - Download: http://scipy.org/Download SciPy ----- Provides sparse matrix representation of graphs and many numerical scientific tools. - Download: http://scipy.org/Download Matplotlib ---------- Provides flexible drawing of graphs. - Download: http://matplotlib.sourceforge.net/ GraphViz -------- In conjunction with either - PyGraphviz: http://pygraphviz.github.io/ or - pydotplus: https://github.com/carlos-jenkins/pydotplus provides graph drawing and graph layout algorithms. - Download: http://graphviz.org/ PyYAML ------ http://pyyaml.org/ Required for YAML format reading and writing. Other packages --------------- These are extra packages you may consider using with NetworkX - IPython, interactive Python shell, http://ipython.scipy.org/ networkx-1.11/doc/source/static/0000755000175000017500000000000012653231454016534 5ustar aricaric00000000000000networkx-1.11/doc/source/static/art1.png0000644000175000017500000020253312637544450020123 0ustar aricaric00000000000000PNG  IHDRsRGBbKGD pHYs  tIME &=%tEXtCommentCreated with The GIMPd%n IDATxydwYC횇SÙ9OO $`: 1}%*D WPp{Ku]U ""AnH$HCB3j5{8UewzPZS3}6 l`6 a b}!|Yey%jMニ2Lo4e}<^h{UT ྏ( 1^y9HuiZH5yAl=MeVBִKzeYx7hM#u+UtH Boa&NC$#땊X~ql$ !A>;8bmmrtѣ(a\zX;|e!|2LfpAO~p ~l4ɐqZB'J!I,t.*rѿ#KM*ǎ![̼Ia[.}n6΢E+|ߧ](08 KG >T _2Tyܱ O cǰm6Ii4f4yZ.SU4/x\l޷fBH׉T :6 <6;ue?>НNcm0X*a$XQ2 \"6:ra0 :XEҞj53yC llFYnh2̼T*Jbԥ뺢ihj"14$ydȎg8,IR@F ggYY]%u+7S u6q힖i"rmSkJ;FB:VuAʲ|VhPByb:It6Y/$ț6GFP/q,vCFT*gm*cĄV`ЂAcc2uuj&/@He]pz#3rp ;:/)LyBU H8q]Ǵ~n+˲T*eYr9IunOEu + aa6PI&I 2x4+ bQ4p4zlxOJ8t aY,>CiLiaLZ(v\kkk RTb̲,vi d(/,041A(]7}j]D](7l`P(cctg_QE! a--\Mij68B&ɲ|FLUUdi&p,3ظԖ-x'z WEZE8!=*i"IEw]$BZ=a=[j6R>ymu,S,r;wA3-뺨j4@4|uu) e/Dyb E$hv$BRX3&y,b,z]4uNn UU%˲DK4$h@"76FuvW"J> U=)ybk !/J'OR;t\+!+IXˡ* !rbZ-yFEm4M*0MH$rQ~$ 2ۍœ'E4ImFM`SUJX0;bsskk(ٱ1䫯< IBAT\X,PU*6v ''#+ T4.a6FGK&A$ǡG>Eb0}QTX=p`L@RX #Q.319I g;$ELR ˶h֟ T*|2zc==t0pHzvv6K1M0é,bSQ`JUҙ \HE!7=`:FT4jYt0mp=בi:4@lZD"F]"ydz}!D^u]gPl6 *c(3;wRV ųfMRaaFR)D$Hz)^H8zºrزd0h./-% a&LsuKFGv0uàit`(tr/% UUg2lS<Ьѣ )"Ncibja gThO)N=$$I¶mj<N>Qu+8 G4P(4e<ñN2PD.G\%ˠkkXpE#7>y01lRd޺$IC!"[`HǙ%}pe[z%|MB$.lg}_8C$jmUExЏ\ZKP.3y6%\qUZ-h6aų~.qJH!r[:;l2{Dmy lcccb{H$6=iH$B:FUUﹶlLh<~"$_6NT<βm`-b$emmV$^is睼')Jbmm UUI$D"Th:fY.Ad#du -[8=mYyf FFFذa|!EaDZTZ-C;&iBHdDa$Iu RZj8;;ywNVQرc²>{e<[ <4FR`04p$ r`beU dIE ~%4jqY0P*"LR,fR)Oo6?nA\M!6nDblbb}ڶBTU%8R q.y9·^pi(;~(6b1lz޻WC8G.O{?)a3n8/'qůיظ@Y|_-G6L= K8exxO/Ldyit]D6K<aDDVB!J,S;rѣZV0Hhje1};u]udIs>5Tj4baa_yf$ЇU@uY;Zm%˧yi˲:l~檪g>.rY,}{HL|jOeYyG٤lNeh(U\Ӥh G.Rr]q<'#L|3ndB3qD\4MFQ&&XX`085p(--,-+ [z qDRY'[,:DNI%d/"m )}Oi;{LH$Us'qP*E3{>|#,..l۶m?w]BcXbB}h4 B|?2sU;Ldj0k R(Pڻsqb Dо},7T*S6+o~3[o' K8(-y?o_&] tiiFI&Z-b lOm%IB4&BhFuek5װ@0$X.P"={(>(zZΐQ0 qwoo&Ny(e}1tE:&~_5/dEAǩuӧP{7,!K硙&d`( R4MشiSn}ۭV7yO]QUںN1ǎDJ%J"˶Mӡu+Qmox;'pt$!|ǎ8pV @@m*242(wM߹]uF4 JG$9 .#"uYgZ$Bt&:ú*IbEq֥Lr-˷] &foǶm.B'P $ˢjݶh4~g W>`6G 7Uqs<=T,+O!r9y4 d$$˝3JYS)p"`c6!5MKPL9[Y]n4 zBBV%bY߳W /RGK߹2.K&ij0 MMil= d)u#(FW-'B8ta*5Ues4JFQh t*n0H$%N#=>N'~PT0\uV HصkZ9t zALrǩT$ \رc_ T.saRAU<^{-|_ζ) {-j41O˲D!'ɐxыGfd~<#{/V!iHGmenH)e hCC9*ezv{EDQ"17?//Hy,dR BRZ+!8ۤ~Qmtd6nO"7l4tH08eȤNCI6*^$~ ss6l@m#9Nڊ7;NN^y.r0HѠ8hcc^vN(ĆIɓLo؀33TNL  Wx3p'OrwK`j|+ϙGٷFUZh$ү7t{d]  O[-i'3fYW~͛7t߰l6˱G!8=u]QT0ML&Ҟu4J^g򵵵uIK8 "Y*gfV"8$\éTȲaBZSu˕ rBӹڎi(NMN81inyիAYDuv]ǝcT}#D2zS=m$I0nr*5LLO ޴i}=EhJܳo}V>\!ϢD A,K%BFJ3eͧܿ_N8&rm72zo4ww IDATytO b1'r>hP/C95$aB~m2Y%۶E>0 cQ]ժ0/aE㺘IUqe"RtѶ,ZߧT*~[UW_M<eY[jX$먶o[mczzRċ_b݋a4Z-r r{˧7nd2`;ុnX]]% aY%I"2z@jO1D6KfFFFFxrio>?ұm?|;9~0⢈Rc$CV./I&&e‘AE!jAvfuﻏھ}x:j ;wxɕ \71A4i8@@QMLέj4Mr]WtRZDy II:j6o*TuI ˢ-,o W\y%w o6g^g-sċŢx{:{Sc=UAZ{~˿^zn߾7o{=G0mܼ~[nj++iB##$I#U.S4\e`:Et%u]'EFft0H6U7Eex2-yjv)Bf)q0Hw:KPncW bHY'W6MS\0}Q N5tATiPT bxGX1oKtE]{?;7g?IKK\Ήwi~af|Ǿ& pg6mbZ\du MN"(IVVp<C Z_B*O@e6m+^Z'O:~v5hv=161>ry O=m\#j}~$T78_M X*Hdb1}uUEVU7p\Z&Eki5lDgi h䦧 EzޕU\ˢlh<.-cu]4Ѩz;Ɣ$6rz.b R\XbYdmc1Bn6}Zw0Od2 obm2bPGF7ȺΦh^g^^}5CT'EFGGY\\x[6ofYӐ7o&cZSSD+*h1MTޗs='$gqӔ\\f2h ld6{~ lrz!돺W/4Tv];4j4r qBZsA!($Ȳ,y'2|P({{$ 5! STxO0M򗿜cG|9nF&allo;4ͮ=xO˭WTndYW~W*ww'*Yevf5\Xٽk_:/ZGGa2h2`]h/.R]\Lzv$4MN:66n6Mvf=?3D#&-?'Gz(\$ Fj _M7D>gC|Z##_"v ^~;?u-RFW^ݟ{x/}Ӽ=J!\,SՈ,v,W T() z]qصkxrnS9zP?'<&I v%ҡr*ysMx' ?w|jaK$N8@*bmm \"nl7<dHA&+x$c|.2۷SV//Cw1h7-- dw3!b1җML çyr"8B!j )fhh_orr??ԔT,)JD"M2e7̃ w1MZ c~5w˔m~;P;ٞsMݢ(L&?pܿ%]xJѠhPRC=ٟnq ,˒iRܧrB??s!ݛ==5o~\s5 :W\/` ϣ-]7wzV 7̮n /O0duޡ+ypquٹ9Ӳh&>#2q"ͱ-.6 뺨T*X,#Gp 2[, `td;ذ{7#73Gn6Ht iʊ(.DQ> _kkH++HB&L^(ĩUuE\ƊǑb1+dRD,[ {zH$hռ={g?YwuM*“OtjT6oftF\A E"LnH3HO2hifF4|W@Uy^I5?N0dƷxyի^.C;|X_pu++{}/IXU? ~Cҗt:mDQZ B+c1;vv/B a(`<$O}w] hA]?QC!l nZj1O2b'&X9x0S(a;-YƫTH&Ƨ>c3늵#GVW)5og?o* 'Voyضe{^(LRAY\.<3;K\QƦy `Wl{:Ez̵^(裏O|q\yqzBjB0D&F(]GK9AXd9,vAndDT*h`&[馛uP(Ď;Td2>i\"[BZn{iZ\yWwy7W|-uױ9B_]e4#6idixQ\e>0V8L\[c[4M_5JEu*>H{kx[;Kpw A>CCl|KyN*'o '?! O,xG`gM)bZx2ΫF )ɰ"}Q>y!8.+,'Fr>ϐm#쟟h@cvetd*E(#R% 淿ou CHF[)u+|g~nwm"o[LBVOK)T4):eURU6B, ;##an$ftE/bib12eDDF2ѣuk^0( &]xG@QEԱqn6>ODH81,ˬ--qfFM\kxwb |gqlʱ1 N2bJYz›6q.Y'55Ea~P ˴ N7^ߛFeTMx<~5Ô i2aT?zLcqIUIotZ^zբZh4㦑HjϣJ8ޛGqg߿}Mn-e-317@XL % 81!I01Ȳ,YlYF}ﮮ> K%:GH9=]}?S}]˶y3pXh4f&i'ddvx2cX%<]wW^y%gYElUQ- _~9[ \.GSUnV0'y|)v_u&VV[-yVCL&7dMrl$,V+ZN8p5y0͆Ή D}}+UM3tRLPYӃɐh4ye3䓸lT*PM[oy'oٻ͆_QMMEjSU{'sÍ7sL\-6}> WhLCUQwvtPZ\Di4d'pDvOּ-ࣙH$p:tvvVDBnwڲeմ I0=Uo6)VQuZ7MP(L&IRfSH"& KKm;&vZFgutPXn%N.2nfTU] o4Mib&_.C F vp{YaBvrfV#QVih!u@ NXhD jcP-Cumi ٳ5BĽ^v4v6^{-[n!u"iYjK/E7 JJH)n UY*ξ>JZflX[FzɺxN#ǃM>{Ow7=ӭ($H+ >|￟|2ɹtMqy<(F_8".-aVX?0@-/0*4P H$X,bߺtҟ[醂 HrA\ zJ}qSU,%4&'k5fd{g'I"HdfI$T*@=F6 FK%ATUR++|ٸq#zk{Ad2fP@7 ϓRٸ};XYV#|>OVRfAz(,/磌#"ϳ骫={jIt[]<G #¡Ql,T7XILL Q)5B(yٻo0@CUi4 XDYZ"Յh5ݢ("yĹ9nٱBr9Bj5H^/G{ !o*?>4h]b!A( k1(f)x4Y.ǃ*IRVɎR#g ]vhWX%IJfjQ,xu\p_lRX9D#%Ag`p$((B*jb$L_.GXp+FZ9304D5]Pytk IDAT:=MeVQXZ D+N_ 'bpbp8XZ^^e33w:h4Ό2MZ%GSUIvMCe-%^V#} Q:e@ZU"-=@WpK(у!!p X Ec133XV6##L..uj D0e\M(wuaAUUrQPUjN'i>cI&q2ɾ;S:t&a6MtoJ[* (dH==aO\^oŋ߸Ea^'nB%y~a*,H$Vɵv;Z$7'50v|>o j +W(rF |T2 sLQlEK$aas؁i[05țdXG`q.nUMV @243zJΝlڴ:jUUlh40CAjLMMՅRR,2{|IQm6,G׉,J$ \.좈,D[#ZP,R)I8N6|E hv;a`"n}F +c"'d ,LNR"I D"Be"~?z,chj@(Dna|ٙ׽f(LNMQ,@V-W_ν{ɤ0!FO£Dž9x hl&C,c&r F_+x\. ^MP] ؠZR3 $I8ZVl6\v;PExEy<8Οn﵅a͑gPyj i7HY*kt*NKX|u,`|al+ Jo0 T*;:ꨵuDk6)g2.)]rQ,J$id~?+++3|>\8Z h0[j*bEZ,v21V,cErxh( $tJxN&zb줚N3v<,c&ˋL%۸D!-@LJy!nY\dCvwB!Uɓ{;nm5fy՟N gfX8~{HǾkeaasI0o}2w$8 |G44-bN'UŨQn:;ۿaf&YeF|BFO ^D;@ux9v g&ΌF6C)--1ىb2n7bJibTYFQU"0iEA,y9SUNJm4?\Wga!͚k\jň8+֯4p[x0k ޭ|s9.۴:ɷYך]]]dYt]gUϾ}ܸar6K-ˈbk_|E_T̬,Swu0VV0MlvT*Lif t # jrll6:9~8P(7>aO;^H4sh4 zOw7}}?%hxVk]7 4L\FL dT0:ٳgzSg@j%(VxhP_Y!ҊN tEx% ad4E jE)njñi3*rJGB(Bhjz `:2BP@v8plv nc=/ <0YMNRh-9t(/GZ n"7<7PdHruA4ػq\dZ!X,}AޟL>c\Iφ H`g4=%ضm4 1~ 4M E!Ed3г^ZZ)@9\Y̯}kHć?azzzL&c>tmwT*r%:Cd5~3 _UU9gu8(+ %M#׋3kp'(( Yz|X^F+VE,N p,Ju~Ll.G&'IR)$QptٖJFn7~Ν[ %(VY^F<{alr͈D:{{q`۰ejQĉXD vlB!hinq7r8EV+KNq^n 7|n&> H6ZVtQs~y% [Il]GfacSoYdt+ {<cb?8Hqqw4`bv4n c>w^2^״mU:MYCH)bv&XCRAUW?K>vL"9,V*XDR$g&&2R Պ[Qj5fC-$ N&glanGyi&&޸~\&8@|>TSOe/c+B(駟f||n ǃzzG>|J/^)&EIa048f{$'xn={(;T";>7 Dp߂ H&mFnŢsT_{6@X,),s]w199ɻ.oSt:zdYwB65"X Wy*&uSX,"_ R娍2vT Ӊbk,JH&l㏣*lX89tW*IT"i( ǁ$ZYLW9vF֭Ce3J7~;<h}ypʫ tϾ}|?EDK$y< X, Q  )|wN8{,{G-pxwxQq-,Ret]7Ktbh|v,WϑpQ8@GGGD")DI/| mx@{I$T*N'0;pi jV(xϳc0S)RTXEi\mrBYW=͢H0pZ,^A  QL$&&PQr6IAY0 r/|xq'O2j[-XU%`e,.M#TkNِTbML.y}$ .^D?>x[ SSSTMt2$XP;AZq J$,Dl4P8q<9]h* Sx)lV+oxVQeA03g(4M&''ڵ~x4k BK"Y,švppp8Lr334Mȑ#?ꫯo|BsVP*ͩgqf8,l~?Ѿ>fVV(r$ * EӳxR,Tz2Ib!(|.s9~?EY ey5ITBVxA dSYYp,dd155EPrǑ$iMg<}1߱ZiB*?%%˨V4 P) ;\y331lbz){<DI9f0w=r-DcuD|,3S^JTUmK]- N8;`0%9:O3v"5s瘝`rՍ7uV~?Nsutq#rzFŋ̞?ώ}G)L }aʺu]Y43ˑH$ݘJ2xd$q> 277p8бBY)jlFܵbqqGP0f=4gΝCJVCZ Y|{YLعc Pq gYJ:.Q3rQYY;Ҋc.f|w~N,.$]`bAEWVЛM}7DR!=?lm69)Ib:N3apP|J3fv~M.B͆5eTc694C$?z?z{x<؜N LI-gߚ* VsI^gRYZbR!i\{7֭~ O%=2H$X,H$TIznyqs:)//=ibFd(,`y|ۍ($b;J1cL `4LagϞm)мn7R&7^/#s<N]U+:۽^2 j\lRVT*x2UàQ B>HP,H$$ Y*hHuIb`V*T3;5EZ* `Rt.T*HD<'17(L̐Ç1MrֿlFyK_⺙EɟKVCjG™3XN"E4 snOĆ>PW_v$+oki(޲|[Gq70?OS)8 C*itlFflT  tw!Th* |6MIh(n7놇9r=ĜN8Jkk.pQM̪χ[5#,*fłsOak6il@|9˼AX`qAYX\$ebVU4Mk Z ^9 :\Á' L&4HKZevt8B6KDj4ldR$! $v;*DXD6 t=ZĹsŋObZ]{йq#yܹsSX^fP .o$NWVLH1irCFj?>AtS* NFx8W*_ [^˙ǙthСC&[,ΞeR@ <aMO0M ;t r1=MǃhдZ `0V*.`na-6;K Q@#߶&"iR(b.UUyk_x8ו g}.aݍl\,(\B@Tbvvd2iJᠰHPU {\癝` hJp:DQ,d2J4k5̥^J,(MOsqj ֭,Bj%,łEn ss8N"۶ <9=MTZhTj6㲌;efh\q#aU%p ;:{ 8MM0{gV~r=lZlI?})fqYx 6nNAUy]Z& 4 "놆uIU XEǒĖ x780.LOs ڱc,.-/'40&'|>vj=6b'XX aj6_"dogn~OAQp4 lt 㡻SX$*ly5|;v~db{k.jHuf(*ޖb 7~:r}C~gwg꺎e0t\QQtG"8J% Dq˅$I@:nBCnj 1reni C^trK^±#G0U$o4V0+ Mcp~O<<=MT&a0txlh`[o/Y8%Eo  >(^f#ӃEQyLd`N?~n4x={728<$yvw:z1ÍW_MPݽ1N<U g-K팝?NdC(fp۶191^j_q _'R)|%ή.R4KKKzQ*P 884UX^^r ?:u>>tw3$ A e-.Ri.H$j% rO~_AdY&d|,4j;wMχT*dYFmo#"Z*n6WV+ffP]. uade<5;-WS+YlR@|qN Wʉp4f0a.ݴ kx`e. ܠAMíi|bzW\8VUV[8˔ 0t4L"7LNM{aqqWV)9luh=]]\~饔zSSL-/EVToܷN*B} r팏~`ǁ*O=E^q4n$a;.b8tlMѣs"˛6G_u0 N`d?{3*J +aDV~ xGT*~VqIjEy5mEQEt]' rb.gTj4 sMiLvXV-d$I(]],OMfiSXZ@h6T&zbAVq$4(i)`-EwWSV/l::x-pf9R r9 XZZ2F(CN'Pf S+t,S4"6q2h]) ?uJ<3`qr\L p14md,0I*n46Tq-0?es4ht:q|V*7P*a( E `&< {<o.۵Gٿ?tnhxqfK%v֫|n300@OOT{w=K0bmF=VcsFZb/ w\ȍ'NzE6=ZgP.Imu 4 QdzvM7޽{q\9r]v_Z+iM]o4U 54с  zHF_k"j>Op|ASf].- /D"LMMך-},ϛ@+b:MRy3Zah dY^btEFO+s9JKN&B:UU UE0Ex蝝!djpn=OCܹQsW0n4/h~7׷zij*IDQ@DXWf6QYT*HdYEp֭o[GGfl2;:j%HPVtV*}>>rKzm##Zea0\ I|? w}z!/^k_*YFf% ~p=U HVLnQ{U}ʟ*A.ٷho/ݰtΏ]׹QW3|w_Uy ;: S8NSt:t0?;M7݄a<|>! wObqx t+]( PHN'\aP ss4ZH$ΪZj5!mv :^I:}>6mln7`p\B,3==ͷ-ҙi4ַ?g;B#U\ WsE4 si"ibLO#~?=7r:y㭷j壆[:rWHȲ_-o~3nN''NsX.N};> hl6N(:SLOO_q= DKWI"<6J"A" Nr w&t!Kpv"۷D^迮p__WVVwO,,{fl"Zgrssf*$I,,٠@b`ۏ-kN7M?3;v$?mdp?/}KyG9z(gΰ-4QebNɮ22Io4X۶qKHV\'NYt_qJ_%\r ŋb1j'$ՅcaaX,fOu&|ߏKk5qm1??OZmhr9o)4>e.#SOa8PL42tpg'n-Pzg?Y{1NoURp8$}O 0UEAi6f%GX,/O%]G(ZR{S |VQ mu8,˦3ai^2$ѿq#cg/V5eQ 6T 2v?̛k5p̶mtbE8enAvu1QQN!$`V`c"%Ap( a0w/ݒD@q\SK&x`Eᨛo ̜8AUVqgc2 6l#fff <f޽,:FIbϐłf ~*Y >n݊fېD"a&c1 fgf#|9UިIse2.A*V1~$I*Y.VV++V ,E(\[ '6qKKKr9yzxk5 x__Ϟ'ί%j+#|wɑX^}5{x⊂lrT5 bAv8˱y֞qYn­{jܺu|Oz>`\k7#V+'o}d<ʕLOOsEyf*tr,ڡ!")/4hLan}>-k+FGGzu]\yGhyXr8Al "z /p K5{vMڵ L//I)28ZG^DMUYL&)7"jmvh]k&۳K"<>6ƾE/-v;Ǐ#sbH$|K9h^g> {7KOv~0׈KtݬWסVaZ Cq4|׿B255{~I!'7~uJ,FDQUs,IU@+GeJ-l.C Dmoc֭{,wϹ07|3xOFy !nDEiThZ<<}} 4RjJH$"8JqڵXYf(G6=Z3_>9=3CG8nw*EӫaYy ous9 'N`QUlzŊ?EeɲL*{!JjJbL")%Tn,N'ydjMK KTvn\z饌-_f=j~[9{dA`h2Q(,JLNL06:\WUؾ\p8g_Yp?p0 Xl r9@F s26[<)Ay\nțV$} $ _"ɒb 1qPUGZZ,#GКץEmA{{9v}_ɭ[57FSMG"xU,n7k<#Wd$Z2gdYSVKS:" k~GSaKǏe Y&37GeM9;Ơ_;M܊7yzb6Kr^8 |0lh rQ]\Dm5Mk"bzR .$EYXe]?`bry$>bT.ٵ,(墫^瓆 vxA?&ͲRSHYpp(NYUɇBϳ^׹Y*IRC_VJJI"!hՅaЛxIdbrsTͻws֭[vX߀zϢ(eu `88Vk#ZRL&co%ʤ瑳Y[-ZX0x#l~^/wʊ‰]RΝej8 QӉD(eLA+R.zWE*Z!lB04M3*xmrRYO|sH>, lشO~?(  ?E.'|4006J\cb<_x tu)bDȎ0WrvYf4S4,.Ea@zxŮ( ł&貌\ˌ4\.%Vv-OG|j60@!?xnj&Clf7ʚ}jqTUpFF)gXl6|pْ,k^J%4MC" xE*=EQKy6\T_:$Iw/NI~xyZe|R+TEvSՈb8|>̞]cxO(,?N$a3 $Ӊܫl6"H#'Z(00<֭[E%СC)"CĉZ+ |a)SOX ;pw˩] A`R HccmEafa}}X łn}yXw:< K&\a1`|IM !mP{PT0Vb\s)ml6Aok[,rt:l'u<)owի)ˬ&333e^>Bܱc2y8N==(v30M:=YkZlV+z`0jEQ rZei",--1==M:^pɺue9Vi2K@d(,.r*nz&)| z$ԉ׾Ʒ>1=b&n"dw%ٶ;w2cQ:n,\f{y^|_. "[6Sĉ޽+Vv@WUR QT{< \,FG4kn%QdlVMd0m66m|4N oo/ TELLOizV"-( p8r*2q+  ᷰN.|ML֬! xp?Uѣ8E1,Xbhz<\Zwߍ)*e'7t]7].7x#hglONUU BnԮ(Ȫj:^Q(dwt ZJ2lcq_)ߗ IDATsqQ|tE?$`X(V;Ʋc_fX[o<8J1lIbʕDQd2;vG}MJO4] .]~?tJ%"KG"vXa.㩧⪫X(FfR! x˗`*L>OlxG*S33l9L"[l^  244D*ޱs="n7Պ8odT QzQ&4M~imj5j,7/n7V!$S)tMs=FxnDQDedY~Ii mz<\x:6)RebN\.g\===y8B$kqe߿^\H`mo{7x#w6GGG2N*f` mF*\,J0no',D8ɐr:JޤHGVULiq݈<8Wb˩52G&IWONN'_ַ~Á |sæi|*!QPPrGu^ػd2|`~/ E=@Ϛ5Hp6(JʒDʕl@g7Wqirw- /Q>3KT^J Q?6ZT* n͖ijbLMU%ujSЛ ᤆɨEEP&/[4nfr)IR#I1tD"MbA߳ZKPh pWwߍŎ]]ͽ^\FE6}6tT0c1ʅt:ZD;;<4DJ%Ab@HL$a]4 Xsa_xu똜b~f5 /x=~#NۺmjYfcy.oux1Y,OlBB0XeB*3DIhp"3<~VG"y:FrL6Y -_d?be?L4I9VKͥjQT^f|r4E!$kF--QdVFGq:u\~я8a>|J.}y>ލ,N'æ6 ILƒQlv{F\.ZQ%T,>}}u9<,V*\4UpW"nHCV#J0RTx9[]dݎnD~y37oe9qXŻb N;L`ADQ$"IRSp8"F)KR\+{b0̖1LxiZηeY&;;GQ$+v CUU3v(tłgl|`jMig/Z-$azqrǝwr=WW X T׬iqxC!܁@{寧:nil뜾f \~|>KNвeTff(,FFFp9ifө {WJQVMO3^l zZ,<裨~'JYz FGGd41UWsN'eQ`kXPT`ΥKUVU&E\4a N;k?a~a ZuTtjW}>,<O`7:==Cvn| ;4VDT0n4Eg dN49J>yvld&'9 \l0M Md \YssAenw67R.iAQDRU>$,E"\ԇyW'dլ[n=u]G( ɥ%XArN',jhEge2v;Zl߳u0 d \$L|&E^ƹ4, ne 'IX\.g|{sn/'28.E/,T*j-[>JSO/:j{.4i\҈1e,:X [.ik/˙[P08~>k>O&lS4xBp`ka휲v-Hz"Lt8aٰb.cj &''9t_~9x=lԤHaQhǃ7t[3fXlM*Z+rz fXp:IY.YZZnx<_)52yM7޽}Zaf+ԠoID4eh̵w܁ޘ-L0gn r̅8ݎ;4W#M%J!{p]vZbH\fa"|t˅Z*Fq(*ssTIDU؁, XRA(@n'W [6M~F#$)u+oyہY<$u]GXXK/e  ٸq#?4vAfMk?P}zZ6[8χ0 ͛93H&r9zzzDRnchN'^>g[ a^Yr{,!o;$}rr3$a&ըupXHR7O3ȥVH$P2 É'oz+Ls֬J%^+Лgłk3 àP(N1v;bC#w>*0 R 9D2MNSU,cHFΞL?GSlЮ>z,"p [7ofÆ ʗMCWU0M24Swc٘/{pjʕ+ٷo?0W\qbUQtt2iLW*\7ZQ b?H;uaa:::՜au]gqc$~?hwk_.2 yFCu3D5vcia2L$LbZ¿E"!H_۩J^M(|cccĄ<Ӭ_\իq6EQ| Quro@y:;:x{#pB)0H|~᠔SMWI1T 3 v& oǙPjN'jR0BM$|eLLLqxX#Gp\du4xQ  :cn7^@u.bV^)ŤÎqS#ۇ;ƍD" |>)8mD1eD -j'l|%(wTJ%SSxufs96?4ٜJRmޫ1x].wEsg<Λ pDZW@W_=j>zzz~ZqAvS,#4#2;;K.#?=Muv,VR@V UMRS3 A $-$E!]gۢ(a|>1g݊S1a94-.mr88eM(ܷ<};].UiIb,ǦibLǏ3oC6z1 † ~F$0)9}ӛq:7=%`)`׮]|opS,zABR%2 .V#U*QW| 40b6^"b}G$"`o ft]7SiV*r-?{{y>oa"\ɟ~஻О{+n ,-!^s1 ;wBx>_oFыE9|s8$AܪJ02ib7 >p@SULàNcmvu&dJP?+i#:<̚.V_r w:iy]Dyq-8:kpPd_XGIE}^EuN=T~K.! 1Ȗ-[sYz1zTߏϾؼe Hd2ن+IaibVXAR /P޿eIV(E2<'ImHu`68H&Flf:Fu:;;_SQ|>Rn7 pG>mkcǰ˄A>M7۶,{ӛxgVoK?M {:;ˑUEraP*7UUE4fpKNbA4zbeT˅(t͆jE$|SFGQ<斖X!T4H9N$X%i%lopOQU/I,pпz5eE_[nettte~Gqvvz若#l8H> Ѩih.EQڍP(?B!".ݎz , A0 x{fA E0;͢*hW.rѹf rFUlhn̈́ߵSnC+Қ5XL_k <,;NB k)f7]Z1\o/g\ֳBk6T*G?ss$,tv"8vwSz) P-0MJ4nA _xI1M@d<|DDxA%-, ӱy3g qO7#²@,s>-˼hr9r={p͔K%b/wFr~׋b9<ٰ̎~=GY~i`"333aHE0լ6,ᮮmf/R:Miq`5ɝѓ~yQӸ@uSdYn* t= nַ"Z(uw3qAsu_ bBai tU XӃfMq5E =j4#G0Xv8VlN'U[ZRUeA Z X$ MUqH$ V*tё2 s=׾^d vE@x$91yAI,˙g(ܹ?e2gH2Ǚ}af1F"TN8'>W֗QY:::PUi4#N@znU%Q<+&ӾQP4RJ>{###W3O&G}4)4͛mYz5n\ ]']qKR矏'?938+VgR& Ydp^ h=`A /$p.bZ,* j[G=t#ssRJ2=N'~?㋋DAJ 048HjbXN2*e6Q.˦|3//HRÚVɔtkkp#rᅬXxꩧNw,ⶹ9M& f69&].sxX~=«yxW+S*؊El*;,1GFO(Pk( c)xXvm#4k<_%afz~Y &I^L$xLJ?̥oy uYnQazz={sN"v;u,#XdA`Cf4aXEM}RtK(Ixj59Vb ,bD$]X.Xw7>iuDQ$vS5 2*ne+VǙfhl/+<Gerj nGu^|E&'';@۷Z6oE>~:۶mkKj5g'0 jj_nzW$U219kg߸s:YCTbtl 5"#^׳"arKKX].B]]%B;_k?Ae*Y@UQHq9Ӑk5*,iRezzza9rgxe@$ath[3MQ]Lr^PN8~()DaHn'XRp8z%Lłi޿nߺP.'F9v я~Պ?@وDMU@W˖ѣ\|j5|Iû.@ݻ:Gc֭Y3Ԁev֭fR14 Mr IDATF%w)7\cJz݌dY6my?)uE#mYTH$X$0?EO ^k"AXijz5EQ$7200@ ǎerp!hfb1|'*LO":RGJ,Q僃'deVSE2 Q>fqҵC>}ɉD(vs޺uLMLI$8ghVam>{/pI(A)E 8n"mHġCXjUEȩ6VIwu{y}~.GlZ| j|`% =s:jY&0 T*H$uKww7-O$h]93ÛBTbbbkO.W#n-nY[5Mv3|,;<4A`O"wމcǎfg?y[" !G"ȅN!g2C Y EEש9tVdj5*"@ۍbcp_L&E*~?|NFoyАФD ?le2L?,Z(I8}X"Ç9r0^po?\.G0!<6F<0PYXæt7=-7I[KVhA*ßlBM PUL&m}^if.\.x'?a\OEfmZ,{[5y\ fcp:S|>J?<=w}7vo}[nR"7oXə Z>n=pXuXNܹX,־Ns=c24h0~Cd?;>{+ia,_f#/蚆$m~ZĪhtaP՘Y2uR-k,p-HJYn|NjKKKl6(a}izzzzznZ;;2B" z6-o+&+jjرg}~lذ|>x>쳄Uv}J vO&v:YPO&J˅XCXPB!NL1^Yݍ'_y`>eS8뮻L˅X(`Z5\z$ :::0 h4*|31?o]n7Ѯ.I]j|p8uJv;ѱ1{zxȥk9Z5MuZQMorqT-r(fL& mɣ\(=~R" q)JvmT*z:::8eC0RU |P34MTUmF$I |^7SDvjI~|A/_%\ 1>U +TuQYfYu+z=vSp%xLG6R!d"آQpiD+Ilܴ ]׹[`ժU paxyBFTR] e^`2kp 7 &$7t,3u0$͆|9N^J!lȵu(" BB$fw&ntU3$t85 >8Ⱥsx6Gn uzٳgoԧ>7>gzfF]l#۲l11%1JK͒p!wIMMMHBvd .1.[եhz?3g|hfS?_Kf})lڴ'vq8$Ig>P_S˖ BȊY9t@w7_pTe@k+jV%Njr$ N'^4DkiQTӉ/lݪB)cH?gQCCY&Vv*sssXVNQxIFoM8 twwsWcX^&IݳG$&hQEAѠꂥQvoUǨMMMp[G;gg/ZilߵX(T.8Eˑ)U*hU4NEYGNswrCxHQ tv2'?l69 ?-‹/}}}}'$Ih42==SO=E\ß$Go>n`Œ%f |%BXttXd 9:tؽSͶV[z=& X$1DskEbcc@%-z=Vvwnv>iab -wuW+)[:;`T2ax2Abۏ*$ϋxR(PjUlٲ^_djj T駞ge0+dbNS:l:RPh(ru:,ccNZL&ejgff /ڵkT$={x}|;#!HpM7~q!BS'P?pHމ Ώ|tq$ILNBDNGReR,N,V4==-(jAFӐJ&͑fqz4*ik0Ww9wT.Ib|h$n7Jcױs$IJF"3hZb.N'~/=p-+|p HЪ(NNkm} :ǂQlJrYLNN29A$-Z_~9w}7gF" ü |Cbpph4ƍCмaǯY,˨ 7@FQdl6eIX(`hȅBtXfRT**BX,F[[[cN&\. [`8LP"e ɐ/$VW٩;dRlR5((QX$]3 A>-<[)U A[]F#s98 bh4Joo/^σ^/3JPVxV Z-- jK%2)Ie˗ꢷk?yG~1s\ pA>Ϧo"/:_|ۍhpi;BF#tqF=\Nz{x஻شk֭c3Z- 31?l6j4DitZs`0 )BPNd2":1V+xaƪ6۬\T*-biJCm WOzD:x$B[t;F#eS)Bq }[WUUlzAhissl߷l 8st-_b~zJJ`!Z[[X,&6=K.%ߏ,˜~={P+`֭L1~,*("P`ׯ糟,w\2wz+de6JK jk+ŋqRFCzb18.DTcUU[EB5e/\rSO=%6luNH'#16FsWW1H7H~rVT5\'܌bX,L&ܹ]vq K,&&&0\ٰQ۶ѿt);t^~aN޹fė9?---< ݆ H$R)^>я~r ziV+W[Z- |.U[F#;nF/3p77ioՅ(,2UU^' b0Xbhjg4(Ih57^/+*LXt:5|ı3$f h0mZm \.vƅ߽{ظq#L-RfinvVLaV)&(F3~-NUj /d9cZy$iضm_~9mmm GpRDWWWn:=\y~؁]VUV+t:) n@ @>R(8󙙙aӦM\%K0>> sV6EaBC]8e:>뙟'_(X,Ɩ͛*ۀ݀l%J!z:}Mj*dY&|AfwB3^t6v*}1QE!ahmd6K?w-ZVXXް]*\na: dppPr tYMfͭ/GQU:m62"/'ϩ*fwā20łh$˱{n۱Z_jq8t:P( /|l߱~39W^~Gazzr^/^ڵkyꩧضmX^.==|&b"#?6{O9&^l6L?/D$wp^s K.ṇfǮ]ŋqQikepVJ(J͡k4xbP}d%L`rj~lVJ\.ט\7,[l|6 <}S4;:G4gE!'C:Lb.VK6bm%fBoFkmztc#JB6K*DSevt܎%!B^7,ߏn[;,>q3g6ofѪU8yϢEp\ j(FYXիW mތ\;xA?jv;7t RThmmm(nt:Db]n'P(UUE,#ϣj`˖- ľ}Xb΢tYv->@czz.ܰ-;wR.g!8l6~B`^d&yX*ds9aha٨r v;~5R*n᠖#VW*wS(1Hvw;I/6E ʱ!`0b4vs._2ԚTgq$Id2fI&d2F9ڊ]`<fb1`\g' ېe;HWWbv\L2oǃhl,`T*E.dy,>({.+W_K|RP5UCW]Eyn̎gf{֯g`"6) *xdI" d\.7y(^!B=̝N'BH&HnGen7NsּI$"};lL.o|N_L[[B Sfr,Yd~ixMӿnQ.#4t.hm6 W]uoXr%DZekZ4]4*IE$8!|x\C@*E.Xo[s:ϦS&{!!wv;7nv͆h :`0 7cd|lDȍHڑUUU#;=]+6ܴZTIBo22C*p9dQ7#1;8V+xar`0$I .W3x={ĭ۶ݺVKUV rfv;pE45 F <6_W\yd2FFFXz5mKڰGE7506X-Zup&~SB!~b`eY,rBf}?C33|cp8̿x#݋ 8d4☛Y|]-;n'ڸ$ܵk‹.zGIyTr]Xqʽ ,^/?yNEi<*d$ѽd tBtvv( `rn04wS) ?(hPqR O; Y) m6.6h$I*J"` WZVQ+TUepp9&h^Ϻ6(ݻ{tH&N;K$?:B-_-^f2R(vR;xx >OL&y췿g.q>~=]]] ֟$I "5 i.#H0w/bQ.# Sc $Zq P\]<\޽zffR)mJNQAEuttt,,fBUUE4E`Ph4Xm60[b-ٻ'~b<)Bp5htOtw70@K *?OyyK!x,kAFsk+gy&dn: IDATaa۱픶laً/Dcy{<ݻ9sT*$ *6T+B`F h9|vb1 @ѤƖNc)(jVsJVp رK.b3n]*%I!rXLoq,JL()qVmD$vh4hlܸ˗S.y'+ix^uyA`0h0hȦFLLnn&%ٌ5(/'*I~mXvt0좋_bEPٙx {9dYjI&TU=˗y\60@Q1Ycc<+Ir|ӟ;t=o->ԄNfy?MXr$D&<&gO 5kfcccodժU/j3r, lFR*Q%IDN|+V`*ժ(h_~M:hL!h( c8nvtt܌V~_6֬YaZ@_L8@anOk+Vt2v xkr*E2Ix`=lBSO;OXaab(^yn`188H gAů~Rbasn8B50 ;d~z$ ֮]V3`hh{?xϿ.&^{krbT\-[[IBmmb6inj"(Wty_LP Lr7;s}c+vV\9qG6No~V^MWWWCZ$N$LUVqgr ',8X, B_W{֭[hU* xeMMxz{Iמg!FGGxپ};---$S)|>v?&]~9d<jai4 s5kh Invj-CMGDGFpӘFF#='c;(de9%UU(Bcg~rL]5F$NY߿ ɏ3m`g!_@pKرP(ĵ^GwYX{!Jxj^|oY~=^{-'t> r9166(b0d2)}QE3jZTU%:6avEBGUcV[^OKK O䡇[nK/%ׇd߿e~4?ϡIjKd&r6Vf1v $Q7dhkk6=}F,[ƪNbI'կ5j^F0 ] bKwIYl.r`YE4mWyԋ/Z(ВN(jU&|^|}I?&Ţػw/HV,dY\r\q8f\QwV|M{{;<+VhdYFTw܁fUUwI33::pкbFQeYb1"5@ 瞋l澻醴?O>55iooL& }4[jg|>hrLNMK/qR&[^zy i4&`^pG}ʥctJbɄ Q\G+bȲLss3FRD4pX\%$ggQC!|0RЮ]'?75@|+QTD<$3#T^L&---L&)/G"ڵ s @Džթs6mb,^-7e}t,[FoooC1UiaVp`X%djI& ʚ h$qYn?_{-Gai.{[ZWM|m."V1B8Ff, W t:Ѩt y`F|U}}3l;tv-ʾ_lno|x<.'ԏLHl6+vdb_3:>Ҿ>,f3߷V@q8sB۳nJvn]%"1 u]W]r,榧QE,`6Df޹=E;?жmt::#O=W')ժBfqXF :n(iw^ rl6#2PN;4)Ɉ~7rgrW37?Ϸy`RĐJBpfbhӢEe䶶׼f˅f#cb -&]tŧf't3~?d&sϱkvMMMz)ˉx<;gF|&CX$+qݍ(H$" -wC*岨wk.I2U1k#Jv?t]ڷkS[::x졇xQAW^T!hM&4֎l& L6;wlЦxߏRDV*Qk׮宻b\qopJf@cX2YF#N,JXDR<}RV!I`kmO/jTn|^d1 jaTUØfkVcU|(D($6\n;Z5TVjh%yyb;> -EQCb4iRp:kL&$HǓO>IOOOCM wkFX,~7b`3uݻIRX-"'&hPe D`ok|d$aۑiwWL2dW\>m<֬aҥ?`ӦM\}b1Ǯ]x駱~֭[GXfYRUU3J2hjkC6lx<^X, ѸCtZ-,* tz=HҫIԯDCC$^xe,= N[_(?8x|pTT,f3N\nhv<EQBd22 `z.ZVKXN;oC!qǯ~x(^~9r Dm(\&Fy硱X>{j^>KH/v!rIr$Zۺ_9UU)|2wlzAZ-ԧh:+ohh;vpV+8N6J \sT5L&0iPD$ĻY/nV"2>m(H~?-#V*azVא].袷Xqh$B"o3f|RTx9묳Xx1|I.k$ #q'$8/>,b2}2JF#Vѽl$u<GQj*0g2h56 H{[d|T;=/"W\q7t~ш\.N>dݺu+ccuݖ- 9}{M _dY[bFq{<$ c ;^.D"x@dx޷ N2)nN;T⊅ _x#@T* FMy4 L!4jB2X*‹/mj3 N7Dv;TVR$@j~K\r57 ̌۽J%Z-vUX"Y bxX babkQd"lw↌dB:`Ubʳ1\@ \χ}AoN*JBX4Rԫ<*QB' Qdz|-c(s?яؼy3^ziRr3 $9-\K8M:}Ӊ՛\,FCZֆ txoXCKBAdqF#vTI$|_|:GMr흝d$!bCC8[ZgAJ0F#6M=:fI"} CpVi pxעEh,J"X,0hjӃ1Tߏo[EAYeQgi4V+H!F+jR*v\.^trA\.J꫹۱uig\ev& bg{c==ihކ(" Rerͣձ/gO?4/2_q k&I$ΞE !71Id.I" q8845E*UcЕJ R<4Byp:tPY@ N'UUUx:;:psI(2h4jm뻹,oz* QPjկ j\|qQZZJTUz/rѺv-= ?kiDF)\f3LL`%_IٳGzfPUUȲL8n(tR*j$V+mn72S44~3,0\u?!9hZfbdY~@0|זdD\#&X::t%_giq 7p'aÆw<,9urmHdZX|L&p߯oKBW\=5k&ȱD^OIT*C@8ٮ.zWh24$u:eA\Rbr$I$&l!3;Kl~ShzOe^ehwC!O4} \u饸zz(`0H|^q 0L({>5bahPF۱ eR.{G~jUdنIJB$QL25i63| +Xzqԍssؿ]X2I '8@\Q28DmÒckkcI'Q*NOӵ~$sѐ ,>OJR(mmmr9ssy&prOϛ6HDu FT(PUuAhCf3Ź9~]dZw}bll ߟR(f37x#bҥґGw{,$x;I ~# h'_Thnnd2ht:-0. _ @!%R5^I /Z*BhD$i页K*ˢreCR#I^Jf_t( Nd2I>G.D:!5Ol~uj,&$1;[Qe5tb6JZ7ާ%MOO/@os3V.wRUUB)e2 x<\.Jti?C=`EnЇ8;w\wmW*Qd{SU V߻\.S*84>N>;ţ"Lb@X`6Sx< v[݆xxxAOOښh[#v:Lo2qUWq~\n7?Sf2 jUWLp^!ez*DW(dX=evThZNx9V4r*&%t6ݧE?ЩH$33NR;%_jQ!42f3HD5\- !PrMTQZ}>]=_||~SmB.G|~D:MT˗VźjRZSSS(BwM{4T*1;;g MZZZRu߮z\7`TB#kdUU5`XPJ%RK%Z-@L$Յ ,F F#R)Q(^LĢ0S IDAT##tR(hDHo^[G'K.FFGM/;v0x18Q,~(%J%lv;d$I,5cI0x^1\TN8MB&aVf($h`TUq( `nϓP%ႎ{jR((J Vd2Q(#1u\2IAPP*.XvI.&:r*0nbQ@l6zUmG?0`8Z-N|>/L&ӛ6:UQ0ΝT(7[+ y1{s3YٌLRNʣxh4ڰ߭i!2VVj@܌jhb4bjmEh߿+Ih4>H@ x;ˑhjjbݴPVCC2>sB,˫JZ UUAҪ8Y33nJZ-RlBKWO "( ~nt4ftt:"ZdxĥKq2&L<'DZotQ(b 4$FL{Gdd^U,hI܆zq֡-}x.3kژ`x aA40 '#!O{,-I,.UŪؗo*VIYW|O'փEH/s6C|wo@)4VWQ(jFr½P9d0 ׈a1MHڂ ?,\( qS.z8_DRATm;`JrCPҍeYΖ6VXjy(tfɺg1'T$e&m&"B<>F}eaP*Ae >(41e6z#JQHM=-x&acY ptUh`ii $yi2].H^4 c;xuf!"VuNÚL03M6J*dY&GJR @P /_fn糃)U^EQ2$&^jE "ǡv8$I"wQadcG)e:O>iiʱ$IĶ,f h"0L'2d&YBE$ !F  uyypO?qSTɓuM8) eiii';뺞u]Zv!+ {fqv:;/@^ubYu;Ux>I3^$I$"DIz|y'Q1k2u&LC;B(BKKDe:)>ŋ\[8l"fyFI*\CsqBı(B Dz*i"CqV8ؾv J 00$3v$%sL蕹z~vpGjTUEt2aш-(BJt8=L%G(.Q5<~<ظEp*IH(w{(]C>wu@Q"6}Dz(Bm ͛pUcY/I^ c4wEu tgsm7ob D0dxK A8ʦa,+KYORs۸Xu++e5$qf>)+^ q}v0CMwlRv Ǩ*i1O&~OZaNs}8=>B[[1lCwB\[PN~et`4/^l6c,,hm7o%G1&5E/vvXζay2Au!fB>!a,Jc~ 0 jTU._Ɲ~cc~KHDV ci C0 J0DTgQ?se0DUQ5 .Z:{a`"P>8,--a2c}ٌ)P*b (3Uł&V eATMQ*\8>!Ku S/\6k*N0mX\G%E)Ƽ8% y?ncO,Id:gς"Z ǡhb,!x"HZ>^r/]By(tdY&QpQ8IdNJB?wNè8x*`XI;:ڝclƾ'owAgo hC}Y0Q?qD,BUUH$&ǡYBEIXz`~@ (lPJ-%3S4qunjh6ows|g{nfg qekT*׾z &s{Es,㸬'8#lF ' ?6{G@e4X[cjiBIm.u!><σa({o:?oH4Mo/ _^r5q(Ŷ됟>[^)Af2*hw]ܞ"c6 !˨T*LBp0hoG?vl}9d^ʾoΝ;7W_}ȲL PU5mP='mR l&锥7IA8rPhFB>q.ܞr> `}!"-YFg"2qnaYΜ:Ͽn?ӧ^ç?illl mf?_ _Sl1ʕ+(J(0 #BgGӁ j6?P\QHD))/y|S2ׯbO)G?v( t]ǿQo4TEl}}/|< _Μ9s`+-coy_qԩC``Y0 ضu,}L'p}8#}6 ( ]ޔҬCX̘Uk1Sq6^u|ﳋ/f|k18?)x +xWײDڢK 0u]v(|&2ogXy~tJxccY|B2qL,Mw0f(ҁ!x\o7 * ^yloԥKBWs~Rr;V ,Z#SE|٦ mCz=ι^WML_~k C(C EMEUIe<ϣ›lba,-0ׯwpGX<k4Շc)KzhٶaS(ysR=QI0 ..0qe95MV $1ZAV>Ѡ*P~ג*BP2MT@J_o(. HD)sm5:_uQL@lJZzzwb-ER,æd2AۅaGnD7c0 Ywm QHUQb ̷b^,! |z=zJ))* & JɄzs?jeEJ^UAP=^:)!M1WWs.H,ó,h:'ORtb&: @)ؽ^/%tHiC!l1<4{exNQA`MEQR]{rt{k6JF(b:+ w<my[$M20~koP:4U \eT/^50*Č8%~h4*8#2Q {I@rB;:|Gh4t39R:bAN]A)c b0VS*f8xfrt:&'),I dǁ9OwWS8fmPTayy9}hIp8d(T [/?byDQ/BK'4+D' %lY|uj #Y ZXqCUG锍0N/Ԃ(Loބn<,ر<˙|pXe.^5ASU[[{=!Pl"/_ưӹGo,4DO(;SJcYloCWH[Lef.8UQ,2[@=!Ej^e15YHCz^B q) d<R@y;: AW1vqM欯Ca.^|[s1N_vP!>"Eg7vaRf*1}~@dWݹ{4uBfsl;cX]@?C(9$Y#c q# 9GQ<0B`no1F% b4ѷm,-/j5H/R=qnP 5 :={??Lh(zƱc5) )xQĒ(ΜpAdr?i, À`aqOK֧?,yQDee%R,^]kc8b63(%cgu]yx{'?,}k?? ,\Ag`ha6RAΦ<{3|mNy ۢy ١uqG],+?R(XPU,4"at\.ò,L'<<~z RIDATpv(0/ yhuh40͞Y?:>B\y,> ̷#>DQ$IyaR6 L&f'<իne8EQ(japP?(DuL&(zx ˲y>s=tB;f>3:E" C( m>EQ)l5 d)ad"BTU C8XA8N>ܒhZ( *1G+S ,~Zj*81EŬV,I\F߇,!|`o9[G0YQy̱mTT ,mj=-fOȂ}h`5x-ʘMS͍ ȔBVU8 ?܃@ jDմGvRgy"eY΢sImPR4 t;LC!>>u ͏F Ðuo߆79>Mވο8k g1 `GNA]J1_[[CD) f A}t}M~rD<} ~00>ۈpON$doma "_=fL*h2`8`?c\(XYs=r<Yi2s2ᔢN[s*[ 8=SL^Et>;'h\_- c]M-$0F$0 ei JeNhD*'1N5r?-&2Q`8rUp1NJtzOn} "8b4QSecB,* ˈ V]T.qHh"%{-x S;TBM<g2AijB*$4 ʩ8̞ * qgn1(5P5q<םAO9>[n[n[nvPIENDB`networkx-1.11/doc/source/static/networkx.css0000644000175000017500000000061612637544450021137 0ustar aricaric00000000000000@import url('css/theme.css'); .wy-table-responsive table td, .wy-table-responsive table th { white-space: normal; } .wy-table-responsive table td:first-child { white-space: nowrap; } code, .rst-content tt, .linenodiv pre, div[class^="highlight"] pre { font-family: 'Inconsolata', 'Monaco', 'Consolas', monospace; font-size: 85%; } p.rubric { font-weight: bold !important; } networkx-1.11/doc/source/static/force/0000755000175000017500000000000012653231454017632 5ustar aricaric00000000000000networkx-1.11/doc/source/static/force/force.css0000644000175000017500000000015512637544450021450 0ustar aricaric00000000000000 circle.node { stroke: #fff; stroke-width: 1.5px; } line.link { stroke: #999; stroke-opacity: .6; } networkx-1.11/doc/source/static/trac.css0000644000175000017500000000522512637544450020210 0ustar aricaric00000000000000 /** Link styles */ :link, :visited { text-decoration: none; color: #CA7900; border-bottom: 1px dotted #bbb; } :link:hover, :visited:hover { background-color: white; color: #2491CF; } h1 :link, h1 :visited ,h2 :link, h2 :visited, h3 :link, h3 :visited, h4 :link, h4 :visited, h5 :link, h5 :visited, h6 :link, h6 :visited { color: inherit; } .trac-rawlink { background-color: #*BFD1D4; border-bottom: none } h1 { margin: 0; /* padding: 0.7em 0 0.3em 0; */ font-size: 1.5em; color: #11557C; } /* Header */ #header hr { display: none } #header h1 { margin: 1.5em 0 -1.5em; } #header img { border: none; margin: 0 0 -3em } #header :link, #header :visited, #header :link:hover, #header :visited:hover { background: transparent; color: #11557C; margin-bottom: 2px; border: none; font-size: 1.5em; } #header h1 :link:hover, #header h1 :visited:hover { color: #000 } dt :link:hover, dt :visited:hover { background-color: #BFD1D4; color: #000 } dt em { border-bottom: 1px dotted #bbb; color: #CA7900; font-style: normal; text-decoration: none; } .milestone .info h2 em { color: #CA7900; font-style: normal } table.progress td { background: #fff; padding: 0 } table.progress td.new { background: #f5f5b5 } table.progress td.closed { background: #11557C} table.progress td :hover { background: none } /* Main navigation bar */ #mainnav { border: 1px solid #000; font: normal 10px verdana,'Bitstream Vera Sans',helvetica,arial,sans-serif; margin: .66em 0 .33em; padding: .2em 0; } #mainnav li { border-right: none; padding: .25em 0 } #mainnav :link, #mainnav :visited { border-right: 1px solid #fff; border-bottom: none; border-left: 1px solid #555; color: #CA7900; /* #11557C; */ padding: .2em 20px; } * html #mainnav :link, * html #mainnav :visited { background-position: 1px 0 } #mainnav :link:hover, #mainnav :visited:hover { background-color: #ccc; border-right: 1px solid #ddd; } #mainnav .active :link, #mainnav .active :visited { border-top: none; border-right: 1px solid #000; border-left: 1px solid #000; color: #CA7900; font-weight: normal; } #mainnav .active :link:hover, #mainnav .active :visited:hover { border-right: 1px solid #000; background-color: white; } #metanav { background-color: white; } #main { background-color: white; } /*#content { padding-bottom: 2em; position: relative; background-color: white; } */ #altlinks { clear: both; text-align: center; background-color: white; } #footer { background-color: white; } #sitefooter { background-color: white; } #siteheader { background-color: white; } #header { background-color: white; } #banner { background-color: white; } #main { background-color: white; } networkx-1.11/doc/sphinxext/0000755000175000017500000000000012653231454015777 5ustar aricaric00000000000000networkx-1.11/doc/sphinxext/LICENSE.txt0000644000175000017500000000302712637544450017631 0ustar aricaric00000000000000------------------------------------------------------------------------------- The files - numpydoc.py - autosummary.py - autosummary_generate.py - docscrape.py - docscrape_sphinx.py - phantom_import.py have the following license: Copyright (C) 2008 Stefan van der Walt , Pauli Virtanen Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. THIS SOFTWARE IS PROVIDED BY THE AUTHOR ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. networkx-1.11/doc/sphinxext/customroles.py0000644000175000017500000000755112637544450020745 0ustar aricaric00000000000000""" Custom Roles """ from docutils import nodes, utils from docutils.parsers.rst import roles from sphinx import addnodes from sphinx.util import ws_re, caption_ref_re # http://www.doughellmann.com/articles/how-tos/sphinx-custom-roles/index.html def sample_role(name, rawtext, text, lineno, inliner, options={}, content=[]): """Custom role. Parameters ---------- name : str The name of the role, as used in the document. rawtext : str The markup, including the role declaration. text : str The text to be marked up by the role. lineno : int The line number where `rawtext` appears. inliner : Inliner The instance that called the role. options : dict Directive options for customizatoin. content : list The directive content for customization. Returns ------- nodes : list The list of nodes to insert into the document. msgs : list The list of system messages, perhaps an error message. """ pass ################## prefixed_roles = { # name: (prefix, baseuri) 'arxiv': ('arXiv:', 'http://arxiv.org/abs/'), 'doi': ('doi:', 'http://dx.doi.org/'), } no_text_roles = [ 'url', 'pdf', ] def prefixed_role(name, rawtext, text, lineno, inliner, options={}, content=[]): prefix, baseuri = prefixed_roles[name] uri = baseuri + text display = utils.unescape(text) node = nodes.literal(prefix, prefix) ref = nodes.reference(rawtext, display, refuri=uri, **options) node += ref # keep it in the 'literal' background return [node], [] def url_role(name, rawtext, text, lineno, inliner, options={}, content=[]): uri = text display = 'url' node = nodes.literal('', '') node += nodes.reference(rawtext, name, refuri=uri, **options) return [node], [] def trac_ticket_role(name, rawtext, text, lineno, inliner, options={}, content=[]): app = inliner.document.settings.env.app try: base = app.config.trac_url if not base: raise AttributeError except AttributeError as err: msg = 'trac_url configuration value is not set (%s)' raise ValueError(msg % str(err)) slash = '/' if base[-1] != '/' else '' prefix = 'ticket ' node = nodes.literal(prefix, prefix) display = utils.unescape(text) uri = base + slash + 'ticket/' + text node += nodes.reference(rawtext, display, refuri=uri, **options) return [node], [] def trac_changeset_role(name, rawtext, text, lineno, inliner, options={}, content=[]): app = inliner.document.settings.env.app try: base = app.config.trac_url if not base: raise AttributeError except AttributeError as err: msg = 'trac_url configuration value is not set (%s)' raise ValueError(msg % str(err)) slash = '/' if base[-1] != '/' else '' unescaped = utils.unescape(text) prefix = 'changeset ' node = nodes.literal(prefix, prefix) # Hard-coded for NetworkX if unescaped.endswith('networkx-svn-archive'): # Use the integer display = unescaped.split('/')[0] else: # hg: use the first 12 hash characters display = unescaped[:12] uri = base + slash + 'changeset/' + text node += nodes.reference(rawtext, display, refuri=uri, **options) return [node], [] active_roles = { 'arxiv': prefixed_role, 'doi': prefixed_role, 'pdf': url_role, 'url': url_role, 'ticket': trac_ticket_role, 'changeset': trac_changeset_role, } # Add a generic docstring. for role in active_roles.values(): role.__doc__ = sample_role.__doc__ def setup(app): for role, func in active_roles.items(): roles.register_local_role(role, func) app.add_config_value('trac_url', None, 'env') networkx-1.11/doc/gh-pages.py0000755000175000017500000001037312637544500016022 0ustar aricaric00000000000000#!/usr/bin/env python """Script to commit the doc build outputs into the github-pages repo. Use: gh-pages.py [tag] If no tag is given, the current output of 'git describe' is used. If given, that is how the resulting directory will be named. In practice, you should use either actual clean tags from a current build or something like 'current' as a stable URL for the most current version""" # Borrowed from IPython. #----------------------------------------------------------------------------- # Imports #----------------------------------------------------------------------------- from __future__ import print_function import os import re import shutil import sys from os import chdir as cd from os.path import join as pjoin from subprocess import Popen, PIPE, CalledProcessError, check_call #----------------------------------------------------------------------------- # Globals #----------------------------------------------------------------------------- pages_dir = 'gh-pages' html_dir = 'build/dist' pdf_dir = 'build/latex' pages_repo = 'https://github.com/networkx/documentation.git' #----------------------------------------------------------------------------- # Functions #----------------------------------------------------------------------------- def sh(cmd): """Execute command in a subshell, return status code.""" return check_call(cmd, shell=True) def sh2(cmd): """Execute command in a subshell, return stdout. Stderr is unbuffered from the subshell.x""" p = Popen(cmd, stdout=PIPE, shell=True) out = p.communicate()[0] retcode = p.returncode if retcode: raise CalledProcessError(retcode, cmd) else: return out.rstrip() def sh3(cmd): """Execute command in a subshell, return stdout, stderr If anything appears in stderr, print it out to sys.stderr""" p = Popen(cmd, stdout=PIPE, stderr=PIPE, shell=True) out, err = p.communicate() retcode = p.returncode if retcode: raise CalledProcessError(retcode, cmd) else: return out.rstrip(), err.rstrip() def init_repo(path): """clone the gh-pages repo if we haven't already.""" sh("git clone %s %s"%(pages_repo, path)) here = os.getcwdu() cd(path) sh('git checkout gh-pages') cd(here) #----------------------------------------------------------------------------- # Script starts #----------------------------------------------------------------------------- if __name__ == '__main__': # The tag can be given as a positional argument try: tag = sys.argv[1] except IndexError: try: tag = sh2('git describe --exact-match') except CalledProcessError: print("using development as label") tag = "development" # Fallback startdir = os.getcwdu() if not os.path.exists(pages_dir): # init the repo init_repo(pages_dir) else: # ensure up-to-date before operating cd(pages_dir) sh('git checkout gh-pages') sh('git pull') cd(startdir) dest = pjoin(pages_dir, tag) # don't `make html` here, because gh-pages already depends on html in Makefile # sh('make html') if tag != 'dev': # only build pdf for non-dev targets #sh2('make pdf') pass # This is pretty unforgiving: we unconditionally nuke the destination # directory, and then copy the html tree in there shutil.rmtree(dest, ignore_errors=True) shutil.copytree(html_dir, dest) if tag != 'dev': #shutil.copy(pjoin(pdf_dir, 'ipython.pdf'), pjoin(dest, 'ipython.pdf')) pass try: cd(pages_dir) status = sh2('git status | head -1') branch = re.match('On branch (.*)$', status).group(1) if branch != 'gh-pages': e = 'On %r, git branch is %r, MUST be "gh-pages"' % (pages_dir, branch) raise RuntimeError(e) sh('git add -A %s' % tag) sh('git commit -m"Updated doc release: %s"' % tag) print('\nMost recent 3 commits:') sys.stdout.flush() sh('git --no-pager log --oneline HEAD~3..') finally: cd(startdir) print('\nNow verify the build in: %r' % dest) print("If everything looks good, 'git push'") networkx-1.11/doc/Makefile0000644000175000017500000001032112637544450015407 0ustar aricaric00000000000000# Makefile for Sphinx documentation # # You can set these variables from the command line. SPHINXOPTS = SPHINXBUILD = sphinx-build PAPER = # Internal variables. PAPEROPT_a4 = -D latex_paper_size=a4 PAPEROPT_letter = -D latex_paper_size=letter ALLSPHINXOPTS = -d build/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) source .PHONY: help clean html dirhtml pickle json htmlhelp qthelp latex changes linkcheck doctest epub help: @echo "Please use \`make ' where is one of" @echo " html to make standalone HTML files" @echo " dirhtml to make HTML files named index.html in directories" @echo " pickle to make pickle files" @echo " epub to make an epub" @echo " json to make JSON files" @echo " htmlhelp to make HTML files and a HTML help project" @echo " qthelp to make HTML files and a qthelp project" @echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter" @echo " changes to make an overview of all changed/added/deprecated items" @echo " linkcheck to check all external links for integrity" @echo " doctest to run all doctests embedded in the documentation (if enabled)" clean: -rm -rf build/* source/reference/generated/* source/examples/* source/static/examples doc/source/*.pdf doc/source/*.zip -rm -rf ../examples/*/*.png generate: build/generate-stamp build/generate-stamp: $(wildcard source/reference/*.rst) mkdir -p build python make_gallery.py python make_examples_rst.py ../examples source touch build/generate-stamp dist: html test -d build/latex || make latex make -C build/latex all-pdf -rm -rf build/dist (cd build/html; cp -r . ../../build/dist) -rm -f build/dist/_downloads/* (cd build/html && zip -9r ../dist/_downloads/networkx-documentation.zip .) cp build/latex/*.pdf build/dist/_downloads # (cd build/dist && ln -s _downloads/* .) (cd build/dist && tar czf ../dist.tar.gz .) html: generate touch source/networkx_tutorial.pdf touch source/networkx_reference.pdf touch source/networkx-documentation.zip $(SPHINXBUILD) -b html $(ALLSPHINXOPTS) build/html @echo @echo "Build finished. The HTML pages are in build/html." dirhtml: generate $(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) build/dirhtml @echo @echo "Build finished. The HTML pages are in build/dirhtml." pickle: generate $(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) build/pickle @echo @echo "Build finished; now you can process the pickle files." json: generate $(SPHINXBUILD) -b json $(ALLSPHINXOPTS) build/json @echo @echo "Build finished; now you can process the JSON files." htmlhelp: generate $(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) build/htmlhelp @echo @echo "Build finished; now you can run HTML Help Workshop with the" \ ".hhp project file in build/htmlhelp." qthelp: generate $(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) build/qthelp @echo @echo "Build finished; now you can run "qcollectiongenerator" with the" \ ".qhcp project file in build/qthelp, like this:" @echo "# qcollectiongenerator build/qthelp/test.qhcp" @echo "To view the help file:" @echo "# assistant -collectionFile build/qthelp/test.qhc" epub: $(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) build/epub @echo @echo "Build finished. The epub file is in $(BUILDDIR)/epub." latex: generate $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) build/latex @echo @echo "Build finished; the LaTeX files are in build/latex." @echo "Run \`make all-pdf' or \`make all-ps' in that directory to" \ "run these through (pdf)latex." changes: $(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) build/changes @echo @echo "The overview file is in build/changes." linkcheck: generate $(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) build/linkcheck @echo @echo "Link check complete; look for any errors in the above output " \ "or in build/linkcheck/output.txt." doctest: generate $(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) build/doctest @echo "Testing of doctests in the sources finished, look at the " \ "results in build/doctest/output.txt." gh-pages: clean dist python gh-pages.py $(tag) gitwash-update: python gitwash_dumper.py source/developer networkx \ --project-url=http://networkx.github.io \ --project-ml-url=http://groups.google.com/group/networkx-discuss/ networkx-1.11/doc/make_gallery.py0000755000175000017500000000462512637544450016772 0ustar aricaric00000000000000""" Generate a thumbnail gallery of examples. """ from __future__ import print_function import os, glob, re, shutil, sys import matplotlib matplotlib.use("Agg") import matplotlib.pyplot import matplotlib.image from matplotlib.figure import Figure from matplotlib.backends.backend_agg import FigureCanvasAgg as FigureCanvas examples_source_dir = '../examples/drawing' examples_dir = 'examples/drawing' template_dir = 'source/templates' static_dir = 'source/static/examples' pwd=os.getcwd() rows = [] template = """ {%% extends "layout.html" %%} {%% set title = "Gallery" %%} {%% block body %%}

    Click on any image to see source code


    %s {%% endblock %%} """ link_template = """
    %s """ if not os.path.exists(static_dir): os.makedirs(static_dir) os.chdir(examples_source_dir) all_examples=sorted(glob.glob("*.py")) # check for out of date examples stale_examples=[] for example in all_examples: png=example.replace('py','png') png_static=os.path.join(pwd,static_dir,png) if (not os.path.exists(png_static) or os.stat(png_static).st_mtime < os.stat(example).st_mtime): stale_examples.append(example) for example in stale_examples: print(example, end=" ") png=example.replace('py','png') matplotlib.pyplot.figure(figsize=(6,6)) stdout=sys.stdout sys.stdout=open('/dev/null','w') try: execfile(example) sys.stdout=stdout print(" OK") except ImportError as strerr: sys.stdout=stdout sys.stdout.write(" FAIL: %s\n" % strerr) continue matplotlib.pyplot.clf() im=matplotlib.image.imread(png) fig = Figure(figsize=(2.5, 2.5)) canvas = FigureCanvas(fig) ax = fig.add_axes([0,0,1,1], aspect='auto', frameon=False, xticks=[], yticks =[]) # basename, ext = os.path.splitext(basename) ax.imshow(im, aspect='auto', resample=True, interpolation='bilinear') thumbfile=png.replace(".png","_thumb.png") fig.savefig(thumbfile) shutil.copy(thumbfile,os.path.join(pwd,static_dir,thumbfile)) shutil.copy(png,os.path.join(pwd,static_dir,png)) basename, ext = os.path.splitext(example) link = '%s/%s.html'%(examples_dir, basename) rows.append(link_template%(link, os.path.join('_static/examples',thumbfile), basename)) os.chdir(pwd) fh = open(os.path.join(template_dir,'gallery.html'), 'w') fh.write(template%'\n'.join(rows)) fh.close() networkx-1.11/INSTALL.txt0000644000175000017500000000013212637544450015050 0ustar aricaric00000000000000See doc/source/install.rst or http://networkx.github.io/documentation/latest/install.html networkx-1.11/setup.py0000644000175000017500000001136012637544450014720 0ustar aricaric00000000000000#!/usr/bin/env python # -*- coding: utf-8 -*- """ Setup script for networkx You can install networkx with python setup.py install """ from glob import glob import os import sys if os.path.exists('MANIFEST'): os.remove('MANIFEST') from setuptools import setup if sys.argv[-1] == 'setup.py': print("To install, run 'python setup.py install'") print() if sys.version_info[:2] < (2, 7): print("NetworkX requires Python 2.7 or later (%d.%d detected)." % sys.version_info[:2]) sys.exit(-1) # Write the version information. sys.path.insert(0, 'networkx') import release version = release.write_versionfile() sys.path.pop(0) packages=["networkx", "networkx.algorithms", "networkx.algorithms.assortativity", "networkx.algorithms.bipartite", "networkx.algorithms.centrality", "networkx.algorithms.chordal", "networkx.algorithms.community", "networkx.algorithms.components", "networkx.algorithms.connectivity", "networkx.algorithms.coloring", "networkx.algorithms.flow", "networkx.algorithms.traversal", "networkx.algorithms.isomorphism", "networkx.algorithms.shortest_paths", "networkx.algorithms.link_analysis", "networkx.algorithms.operators", "networkx.algorithms.approximation", "networkx.algorithms.tree", "networkx.classes", "networkx.external", "networkx.generators", "networkx.drawing", "networkx.linalg", "networkx.readwrite", "networkx.readwrite.json_graph", "networkx.tests", "networkx.testing", "networkx.utils"] docdirbase = 'share/doc/networkx-%s' % version # add basic documentation data = [(docdirbase, glob("*.txt"))] # add examples for d in ['advanced', 'algorithms', 'basic', '3d_drawing', 'drawing', 'graph', 'multigraph', 'pygraphviz', 'readwrite']: dd = os.path.join(docdirbase,'examples', d) pp = os.path.join('examples', d) data.append((dd, glob(os.path.join(pp ,"*.py")))) data.append((dd, glob(os.path.join(pp ,"*.bz2")))) data.append((dd, glob(os.path.join(pp ,"*.gz")))) data.append((dd, glob(os.path.join(pp ,"*.mbox")))) data.append((dd, glob(os.path.join(pp ,"*.edgelist")))) # add the tests package_data = { 'networkx': ['tests/*.py'], 'networkx.algorithms': ['tests/*.py'], 'networkx.algorithms.assortativity': ['tests/*.py'], 'networkx.algorithms.bipartite': ['tests/*.py'], 'networkx.algorithms.centrality': ['tests/*.py'], 'networkx.algorithms.chordal': ['tests/*.py'], 'networkx.algorithms.community': ['tests/*.py'], 'networkx.algorithms.components': ['tests/*.py'], 'networkx.algorithms.connectivity': ['tests/*.py'], 'networkx.algorithms.coloring': ['tests/*.py'], 'networkx.algorithms.flow': ['tests/*.py', 'tests/*.bz2'], 'networkx.algorithms.traversal': ['tests/*.py'], 'networkx.algorithms.isomorphism': ['tests/*.py','tests/*.*99'], 'networkx.algorithms.link_analysis': ['tests/*.py'], 'networkx.algorithms.approximation': ['tests/*.py'], 'networkx.algorithms.operators': ['tests/*.py'], 'networkx.algorithms.shortest_paths': ['tests/*.py'], 'networkx.algorithms.traversal': ['tests/*.py'], 'networkx.algorithms.tree': ['tests/*.py'], 'networkx.classes': ['tests/*.py'], 'networkx.generators': ['tests/*.py'], 'networkx.drawing': ['tests/*.py'], 'networkx.linalg': ['tests/*.py'], 'networkx.readwrite': ['tests/*.py'], 'networkx.readwrite.json_graph': ['tests/*.py'], 'networkx.testing': ['tests/*.py'], 'networkx.utils': ['tests/*.py'] } install_requires = ['decorator>=3.4.0'] if __name__ == "__main__": setup( name = release.name.lower(), version = version, maintainer = release.maintainer, maintainer_email = release.maintainer_email, author = release.authors['Hagberg'][0], author_email = release.authors['Hagberg'][1], description = release.description, keywords = release.keywords, long_description = release.long_description, license = release.license, platforms = release.platforms, url = release.url, download_url = release.download_url, classifiers = release.classifiers, packages = packages, data_files = data, package_data = package_data, install_requires = install_requires, test_suite = 'nose.collector', tests_require = ['nose>=0.10.1'], zip_safe = False ) networkx-1.11/examples/0000755000175000017500000000000012653231454015016 5ustar aricaric00000000000000networkx-1.11/examples/javascript/0000755000175000017500000000000012653231454017164 5ustar aricaric00000000000000networkx-1.11/examples/javascript/force.py0000644000175000017500000000171412653165361020642 0ustar aricaric00000000000000"""Example of writing JSON format graph data and using the D3 Javascript library to produce an HTML/Javascript drawing. """ # Author: Aric Hagberg # Copyright (C) 2011-2016 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import json import networkx as nx from networkx.readwrite import json_graph import http_server G = nx.barbell_graph(6,3) # this d3 example uses the name attribute for the mouse-hover value, # so add a name to each node for n in G: G.node[n]['name'] = n # write json formatted data d = json_graph.node_link_data(G) # node-link format to serialize # write json json.dump(d, open('force/force.json','w')) print('Wrote node-link JSON data to force/force.json') # open URL in running web browser http_server.load_url('force/force.html') print('Or copy all files in force/ to webserver and load force/force.html') networkx-1.11/examples/javascript/http_server.py0000644000175000017500000000415412637544450022114 0ustar aricaric00000000000000# helper to load url # runs webserver and loads url with webbrowswer module import sys def load_url(path): PORT = 8000 httpd = StoppableHTTPServer(("127.0.0.1",PORT), handler) thread.start_new_thread(httpd.serve, ()) webbrowser.open_new('http://localhost:%s/%s'%(PORT,path)) input("Press to stop server\n") httpd.stop() print("To restart server run: \n%s"%server) if sys.version_info[0] == 2: import SimpleHTTPServer, BaseHTTPServer import socket import thread import webbrowser handler = SimpleHTTPServer.SimpleHTTPRequestHandler input = raw_input server = "python -m SimpleHTTPServer 8000" class StoppableHTTPServer(BaseHTTPServer.HTTPServer): def server_bind(self): BaseHTTPServer.HTTPServer.server_bind(self) self.socket.settimeout(1) self.run = True def get_request(self): while self.run: try: sock, addr = self.socket.accept() sock.settimeout(None) return (sock, addr) except socket.timeout: pass def stop(self): self.run = False def serve(self): while self.run: self.handle_request() else: import http.server, http.server import socket import _thread as thread import webbrowser handler = http.server.SimpleHTTPRequestHandler server = "python -m http.server 8000" class StoppableHTTPServer(http.server.HTTPServer): def server_bind(self): http.server.HTTPServer.server_bind(self) self.socket.settimeout(1) self.run = True def get_request(self): while self.run: try: sock, addr = self.socket.accept() sock.settimeout(None) return (sock, addr) except socket.timeout: pass def stop(self): self.run = False def serve(self): while self.run: self.handle_request() networkx-1.11/examples/algorithms/0000755000175000017500000000000012653231454017167 5ustar aricaric00000000000000networkx-1.11/examples/algorithms/hartford_drug.edgelist0000644000175000017500000000443712637544450023560 0ustar aricaric00000000000000# source target 1 2 1 10 2 1 2 10 3 7 4 7 4 209 5 132 6 150 7 3 7 4 7 9 8 106 8 115 9 1 9 2 9 7 10 1 10 2 11 133 11 218 12 88 13 214 14 24 14 52 16 10 16 19 17 64 17 78 18 55 18 103 18 163 19 18 20 64 20 180 21 16 21 22 22 21 22 64 22 106 23 20 23 22 23 64 24 14 24 31 24 122 27 115 28 29 29 28 30 19 31 24 31 32 31 122 31 147 31 233 32 31 32 86 34 35 34 37 35 34 35 43 36 132 36 187 37 38 37 90 37 282 38 42 38 43 38 210 40 20 42 15 42 38 43 34 43 35 43 38 45 107 46 61 46 72 48 23 49 30 49 64 49 108 49 115 49 243 50 30 50 47 50 55 50 125 50 163 52 218 52 224 54 111 54 210 55 65 55 67 55 105 55 108 55 222 56 18 56 64 57 65 57 125 58 20 58 30 58 50 58 103 58 180 59 164 63 125 64 8 64 50 64 70 64 256 66 20 66 84 66 106 66 125 67 22 67 50 67 113 68 50 70 50 70 64 71 72 74 29 74 75 74 215 75 74 75 215 76 58 76 104 77 103 78 64 78 68 80 207 80 210 82 8 82 77 82 83 82 97 82 163 83 82 83 226 83 243 84 29 84 154 87 101 87 189 89 90 90 89 90 94 91 86 92 19 92 30 92 106 94 72 94 89 94 90 95 30 96 75 96 256 97 80 97 128 98 86 100 86 101 87 103 77 103 104 104 58 104 77 104 103 106 22 107 38 107 114 107 122 108 49 108 55 111 121 111 128 111 210 113 253 114 107 116 30 116 140 118 129 118 138 120 88 121 128 122 31 123 32 124 244 125 132 126 163 126 180 128 38 128 111 129 118 132 29 132 30 133 30 134 135 134 150 135 134 137 144 138 118 138 129 139 142 141 157 141 163 142 139 143 2 144 137 145 151 146 137 146 165 146 169 146 171 147 31 147 128 148 146 148 169 148 171 148 282 149 128 149 148 149 172 150 86 151 145 152 4 153 134 154 155 156 161 157 141 161 156 165 144 165 148 167 149 169 15 169 148 169 171 170 115 170 173 170 183 170 202 171 72 171 148 171 169 173 170 175 100 176 10 178 181 181 178 182 38 182 171 183 96 185 50 186 127 187 50 187 65 188 30 188 50 189 87 189 89 190 35 190 38 190 122 190 182 191 54 191 118 191 129 191 172 192 149 192 167 195 75 197 50 197 188 198 218 198 221 198 222 200 65 200 220 201 113 202 156 203 232 204 194 207 38 207 122 207 124 208 30 208 50 210 38 210 207 211 37 213 35 213 38 214 13 214 14 214 171 214 213 215 75 217 39 218 68 218 222 221 198 222 198 222 218 223 39 225 3 226 22 229 65 230 68 231 43 232 95 232 203 233 99 234 68 234 230 237 244 238 145 242 3 242 113 244 237 249 96 250 156 252 65 254 65 258 113 268 4 270 183 272 6 275 96 280 183 280 206 282 37 285 75 290 285 293 290networkx-1.11/examples/algorithms/krackhardt_centrality.py0000644000175000017500000000135512653165361024124 0ustar aricaric00000000000000#!/usr/bin/env python """ Centrality measures of Krackhardt social network. """ # Author: Aric Hagberg (hagberg@lanl.gov) # Date: 2005-05-12 14:33:11 -0600 (Thu, 12 May 2005) # Revision: 998 # Copyright (C) 2004-2016 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. from networkx import * G=krackhardt_kite_graph() print("Betweenness") b=betweenness_centrality(G) for v in G.nodes(): print("%0.2d %5.3f"%(v,b[v])) print("Degree centrality") d=degree_centrality(G) for v in G.nodes(): print("%0.2d %5.3f"%(v,d[v])) print("Closeness centrality") c=closeness_centrality(G) for v in G.nodes(): print("%0.2d %5.3f"%(v,c[v])) networkx-1.11/examples/algorithms/blockmodel.py0000644000175000017500000000556412653165361021671 0ustar aricaric00000000000000#!/usr/bin/env python # encoding: utf-8 """ Example of creating a block model using the blockmodel function in NX. Data used is the Hartford, CT drug users network: @article{, title = {Social Networks of Drug Users in {High-Risk} Sites: Finding the Connections}, volume = {6}, shorttitle = {Social Networks of Drug Users in {High-Risk} Sites}, url = {http://dx.doi.org/10.1023/A:1015457400897}, doi = {10.1023/A:1015457400897}, number = {2}, journal = {{AIDS} and Behavior}, author = {Margaret R. Weeks and Scott Clair and Stephen P. Borgatti and Kim Radda and Jean J. Schensul}, month = jun, year = {2002}, pages = {193--206} } """ # Authors: Drew Conway , Aric Hagberg from collections import defaultdict import networkx as nx import numpy from scipy.cluster import hierarchy from scipy.spatial import distance import matplotlib.pyplot as plt def create_hc(G): """Creates hierarchical cluster of graph G from distance matrix""" path_length=nx.all_pairs_shortest_path_length(G) distances=numpy.zeros((len(G),len(G))) for u,p in path_length.items(): for v,d in p.items(): distances[u][v]=d # Create hierarchical cluster Y=distance.squareform(distances) Z=hierarchy.complete(Y) # Creates HC using farthest point linkage # This partition selection is arbitrary, for illustrive purposes membership=list(hierarchy.fcluster(Z,t=1.15)) # Create collection of lists for blockmodel partition=defaultdict(list) for n,p in zip(list(range(len(G))),membership): partition[p].append(n) return list(partition.values()) if __name__ == '__main__': G=nx.read_edgelist("hartford_drug.edgelist") # Extract largest connected component into graph H H=nx.connected_component_subgraphs(G)[0] # Makes life easier to have consecutively labeled integer nodes H=nx.convert_node_labels_to_integers(H) # Create parititions with hierarchical clustering partitions=create_hc(H) # Build blockmodel graph BM=nx.blockmodel(H,partitions) # Draw original graph pos=nx.spring_layout(H,iterations=100) fig=plt.figure(1,figsize=(6,10)) ax=fig.add_subplot(211) nx.draw(H,pos,with_labels=False,node_size=10) plt.xlim(0,1) plt.ylim(0,1) # Draw block model with weighted edges and nodes sized by number of internal nodes node_size=[BM.node[x]['nnodes']*10 for x in BM.nodes()] edge_width=[(2*d['weight']) for (u,v,d) in BM.edges(data=True)] # Set positions to mean of positions of internal nodes from original graph posBM={} for n in BM: xy=numpy.array([pos[u] for u in BM.node[n]['graph']]) posBM[n]=xy.mean(axis=0) ax=fig.add_subplot(212) nx.draw(BM,posBM,node_size=node_size,width=edge_width,with_labels=False) plt.xlim(0,1) plt.ylim(0,1) plt.axis('off') plt.savefig('hartford_drug_block_model.png') networkx-1.11/examples/algorithms/rcm.py0000644000175000017500000000165712653165361020336 0ustar aricaric00000000000000# Cuthill-McKee ordering of matrices # The reverse Cuthill-McKee algorithm gives a sparse matrix ordering that # reduces the matrix bandwidth. # Requires NumPy # Copyright (C) 2011-2016 by # Author: Aric Hagberg # BSD License import networkx as nx from networkx.utils import reverse_cuthill_mckee_ordering import numpy as np # build low-bandwidth numpy matrix G=nx.grid_2d_graph(3,3) rcm = list(reverse_cuthill_mckee_ordering(G)) print("ordering",rcm) print("unordered Laplacian matrix") A = nx.laplacian_matrix(G) x,y = np.nonzero(A) #print("lower bandwidth:",(y-x).max()) #print("upper bandwidth:",(x-y).max()) print("bandwidth: %d"%((y-x).max()+(x-y).max()+1)) print(A) B = nx.laplacian_matrix(G,nodelist=rcm) print("low-bandwidth Laplacian matrix") x,y = np.nonzero(B) #print("lower bandwidth:",(y-x).max()) #print("upper bandwidth:",(x-y).max()) print("bandwidth: %d"%((y-x).max()+(x-y).max()+1)) print(B) networkx-1.11/examples/algorithms/davis_club.py0000644000175000017500000000205012637544450021656 0ustar aricaric00000000000000#!/usr/bin/env python """ Davis Southern Club Women Shows how to make unipartite projections of the graph and compute the properties of those graphs. These data were collected by Davis et al. in the 1930s. They represent observed attendance at 14 social events by 18 Southern women. The graph is bipartite (clubs, women). """ import networkx as nx import networkx.algorithms.bipartite as bipartite G = nx.davis_southern_women_graph() women = G.graph['top'] clubs = G.graph['bottom'] print("Biadjacency matrix") print(bipartite.biadjacency_matrix(G,women,clubs)) # project bipartite graph onto women nodes W = bipartite.projected_graph(G, women) print('') print("#Friends, Member") for w in women: print('%d %s' % (W.degree(w),w)) # project bipartite graph onto women nodes keeping number of co-occurence # the degree computed is weighted and counts the total number of shared contacts W = bipartite.weighted_projected_graph(G, women) print('') print("#Friend meetings, Member") for w in women: print('%d %s' % (W.degree(w,weight='weight'),w)) networkx-1.11/examples/basic/0000755000175000017500000000000012653231454016077 5ustar aricaric00000000000000networkx-1.11/examples/basic/read_write.py0000644000175000017500000000131012653165361020574 0ustar aricaric00000000000000#!/usr/bin/env python """ Read and write graphs. """ # Author: Aric Hagberg (hagberg@lanl.gov) # Copyright (C) 2004-2016 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. from networkx import * import sys G=grid_2d_graph(5,5) # 5x5 grid try: # Python 2.6+ write_adjlist(G,sys.stdout) # write adjacency list to screen except TypeError: # Python 3.x write_adjlist(G,sys.stdout.buffer) # write adjacency list to screen # write edgelist to grid.edgelist write_edgelist(G,path="grid.edgelist",delimiter=":") # read edgelist from grid.edgelist H=read_edgelist(path="grid.edgelist",delimiter=":") networkx-1.11/examples/basic/properties.py0000644000175000017500000000206212653165361020650 0ustar aricaric00000000000000#!/usr/bin/env python """ Compute some network properties for the lollipop graph. """ # Copyright (C) 2004-2016 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. from networkx import * G = lollipop_graph(4,6) pathlengths=[] print("source vertex {target:length, }") for v in G.nodes(): spl=single_source_shortest_path_length(G,v) print('%s %s' % (v,spl)) for p in spl.values(): pathlengths.append(p) print('') print("average shortest path length %s" % (sum(pathlengths)/len(pathlengths))) # histogram of path lengths dist={} for p in pathlengths: if p in dist: dist[p]+=1 else: dist[p]=1 print('') print("length #paths") verts=dist.keys() for d in sorted(verts): print('%s %d' % (d,dist[d])) print("radius: %d" % radius(G)) print("diameter: %d" % diameter(G)) print("eccentricity: %s" % eccentricity(G)) print("center: %s" % center(G)) print("periphery: %s" % periphery(G)) print("density: %s" % density(G)) networkx-1.11/examples/multigraph/0000755000175000017500000000000012653231454017172 5ustar aricaric00000000000000networkx-1.11/examples/multigraph/chess_masters_WCC.pgn.bz20000644000175000017500000030360012637544450023742 0ustar aricaric00000000000000BZh91AY&SY[64߀X?b>:@!L#2RETUV4F KAUԃJi[}+l*M[e6ZѪ)m[wM1EQ E "U"JPDHTDU ހ 1nGt{ zd;vaq#]S]2xmIU )Bz˝]֚}Ϟ>7ͪ-BKQuvY@"SM[PXVSQg]g8tK6mt<ާOvh6T:5ؾa:5:]4I{6<T*^cÔhPhP$Z€϶7|} TѢM} (UkTm:׹x=6os` e[@P*}h)nd6e*P4}.[om*6"|iDU@6iOc fJ`BBlj(4{zPbm3OA1RmlbmE)[}4tBQ4=/nLɩ Mޅ3^>}/U {aJ !(4-d(M!44׻-XBl|%Kq!J% *cdPW"Em>eN*+6X-bDD*-unrh[޺Ew۪NfZk3M[>5Rm`RT'Np}P*e6ն_}NaGN.]jɧ P &Wx> ՚Rճ{ۭk_vl9ֶ:p{|mmUgnm1E(mo;);i-7]:r9Մz}ekMRˬnrihqc@Uw(tݺ puӾdm}[]m*frűs$ef,)e4VQ,uꊭMotmݔDog{kTiL}e  =( IJeЕ0Kmeu&kR3 @m*AIG~&DQj4Љ%$Ih$I0F2hM"QR( BR`M iS24J$) 7 SH0bEA@j([kUlbN]l``5э!B1QXEV+F l TZbDѭ51:4mlADFIQ&ƙ:mPmѢ([&,kjKKƝ&6IEjiFڐ((]cX֊˭-V؍`jA:w.IDrnAG{ljj~V(y/( uǬ8ÏԓYV)Hk8$OĊ4u.+ deHI8O$@׮!ΤKE8&3Z@+SZyaGSwNC)'yCަ0غ"%d9yďhL@jnLEdʹ CY#KGLJi4 /2:'ĜqHB/Īu5:׌ a|Hb\Lu 0g1P:Mb";af3,>?; 15/,s0j}O+1JaJ'&mdl>Xy Fzk!Ѯ|=@q;i :mUbab,̑Sl K2{p XPb䘘 `ϩ? ~|Y>$n@IcP}@R2c8h e*YJ/ʣ sz#rD$I'ߓ?'T@sD|7w%F@R0}d$I.ILDAl2I}Ladu2>JFufd-PHې9S|̪̚/mY;adcfYa>0: |9/l <#%TI3d:I*hF$q)<;qȉ/ ȧ)&Q@3?:ƿdO3yu8eBd:\@YTS0{%5g33nT%EШP ]BOOWL`?6#֬66t|+͆X &YRqV &kUCc׉3"gnA,b1 (Z{-l,Eg+f(F J3Eku2]ؤ.1AX*#ssܼ~n1)R%XiH%Tf7~TDQQĔ!%gw[zb"*+3``K&; :%)'R08AC@wd)8gYSK iBgl)U/CfGS?=,Rnwt3U? Q+*]EUrkyx_ DWb 58{^%E:8N8#5&CN!j\x*X*o{"C띹F|`6ˏG UCōu,e1-:pv4? -|BJDoAԦ~-uX.eI#a+^h ydXQFq :34E5 JG* Ob:iC[!v?J2nnnr{б$w՝MJc2LgntcuD`q ༽0A(LTH% M'є.uX\ctUgzE5}Cqy1YV`Ҙ=@zvq*fdy\x"oA|$42E%Y˱ m ޴뿭Btq5 ^N{HCZvzBXBSC.CiJSAa;0tL|9faAF8Ul8ft3{mp 7UWĭ7oHXQh&7}[oIF 4҄ 8g=SkJg8vld=e ‘mCn<.9Qw/ڞ>ȊE6Cbl^S_n@^0 WяkVyy`"[^'ݾ{kKf[({'`eS7I[N(0c{h%U E5eBspnQ",ݶ6 'V}^ k7;O_{3?F Ex 2(JH #*$00Tp (`L(dBRHH%$@`BA`U %e RP$aD !G( =^=>WN5[ڍYC- ͆EťMqY/ 8/^Uy6*ּAn4Y0^\QWzrțfćl w03nt5Wh9''AjY8(Lue%XarVgj5, ̹ǬO80/W9+nl+9ܷV#Z<>DFD~ī,:MwN>e$ʺVv֖ڍ"xlj%r7!(3:a%ZHO,z.yܺh)p8qဧ-k`'X"7lGbR0UJȆEo,H.!ɵZr%r$ɤȚ"g| MG 4Z1b,p+G`y+VOj#a !%&X펖ñS㔗-L}Gt,4n s Il풱$sg&0ӄlcvNn=/W@)~O1cJ2vsesR2#s:6fTŮdD)*WSZ=ΘiIT,v KȮkPjZ|Rund>H:$e_7nڹGJlפ!4Wl-ZB#ګR*ӒeJ"JV'TFꚪR'+fPW|jW6)1:{Ӗ?_lf}o9CJ=59s%*&6;(B$+鶖ȳ+ 7&F:쬶uHCc鴆UT[ ^ͪ:d1#ېU,MLjaWߘxQмfOhhGm]MX E\NXc"[SqSݙbDK"vO't+/V"1w[Xy%H/a)H尶E?߻+$g}~0LWG% \f+oETrDJ*ן-|&Zřc/(QW.2׏u;`pv Vt튜DEO[QzϚ2bu|ni>lmvs[ZǠ쫪Sf?km-#0=y^P;ld'ȲQ×\vZ(ELΨX6cहPekוKs:km1z'oMՊ\Yl$y-ij_0|V/9r]bnVRIJbl%2D4 FH4BS76uf[TQ /(Fkz5"ewơA+홣zab[fdϚ? @$V2fG^VձQ@p5@CP mT-R'TUW}rfDy|5eTˆW<|zlOE$ Fnm;[Wn\uD_Wy}4u g-K*-v j{\sW v8xڱ&du!̽˷-+Sbݚ)_s.TTq( 0$&r-IE!s/';z1M>ipHڕZ#rQ=z)d͏+Xj Vx`?vvQ-ф&}Dڀ풓.VPEj`+D?s&lo~>i6 `R h]tݧv P#*X%H~&׃q_vn}w>MEۄm[S*ؠ JzmأrD[wbowrf6#6-#bG9R*S"nJWBlX%g_V>MX!^8FH%Zjԙ侅`wPBhڱ$<},$kUeD:"7%SMhcUo}Yڽi-Ocjr3T$ kװzK!_}0yzUƩO{Mkb'CョH엂`Bg0 GKi^6co*~ uYcmd_N>ϲXn~8ۗ U5a&}PIT"glP&!n!M["ק(UQFmlKT#pJ(%vnMzNEUO+uvѡ ߼Hu^j$~V\:G!J:M۹-6@cib)eʞV1Nj4׻]>K:^9Xi!n2}*I)鿻i,8Gʤm9(ypև"1%;no;i@.F'b{<KfAvFXQEnz^mB%Ē_Cu"eIf[hr+v\,,9jdrפ@^w zFq?}a?Wwd&yXa{֧17DHk" z?iUZkGLAS  e,vyRžA9oO :W+IiI$?;6Kڡtlsl[`oZ@|5 ͚= UqTW1 nf s&=;vm8s7 !`(RUJPTU -R3004J 0@ȡ2AL=reJۿs6|tJ"T:+ FWlnhKECGjh=-֧z%qʤ,Bؙ oNM@xz{+.^ZGk:PUdTolf_)T8pf5iYv"$HWBc5!O=oZZq؃)LHj q|ζD3h ܎Ebܙk, r RQFFvb nMUXm?,SӼY"G:XE]|z&]T2䕊;[yNj Jpmq=V(Fu"- 00IVL0$fy*I0奋y ]}=Ng3;fݪ 9u$ _IתE=})m"RÄ9`9LSZLG (n>k]i V66wf ˂iY4Zʢ(1k*cN?Xem\^Z%(hjQWі5۴6=B~,UԅOo.xzBbhƁzk -:ƨjl2:f߄ Ҳoۧ,tLNd %I!W S4E+TSuᕙ$y ۛYTS\5)^ )ŵPR:RFCk40%lJKԵ-T\d(WcVHhPh09Sՠ^5$=V qwMHn1/-].Z+cWdv[V䤲D͏ 9q1(-QqD)aohݹ@XєF 6FF4l_x~i}yg8\m+ .$#N'{!ԥ< Alkh(BM6t80m뻥Ja 3[cBbӌcnY\2klP{Nkc;<5P>mD\˔mcdZ 7#$®F038q]. p2o7DSQ~jL׳ۿsF7\1L]\GՑ"evŋۯ)e]-IVK}mmlB;7~_e6z*& TO Je* (jSw[VS%`rj"뤯ŶMif-JXZݎ}]9_yNowN M!Y+ |ah爹:ዖ6Y9}ٵݳfDch,J1{;j}7u2Xi.蝙0JA%-[ -Z!xؾ),Pi5m$/;h@a-ܡ; j;]^:9_=Ť867;B*^fRn~ݝКjRd&8:ꥥH+uj鶢j,m@r4ǣFvGP4Gي6&^7L٭CV$4i&̜1# =9n{'7VԽ ,cE ¤lަյY*үr/ #'1݆-F|+ilP{ {Zk_xs [bjDC8\367-c96GևW۴DgZ3)A7"ҽ>HK˫@%+-B4oMMSJ9F J/3smAn#[wMu(ssa,V`H&URCwUl? "zQKiwlWt?<* >; ,LC&|07GZ@&ƠlbA8q0(^^Q LwV#*pӅ2:>b-V˵+U҂Cё2@h_o{g/ktHqH߶eoz-evD׉䠜+}%MJiAUh }=-]LE1(4&W4.6 FΏ*l/IsjVCS5(:u5Ց'h`HʕVdvӠ@R{!Sn,rE mt1B` Xwrc 1㿖%߻4=~o\T?y?M}˸7$aWR-&HTͶf}.+m|VF'Ϛכe$c]Q|=.&5X1-Ee]z ^e'B$#;9(, #L{T1lyTAv'qn[N)^;FZ8t !ZNAV]:ح"N 4|g}C|p1bc3f/42&#qǸ G!tacÌgs 8pגD$ Q4ruhM]#-%s馻>j9`ĕj&I[To3I' _ s?v63j(t[UN/ޏ*ܽ]ťTDa2*JlT't y,Q,60MJ1w*u)oy0XǩQչB4]*Y)%nΟjw 2zVQ|bHE -+sm֎ߋ Q=Mʯ}w8RdGJZAŲOjq}uίkf=VJRJAI+Wm޼vN\=%kq.r*XDMfLئ[aS6嗵S|]u==\y. Ejʵsi*vmva5 ES^;#}xdcKTXӮ]a,tKӺkxl(1Z`ݙi2l^F5:բ$+1#!h4>TQ#QwE6tFj)0^jd^ JCs,pך)\{FTgb\i\+)N#6=fF߶teD|9L7!ҨJ)"_%,LOE BJf9K|ZV2WL8JmI[HU^WBˈHO^: Ůj~e5dv{p,kVUz5i5Ĥ\1G- 餜|0nd[r(k|onQb70esa}8|ܡb:j( hIETC!&x#M\JQlTnږɸn &VHD.4¥1+tߐO,soS}o"kWbu"j%2C m9t*!Kw' R+HZ2e:pSԗ0pmɀ2$.b& <є.'-S%XMᯩ9{h;m6<$z>8Q؟.(\[8N⌽mEL X)%і & IS7fW%ژuqp;ejOeomiNphqU>6ѨS1e (c:EsCbԞv#u+{-lp=&Dl̋#&\=LesRիWG) =|TԻft!4+ھXDMXȊ2|*r) jI݋o+ <TۭDId1|^?Vdw>=2vTaG(7  ā0&X1$RC{CҖw4Ԣl\g3!s~kbavM.\hQQ׶ڕmCpuQBۛG^Sݢlnk)2u6qw=Sh0&! .i}EdCn@[?J7Q$2!34T5}]9vS4>Pq*FVl!8y>nښN+"a#u6 %gr^..|$9H,_Fr]sM-k'79JXͳ$a70 fC Rf~ZFJCy3̔ 0EJM0d! qFEn2ܜ݅gm Al#g@FZew5(% Ah4NX&.BX&#hA@JS"U{Ю/pA]˒LkU#6K7}jۑ(,0ZXviٸCᓢy4|&0KuN'zi(+HV۽$!ms~MQn8ju H륎9\4<Gd}}m_&H8#o^>wFG.&vPHTvހsR9:26KJSֹRaύb*Y:.ǎ!,|P&e17-8]M|WjHgJ(Ǎǩl_128af|X1kadMvQj3AFDY(H$=ݿZb/tEinjq2nH-52՘.4fƜr7;wxzgquv ip *T&őqLi8,c=vJU|iO}E!5PnķFAt,@pKY ɋq^X3i*E2$]stdd%lvf~uoO1'LbL,0 H.:kW’nx#.Bc̵wrDՑd&Ȇ& ]MkOyIݳr9Sgm+mXp$ۆSh:`̦%BC`C@VՀs4di<ۼpJ.ԏY )X`\qifg^j5u0B;, FA 66^Kd6"~ieE/b-@7Cç# ë^0Tj܆gZcCrG&!Elzdt͛GLgjM-+5!F҂eUgSq$XWyq\f_^U ҇Qa"aMf`,X7 5$#C{-Sn^{^0CGi"Yo1jݍRɬh3;3qn)~vU/8@QtZH!nyӻs &@aPkM6>wNeo>L!LnS0ω=Č%R  W#:ޜwh&]ga>~"m4&Tp&[[Ud*fJ7긛&qMR-(FR~{#ަeG*MQzAkInF)+'%n@4G۔M&;V}Kʹ/7꽲b4YtXQdMFeYŇQ0V8b-H!B# _1$m vJ{qvoʐ7|<]n@I=PKYEsڪT2kf~h``;%(~g2^5Y2EXrh^Z%Ys#}6 ߺvsEAj> QQxf䳅#_w#mNDUfL`Z%ܲ,5:_cLqIc5!NjA]ro3vciP^񁌈R"#O_N`Liwi~o0 x%e'Ts.{+)~sq!?B "{UO`Hp"` P0׽U X"dU\HZ/3dJ??I7u-rߦ1ŭlqf#i_?6_xaD I!݂Bk:a#Yɶ+$ #f 7ٰBYA: Z['D5~>Y6ˏTեLTVy:3<3ᄽd%519\¸3FtKUV` ĔW7̫L(;ԎmR*/m^6T݈=zm%45mߤ>îK6#!ndM(ڪC{(#.C'Ag!ܭBacuZlrH 8VDrc*;:8Xܓ[(],\2>K;FtH9ت2fx$VϞcZx964 cJקeݕn\:+ِv "[<1 A"u(v`([af;n 5lڄM lEQυ*ksMJUZM=a.1^~;.YHiu>Z#Rn5 n9fm*yC%+EIwcNܙ') ~ޯ|b=!UEы]k1~n[jmlhLL:BafdVy ܿ}sye؊ vKKfB0adS[a_5]~ވix>gY!|,Xz{¼'g\qhT4nNGMpT,`4(ГZb<$Y%Lwq<*d*=v_Fx3[}0̓_kg956b0HopQrM4ZN 懄E##DΕ<ݸx{$=d#5!J8>{AR)b'xsOzzᵪ-0*K73a5Opio:e=RWEV4f6[ TPꯊ(Eږ%]wm=WLX˳WI;fy[O&K >q" D0J%Uk&B5ۍ X~[qWBϥ PIʧ;" k  (ijP@li·Gc!qL(GMf.FbMCktTZouøћ;;|Nvfh0$bIiiSCS?4bϫxV4_U ސ=7f[tن2V O+Ƣ}~*ӅQ' \m+]QrKC(/X,M7=ќ3+`=bٮF}`'ش5oMypWz8 !e7]yK9: n)hMu>ղ}ewV}- "3ڭQrpF!,5[ k3vT$R bH &#l(uôH?6 }%?I7^`^ r+)X/Lc  7ϧIZ:͌jdܗ\ԥQʦJ;PoF&f0|Us[K,uh]av}+;+vu3uI]lhC&$aBqQ^ [sOV3feel 4|ZQ-k۬mqudc/Gĵ}{US cf+ԚR4c#=e-*}Z]Mf ) 8uf=L_fpQ7L_Dʍ"ĔD B(uniVҴ7Fцf*QK*W&8d@ L-n;5yF BЉ4LgؑJ]6[W{z Զ%y6+E BʎҴ:^w:*9A)Q}@\+ыm٭tr lhc1iLN3̼nCXY'Y:ɯ3b|ta>o'ɻSl_ ^ИmpCwN&QLb'6T]O}z@1 :oP[F[8IC>1O֩ >T[UoshCqcSٹ-›%ŌfM &gk;ƳC 67P6v,L-F{:oW{K+2l͎]Mzb$q3lCަ b:ag87dr=ctjC#Y-2ib\ ]v̓ޞt+2[YZ)"=1~sn\jrp;u'ݫJ"I\[ GW^~ݣ|u{"]i8g h}T ڋ"ڒi sXhŨ‰ϹU zP=:)mM*&zFxl\ 拋T,Uv[_yMm{jؖuI̓=DO(do[LOR7OgWlv{1E&|=B i~ux#P@֡-(漚 <㴲ʡ[zU1NsVآ-Ex-@C'\nm&k|8Q5h/߮ʚqi!Ck)OqC^|n5o _ Id&Y%Qd0BRW9H~, N!氶YdnRh< Z5ڞvWdSP҆kY/øF@h"ݕqdmIJmRo:sn !%pץ=ztk$H@7k)iM,4,޲Z"?c y)&Tvk|bڂE:p@(^ѦM54W\E~i>STv%"Ӌuh}34q0F ;Kw.wHm԰| Qi—ҊVg!F ` fHvӅW l_WtsMɃW-4#X 5n/l;vzkQbشvY]޸JCnn?τ_!*u_gLr= xkVF%g'3Y f>lbzO'̳ӟ,%,_x,|.0%vM4x J ~fKhT'58s} `+hV_Ug^hZ[._B8=$5Y+vM=>gj*1S=4ECYʛ~!oܾϣIc3ui8-,tִEswTƽÝOE?>Db0<LjYR4όRwz4th+"Qx3\B5z* '+w-|=Ph%MILuf&6,!ǔtGݭ=槡W՛LP-1^@cnWur>ړ$B<ρo*7ѮS:ٱXғkilN޻joN(~&ITh[:7"c,[ڟ$ fdLa)$hPFb8F,6\˃jez(u{ +45ݟ,ww5^3W5l(ܲy;+~ܠ]cUV r~hEL#- Ϭx7 lW_s%A*SAr3(K o5Nnvl+79dv6E5-ejY* =OC#ZfrY1E[j)"'OoReI7|ZpRzzu&yL~v8jnJI'=^_kjz[|1Z{i4Yu/wmMȨzҙ'TךhVcMݶ)&zȱ.D^OQֱ'E! Yrg!v˦8ld};CXG-k[sݣzSZ}Z];ڡ Rpzz5^ʠ;%8S|BqB8uLPoO  {=+惯5{u :h66>&ↈ9Ӹrȁ7cj!kG |ld\NHHSWEC%^xߥؙM\|U^&b|nFmmaC>߅v2i AqI,HϬ-(^O PEޚHЕ:ݗhtD6ӵmTon`OFpA8ljqqS6Uk3KWw@GVͿvw@:6Żzq"V=[ђέ%bm]]RhvTm<#,NcEX˪i]+LpwJ\4QNx(F.bTylzi"14)%Cjc zmoN $1;f}1X:V䵠j2r!_4ՉNPmSLZy @ayjڤע["i{=d#=lVrMHiiF\AŢKt֟2Svky&ZSt #)JXNE=A|Yo xD}q vI`AX%p֞8n麷mf43\^ze[hښ*$" f˜AۺN8 _jk)7vH&"ZT\v]zU]!b܂c;89Ѩ ̳7m[jަ&kb RtC툵HI9\MJI~j!{~9~4QWD P{F-& ?.'P8e`I0f𩛦I\OkϺ1Eϖڝul H!6үU3#Z`+273կ Mb2TE%߇>97>rk(ؕ;8'4Z,f@/!` 5o?|l/\SY+a"əS8֧O%e:"QWϼ92,D"F#(G=7J!}( UXB"#ޤ@3"4i6 j0(04Co'K\V(jH.ZŽ7d\YX%loha{,_zSe7TZ[@ @ c"L 㒿Z;kd!knnݯ6;xJ(ٶ:^t^ (2^c&Ea] =1KA+Qպů PVgG4mg<\ekP%,:lGȶ' Y.\M a@q]p;poJoxnY9]l~ mM:y[hXgt-*ptV୬ &ޫNU4U_^;Q$J[KT˴W2y謦'g)3oɩekuJa_k;]>`%d`;gYQf Jbun$ \J̦Dn2o..34Mg$QcK*mw"B-#"$~-X i.9(l#XpPxPbȰ3H@ H4w=t]fD`lyM;ѽƬ1fH(+b|⸖ 1LwY>ғBK s %o0?E-3Ac/B5& kZ'Lڊ&$C+chNxm37^g{͈oAYigM3zcJfjmSU##Ώ]V5jbm&J5{r^kHTy Kˆ c!}+ #hkҞUQP L $p~iZ8 I>9˦\FCh"+IzنA[0 kLtKeեr0 h4Z kb ($VF&={!q@)j}%en[Yg *WJT2]ծff$y8=@οvn3v&;h_A~r+ %$1$$kkx"-(BL$IDAseA\1YtH$ &ݣr"*S5fhb+'CxBbH6u^[k'"LKLU,_{oa/N~?+a6{jn;>BWj7fPY{X%vV(V^tN+FD9IO5"55t׷?ZӣT@_/RQNiHϱsx8k=xݪy%` ReogDw뱫(ѷc>!J'a|QR?bR*$Abz z5/~o2A#p8Vw=8w=XwzIa*9͡ C} d8ɬ1t wdqݹ>֌Ci q]3u:=L߷V-ɔB`FuUk!}+#.Ow:qY^v[M'zZ2 H-Hun*޺oЩVQ TP53n@)[]px#m8ThE%VVwEWvM[Z5؏Y,GC+q(6Fw xU-) % 3fN~GS*1:l=y-ˑec6:+_U բ6Ԯ:}O]O?<~g&ONoml0J=s?{sOa l 0z 43ߓ' c+^$Q\߳O"}|z_~ܸ͵zқ?V3RةA #;ibstǎo ={W{_w|p2+Gɚ{فpM>p6mBͯ=7OߣM_×R剜cM8YǥPÇ.[FQf+dx{Wƞ}#sGr+^03o؛Yk,϶߼}#nNwG~+Nj1%ܓqڮ $n[mVȢyt:q^yܟ,x[K1`{6RE:=B}eoW*4xw*q߿|×˾X^fbh^"Μϛ>--`Ehi]%6 $HFƇC| u`szm:c1JcVɅR%t4d%J=s-q[ %E_|x=]]TCXEN *Jmʱc{|Wqy0 JJ%Ujֆmi".w.?!M͆IrXHV϶jqBղb۸Ǎ lU-,*@%JsأBj." Ud@%ls]U+-\[tr/)ҏwuuhr]ײ̎B|,b-# 35rKf;եr~3[^ktnBjp;޹:]~j^yZϧnIwT6= \\(۬ݥ^؞zkB'\˶U$IZuDc؛sE!;ZV"%^;Sn[FQQwRʜz1l'T9ȐUm5GluIkD$"-R9jIx>slWE M̥߮BuQtϨU7q Z['{ճ L1/Gm#ֽlU$bu}]w3x@ײү@5/MH^WnQ3+dTJYHFPN.lBSN1ǻuxLSN8^$%.8 u175T0ۼɽ}FI]lZxxcɭaVk[k`hLLnp3JFH—!^~G2kٰM#5u}׉RVݳXb-bOn|U<*fHrCӽˢf3SEGnO,;drrӖ$fjܓ=IeY*IF2" wvM S4 f[1OvM\<'dOΠU?^P~D;Oت'p(~OH' xQQ>B }]otS s2߿Xuyhn)Fdn sg Ƶ[o>TtFdM?]"_j'Ga+Mn-ʲ s8pxwG>E+paq} z+@A5 BRcicc>gnʻ5'f\s # ͓,^XaJ {a<qגs/wž4 :5_YjXݶkoISzSarو@Ǝ:?%a(ՄwF}7=|NPmC RN2_/#Ǿ+u>i,6['+D+%_j$::>ؿ["}dOxQ^;yvMBk x\+W4mG:Ij-FR~lo_G>j-`Ɵzv`FE\LL*H }P3|6' ߱bn-mģۮ<^q͒VOvlL!Z~OWji dE bh."h"e!HEQZ2`CkױZl֠YU+YwŶ"R'bNi}`!ʧFʱHYpHMH5c(wU]e_I @K+ϰr#^'oT;t5vke°GC8l$R#4|UUXpn>{޾DUU ((y%xU[ZS2fiRfjdu$keݙ{.bѪӃZdv'eX,J<a1,(Vhk4}BhMv"ٴnR$LM9=E5mTUbskg]a|g%+((0dB )˽;v=|0 ̈zn6:p=R~$Glp8̼^Q^^zvfAι(|86g?2r؃%ӛE X_u+˲vQW)$!⻆=~SsBdlA/ಟggMcCfy:7L "y_ 2ߴMj׊\FrnK3XW[NDɡ&N-'̨Mƴ8>|ckQ`x0㾈t}E:6ܚP|y x[b=L wx,:#]-BD@}@$[3Qʐz^>׻x_i$N)w񷻎 'c7x?j㯽zo/1(7Nk3GW`ViG'ɽ4͏{=WL)K=qk{.fO gm}@"22[gbo_};ޞ(*I^,VD<%}NO< t^d(333}U݁{5,OE.@8#v}jLSAyo<6']y{^ GYFq v[C/ #W?Ww/~~ =s=CV޷bw5)!q?JKr{݉We"1|]]jK9cjuwY͸~_B6=hӑ윴G{!^Y::89>Óy;1b ,cwG'2S|ȞǪ?,dŹ7q?kk(? -ոQ> rrOZVS4p^Y}y%ڄݢ j-#|{O+;Ĵʾ̛q~pg9 I<;XyIet2M[ {k4Uw@_{I5<ي3]ײᯬ$r-Z3wr`>-MnxGv ׆{ܒ_yeO*Ùsw:E|P?LS^[{*s 7b]QW|~(H7ކ`nI7-y@AoYwV=٣ވ.3@ufˢF{zМ#8N$>9z&{O/ ~\f}8zsM/ K /1gw:du6[mL47&6FNUh#/!Zy]!;>= 2C .4y vz=xD{}DYNi~j~/I;<$yLz=cӛ U 꼎hgd1{_5> nNWǞqP{ we=.+}=/yԓ9^o>v-!vvcQG v좬Yg~g(ygFCӺ_Wx|ZElytAw:7fb{qIsofZ99Cś(F =onw3{()_e çp]/AcϑE` <N?)Of0/tjKIew+HŚОxX{^S|v&47ƓoxJfցq-G}ڮ9sk6V=={yWO8Q0Fw{(^+C7ٖhcfd=ڵ$i.k[w'nxtNaCi歧,y-'qVk/'֎1{ww_iFffq2E4ڸ}ksՏ3؜,337%$#7g{+hOy؇U{sYeS>g|BxDTm y=Ls-y3rKbO q943vbyN̯Iܪ+Q3spӱ7.ڄy3sG9ps=rlǏ۠{}eTՕJLqҝ[AT=&H;DBԃu@dp}Y#e2/$vJ([|.{P}M΁끕e#R[ßkm~[k}U3sS;3{F0{yBc|@Tn=TNhuI6?^/{۝Aa&3Of{pLž;{Dyܻpz%rñ|pO1g9n=f z;ɶwosTg&-bq z(!:C8UC$e;s7}jhbiv";f-EfVOA6m׷oqj0=8n')7}}K͝w:1z˛V% ׽:V+%~7M ƃۭ]=mwy{`yۻ9˶zC:\wܫ SsowtEB&r#wz=y̝3^rQ_?)s+-cPk}{ðF+ V s0ܪ䗗o'Xxp$!/.V jm{}khÆ-@՚'w'{0*>>ٟ_ LE@}Jfwoub3m 'zp;=# |7!Ç߼vΙ!4|sɀwþ=|Xˏ(D\#vgtѿgYzL{}5-}=qkgs^ @b4pF_:pvnxr~>sG{t%{,ٷ]tfhk/{ϻr Blk'C^4;TԹ@<|_ NB_Y;s4[gX7O}RQ杫W#7rv_׻Ŏ#QL9lܓ痽9а ˅2n{5:7瓨Sd,澛_ ۝H> ~kbXHboM;"T|{ %<)k.fETxYs fwr˒LuO{Yg9vvRJkd}]E'xB2KxZ|p{^hoV$osݾ~%}.Qxpt oeuʄo]}YߠJGgag.%D:Q$!74 SJAs=ioj_=Uz@Ie|l>雟Mwa㏤9|tM溈 {4WS6cډu`0z{ OǾ= dr/{z{&{w C}au:,r'V wOy@8!ϯm|V_"SDt>\(| $Oo;,}9%t/=:}+1+{Ŵf{gn؇G |(Ql&M>`W3װz|t;̷F,*wL3TN}" ՝w1u<בw};g7X}Ҵ}.4!sy8xyMˢLc^%_Oly3caLݸ3s` w vZSsi/ﯽ\wm:{L2v?$ O=k}UY(@nU'/d;'3xkY3%Zd|<5oxI݌{N1{Ha։>ᝯ[|s Wat}5ݓǻ2ǻ5|/ sOa1feL==H YUN6v";m}1 xL~|._?| iQsтy^ϾOYy*}=LJFg_s{A pһ` h7<)=KT# :4w}4}_KGg7O wWOw4KH x],@ ~_um2ÇruMxx{v`)%XV#k9<v${c40P6 q}7} y E$Fp2|}C`g؇y?z߹7Ov7%ŀ޻x r(o$mǙasr=u #n{2LpۼJOD-Ph0W{ԫ=,$l#='x3; Z 7lŔ* a&E[-~3겕n0C~;;{wǽxDzmPpgpfݛϥUʌ^ܺw|z|C@ty'7wRD{|%V.փ<|yv;z}wĮy>w(5oCESs@ߋܿ\&L1=oF9A7&|xIx⯗qӹ3mκ,s5.5dܓ<@MmW,;{2Ӽs *9SSDh;=eBn[du/-WG:YHT{u>ҹ?y&l=9ݘm\}o.6 '8="z9U80@/?h+Ǟͪ?F'rAx6a@p0Ä}siCR|_<52.Y0S˽Ɔ~͒L야tpΚ7O_l|돴x{`0׼vGt[8MXTu7Az92xX<{, {w X#gק=:ry|}MKҺsA[zr߄[WVǪT0:4ZM׽wЯ^p4fj;=.kǗf%+bݱKʝO_xG  jENŜ7|u!9ӷ/=~=΄پlj > f]F><ĘᏢ3,KūP> Dމ$/x.}L^=^x8fwt^="#Žla; o3G|]lh𹞳*Mh_a{{wnH{٧=rfytA7Z:f|Fv{x-ή7O[wܶp[ 2dwG_?=1;x=w9P岯iyɹb ? Y{;5NZ^|̗8;*#n ۴.NʤY+\=pTCyoJ cs+68gkʼ4yj`]9 {B{ |;8xaRS'bѫ3nݜw1{}o nF7թ#O}ow[vn2pw8|?N2u2e >[; ~hlv }}ǝ,¿Y9cc> L/^*>SNh75[>w;D3ü \4`KјFT"&hnytQr6n{|7/]6:UQ5fc2lћx6^^Om1v {p(%6xLԫo>ah^p`\$'t޿}Nt$I=ǝsGIPG= u'2ћwþY3䭻ajۇf>`C2 A \5yQvT X-p|o/;}=۴IgC@ӽ'9aM@'NOl6P{NަP ( uן <_}ç {G3}U`O(0_{7vv标NFw[ 6 Ϻ#Kwtv9D< K;mb%`qع3pmI(C-CϷ#$=Kݣԑ=R<<\d{{7dޛ ,vgz 3o)E\5aׁOt}}xr|vմI@ J@sovy}{rݚ չByC߼av3H\c{ B{|}QPϟ>9};=OL-`|G`A9=>r7qÃSWa_w!} ޠ0olW܃-ں]cؒWf?`^˒^l34by{~nq'߽y&oدzI!ei л7fv(04xwx9-"3}:EUiChVhYޟi3C:Zkbx}\tVf^wTe8Z8 yg `fZ"#ےyfUsϻA"=\) YRfw[''ozuJU^?fuOict Y{|T|d pBfO)+:,~ޞ-s;!oGy(4{=-ࡹWi&CrfާT^wطϖL5pt_{ EfB8 cr;ww=+{=ZRS}֥b>]Հ偏5֮#)6iV{o< վC܅еVsg}р|ņ-7n>^wkl2&pwˆ]XBY{Өi`VRN><%f} 8=}uors8a$wלssaGu8wer{'en|Z<ĽJȪ`/پޞ-ڼZ6<(d==)1{ǥũ?{_{f%;Nuj܌#ǂ5E{$z/e;"].BA8ow{;!d_C?=;P ?_IibƭVtUAb1QUm 9U5Ex"R%)J'_'F>5A5F.!6 Zw(Qg;c6d(T=76`MF܁Ɠ*љ,teR=uW pa6! ( > ov1QE-"oRFuX7uE"D',Y'e 1%;e˶2I)uv m+9B ־(უ%){tZY/Oӻ5Mr|GS!"7EZl]*^[E McкI"(C,YG p Ƴ nۤj@ Z{bbFX\$;o:]_Xc6e Q5bnGT5HKol 篻IGr¼Xٔb"F2.pڰe8$Y 5OsHzD#p Knm㫂x!#іQ, /3r[;;v!;BZ#0X)h2Z-1` 5ﶫ昺@5AKwZ^Zx׬A$j۹/U*m61;()`s%"b(YKt+iA #d|TXq6KN7f-6HȜB@!Tm-@E,&s\On݈VRںq DFIkPok~!VC"j$̘=ELa AFA 4USKd LI(S4r!ӛ]e{ ْ4f78RE R{aϟ4+qKqH$cBnʩvg1"ےy. NGmUd$"C'ίYZb_ۈ>aTx¼3*o=eS ⏨"Axuw ayUR#7Voצ5mL6֫ ]:9+GQ#&Um'Č$/%o]>H {nq+KbVr;OU/&2.pny˥嵡'ȏ[%i&֘vw;2M7憱^ {ԭ^WmkoOĶX8rj ^<` v<Ծx45ك>}hW Գ$o)֊K6VvВ+AUIbͺfjx㞉l!њ+Mѿz{kBYHV.((V"tb6x{Kmj(o(oEW̑iܣ3^o)ݷڳ1WWXa%oeJm[.!@m'6)6r6fW0{iIL݀SҜ]؎`Cdgy&Rykbb&ͻxW'j˼/_`ݤfpTQ%]At$c26[U {5imO aHdA=z 8ш0Wc q2 ^ GQA{ҪYi 6(2j*f&dyY'J*S%tz!e9x|A#G'4Z/v:DԳu3 FmH5Jצ=8  uQh%_M%\:݋l-agQ8Hv$h ߇P+ƮȆk=&fۛtf2dFh}y#Y15)0{yel4δ;dj7 bPko}&~6w6M]^.]sDxD:Ķ) YaHFGI\H8ug"sk/UQ 7 vg=}װ}= |ķ"wOu뷳w쾛D`#x\ÙqtC \vybTU0=oAb bbTowgc!<ѿs(}4Jdv<ܥrDa$Dxb 4H&bQu%}vmVo]f<\?/kNzh1z˝Z̃-󢉭ڙ\ڧ>a=J{MnNm1؈ cL!F"CϳTC0@l9K {iW9v&"rZz E#@AE@JЭ*₏1bαLDʹdA(fj Pfj ) +}lQNC $QAuڢ+,p%E10I3(U ?ÝnjYq3k 69 R3k.!?恜ESkT LO~)9?9&Cyv3mfU]WȄs~ PE T &XBHH~2[*ya{swh7sjBlmO+}it#ի|OLZ\}$tOL\6'6V`.O9^άPBݢkH6bIR l3)gSuHkh\w NK2&$ e8ŝv-h3!,fdH9<>ػSn[B*cwY +-eID§kfZF PmTəSAhv˒Ui<$c,ŝe7nkIR+EsP*g[)"/ Dlsj4DkR*D"iX7jȶxK&͹ImYc65D[ҟIa"26 ֻ@hg[ݎI#>e= q^EN$!&Eu8+ x7&xUXF.u.rz{i$֍6Adl)O™ 쌄[m.E~R8!0 XָksTuT< {}G?_QDwL\4HP?$*0dƕ{"9,r򃺻t[G3Y Q"gXb"}1mq`qvp ^_n;ykz/^fNM{eߢ?f`nR'ؤm-kz;7U^hwTz'Q=zmol5U{Clwn<]֫`C-q%ZlpLԌcW,7{z\xLly p=Їr s=*r.{`,c$-Kd^Ȏ02% Ol&1%PIq|pl3^ѹ񐪮f<\9n'Fr3|.^"E_r/y+k޷jPOH.=Q,e]/uzTXmG=+2S!(SQZ4ApoV>nhue":d73XD ٛdw6ijŻ"p|@٭#^ylϩ(oc-6 t{C4Bc"y榘YY2r ZknƶI$3@ {6'ȓMmRpezn5nT #` [Euz9TAłU%ɘ EQ5sf*ZYzʷf"o}e7+3L0Vޔf4i|X^Wj`@!>q85w]X LJMU?m"#|l5^ۺ>V4W%;"7mW[@i\?lڣ1LVTծy{b;#dkRpCZ/'OkQcj;o1 (uL)y#4~ވi@,TSQSW x1mͻraoc5JUq ).lUҞ ",{IaUi&HjsHxkw\`&nR:Ѹid).jl Wnl8y2 1"eݹvYXP'.ZT*7~z5I.͋b ]Y\*0 f{斜46`'Q3zӡ[dSF֣E -Ǝ<\5&{]N%'/*~pCy9Qe =Fz%NA@릓zSmN¢-2$ ǔL<׈]fx2ۛyj-02.kyݲКÒ.0+0e(;^fݾa\ VFZ4+a w'R(vҤ*K?MYnik1%2Eq2~QRGf]F\żQV8g g ;);rZ"%Z1xTWDE)Yg>.jvi>zka4hr-emuRI/ODD$H -V7ϾiHcFjUsm@y_((_4 X?^\Y4F?˄8Vf&>G^H̗q (Q<҂hA@~` { AKBMP4BRBE P%)HRPKJHMRб1 @PPLTHρPTIU-AM5M%IHRQCATUSTM#U2DE4RP, QUT4URET)M%D +JDEMP-",U""! I="P4ШR"RH'_L=? ==! (>a ZJ)* *xx ٪ RZ)bh ** JHj!((J"Z J* jIJ&j$!fBDPD!C h)(iF(J (iB("B@ @  JP J) ~zc y T}J;}!H3?㵑"LP;!918fI֛ǕĤyU!^t3Wblzv7GK?H"T2L# cXZDZ^ힾpDiolڀS^6gI\Rر,5y B,oؠOmܷYEּ۩T1GDNbU/xyNv"+SokglҊ`6(sۚuskծ$J/>Θ} %Nj{żRTqUj#k`PCԂل nLvstbW_?l5-fvrӇ9!s183R^nћ2B&-+E֢']kChǢ{.0JŸ A dܔG#/" SyAbp56&滫t8%]՘03~_U!ϦN%<Pfh(Bh*}u`)1%5YC( +? ;KFIHTd= &7PulKتtދAm\dˆɑ3$if[ɷa8H0 e>Ҿ =7=DF76"ۋ8PLP+X`gvm-fŸF̽k+dw*4UO!"QS *ZˮmrYtPIXlsXƴ&;Nos;< 2p)(N6(fv-JNR W^o&OCDy/3D \T37vښ46F#wT5I!@k 2LvUUm֝+ԪPPvܟO Zwse*GaPQ Djd`nT;bqQyۢK;Gub^o.BT)@ xwO ԍP` Lb9vj8k=c{pmL8\nۧ TxjlRHS! =ꝲm];cbtYc.}_GJy޼ݙWEMWzrjw*Wh=w.7?: /\q% ! ~AUW%YїIdc4^wBü<;yO oS[/wyfyy{U:(Yl3hHlYk_=0A8ț÷7/Q0HY$`PxVu)#kfT${`'!bK 0$[5`HC7,"lFaC"v`7THMQ(@3EJ4 K>)f퉛PuRsDm萓/ml3ݸ3NDD%rz` V7[wUriaS<9O=ecuDxhdh5vzIOZ+JR-) o56^6#%ʉ2;)U$A Y{Cqjxȕ vч V \N4f!V"EސVoIilqF`YL{|XdГNZE-pfgkx֕W+;]pY`Va3Pbۂ34DɃ⥭ԣK(߷q_oGE^uњw}:S as==_X5ٻ,0Ա܆nꀑFGfPpŎ4*@֬teKt3)yXFK Mp6Dt*'Q=wz\3-Q(ݵ ׆Yf]߮;4Ot{I]{jL'ݔr]~Wѝ'^8=$1פ}UfV TQN.晡#*U$dx5(!JN/NNȸǪx:f}"_P?}/?rV4nzP"M3/hv`7XwY DaQ$fk3y8fK&J!ȇ#hot[G/zubݎ %ŋfCT W*lW4ٽbڒgĞMRFCA J1Qn%Ґ3FMhEKD%Q"^9_sq<РV,;SwHv+tI;hhЊue=a .6V-ګ @1adW[O-;WaE*+TbjZ) to>fZl7qLX5GQPwP!uI:=Mvޠ&=btZG %a#<E1b<(lTL覉W׫h8[3}!cN)/pNJ&:svuڢ-tNձAmo~-<ڤ`#a$!$XBll?t{$մj:5$jW"3iZė)Oro؛UnÙS(x.fOH;%hnR:NCQ-"j+t́Cf{iC`^\=gcQ5D}0, 2gZ%ա S̄=$s#s"2Cƅi?IH#Caz>a9^p 08Ƭ\EHc}$Ѻ}hx)bi\Dl+@alGlL8&|yz;;vɏB/ #U~p\?Ţ%y\auDqQ 6~vxЖkHPN(,=@[ )۝[=A@v@6OM \3NڤE [$l,I &a#r%"7v*z+c:dc*% .8MUFM:sXcG0dp/_!X6 zp-AQ R,%*goyt&0Rv~9`97v 3 (\ lt"(j!m8I5Օ 5"[[RTlCqG5Ī-rBeD` Rnb><];Pܠ_^}!6*x\)mP+{$c1WKPj.˥;vYVYsp am%7cHIҬݎc%2+ETvůSZP4dR$0 Hu6X>-'QRUHP̎Xϊz:vNf0:I1'pDHV 5;;n!"9cη  ͺfB鋊e I3)elL"X,CX s86ZU v䇞=Oô~NΤUQڻdԁHٚII+U5;&]tj*%lDO-NE6ga zxq>Ó$zi@lv@O0FODM+EPERSUAT%-ЁMR5C4!M% TIBP2P4JJ_wG+ii*)`ibF )*")JX i*&"J**d(*E x BAa` RC@EP~!@"toΎMW6UcBU]s^ٲeG1\pSRF:rF5h)J-D4x0b(-}teQS,'BҳE]5@j[MQ'Vo.򷜻HweGh) w4j3wQzjNJn$.̕4nJd{"pW%Qb4 KYgj59DWDixڹܸ&%k*MD⨎#yW=#BF׷@#"s\o̽ƺz̝ӷ׻$6֡{jn[~Cw$BK͒ǕB!O130ciTÎ6YU}ulg:ڵ۷q>wŢ2wBNՓ Xs KYM:L.Lm-jԼ{)΄76f6,7397+2I02PUdIw}GvzvWPϒ VE2nMTR5!dz^3.'^:b[Iè7JF wF7Ezezʹɡķpf 5BW Q0MJ3 Q:˜2E 5yG{LbzX  0P 5XM@d$"k GՕ:R R?Whv̮ LS n-s׻ǯOz*K훉`ITX50_Ӿ[ο=M氋U7 7.dX`gpLhB0 eٻ7=㝹{Z]Λ7,z<źIBdV43fYw`L9Q@nuY52πPA ΠPR(X{խk^]-Wb20L:SN^ B:+kt]g"HD#RG<vs6+>*E:8IP@StY̹KPszpƢv+c᧣RX+ׯn;f *ŽMVa);҂J>[}H߂wka:ʛfPm O&Mq;J`0LJA1.&]:Oqۧ#dT]4g!V?lCr#neh|ʁjJ#A/+0.LSkhxlfwTi}&T]۱'pd%s4nJLSr[l9^]+sPOXmÛWuq"B{[tb1bv0r=x~"Щw8Jfje;Asd%ֶ IdsSyU< ng|FmS$011 U_z"zJْjPU\Č30vhFۡ[K58o'P{Boj.g@1s["lfaݽ=ZR #%BT[+p ¨ksO!%aǧ=IP׼[^[*qjZyU8Ff$ x|syå d `L0`=˨ 4y!F;M)"0Y_.=ն=ttFOK}:$_9'mČ'۩{%$I"d# b(V!Mh#!I(唶$׶zvs,\[7%7rS4\v˜NFf4XVp^*$bt/)zMstįl}pngR0`?o?k!v|-(6UR'ߖEa*(gnhGZ#uۂ \KAczqldݮ nrJX*)4{nslEɗ}J:}Uv^E3Zn7rݢMɾw8EU;Z5/lsBjNvd;yfU1򟛓+m)L1%%EV1@iG~`0s&4U U ,ykmVqTqdQFEP!/'!r?g d=ԜqAkyH\aPNTC-G Uo02i>]\ےՀPGArXi+YvY,#+ }ҟ/s)I[m{!i5e~kƒP%D~Ǘ^UDy.Ūl7ݓ159v>6X[b@{dAd|f5C'H">v5T]sNW͊w~n͡Zဲq0Fs \-.3Y멅BZauH`gV=cRX5&N3 Y*#hF"MdΖ [H#!# 6fnU7Tޥ!DMfXﹻ'&V[)`ަZXdUȹqHS1{Tlik>,kH9%Q|x5鱞ki. 7.e\UgV@Ϸ(vͧDeu l$z L$7H0ƣE:x0:"%I:hmEVIWIWĵu`FB!\|Ԡ:6TDB[>rWL2$-~.쏈]UؓI󋇯69vi")y GLmeE$g *;կ%{:vg>rl+B(lQQNY -'0˥KLtN!Y, c Vmr;i&&Lecݘ pFHKI&D)Q9!1C萅 rTDl;XfzYZ;K!|+9;*Xrm-øgR:w9ׯ.|S 4%RHB?U H*i&fJ"" J*ihj"(")ZB hh( bh*%=G%  5BPP44%"*P*@nIs!9t  E׈ JZ ihhiRwߝUU5I@UUM4U#@DQE JREIM4T=y Ѕ (G=HO|>#ۭ棵Jgm]W%+55\ʪŇM!Erk{꟦Oed[=Mvl T9vB!gsWY4l=bcFUݫЇWp<.Z[:J#˰N`q&4&`ăo([ul R mG3kJf9٦E^G3ܽy \Y)쑊C-IZv[7qy,Ow%gԶǭmiuSĖ#ZN.[4@4 `YfRii |g^a=Ι gYnF~'|s9{P{::o7% -q7*UfFYEEO~0蓦[}y+RЧF-{Mo5mA⥢I ud{y;Rհ%ɷ^r :uᭊ ]wi8ij9ed-64u+ec̿]A$ZX [cŸ"![]t ؝IZ-+=qhqJ͏2f-[KH[_*oQz.ӦEf%.R!s+ecWќRg"];} vD< bӱbS6-݇eRL$ 3H!BwFc=DyNД# vcV[?]ފB)ڪu׎JnyxSgu<-g#@1Ǔ튍ՁF:_۶֭^N)SiMSz3 )'2`,J&Jގ껧[W20蠖~Ӻruփ-XTN0q%?Lf]";&} Bc̑3 lU $ݎ*27F3â$ڙk"iU*TL9_)t+cK̇nDka3.CEƼY:ݨt %qY1D7sCBXiXDwAjLAdz1 ON(껌$ȆJm,"()f}SZXX%`\Hh d%C'T!N_}Ҿ-vxW~~*AFdh2ʉ7Vb"< N#`≠1jVe9*jkzg#k=J2|âMud390A88kp3FIpby^xOm%~y8ܱp2Bo[bݢQ.K2摅/DqȂb1%)iKW'-]xM 6=\hE`llq{r~;=7ku:J\o +߲#e$TPr@n38&eI5T#I.N7N]vwF)U&WAWm"zuz43uFhx9w%Ʒ )Ky4lfǺ]M(5Co1j[&wQr Wk{Ojexk鷵ARQ06’i6[C`SvH# [ђ6!kDk;wH&߾~Q/5ٹ;wvbWViVj5ӻGo7EpMNXt.돓mŻm&v}-*r4=R|ب̩2uvk~[3f) dfa=+<:;̥9 w'Ys={\’$a o__=;xc[ɆzСLBIbY xq6n0@խ(7ycTZ$X6̹BqmATKޭPg^IQgkkuy5cZN^%p2`.;BDHRJ'wENEwq#05,p9eAC QJ0KoOk2 +2XR oܺ0@luP TlONձ*ǨECxEo;鶾 &IGTCS8Z(ڶoӪ5K)_BS4~;SRDDR(Rn4ntƈi*E!ACNښ:Ю%.oja99]ӻBk̘ Fv#k\nbd2uyomm}*fv+?4g;4aG0 Gw>^c#h%3laa'+9i{Y%K>0S#p+I!l͌l︋}7暈fmm ߢVN ~\gmxPǗԷE?ZxF6im lA_突n"2-T_[׳)[n vwbA&Ns'f Ŕqv8&;hMgվ17:g`F2^2;y21Zb.cy=$'zUP"` _[c<Iq1 1$alS;Ci#$t3u:e⃓7(Ш銎&]{RvzbW IK}(狣- mc +$B[.mؤ*-+k>e^tAbYcOx˵9vgQԔa_|+"q4'6eM(c흧NKx^f޲l$$ lSL{{&pZ@)d&1v`6tnv! 46Ck+hl(**N"ROҢb I tP|0-gm'u zcuQz=9z=T-θwg2o]\\v";y=G^u 16X]}ʚT%|ͱx"NtUyCA~ %p g PϷLZytԅ2ѪcjWeHɥ6,m{Ե<d8 }׮9:qE-Nri=~[9\[=D=ZqrJđ !!N a@^MY[qf<]mSey|{-^^~E!'KzɣL:Kty-iQ遮n]PlP . ! tk$}BV=4A mTuG4P2mxr`كj@w4CELd\eGQ#5}5ͩ NthFL6;7f(mNdػ[^v&3@$#w YsNuIbIAnO]р ܞdS'OI$}j46{\F|w.5Y:a/ӆ܍6CeS ^^>4L#PD$Z(X05AU H8+Y:FO5MElSrsY-`q,*co\w^Jx'yI/|TVVE_JTa7dtdZnWf`(l-iǢarm=/ ;Iˬ0Kk \جҤ:Ӡao["|@F߮Cӑ[jAm*9'B<$+H!܉@@QKB$fd(2ukPv Xq/ r :q@S3`˖3!)M" b՟bH(PpF'ňwuA6)V sL(`ɂlZ+m30n̗n8Q0{mM)Ե}vk-袰vN9'^=Bm5v!{ j[Pɀ6ÌȚ3T r m4ݙsZ# DlYp[wN_dkndfFqWnOgd9]&M̗Ki!ۀ$Ǵ]#I9~2,l{Ss'IHVJ .Xw6v[iLh&,-ITiф"ba;i!+EG)Q؞P\P0ۥFRk!?JD:BPsgeVc݉6|So5`.ͨu$B[b]ыsjXUb8[v{V>W53[)tMAKao7C0Er՛ _}Yt/TbD%#[[IWN%xS8h7tq= n\r^g .iL8a-dnb%Z|娂3wܽ[E|{wN1 8R-81 qʉwZ椐dI`,@oM:ևmжrE`_jZ`" .T:˭Bmi(\9ڑ}X3K[PRhx!K**S'Vͤj굫o%W9Y6aЋZvYUSLyd6Wf9;e_!^C7}Y~DܷZz3v0aژ-083F`Azf4lD*p6պl]8Aj TeL`3q&LśLj-d&0lunUF&G^× u݌;: Ad!>RN vESF2(1g0i{vaz֡sH4dP\35DR\kBd>Xpv${Χ۾{R#lRyr,6WYwFB;TE5#iok!y+4͙Ć5+ʵi:ܹ4|8VT^d[<@ f ʼnaD$07c B^vMmAT*^Ly_qͨTUV&>9.z=*[w*=CtYº%rb :by֔L0 l?hf_D 2~?nCCz{_r$l1 ol֖BIڙF c&%SGW|M3 xB& f\1vz:)zPᠢ$"ST4f77!7$1b{7Ktof=đY+[#(iR[H(2maPFsLgzC{gNk6(е݀×guɥ4Jh1W䦠40#'2V̊`4sZ3/v"Z8{_STƹ}, EldI8]hac\pƲ=)L](i;4ke1ZL[oN;4iA.a8U…! ANlR(]aZ"hc!>I!N͍EacmZR(&1 g\ -]X=_g6/tCbV+l@MvVK h pYmBoMx,3JN]əRYD87mBZ[_޼U}DHR"5{;8rzlv;01XKwW=$ 1.\s"4Ҍ! nTۛǚ;2A4nRk,$\xv;3toӮG(D-űfVN45]Ah~WEۺ5먶٬v׷NO};'ϚWao^A׽04[{Sg1в62iͷqE$ҙ+%2;7e6M}d{FOjr\CRdQkLvبlhFNdX&V]gaj.$cg;Βs05V$ #-V,,[r [ݪm4Mζ i)(4ʣeF2']0:2z\Z&eXDA餷BH@ ?A53ebv[ \s}Ϯ{gd'´yE0iE56Kg6[c4BVDJb"@(YH )]EƼkTd 2&Af!J$|ƅljEuʦVZm QIsijvGqLR5]Vk6sft]8b`td([eAF2̧WX}=}Uݮzn֦ t%tjr&0DlFUKUGDlR<˥Q7VVL 7=T=}.rn!uM}B-ˆ,V֫HlI`d3rZɕ L=)(.LalJ]7rB3%ת5~hEhN[#VG* vZ,#A hØM sr]`̘ffՄ!WnYZ,( ī{Rǭǝ7hђ:ө@s'u׾4]d&EfƎF{LF"S1c*svS8*UC7ٲF85^:1s.Qetf 8Nܠ+} m)hzAw4iUB;z禫V*dr#֑Q0sB)fdm n2vj;W/jYڝ,b,M1$k K1NFPMcZpLΖ(k6ertכn'[R`BDeChqF"j&wQv{ˢ;۴9AZ>V%Zm ,FEs-9ozt9дkhjq]۳j;۽Pr- =+&e`INSGn0 `ٖJ$`Dm|nvh QEpo0ӣeB^8zS\hFGSrj;cMcW.(gyɻˍ}HYaHe(ripN̐ǽX#m&m8cv7_E}] PͻtK+7+w,ou:A]N$1L\qc0h2Re% PUm{L ˣXiNe|8@֫,]LicA*8ajSFm |k[:o7Wnb6nZuQ:]a}\wbץcIlREUn?PKr.  k?LMrX4[bٻsCb"tCt(və^Yu M%#pb&D lL!Lh$jԒ.bbf'n lCNULiz7g8{=&y:f5zTǺd͑@"-(hAL&+ ``ٙZo5La=ÇOۇ,a*wګ􏸾˫XQ͇3.$smfgM~7 6 B'w,+Xn)"};pOҌA'O|-ğPi>yl El![UQQG\DDDD?#L?fsaz,V GK{3*kRZ#2B3C:U2>{TE*۷wܭ=$}}]}<񔯽Z7[rTѾ^W y*;;FrM}}Ss߃`ŊF=ם!UDPT>8"yUQEQpR*XH|0TS=01cB QHb_68I4].wKyq!&йDQ*4+bq瓄jfn*V9eĮ ]T1+.v(rWDԠd]sazdK[<i; (:a-R,4LsЍ3VƘgoH(qLy3B# ZznVC"NכC,u]]0}*Fgґp"? @~x1mA[);1V-lR F".5캐 Xf)y0o{1}vhpH-]aݸܦ#Y1ծXK;N ybP@,5Mv83&P^܂3 Q` ʎN]aHe D4 ĺBAtxy} 5eo&&fEJEf%(w- BFiy袑Aq$W7mzG:#];]a@$Hiэ܀B;{Mw`=*xyliO&6:VYvr^kw\,a"ˑ]pxpŒ91vME]k*K8rЗGpUMKdiR)ϵ8ёOUrI/=Ui)E*20s0iLL|uǍM&T!oV]\WoYrmjQ4 `;ʨcPENzR(ZBKa p)pFIsbCY o/[o5DDcRYBAffc(͒&=]{ބ[SbV[l8֧!MEup[)XY_2#ދe]z5G%OH8}(=,?0&??Ckss9^+3(,$a Vr^[[V UCSM|l~EHː[a %8lPX2ݲ؆+2$1l=j|aE5q7'W'Nsa>^_>}E3VXiLEgn656hl݈H&Lz??@66F;dh`iv(neh#Õ7S@7TW/[tªf͌{Z)SlkhSi=ֵLg{{bDyw-;u!pF ;WVڼOo ⛥l9G- W e ;`#TYo[]Ѧ0K͞sYXU)E Dn )18ڊj6WX.&zd1mUƴglWrUhh?DUDTis9cޙ;}#yD?9[b`P}UB"dUTBܞ^{ѱZP5)Ar5|PE$D+;`r ]^L &:|:*Nyb"94ufئ65$LADhk*((ƝS~"h(-b݅@" )Ĵ[-|AHADt[fC1MeJwb}ڿ!y/u46!kZ:DS(́<z񀓆ǂR dS"-`22H]\*c]X1AD2ئ/vD@ '!a"4^HqTsMm)_A ijM9+YEve}[ f;mLtҠ5FdByT}~{9srka-$Tp++[7tnl9YHi&b=!4޽5C&.)rYg.d,`kD>ӟ/J活*X}{ow-ql %z'[;) CF%Q,@nsK y+ q\DWŘ Q1WkJ7}I`E.,ш]X5{Lݜ}6<Xd3t ]^CPpBr~n".@i{&y{kUWc? 4qikQTLJ.m?,t4峈H1 $56c;n!G.<0]j`W,QenNQL#"BNp+*7n[57bR0R_s6uk5ۅ5x_tUGnJ.VQʯ#sc,N/; 8sՁXd?,utˣBQkC! tUP[Av<9*$bek|>_/ ^ ~2t;<?:@ H w憐I$%<`aM5Q@3˽~} )gE€tU}'= E R!CK]Pa@M"ᒔuCtYuC@ICT:$dУJB@b 9:r:T"\Di#!AaBM 4΄F1  BtE cƕ( h\l (94KU1(RSBR"TĄHі1t@ A,eMd - H % EDLsZMD:l9iA$J-"IisKZTJbbV " 2lR4M)FP*d эPJ R&B%DM*U BRR !AAT4@RRá#n)rH%4f %%* h4Dd@ @P MDH"gHbeGBċjhs(@M"TJ-!@eш  QЈPPe)*6ңRqJC]"4B+t &\J%$RH@,C$HdEV@:!87=Pp H"‘4$h(hrIR !GLI#N],P * P!.ha40KcGLT%ĖLH.GILB8BTt!#``M4H"jBjCJW4I JPlA$% @gZNADK)JX,.-(B6D laEҙD132B@D ;({QDN <eE$*@O(J(J ))  0# ² 00  B"H$ B *J$!K0SDOP~H` IL(QDdI_`2+2"DOTi`R4Ch!pAΓ QR(LJU eRb ĆҴ!Q#B$((TrhVJqhT(G$. (( ))C(e!NR"4JEtP%4i#HE"h 0h 2"B"%, @hBД%e  iPR ЉB @!4(PAP:F@( Qj @ZL!iJ4Ԛd*JUZPbTR02P(J B((0i&))*F$bZ &$4 HҔ ڄ:21H )&Rh2 ]E3MAT T@S(]#Q]4H!Jh@ҴҵH94Z tBTB"1-P]MJ% D)O&9`lAdfI*-A4 2J: h0ԛ.% P``5""|$B UdOལ&bh=U(p"R;"@9(Ĕ!a`ta`$]!@4-(]ZC2Ȓ)$H)VhRҔ 4- 1 L6c0A*(P("UQ Q(JSP CiZ)2`H4LB"* 3FJh)\Aj( *!hɦ"tS#@ G(TBphr0Іt P%"Rd J R6RZQ)h1D[`$JQ. Y29cQ5lSдb %444HP4 IJ)Bm&F+@"IIB %-"Q(HšQBpH%RP&X*Z2Ѫ]A(B)i 9CJR]& " @j$SiHCB-BP"PP"ж Ci3!U P5d4 d]R&J1 PXetMC hKl䊪HP$MP iSS AĐ..( a1rhSE ++8f[N%% `Bm$BYCY՗T$!1D4: ZX QHK%t4`)҆0&P C)eM0 9$atq"@!h KA@TJИ ] $JPJ1Fj(Lca2' *"΄(A(Jc T0%kDA]%%)] Ht6i[.Bvihh]\1$F!* t)1RR P %P)@QKE-QD@WIbbMUDTE1B"5-DAAДPP!(SIU@ЅF؅R T4X2R6 H mJD+@i4 Dmb NM)HeұEP@:(HQxwR- CLҒJCS++0P;4:% S0KHEE-(b4j&J)F"D[ЪX1Dp@M! W̄@:R@at H(P]9tQICH.A`Aj(evLӥ,%) * ̐%J& PR"P$ .SB%( t!) DBV4fU5JI96)@J(H!ĊL`tF`дM- (pilP]+HЉBgA lHP"MD(RH`M%$HPm&$$)(D*S$ҁX2:5,G!@d* С@ aDҹ HR ҩ@etRJ50lSTa+0k5\& :p&( LRi$"B4`dH4&E"B M#M0ˤ+2i &сHdLe2JBPh2 J+BF B&JFs4 Be I@S \X 26MӂV]$Ba3JXM+eL BT0:ZtPF!Qi0UQ0t0CDhi1pRCi&F``6ii-#4J,IL3), )Ɣ;&Ӷ@@bH$B#c&JȞ*!B )S+˥)2i1#jځ4!NtPGHf!X@҆)Bi R҅*QJJc) iZ˥() P " ()T@â0ZDδ 0P(W ID+ii(i2 WhDh\2D!L:hhsPfU%5B pTJ*иJPiPAY#lFI ΔF@ZC#5Ɣ"ZQr%P4@H*CJQJIAIHM&UIH4TT% a")E(UE AB+@M.)f$mE $m-EAE X!H*U I 5CMJ4 ѕ`B()5 G"iBh0ĮƀEĤEKiZTi11A HN .i"AA*"DB! `%䀤+H&b*GYЁBa 2!L1DV0@ 1bV.[8SjM ,)b% Q#9ӴP鴹 Л1e Zɱ& H'x.F"y%\H#AYm&!PU#VirQ#"hiU:D:qZLE+r&ͫF% (`ФRЭ@f 1Ҕ8@t +@-4%-!1Ѝ`44AJ%fm$DaC@Q  $`-1@顡TGCRZ !@h@Ƅ̺1D1lh&iX4J@` MatVb!++@M"bH0 )DF:"@ *Ëd .$2iCA. 5I a$ ]i 15 +Д 6h 4QpĤ!+LC@h6E,]K4 !@dj -2`1!c@i\+&SDR$ԣIII YHYd4DAJ h% E!lhԆ.#0΂C jHcP%Ή tT& I&'FRHM 'TiL;ed  Q:w {E0E Q@` E>0De ( G>?~͎Oߣ5 0e11};q*qlOq6vw6ˏU4?t4A0tK]@(e&EG1]Zp05SmP^>wvK^Λt18<,jPdeR2GC*U>!I~ߛ/*%O P-{} \ s qPL#Z+q=1J$Ѐ738x \%[!1 pK~TqD"D?B8a"U6y۫mmؚEZ[{nKcC[H uӶwsG@abF 롢/CVR|}_Fq b~j5Xw4J޲j-iC&X+hCe1 nzrFБG!gtJlndFi+(ߵinO 랿1ADm(O%㶢׆AmȬ+5:&ۛAj8s"4.:wCz"o3 JZmNWxdd24k:PbA1duFQ2ijbq M$2 &Vg0x'oB\Zƒ SN)УFsA+kD \a<kI]Y3-ڇh5qEtČG:xXjmČ +hTxg ҭ |4^@̤`W 11ә+'P}1K %-rl6N2v,AcѢ8*K )']x\2=g_Mi9akVt[J|Sɉ{F#:;zg!-J[fff:K>K2Dj*Pp E? Ji =$Q Bt H Y?]rLfqn521Kh-5qi?# L='ć)vb"H5e3;^zkE;_*W^ &qNKOӗ-:lziy?VL$Zi&/^ϗLX(U1< bļ,O/۪dGYop{{\ J47 R_WQo2q5ٴZ6NC 0,#ZA{TrJbضN6N,dˮxB'籆*%(,is-‡/n_([&SOgkl.$ՈgXBm( 5;ZHj7MX\+(vNLM\qD,bJ^_O; DY { \fmS70-KmU>sE+H))dkbzoK۷Zfݯ=aXܒns7h9=p:UȢ5DdfJqӕCm_Me+I$-qSMJ"l^oElޏj'}Of+'FmjjVr'y';B sI15j2U$R@ x$s;țhʒ]tikZUNt/m(C2EE)kUAhE zc{-Bh,Pv%LTAŽ}JA[1STZ & 0` ♮kv350X'?}=T\K$2,0L j6jd ׎v>uҕ]X$ӥU-4z5Ǧ5e@v #d͡n!QGZ?%Xؚfڲkpq'gzq8kElJhIC&kkMOhøS/4$qe>(9$^$?osђhbĚ@%A[bqysiddyE{VOp[XZG<mb> t$0;S[5+LR͵&q+ݰ,w7m{\؝X.ጓK>0@%H`W݂Q $bmk$ bJ 斟9tkhQ z䣋xMZlQeGˮ"ѫsdFWgvɋº1*кZ˟S*2N֯/NHH}Yل95./2` eYnμƤIrNgSi Gp8ur"TP?D,Psn-T养!2Fua.XS}YWh͝ ̙ryg$I\)eee d 14FNeŭK,\IBB]ݤN4hg.Fŕ旨.#X#9MUΫRlu-plvy?XN ٪1ڶ\r-nFU}Һ>jԼ%ȢEZe.DWi!H kᄑˉ<1-.ުtU%f {wE+!%ڻjсE ^fbR-tMVw{D0hxjB(L38 7)S `xQ'yҬVC"61TK10! 8^k} 6.k6gN*45iEHrKtض]KtBPX/#t7k;G{xyLUnqK6Y`ET_qH>jPWWEFwIMgZڔƷIt6@x/cž^oP7y p@-eIB&8`Q!PĈ֐֠mB 40R  H@v'Zmi-lXrf4nk߶ עn[qvWXwk-ٮbJDnfJ5Z#)ěi_UO):q'{ڍ%WffAF3Nnٚq)b24<'y,8 %w̖ݿMUXd- Y n(1C9nX7EpA'iFR1C53Y`k(`_YT]fșap`UL+D1:}Jc30ԑ\HuX9ns4RHa9`IUtIR &`00vKV?FƧs'{Tt"U[j~GM#[E Mǻك܆OŇ#̄#qouRI'v4}w'sㆥ9œ.+/sƑRXmžtZ`~'k=ln pAS x]`Ci[W%" ^O"YZA 2"SQI5@LP$?XA W9B9r2Si @/iB:3(UQ)x"Dh@Pw{|kO}ÉjbXnjTI֯߿48p6s q>#ZDg39|Mkoɟͽ3s"b dtx TtBr_M富vʨc}svk4B0%TfϿr^K_ ~/4Y[`.3I _'H?}"RoHg3H'~OD2|oLw>@hlMP$pE$bO}K,7 ?P*Pz?4e'Ip:%gEkj3xy[7!ΛЋl[j_ WE~]o2$:v-#YE9HbC0I甮zlU[HbK-H3uX#,p_oź_q g>sv.̕se38P7Pb*k{uy 1jk7v 6Ǜ*<4 Y-Y4t_}dr&SS/Z|v'vQ!؍Ni'ER:)SlU?撾8s$2 ngl҂h_y>fEc*QQ*z-owƘ b† Nx7 sٽ8N51-YrH)6M?}:wdu3,ߪ! ;'2ckq4h%9DVųГ]6[-̲mHed"D_4ZtI۽V-lVV [)gj%+4n-6WʠKfF#߶MHti vZSɰi'MyV=75<$9͙/Kl$PᙚB%랕>M˦[[~NNDC9LՔȤ!]BvNLJOCy})$xh]2RWLE-a1yqfXp}qk*Ԝa{ Bg+ b铍4\]dM̪$RI<@-{gXԳdXF m1 <]\uQr n*#d7;NiBTbG. UG0ӒNno?dl~r/Zq~|(" o z!kWű dcD2DnYU:n9f$1&X S1?O|/-/šǍ;Nu}0Y?b\ wdihFINrEvvgo'mOn 55Rdv:[(XA1ut8׼dD4 pV:|'{HtG&O;pkUft䋏lMd[xl+^j&g}[Ϯ,V >&$zkx c11w L%!B0G8$g.tyoh"5:ҋ?Ds@bB˯\ WJFjB9 RmGE!v@I/8l(MWJp3'0itm3M?xP**!W"yo=v}d"J㻽MA A'lhRBQ-%1DRKkVgL,SUz)ɯ z)hٝ_{˭l!6((EGXFV)F"wRnɎ'WM%MH"Q,]{\ͺWm/GM=7)=Bj -} Nqp2,4Ҡm(p,G&fuTK,6S6+.t܁ Svou9 =sVė=d\ h'Hɽ39CsD`!6bL/е_}<;N%7nhEX_i}}`ǡ{ O]9L8R|mk\+&'-'">&lE #/Z1IdoOl_fM{U2t1J?`zȶd͟|z G"1]_z=8N`vga/ a@FO}Np9C4~ӨwvwS?nVϏל̎e׉9;M&w}ӿJ "hd҅Ŋ~a_h[CPX?9_2W=weELjVrQ>;Pbt`Ԕ .?A>Uf!H~6~i2LjXW,mz}x'bمw^E&L#5hdMcBOC`wV'KO[X1ՑmFD"FlΈ'sN/H?k7JnPoXRlvT:S@iwĻyUjݤO*+=s,n;Z⿹y%[V<@d JڈI{ j?Gb55ta@/#4\$'kk+m9/==2"bPi&M(Q+;k,jjj9mgݙ`pH=9x<-3D*D6ƃd*c)IOy4'95OVQwWja[H$g<ׄsgP~o/y7Y.{wa!p֖7: wZ,TV`0I'z +'q4CE &r#Ɂ9L -qX,aEZ+n[9xI UJW]r[w ymQ6F h/ ԅ1PeQ?_핵1 D$tEjR5Ukm[^W"yϩg]@:F4HԕHDeS M Aߺ # v7R[q8fc\5*gl-D[$XMu<3 =8U("$&vձ*cD]bZъڃ8a_¦20|ib+Z^@qkIZ;hIX]TKZn)M|DuQ` %yAqzntSGh3%CWHX[:u{uJ.ǯsKU(3wItU,4SMWg㻕t~t~;@O]vk7UBV>.t#*20ȴ6mv$h,V~;().[H*lX1 g `QWg*Rv@BI"%X+&9saMUs1n0 j,R&ӵ&$nbB[Pǿޣȳ9ÚS~W*`I*?]@̓Şs*t9v[hRBȍZ63kAyk@o/}lED{33Vs?1OtO%kݷ;$C6Uh+!JejR3 qU'@ hNӊ'&RLX^~0+!wMMꚧwi1ppō‚'U^yɭJba$4faG]}ёi:ثRmR C^Z z:{kCZ5Mb-m(_r1r$J۳`q(8ds%ѓvwaW `mmH[!Z4)|=ڞYn,k چțZ{Nϖ/XI [h.d4mg2ODtᄼztnE}@OSêBůte}c|a݌W¢["Pnš/0L5ߣPo}O\|p;V0eAbđnaژt!O_ W *h +A|?h"z@'UNC֡P)"A?JSx'C C}Ol0ȓ(!N0H qJ2% H^IɎ ɮIChi()L#]At)'킟-$e*[*p*ke#^nn{РXo;>;[m9@`Izע弧s^mXˆLVLW19q+njo>Sؤzcܤmhs>58lˆpOH8 Beؔ7οUC̉yaXU:}k)E5=/׎O^Y:9m5YF-ht-j[+yY56Tc,yMM=ܷ4[6H'X[y$yM>ާ֒7xaXFíp@83]?o-;]._yYkR6#ΙyMH"g^_S$L e#P­.u {O 3PlX;F #c8'LgCcE ,b9x\ \BM1ud#x!@J|N۱@'5KgNarg8pn麸1%Ŝ)u\toɴhӽ1:>C],Sͱ1ȂsbL%tn=ۤqYH79sԻ$( 5 ߆Ӣ Ad|,P#%? 'dYqsVa+'wEU[dܑ= uj0zxJe[lu~ |"bM&qM1$dƱS,ǸӎSߛuuv1'CݑxӿN>jB ̱9kZjS34RDR EikDF.֒bs9rQbV*k.dU>C mEqqDx[VJC*YcP}ΪΩX|yFו^Y'> yȀP+2{!)Eimϒ\Os-m.cYn@sݷoG(AN湶D+&n[(SQ:vY?jz@&I1t~S.˗Xh[?4,`(egF?A/Z{tqj"  sihJEKņuNon9@Ӓ#aRa"'/! Ƀ+6u?ֽRqliikltj71,[K7 睺5zyJ<Э:uL^Խ=lЗĒ3M*)=wu $aN![s ֕N&fCb /RC3"(>x>0E}MPf1`b^O9b鴷RxoE(!IbT 9q1.NJ^59 8?Z{[Hf1%?fm}q-OlzNt1٬zOY:K"Oeh%zMU0;Rwl4=Ƹ$R݋4iٳDđW&R%d:ܳ2Y \3HWyo폀Z,c"yY*&iʽSh{UYVڒyphBج<)݇ ~yցJmKrI(#<&poNsTa鸤nߋUs^aE;H3s4{|(,@iU J"눭6+0:ٌ_lxA-!@µhCh3=.SL=k71ZbԻ1"p = F)I)U("D * %II@ åDQ2! CI‘! (LhҚh3EЕFo๙^S(ts)-{["dOobN;w>zO؟~leML~`?o%\){Δ=ld{XBEEۻ(0AYX94ʲe!%L,V.gnaZ0qd 9@-Q&9jN%0˸cmUt+'?wnJ*0Q$H%Qp`-LpS d5Õ󿡡;[Ja;l+9P뵡J tu9wIݞ4r\C_֩לˌon$$.ԟSaS*|VL 8LksLaF ƄE^]] pWZmrE6Uu=:0M)ƮȻ.r5r|vӜv(5)WWy`]Ld""7ƈ1 ^46qT\?;[+dAcIV`F6VW![ƀM?5Q4*ƘؗRh%nB G$~袃jJ6 t* m )d*܍yAPD ,3TĊ>N=p t&BA2ૼx[4B r"D1}[DX~lY{C0ŋ$,>!}`g㪏UV'519_ g:1掌5&&a9a$1Shjfqek 88ʓY&Ll4٦W;y)'r~1(bUBc/mJ kkNҜ4#G nP‡Wcyd1Ʋ{DN+?ï;@;o,|uްweCyE[Ū0AhQw[Q&E;lʍ|zh{\v7Cm8 M֦8Ssfy'4 ^ -F/U̜1 4_z@^hUWp(hH'b) Zoۦ"mx tRc/I4Tx#@~l|DU7vwr(WLL8^^%d֖$6v $B4CS6񥅶9I[gSI.Wzw1j`%NvPAL :d5PxhGܖ]ȥO4ge2Pҷ06ya8YtJ")e-bfA@4yr-* cϘke"zQ6EŌ<1RDw˭HnJŊHq.)dVL!Rd fj V3jw51CuB&&S#6Թ42Ya :)i<%lpuTǏ1ͩkDQTO(:fd Bj3 ~Ej(舟p:pX Jv;@"pAu>= &( -DȚMH(K4%~"șC}'DTO΀ =)P}~ zA@'?b(DaPDN  H"(¢B0)"HBr'AT_byt@O2A2ħA&Xej-`"pZi)B?"z@)P{*/=('p zAT ~`)2@(r(P0(@PJy>?=ʩ>*E;DOCQ("TO NTC&DCα?HƢU*nH8LT&`1.uT+ba5K4ҽ,4*@!)A37_;`6`nٱU[l<;9KFֺ6VeFGcdؤov7$ʐ2YB)/n R],EQXaq2cUXDѳG-e)!G2)úPVVܶ2L%a3CLe]&VJ@2V~݊Hs"~'/a7q5 ڞKg;\j'Lp # ň$gGJ b54{ qkbhDDoszwKe/Ik &us:z^<2P.$͛7t[M7YYvիXS@Ld'^A@`Kj<+-잲p 'vLBdkA/' X,S#oЍ .S}R1EcJL eS ]v -*gD0!rj)ZKKיN&%dg 1!m+ISM8 ֵE Q!QeeN]Cb/'0ǁ1giҩPbc-F.MF; 7vvLCBoIțUS33]#LrKeݴ i39d.fuKz0wNc{F޳tr&@|KM&SķIc6G]̟u3 C'<"Ȇl \VYC!mqt椥[;b­X]-ƙ\7neFkahƚ̐5ӘnS!twp$ՓLeBrc0ytd0\5} g$b (gsi] 28ݎ8=dg&6ԓX)DB2":b՝([O!!&-pW`U:,\\mx BV!+&G0iD\8v-:$YMZ8ݤ: j뛳K5Xj5WEbztž۝ w^]c @!*ivnDl4L5[wudsAZ,JLBmmejfAe0AeQ $JG}ti9V& 6mj ǫdVQMruRqܪpRκἉTgov۫t UImwjBPCs;=BDfRbf*R-"D %vڥ*:@WeVTfTb̘#h !0f.jN84)̆N{R&  (fcfH0m^ @]ҒnLHI IКVVWnR#%6z-^W rzYbogk|t[Żҩ[`Z2=C 6ƱeZh֨ ho.7._4l^- ^[mѰ'1n͔!0"R uMܞ#+-7 PB@{-%:/3‡Bĝ4D6u;(iLb )F"&(bN.;r16'hj\َ-ȭA~@bAR=b؛%bu:=-k~r5z5lD^l#ekVBr(Ckv!O5۲u!e @J*P F[m͞:vEd[VU7M{ګY(6ȓc4<4`iƓNeMyFnlhxgK!F-) Ylڐuw8%=ZBzh-UC7f86mP^p+P)Qpcq!Ȣh 1̶yRpNimuد7 # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import networkx as nx # tag names specifying what game info should be # stored in the dict on each digraph edge game_details=["Event", "Date", "Result", "ECO", "Site"] def chess_pgn_graph(pgn_file="chess_masters_WCC.pgn.bz2"): """Read chess games in pgn format in pgn_file. Filenames ending in .gz or .bz2 will be uncompressed. Return the MultiDiGraph of players connected by a chess game. Edges contain game data in a dict. """ import bz2 G=nx.MultiDiGraph() game={} datafile = bz2.BZ2File(pgn_file) lines = (line.decode().rstrip('\r\n') for line in datafile) for line in lines: if line.startswith('['): tag,value=line[1:-1].split(' ',1) game[str(tag)]=value.strip('"') else: # empty line after tag set indicates # we finished reading game info if game: white=game.pop('White') black=game.pop('Black') G.add_edge(white, black, **game) game={} return G if __name__ == '__main__': G=chess_pgn_graph() ngames=G.number_of_edges() nplayers=G.number_of_nodes() print("Loaded %d chess games between %d players\n"\ % (ngames,nplayers)) # identify connected components # of the undirected version Gcc=list(nx.connected_component_subgraphs(G.to_undirected())) if len(Gcc)>1: print("Note the disconnected component consisting of:") print(Gcc[1].nodes()) # find all games with B97 opening (as described in ECO) openings=set([game_info['ECO'] for (white,black,game_info) in G.edges(data=True)]) print("\nFrom a total of %d different openings,"%len(openings)) print('the following games used the Sicilian opening') print('with the Najdorff 7...Qb6 "Poisoned Pawn" variation.\n') for (white,black,game_info) in G.edges(data=True): if game_info['ECO']=='B97': print(white,"vs",black) for k,v in game_info.items(): print(" ",k,": ",v) print("\n") try: import matplotlib.pyplot as plt except ImportError: import sys print("Matplotlib needed for drawing. Skipping") sys.exit(0) # make new undirected graph H without multi-edges H=nx.Graph(G) # edge width is proportional number of games played edgewidth=[] for (u,v,d) in H.edges(data=True): edgewidth.append(len(G.get_edge_data(u,v))) # node size is proportional to number of games won wins=dict.fromkeys(G.nodes(),0.0) for (u,v,d) in G.edges(data=True): r=d['Result'].split('-') if r[0]=='1': wins[u]+=1.0 elif r[0]=='1/2': wins[u]+=0.5 wins[v]+=0.5 else: wins[v]+=1.0 try: pos=nx.nx_agraph.graphviz_layout(H) except: pos=nx.spring_layout(H,iterations=20) plt.rcParams['text.usetex'] = False plt.figure(figsize=(8,8)) nx.draw_networkx_edges(H,pos,alpha=0.3,width=edgewidth, edge_color='m') nodesize=[wins[v]*50 for v in H] nx.draw_networkx_nodes(H,pos,node_size=nodesize,node_color='w',alpha=0.4) nx.draw_networkx_edges(H,pos,alpha=0.4,node_size=0,width=1,edge_color='k') nx.draw_networkx_labels(H,pos,fontsize=14) font = {'fontname' : 'Helvetica', 'color' : 'k', 'fontweight' : 'bold', 'fontsize' : 14} plt.title("World Chess Championship Games: 1886 - 1985", font) # change font and write text (using data coordinates) font = {'fontname' : 'Helvetica', 'color' : 'r', 'fontweight' : 'bold', 'fontsize' : 14} plt.text(0.5, 0.97, "edge width = # games played", horizontalalignment='center', transform=plt.gca().transAxes) plt.text(0.5, 0.94, "node size = # games won", horizontalalignment='center', transform=plt.gca().transAxes) plt.axis('off') plt.savefig("chess_masters.png",dpi=75) print("Wrote chess_masters.png") plt.show() # display networkx-1.11/examples/subclass/0000755000175000017500000000000012653231454016635 5ustar aricaric00000000000000networkx-1.11/examples/subclass/antigraph.py0000644000175000017500000001537612653165361021203 0ustar aricaric00000000000000""" Complement graph class for small footprint when working on dense graphs. This class allows you to add the edges that *do not exist* in the dense graph. However, when applying algorithms to this complement graph data structure, it behaves as if it were the dense version. So it can be used directly in several NetworkX algorithms. This subclass has only been tested for k-core, connected_components, and biconnected_components algorithms but might also work for other algorithms. """ # Author: Jordi Torrents # Copyright (C) 2015-2016 by # Jordi Torrents # All rights reserved. # BSD license. import networkx as nx from networkx.exception import NetworkXError __all__ = ['AntiGraph'] class AntiGraph(nx.Graph): """ Class for complement graphs. The main goal is to be able to work with big and dense graphs with a low memory foodprint. In this class you add the edges that *do not exist* in the dense graph, the report methods of the class return the neighbors, the edges and the degree as if it was the dense graph. Thus it's possible to use an instance of this class with some of NetworkX functions. """ all_edge_dict = {'weight': 1} def single_edge_dict(self): return self.all_edge_dict edge_attr_dict_factory = single_edge_dict def __getitem__(self, n): """Return a dict of neighbors of node n in the dense graph. Parameters ---------- n : node A node in the graph. Returns ------- adj_dict : dictionary The adjacency dictionary for nodes connected to n. """ return dict((node, self.all_edge_dict) for node in set(self.adj) - set(self.adj[n]) - set([n])) def neighbors(self, n): """Return a list of the nodes connected to the node n in the dense graph. Parameters ---------- n : node A node in the graph Returns ------- nlist : list A list of nodes that are adjacent to n. Raises ------ NetworkXError If the node n is not in the graph. """ try: return list(set(self.adj) - set(self.adj[n]) - set([n])) except KeyError: raise NetworkXError("The node %s is not in the graph."%(n,)) def neighbors_iter(self, n): """Return an iterator over all neighbors of node n in the dense graph. """ try: return iter(set(self.adj) - set(self.adj[n]) - set([n])) except KeyError: raise NetworkXError("The node %s is not in the graph."%(n,)) def degree(self, nbunch=None, weight=None): """Return the degree of a node or nodes in the dense graph. """ if nbunch in self: # return a single node return next(self.degree_iter(nbunch,weight))[1] else: # return a dict return dict(self.degree_iter(nbunch,weight)) def degree_iter(self, nbunch=None, weight=None): """Return an iterator for (node, degree) in the dense graph. The node degree is the number of edges adjacent to the node. Parameters ---------- nbunch : iterable container, optional (default=all nodes) A container of nodes. The container will be iterated through once. weight : string or None, optional (default=None) The edge attribute that holds the numerical value used as a weight. If None, then each edge has weight 1. The degree is the sum of the edge weights adjacent to the node. Returns ------- nd_iter : an iterator The iterator returns two-tuples of (node, degree). See Also -------- degree Examples -------- >>> G = nx.Graph() # or DiGraph, MultiGraph, MultiDiGraph, etc >>> G.add_path([0,1,2,3]) >>> list(G.degree_iter(0)) # node 0 with degree 1 [(0, 1)] >>> list(G.degree_iter([0,1])) [(0, 1), (1, 2)] """ if nbunch is None: nodes_nbrs = ((n, {v: self.all_edge_dict for v in set(self.adj) - set(self.adj[n]) - set([n])}) for n in self.nodes_iter()) else: nodes_nbrs= ((n, {v: self.all_edge_dict for v in set(self.nodes()) - set(self.adj[n]) - set([n])}) for n in self.nbunch_iter(nbunch)) if weight is None: for n,nbrs in nodes_nbrs: yield (n,len(nbrs)+(n in nbrs)) # return tuple (n,degree) else: # AntiGraph is a ThinGraph so all edges have weight 1 for n,nbrs in nodes_nbrs: yield (n, sum((nbrs[nbr].get(weight, 1) for nbr in nbrs)) + (n in nbrs and nbrs[n].get(weight, 1))) def adjacency_iter(self): """Return an iterator of (node, adjacency set) tuples for all nodes in the dense graph. This is the fastest way to look at every edge. For directed graphs, only outgoing adjacencies are included. Returns ------- adj_iter : iterator An iterator of (node, adjacency set) for all nodes in the graph. """ for n in self.adj: yield (n, set(self.adj) - set(self.adj[n]) - set([n])) if __name__ == '__main__': # Build several pairs of graphs, a regular graph # and the AntiGraph of it's complement, which behaves # as if it were the original graph. Gnp = nx.gnp_random_graph(20,0.8) Anp = AntiGraph(nx.complement(Gnp)) Gd = nx.davis_southern_women_graph() Ad = AntiGraph(nx.complement(Gd)) Gk = nx.karate_club_graph() Ak = AntiGraph(nx.complement(Gk)) pairs = [(Gnp, Anp), (Gd, Ad), (Gk, Ak)] # test connected components for G, A in pairs: gc = [set(c) for c in nx.connected_components(G)] ac = [set(c) for c in nx.connected_components(A)] for comp in ac: assert comp in gc # test biconnected components for G, A in pairs: gc = [set(c) for c in nx.biconnected_components(G)] ac = [set(c) for c in nx.biconnected_components(A)] for comp in ac: assert comp in gc # test degree for G, A in pairs: node = list(G.nodes())[0] nodes = list(G.nodes())[1:4] assert G.degree(node) == A.degree(node) assert sum(G.degree().values()) == sum(A.degree().values()) # AntiGraph is a ThinGraph, so all the weights are 1 assert sum(A.degree().values()) == sum(A.degree(weight='weight').values()) assert sum(G.degree(nodes).values()) == sum(A.degree(nodes).values()) networkx-1.11/examples/subclass/printgraph.py0000644000175000017500000001007012653165361021366 0ustar aricaric00000000000000""" Example subclass of the Graph class. """ # Author: Aric Hagberg (hagberg@lanl.gov) # Copyright (C) 2004-2016 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. # __docformat__ = "restructuredtext en" from networkx import Graph from networkx.exception import NetworkXException, NetworkXError import networkx.convert as convert from copy import deepcopy class PrintGraph(Graph): """ Example subclass of the Graph class. Prints activity log to file or standard output. """ def __init__(self, data=None, name='', file=None, **attr): Graph.__init__(self, data=data,name=name,**attr) if file is None: import sys self.fh=sys.stdout else: self.fh=open(file,'w') def add_node(self, n, attr_dict=None, **attr): Graph.add_node(self,n,attr_dict=attr_dict,**attr) self.fh.write("Add node: %s\n"%n) def add_nodes_from(self, nodes, **attr): for n in nodes: self.add_node(n, **attr) def remove_node(self,n): Graph.remove_node(self,n) self.fh.write("Remove node: %s\n"%n) def remove_nodes_from(self, nodes): adj = self.adj for n in nodes: self.remove_node(n) def add_edge(self, u, v, attr_dict=None, **attr): Graph.add_edge(self,u,v,attr_dict=attr_dict,**attr) self.fh.write("Add edge: %s-%s\n"%(u,v)) def add_edges_from(self, ebunch, attr_dict=None, **attr): for e in ebunch: u,v=e[0:2] self.add_edge(u,v,attr_dict=attr_dict,**attr) def remove_edge(self, u, v): Graph.remove_edge(self,u,v) self.fh.write("Remove edge: %s-%s\n"%(u,v)) def remove_edges_from(self, ebunch): for e in ebunch: u,v=e[0:2] self.remove_edge(u,v) def clear(self): self.name = '' self.adj.clear() self.node.clear() self.graph.clear() self.fh.write("Clear graph\n") def subgraph(self, nbunch, copy=True): # subgraph is needed here since it can destroy edges in the # graph (copy=False) and we want to keep track of all changes. # # Also for copy=True Graph() uses dictionary assignment for speed # Here we use H.add_edge() bunch =set(self.nbunch_iter(nbunch)) if not copy: # remove all nodes (and attached edges) not in nbunch self.remove_nodes_from([n for n in self if n not in bunch]) self.name = "Subgraph of (%s)"%(self.name) return self else: # create new graph and copy subgraph into it H = self.__class__() H.name = "Subgraph of (%s)"%(self.name) # add nodes H.add_nodes_from(bunch) # add edges seen=set() for u,nbrs in self.adjacency_iter(): if u in bunch: for v,datadict in nbrs.items(): if v in bunch and v not in seen: dd=deepcopy(datadict) H.add_edge(u,v,dd) seen.add(u) # copy node and graph attr dicts H.node=dict( (n,deepcopy(d)) for (n,d) in self.node.items() if n in H) H.graph=deepcopy(self.graph) return H if __name__=='__main__': G=PrintGraph() G.add_node('foo') G.add_nodes_from('bar',weight=8) G.remove_node('b') G.remove_nodes_from('ar') print(G.nodes(data=True)) G.add_edge(0,1,weight=10) print(G.edges(data=True)) G.remove_edge(0,1) G.add_edges_from(list(zip(list(range(0o3)),list(range(1,4)))),weight=10) print(G.edges(data=True)) G.remove_edges_from(list(zip(list(range(0o3)),list(range(1,4))))) print(G.edges(data=True)) G=PrintGraph() G.add_path(list(range(10))) print("subgraph") H1=G.subgraph(list(range(4)),copy=False) H2=G.subgraph(list(range(4)),copy=False) print(H1.edges()) print(H2.edges()) networkx-1.11/examples/pygraphviz/0000755000175000017500000000000012653231454017221 5ustar aricaric00000000000000networkx-1.11/examples/pygraphviz/write_dotfile.py0000644000175000017500000000235212653165361022440 0ustar aricaric00000000000000#!/usr/bin/env python """ Write a dot file from a networkx graph for further processing with graphviz. You need to have either pygraphviz or pydotplus for this example. See http://networkx.github.io/documentation/latest/reference/drawing.html for more info. """ # Author: Aric Hagberg (hagberg@lanl.gov) # Copyright (C) 2004-2016 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import networkx as nx # and the following code block is not needed # but we want to see which module is used and # if and why it fails try: import pygraphviz from networkx.drawing.nx_agraph import write_dot print("using package pygraphviz") except ImportError: try: import pydotplus from networkx.drawing.nx_pydot import write_dot print("using package pydotplus") except ImportError: print() print("Both pygraphviz and pydotplus were not found ") print("see http://networkx.github.io/documentation" "/latest/reference/drawing.html for info") print() raise G=nx.grid_2d_graph(5,5) # 5x5 grid write_dot(G,"grid.dot") print("Now run: neato -Tps grid.dot >grid.ps") networkx-1.11/examples/pygraphviz/pygraphviz_simple.py0000644000175000017500000000160212653165361023351 0ustar aricaric00000000000000#!/usr/bin/env python """ An example showing how to use the interface to the pygraphviz AGraph class to convert to and from graphviz. Also see the pygraphviz documentation and examples at http://pygraphviz.github.io/ """ # Author: Aric Hagberg (hagberg@lanl.gov) # Copyright (C) 2006-2016 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import networkx as nx # plain graph G = nx.complete_graph(5) # start with K5 in networkx A = nx.nx_agraph.to_agraph(G) # convert to a graphviz graph X1 = nx.nx_agraph.from_agraph(A) # convert back to networkx (but as Graph) X2 = nx.Graph(A) # fancy way to do conversion G1 = nx.Graph(X1) # now make it a Graph A.write('k5.dot') # write to dot file X3 = nx.nx_agraph.read_dot('k5.dot') # read from dotfile networkx-1.11/examples/pygraphviz/pygraphviz_attributes.py0000644000175000017500000000177712653165361024263 0ustar aricaric00000000000000#!/usr/bin/env python """ An example showing how to use the interface to the pygraphviz AGraph class to convert to and from graphviz. Also see the pygraphviz documentation and examples at http://pygraphviz.github.io/ """ # Author: Aric Hagberg (hagberg@lanl.gov) # Copyright (C) 2006-2016 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import networkx as nx # networkx graph G = nx.Graph() # ad edges with red color G.add_edge(1, 2, color='red') G.add_edge(2, 3, color='red') # add nodes 3 and 4 G.add_node(3) G.add_node(4) # convert to a graphviz agraph A = nx.nx_agraph.to_agraph(G) # write to dot file A.write('k5_attributes.dot') # convert back to networkx Graph with attributes on edges and # default attributes as dictionary data X = nx.nx_agraph.from_agraph(A) print("edges") print(X.edges(data=True)) print("default graph attributes") print(X.graph) print("node node attributes") print(X.node) networkx-1.11/examples/pygraphviz/pygraphviz_draw.py0000644000175000017500000000130012653165361023010 0ustar aricaric00000000000000#!/usr/bin/env python """ An example showing how to use the interface to the pygraphviz AGraph class to draw a graph. Also see the pygraphviz documentation and examples at http://pygraphviz.github.io/ """ # Author: Aric Hagberg (hagberg@lanl.gov) # Copyright (C) 2006-2016 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import networkx as nx # plain graph G = nx.complete_graph(5) # start with K5 in networkx A = nx.nx_agraph.to_agraph(G) # convert to a graphviz graph A.layout() # neato layout A.draw("k5.ps") # write postscript in k5.ps with neato layout networkx-1.11/examples/graph/0000755000175000017500000000000012653231454016117 5ustar aricaric00000000000000networkx-1.11/examples/graph/expected_degree_sequence.py0000644000175000017500000000136012653165361023500 0ustar aricaric00000000000000#!/usr/bin/env python """ Random graph from given degree sequence. """ # Author: Aric Hagberg (hagberg@lanl.gov) # Copyright (C) 2006-2016 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. from networkx import * from networkx.generators.degree_seq import * # make a random graph of 500 nodes with expected degrees of 50 n=500 # n nodes p=0.1 w=[p*n for i in range(n)] # w = p*n for all nodes G=expected_degree_graph(w) # configuration model print("Degree histogram") print("degree (#nodes) ****") dh=degree_histogram(G) low=min(degree(G)) for i in range(low,len(dh)): bar=''.join(dh[i]*['*']) print("%2s (%2s) %s"%(i,dh[i],bar)) networkx-1.11/examples/graph/atlas2.py0000644000175000017500000000162712653165361017670 0ustar aricaric00000000000000#!/usr/bin/env python """ Write first 20 graphs from the graph atlas as graphviz dot files Gn.dot where n=0,19. Requires pygraphviz and graphviz. """ # Author: Aric Hagberg (hagberg@lanl.gov) # Date: 2005-05-19 14:23:02 -0600 (Thu, 19 May 2005) # Copyright (C) 2006-2016 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import networkx as nx from networkx.generators.atlas import * from pygraphviz import * atlas = graph_atlas_g()[0:20] for G in atlas: print("graph %s has %d nodes with %d edges" %(G.name,NX.number_of_nodes(G),NX.number_of_edges(G))) A = nx.nx_agraph.to_agraph(G) A.graph_attr['label'] = G.name # set default node attributes A.node_attr['color'] = 'red' A.node_attr['style'] = 'filled' A.node_attr['shape'] = 'circle' A.write(G.name + '.dot') networkx-1.11/examples/graph/unix_email.py0000644000175000017500000000515612653165361020635 0ustar aricaric00000000000000#!/usr/bin/env python """ Create a directed graph, allowing multiple edges and self loops, from a unix mailbox. The nodes are email addresses with links that point from the sender to the recievers. The edge data is a Python email.Message object which contains all of the email message data. This example shows the power of XDiGraph to hold edge data of arbitrary Python objects (in this case a list of email messages). By default, load the sample unix email mailbox called "unix_email.mbox". You can load your own mailbox by naming it on the command line, eg python unixemail.py /var/spool/mail/username """ # Author: Aric Hagberg (hagberg@lanl.gov) # Copyright (C) 2005-2016 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import email from email.utils import getaddresses,parseaddr import mailbox import sys # unix mailbox recipe # see http://www.python.org/doc/current/lib/module-mailbox.html def msgfactory(fp): try: return email.message_from_file(fp) except email.Errors.MessageParseError: # Don't return None since that will stop the mailbox iterator return '' if __name__ == '__main__': import networkx as nx try: import matplotlib.pyplot as plt except: pass if len(sys.argv)==1: filePath = "unix_email.mbox" else: filePath = sys.argv[1] mbox = mailbox.mbox(filePath, msgfactory) # parse unix mailbox G=nx.MultiDiGraph() # create empty graph # parse each messages and build graph for msg in mbox: # msg is python email.Message.Message object (source_name,source_addr) = parseaddr(msg['From']) # sender # get all recipients # see http://www.python.org/doc/current/lib/module-email.Utils.html tos = msg.get_all('to', []) ccs = msg.get_all('cc', []) resent_tos = msg.get_all('resent-to', []) resent_ccs = msg.get_all('resent-cc', []) all_recipients = getaddresses(tos + ccs + resent_tos + resent_ccs) # now add the edges for this mail message for (target_name,target_addr) in all_recipients: G.add_edge(source_addr,target_addr,message=msg) # print edges with message subject for (u,v,d) in G.edges_iter(data=True): print("From: %s To: %s Subject: %s"%(u,v,d['message']["Subject"])) try: # draw pos=nx.spring_layout(G,iterations=10) nx.draw(G,pos,node_size=0,alpha=0.4,edge_color='r',font_size=16) plt.savefig("unix_email.png") plt.show() except: # matplotlib not available pass networkx-1.11/examples/graph/karate_club.py0000644000175000017500000000062512637544450020755 0ustar aricaric00000000000000#!/usr/bin/env python """ Zachary's Karate Club graph Data file from: http://vlado.fmf.uni-lj.si/pub/networks/data/Ucinet/UciData.htm Reference: Zachary W. (1977). An information flow model for conflict and fission in small groups. Journal of Anthropological Research, 33, 452-473. """ import networkx as nx G=nx.karate_club_graph() print("Node Degree") for v in G: print('%s %s' % (v,G.degree(v))) networkx-1.11/examples/graph/unix_email.mbox0000644000175000017500000000325512637544450021152 0ustar aricaric00000000000000From alice@edu Thu Jun 16 16:12:12 2005 From: Alice Subject: NetworkX Date: Thu, 16 Jun 2005 16:12:13 -0700 To: Bob Status: RO Content-Length: 86 Lines: 5 Bob, check out the new networkx release - you and Carol might really like it. Alice From bob@gov Thu Jun 16 18:13:12 2005 Return-Path: Subject: Re: NetworkX From: Bob To: Alice Content-Type: text/plain Date: Thu, 16 Jun 2005 18:13:12 -0700 Status: RO Content-Length: 26 Lines: 4 Thanks for the tip. Bob From ted@com Thu Jul 28 09:53:31 2005 Return-Path: Subject: Graph package in Python? From: Ted To: Bob Content-Type: text/plain Date: Thu, 28 Jul 2005 09:47:03 -0700 Status: RO Content-Length: 90 Lines: 3 Hey Ted - I'm looking for a Python package for graphs and networks. Do you know of any? From bob@gov Thu Jul 28 09:59:31 2005 Return-Path: Subject: Re: Graph package in Python? From: Bob To: Ted Content-Type: text/plain Date: Thu, 28 Jul 2005 09:59:03 -0700 Status: RO Content-Length: 180 Lines: 9 Check out the NetworkX package - Alice sent me the tip! Bob >> bob@gov scrawled: >> Hey Ted - I'm looking for a Python package for >> graphs and networks. Do you know of any? From ted@com Thu Jul 28 15:53:31 2005 Return-Path: Subject: get together for lunch to discuss Networks? From: Ted To: Bob , Carol , Alice Content-Type: text/plain Date: Thu, 28 Jul 2005 15:47:03 -0700 Status: RO Content-Length: 139 Lines: 5 Hey everyrone! Want to meet at that restaurant on the island in Konigsburg tonight? Bring your laptops and we can install NetworkX. Ted networkx-1.11/examples/graph/atlas.py0000644000175000017500000000532112653165361017601 0ustar aricaric00000000000000#!/usr/bin/env python """ Atlas of all graphs of 6 nodes or less. """ # Author: Aric Hagberg (hagberg@lanl.gov) # Copyright (C) 2004-2016 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import networkx as nx from networkx.generators.atlas import * from networkx.algorithms.isomorphism.isomorph import graph_could_be_isomorphic as isomorphic import random def atlas6(): """ Return the atlas of all connected graphs of 6 nodes or less. Attempt to check for isomorphisms and remove. """ Atlas = graph_atlas_g()[0:208] # 208 # remove isolated nodes, only connected graphs are left U = nx.Graph() # graph for union of all graphs in atlas for G in Atlas: zerodegree = [n for n in G if G.degree(n)==0] for n in zerodegree: G.remove_node(n) U = nx.disjoint_union(U, G) # list of graphs of all connected components C = nx.connected_component_subgraphs(U) UU = nx.Graph() # do quick isomorphic-like check, not a true isomorphism checker nlist = [] # list of nonisomorphic graphs for G in C: # check against all nonisomorphic graphs so far if not iso(G, nlist): nlist.append(G) UU = nx.disjoint_union(UU, G) # union the nonisomorphic graphs return UU def iso(G1, glist): """Quick and dirty nonisomorphism checker used to check isomorphisms.""" for G2 in glist: if isomorphic(G1, G2): return True return False if __name__ == '__main__': G=atlas6() print("graph has %d nodes with %d edges"\ %(nx.number_of_nodes(G), nx.number_of_edges(G))) print(nx.number_connected_components(G), "connected components") try: import pygraphviz from networkx.drawing.nx_agraph import graphviz_layout except ImportError: try: import pydotplus from networkx.drawing.nx_pydot import graphviz_layout except ImportError: raise ImportError("This example needs Graphviz and either " "PyGraphviz or PyDotPlus") import matplotlib.pyplot as plt plt.figure(1, figsize=(8, 8)) # layout graphs with positions using graphviz neato pos = graphviz_layout(G, prog="neato") # color nodes the same in each connected subgraph C = nx.connected_component_subgraphs(G) for g in C: c = [random.random()] * nx.number_of_nodes(g) # random color... nx.draw(g, pos, node_size=40, node_color=c, vmin=0.0, vmax=1.0, with_labels=False ) plt.savefig("atlas.png", dpi=75) networkx-1.11/examples/graph/roget.py0000644000175000017500000000451612653165361017622 0ustar aricaric00000000000000#!/usr/bin/env python """ Build a directed graph of 1022 categories and 5075 cross-references as defined in the 1879 version of Roget's Thesaurus contained in the datafile roget_dat.txt. This example is described in Section 1.2 in Knuth's book [1,2]. Note that one of the 5075 cross references is a self loop yet it is included in the graph built here because the standard networkx DiGraph class allows self loops. (cf. 400pungency:400 401 403 405). References. ---------- [1] Donald E. Knuth, "The Stanford GraphBase: A Platform for Combinatorial Computing", ACM Press, New York, 1993. [2] http://www-cs-faculty.stanford.edu/~knuth/sgb.html """ from __future__ import print_function # Authors: Brendt Wohlberg, Aric Hagberg (hagberg@lanl.gov) # Date: 2005-04-01 07:56:22 -0700 (Fri, 01 Apr 2005) # Copyright (C) 2004-2016 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. from networkx import * import re import sys def roget_graph(): """ Return the thesaurus graph from the roget.dat example in the Stanford Graph Base. """ # open file roget_dat.txt.gz (or roget_dat.txt) import gzip fh=gzip.open('roget_dat.txt.gz','r') G=DiGraph() for line in fh.readlines(): line = line.decode() if line.startswith("*"): # skip comments continue if line.startswith(" "): # this is a continuation line, append line=oldline+line if line.endswith("\\\n"): # continuation line, buffer, goto next oldline=line.strip("\\\n") continue (headname,tails)=line.split(":") # head numfind=re.compile("^\d+") # re to find the number of this word head=numfind.findall(headname)[0] # get the number G.add_node(head) for tail in tails.split(): if head==tail: print("skipping self loop",head,tail, file=sys.stderr) G.add_edge(head,tail) return G if __name__ == '__main__': from networkx import * G=roget_graph() print("Loaded roget_dat.txt containing 1022 categories.") print("digraph has %d nodes with %d edges"\ %(number_of_nodes(G),number_of_edges(G))) UG=G.to_undirected() print(number_connected_components(UG),"connected components") networkx-1.11/examples/graph/knuth_miles.py0000644000175000017500000000561012653165361021020 0ustar aricaric00000000000000#!/usr/bin/env python """ An example using networkx.Graph(). miles_graph() returns an undirected graph over the 128 US cities from the datafile miles_dat.txt. The cities each have location and population data. The edges are labeled with the distance betwen the two cities. This example is described in Section 1.1 in Knuth's book [1,2]. References. ----------- [1] Donald E. Knuth, "The Stanford GraphBase: A Platform for Combinatorial Computing", ACM Press, New York, 1993. [2] http://www-cs-faculty.stanford.edu/~knuth/sgb.html """ # Author: Aric Hagberg (hagberg@lanl.gov) # Copyright (C) 2004-2016 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import networkx as nx def miles_graph(): """ Return the cites example graph in miles_dat.txt from the Stanford GraphBase. """ # open file miles_dat.txt.gz (or miles_dat.txt) import gzip fh = gzip.open('knuth_miles.txt.gz','r') G=nx.Graph() G.position={} G.population={} cities=[] for line in fh.readlines(): line = line.decode() if line.startswith("*"): # skip comments continue numfind=re.compile("^\d+") if numfind.match(line): # this line is distances dist=line.split() for d in dist: G.add_edge(city,cities[i],weight=int(d)) i=i+1 else: # this line is a city, position, population i=1 (city,coordpop)=line.split("[") cities.insert(0,city) (coord,pop)=coordpop.split("]") (y,x)=coord.split(",") G.add_node(city) # assign position - flip x axis for matplotlib, shift origin G.position[city]=(-int(x)+7500,int(y)-3000) G.population[city]=float(pop)/1000.0 return G if __name__ == '__main__': import networkx as nx import re import sys G=miles_graph() print("Loaded miles_dat.txt containing 128 cities.") print("digraph has %d nodes with %d edges"\ %(nx.number_of_nodes(G),nx.number_of_edges(G))) # make new graph of cites, edge if less then 300 miles between them H=nx.Graph() for v in G: H.add_node(v) for (u,v,d) in G.edges(data=True): if d['weight'] < 300: H.add_edge(u,v) # draw with matplotlib/pylab try: import matplotlib.pyplot as plt plt.figure(figsize=(8,8)) # with nodes colored by degree sized by population node_color=[float(H.degree(v)) for v in H] nx.draw(H,G.position, node_size=[G.population[v] for v in H], node_color=node_color, with_labels=False) # scale the axes equally plt.xlim(-5000,500) plt.ylim(-2000,3500) plt.savefig("knuth_miles.png") except: pass networkx-1.11/examples/graph/knuth_miles.txt.gz0000644000175000017500000004753512637544450021644 0ustar aricaric00000000000000IFmreǑybO]uaT+"!Jե5mcP&Ȅh X|ةR\({C/_}x||p>y߿?.?]^Ǘ{~_>=,x㻇ׇ,>O>߿r _˯^_oO]~׌|x{??^>~O^~ᛷz?S|om՗o߼||׏?|yyxxpm.q~zzxx| O.bvg/?>?Oq?<5p_c~㳗oqxdvwqmݢ}S,B<c|1C# ty|,N /><,{,jl]7_o/?>]~?JYzR˿?_oqq-u-ex?8ֻ^X^ڥne]k`Yz-G9X8Kz9mkk/^ڿ:q׺ײKl\Ǘ-/xVzߗQYǥ[eo~u<7oX~lٖxr97kR>|a՟~ueܝ׺r륎Kqc+Ry{,ߺ]_qu-&' ,Ɍ;?x7yUCtW"#cs%oIiEB;"-do;Z4AFBCY6~7ԙ¢lGYj{ ӻ寵N5$b$c):?a23ii;ޥ]եv AB|ӻ/!m"n"2v#͋$us:rw|5V [= == [שnH^?0V7 3ڣƗ3kZ]q;6_>})P=|@FRu_RxV"Hbv~!xoKX&(‰B Uc%02>pRf QƂУyb}c#:fkFc44!X!"m =VgaEvGд"TSZ+W۵YCNBqc\sZ,!`۸n O,maǞ/;ù+'bAZ-^ssŔF<=\P!>dB¾.aRâ#cr]qjVS2 @1߽FBcXP2%猭QweBZuxWb?,L= ߱ : i,x<]]׎9X_C_6$c5Цχg[N/DBڱrZ'`{$q[bC:;>]B$0WE?={{xBXȵ\q50Df}]O%Ԟ")fşyU#V4y9KaygzI቟؀#*fȡ\(s7u0ۢDy%'Z{{^CebsDPh )WaX1*<2Q:vOvǫCqydbCpD,`A-|{|w?R"-|*gwl'Ɨ%Pby$0 o_)L\i*AH:" Ue, X“X wc#Fz>vaC4A!^-|~|yH$]]˺laZ|[el&Uo?!RyAbk qf2 $a0Jߌ mP UOO(-.F%r}Hg-S '~ bww\n27-+c=#3ܒ=ngPf s~MC)Chw.|9*1GK8P9naCxa@hdn'{5Dxabn3BuPkgx:{ ?FLQPr{jGXj{<+$UxGnഊx0˭;^|-=\5@#ۃ'T#M2p>P0unkSxo0:f\Cv.qA]PfcȗKƟ^Nl.w_E?1!80b=Z{A-]m<vlg@*bQ'Rz#]zHP跔ȸ[cs JGf'&p=ւ 8";¾͠$ t&=%SN̉1@AIB ӛ Y!-xӠK/&ZOow8ds̀0_iݷ3Gږ4"t#|y Nj)8sՎ=b4tvת- Y2TX126AF]lkDr O_f)+8oBL9=2 ؀K8OFb$$ pMS|7?lxY"C4IKV}.x3*VԦ".(@G@RK$?|pfVGD rHbH6b$W=7mˢ,*W|"6+f5<h }SnB/rX%?ρ{R΍7VBFBEG6HJLDZw],J$Cľwn5#V2 qFV[;rurZvJҨ +CzA2kPnV(R5`ʣO}z}HK8p5rA+M~n);Ykd%%aŦ_˚EPI0>`uԣQ%\:@gǨ3C ªb=0d圀aյ@^Lҭi>F4[]/0%rVzv X*8ݺo ߝ"f)WGxP0-T6Tx$ƈpo?,MGi2qC[p{]IJ /~aF \Cɗbh,}Vq?jb_~^Ǘ_"n A@Ʊ l-U:|T!d*9u.p"!C\eЏ0{#TҒԖWo,5 bm5m ̓5w53Q'@Lm+\ ]?)}˧q^v\#af),8 kt{EJۇ)ۚDcVXJV\ePXzgjau#?Qc\PN v6 mo5 8U[p!S[Pc^%ШDacS'(q"D,k? 9 3+1àǑ')BH0~#ۨo{4Xbv!IF:}ݢfdzwVRʻ!BX,*xX{v:AbH CD\JlLwׇ_&a2_ADL,So?TYHH .e-V^>ēl0I+eSji '֊e@"xZ*#CxI7=D&MD~$,ԧ<}^l!z.Ŕi.Ii@)$28O0rz/!vYE N[_2JRʼnTQ42YWy,-QraM)Տѷ=Oܦ׬@TŃVֻXB fp2',/ λ?Yiޯg H=*ZJPpmO]%Obr)\*h8Ǥ^[U^RD@]\8 BJdB@\V-ق&`ND z# ס}M m+@\*8ì[EY.k%tqMg_ &5.Xa*j:41]0f7a*݀@DX2 UzZR$73 0k6#hD[qؔ9(!UxvǐEw7j p(Yq7CuհMEٷ#3ziF{KEhY~*Xº(|Zne/ {ŨLQdwXc&_;M>R)!FfD.%B: )R3A4bEEۂ ˗Oo.xx~/\'T8M()ɠMumA"AHE[_8GBo!lv4ZKo)>dQj[Xu Л^MČ V\r =}M敥\x7҅]A/V_W%9sLƌ!*ք%g#>CrT,?'&@ u<"M(lQ^ߟ7nd:]bb]s4BG]X1rl5^"\:+1JK,F"\W+从"+$Iyx)See&#@Bb?dg=$LI5!/^4wBf׵4}f";gJ ` +asS}!E魤1`х&I[<3ĺ7u"*p>D?2 `'α%]n)5$!25(ЏY?ב#w+u" 0Qy!118IlvLQcyE\'IQ#WN!AQ00#[ I4S5+(Jaq= ILa%7DxG\Z^) j}lv9&;h繸LdDz:yDlO` \/BHuAH-kwc83,|j=̽֠#9J}IX%JXd`%4pYI#6 D&*QHNd*;,VtƑ)f?%wWU[mtImY s!.J2P7><}?)ŧ"^#) Rc%ll0̈́hhMmXE2iK:[dEl r:BX]YZ)3 ocYٜlt`;$ KwKD͓Nћlw=!kAIdmSjv<'/<[ )Ց޴%+Qpk&@@ &v~d_>}Li}8*5SRN9ͺ5%= r9hNRp͓ئ5[&EݶH6}e=Sz[Uqgow ^&A'anm+tOj\nZ̪lg1(Qʦf5L +vE>+ )r$xnV a,tU><3;o@`sn^4 9~J<(8RL]g6 qdFuXbOM$-_Ȏp^q 5snKh_?L&z% :ΒU+G/>n%1OUV}K8ܖpkeӁ7hm 954YIp<őzy8;NinIBA TOK)8Ģlkmo3L@~0(EvJ2._?>=\~s!Y~ޅcW6eG"+euxԦ>貺Z:Q6Ań5=2UUِ>9by瞵1tUxd@-nqDV& šbtAsxeC\dGMޚjʙtPUaD2;`l3kG0켲Mc VqP2F4\lח+mwD;CvˬX=I֤,HZЦG[hʅF3L}XYZ%Xvі~G"~!m7t0LIU̜FHq$-狰M*?DmNڦXӱuuN7A1Y٣옚(A 5;Cb/o BPR^{+M~9XT&U ls2)ihVΛ3 QIql oT2=(H7]]a¸سgwv"ӌev6T&\=V~3,| jfNz1vaŀ&R!ʒS M ` pF# %Vo$zC{8$bx:B~Gf}ڛI+eGvxW "@H ?퇝1M4ng)B,uM0PikViji j!ch$RZ >* oV~fݲ/GFa6 )+eRmP tx@v6قԅZ7#kRk0i]ǿLۘCzfG@ao\!{+f_mʬt$+~KV$Ԏ[%7AK8č0 FT @d-#7%VfOpP"]AN8JlTN8ۙmcΗsl>6 NWKEV~SVm-},;*Ei9HD6Y3ܷmϟ^FMyW^=HG&ҕqfwaKԏ d>1zgŘ쫬l.FXGܓM2j=D ,H=qFx(HRGq] jΓ9byJZUP' n^B t~0)a(p"xT7u D@<ߒlGym Vw9G#yb95WڀrX5 v_>?cҳۈR#)ihDtUBFO6624Hѧ#Zqf>CY盰ƲTa 0yJCy QQnUvN;ow*.UUTEeV\~W,1o9o%V8Fhb_o2BƳDŽ]s$J\< q"$g SYcXI);CO;ޒg(U. |LBxFMխ+QHfëmI_<||yr r+WI')cK`E As}fM,&8֬13KHpo?[;ꄌ;T>=D  SQxA$U2>Id[$cd"@8%vN홍w}]3R*:V|a+HK]r@ְdB0:I~قH`hQofmnϩ<ܻ0#i釁 t9EC|8En;&IBkKΗXom2e'\66/V 葍٠l) ̶=XG ԳȗI; `IH:wW\)5н1$eAu:Líy/*E7/cB۰o%WcI) |g$v&eukE&v Yr+_$V'p2ud,;Ʊ֐Sj,1&>]5N☤!0XQh{zqI/@ZLw|93jCn8 ɨM4ul1Mu]ILH8)sڭIZd2u(&7,@stavYvRЃVKiܱ#١NAȿz۰?~u7tXpx %ۃbW*9lɹQRn\ʚ!XoOّ3 koÿbt82ۜsؔU ĴE.r)ju"*Ij|Ξ_X@k|oPœ/[0Qo}p \w[nNZ'f1֚=܄f9thS[V"`ʱ~Ba}H렙Wf6%48tfSO28&%Am\fuhP%qlӦٶ{洄jI&24EQw$P8'`JLPwa*EfFڞ\Hkܑpb>4/u|~) 2{δ Bٙ 8f*RAHG ͩ=v\$4V&2w=(* O8IE,N[o!At)ϡhV=)~qp6)2BrC尠w:2`2u %O@GLP߸9H2z0AoDsj#z*~\&<So0S'q9=c2gSг0 (, &O3'[ۖ pa#Þ #!΋Zc/˹+ ^Ca#it5A Nv/5nBNI7ްy>K'TYfP:Nlu$՛ >2L)DYY8fI$7&i ArZT=,ğE~YaMN*|v2Ł #hcLvڍf;7<Ȝ P11@J@Ϛ|&Ss Ʌ/ ]lC!P՜Ĉ*Z2( }銦c,q5d2d!mSXptc )[} Y^u.3d`!5U`t8CxurpPN@sd$Df{\=$BwsݲUߖ Q.VGƁBf͖MÙjb#O9_/Xa\) L[ vPPM[aR `!l xr~wdW*mᎶ:*9UfG9EYR IG܇[S +Ŏr' 1s>yH,u6`X΄+YlMr֥\ _}B@U!K&ppa /.9861󟖇$k,;UJ̲V&1NK3w)_<|z۾cyHgq3|;y%z"X @(89AeȾGY`O=7&fs .9Y\sGeLyVܒӒ͟ `j-yf n=k5O><ؠ8Lk8u Ai Hb,s$gsnFN?{5\̖RqdY)ԕUALAo?=>W8wpj`3UPg6kzNܜn]"6G'd3#g[8Vpi#Ɩ|Q*#pI9!4JƴW"lsdAS8YzQD[NvJ%NۼgSo8`I 1H'FK΃=!kF $0@ ~4 S%2G "C[6dԤSPZ8겲;Bwr9sv$p0;2OoaI|HPr4=kCZ-EU+<%3CӽeBWՒL.`YOCq9i3գ%mw8 3L7E7Ƣ\=';PY}9 >qxٱi F8r|?u"NtQD kc d8~k)pت p핤>YLů4,`XmY8wL~E=uo)x8F͎r2~WK]鞩I E9+h$s?kyҔTKL _[RHOD̎ ,}IvHGփ^Ug`iRfd>i-|r:(u1MqzbОȸ0!ͱuG2 _xSfL+1US+m&}qB6N51re8O0}E^;D)3c.̦[G)hz9l,xy);=ڶzMD<[2llplOٳ+GI5^+)&7~`(5\2~>F}i_"m$:7 9 NE[^ufBֆ*'K<ıڭ!q2pSǀ3:Ca<\ {0mhmq0<$)Qm2[meZ業k+YC\Fl~cpyFB̂f*~ 4V{_K QM> 7cOX{d ٖl'`+1h?ˋޅ91&,}L)c G)[i, o)F= (&rX `; Xyhvۆ.Qu˦"wAy>(Ȏ#U [vNx޻G9D-\}sˠjQ=X8N 9m0&f,Y4S*t}%:G'<\;ibD=Ȥ9anPk;S۴M؛M#l}CY06V:5D%'fSa~!Gjqw[eun.kڧCp\yY-G=c6s@*=RiN2;:z\ [Jꙵ䛎dIZӷ휏wԄf<,7o !*X Zr`6|)2 mj 5tuF宒Q{ƻМڑ5;#Ƒ@s< mHxmdc>3[<9Wt+]H'zְ%c?<( p$]G`<#畔*_/b &1/,SE+:C9cd~y,Ɠ$-CVgfU>߲r zNP/M#{FHϞh֦qJt-FfqlUdؗ3Q+0c= !H\6PE'Sɹ n"͚a'%h]nbG yA6Ss) o:z;`&'P4W(7(P M)/ L3O٬P%ì076ݯyo-D 9y'hPfkRK0t(7,I!}ՖyW0FcD9l><>?P (h"` G`eL֐ 0˃ۤɽq{%9c5:8 >t,5^UAvs>'֬75OF/9Ms+gt)/ZVSF>L[$VMSP-5$,NFkc"'`zbUH{“I@ y/ 5ۊzb%ӆ4"9gٚ`zkb"s LV,3|Z?<Љ?7۝, 2ӿwG$]j˞n'9yN,m#RIA HqDe2qf ٘/qk8W&@|fS2O+Oݙ-oj6Zu鹱 cp$=XJ U#Py>g8C.Odf9vg }`d;J_1MQɼߒ^$ S=W`bKkiE1ѤJKs:]3΃=VL$NY)s\FHڷ%}o_# 6`,n/ F2- XO0f&^8R𰱓QT%2aXgEp&Cp"8i)v' l375vd_񟎱*]&y\Z&e"Q.Dl>Y|ụSa(2)urb&H%_՝܌@KpR[ [Ijwp!<-QrjDZ%-YIoJlh8vۮviv;[v[//\y|z߿kV;أnetworkx-1.11/examples/graph/degree_sequence.py0000644000175000017500000000150212653165361021615 0ustar aricaric00000000000000#!/usr/bin/env python """ Random graph from given degree sequence. """ # Author: Aric Hagberg (hagberg@lanl.gov) # Date: 2004-11-03 08:11:09 -0700 (Wed, 03 Nov 2004) # Revision: 503 # Copyright (C) 2004-2016 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. from networkx import * z=[5,3,3,3,3,2,2,2,1,1,1] print(is_valid_degree_sequence(z)) print("Configuration model") G=configuration_model(z) # configuration model degree_sequence=list(degree(G).values()) # degree sequence print("Degree sequence %s" % degree_sequence) print("Degree histogram") hist={} for d in degree_sequence: if d in hist: hist[d]+=1 else: hist[d]=1 print("degree #nodes") for d in hist: print('%d %d' % (d,hist[d])) networkx-1.11/examples/graph/napoleon_russian_campaign.py0000644000175000017500000000616012653165361023715 0ustar aricaric00000000000000#!/usr/bin/env python """ Minard's data from Napoleon's 1812-1813 Russian Campaign. http://www.math.yorku.ca/SCS/Gallery/minard/minard.txt """ # Author: Aric Hagberg (hagberg@lanl.gov) # Copyright (C) 2006-2016 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import string import networkx as nx def minard_graph(): data1="""\ 24.0,54.9,340000,A,1 24.5,55.0,340000,A,1 25.5,54.5,340000,A,1 26.0,54.7,320000,A,1 27.0,54.8,300000,A,1 28.0,54.9,280000,A,1 28.5,55.0,240000,A,1 29.0,55.1,210000,A,1 30.0,55.2,180000,A,1 30.3,55.3,175000,A,1 32.0,54.8,145000,A,1 33.2,54.9,140000,A,1 34.4,55.5,127100,A,1 35.5,55.4,100000,A,1 36.0,55.5,100000,A,1 37.6,55.8,100000,A,1 37.7,55.7,100000,R,1 37.5,55.7,98000,R,1 37.0,55.0,97000,R,1 36.8,55.0,96000,R,1 35.4,55.3,87000,R,1 34.3,55.2,55000,R,1 33.3,54.8,37000,R,1 32.0,54.6,24000,R,1 30.4,54.4,20000,R,1 29.2,54.3,20000,R,1 28.5,54.2,20000,R,1 28.3,54.3,20000,R,1 27.5,54.5,20000,R,1 26.8,54.3,12000,R,1 26.4,54.4,14000,R,1 25.0,54.4,8000,R,1 24.4,54.4,4000,R,1 24.2,54.4,4000,R,1 24.1,54.4,4000,R,1""" data2="""\ 24.0,55.1,60000,A,2 24.5,55.2,60000,A,2 25.5,54.7,60000,A,2 26.6,55.7,40000,A,2 27.4,55.6,33000,A,2 28.7,55.5,33000,R,2 29.2,54.2,30000,R,2 28.5,54.1,30000,R,2 28.3,54.2,28000,R,2""" data3="""\ 24.0,55.2,22000,A,3 24.5,55.3,22000,A,3 24.6,55.8,6000,A,3 24.6,55.8,6000,R,3 24.2,54.4,6000,R,3 24.1,54.4,6000,R,3""" cities="""\ 24.0,55.0,Kowno 25.3,54.7,Wilna 26.4,54.4,Smorgoni 26.8,54.3,Moiodexno 27.7,55.2,Gloubokoe 27.6,53.9,Minsk 28.5,54.3,Studienska 28.7,55.5,Polotzk 29.2,54.4,Bobr 30.2,55.3,Witebsk 30.4,54.5,Orscha 30.4,53.9,Mohilow 32.0,54.8,Smolensk 33.2,54.9,Dorogobouge 34.3,55.2,Wixma 34.4,55.5,Chjat 36.0,55.5,Mojaisk 37.6,55.8,Moscou 36.6,55.3,Tarantino 36.5,55.0,Malo-Jarosewii""" c={} for line in cities.split('\n'): x,y,name=line.split(',') c[name]=(float(x),float(y)) g=[] for data in [data1,data2,data3]: G=nx.Graph() i=0 G.pos={} # location G.pop={} # size last=None for line in data.split('\n'): x,y,p,r,n=line.split(',') G.pos[i]=(float(x),float(y)) G.pop[i]=int(p) if last is None: last=i else: G.add_edge(i,last,{r:int(n)}) last=i i=i+1 g.append(G) return g,c if __name__ == "__main__": (g,city)=minard_graph() try: import matplotlib.pyplot as plt plt.figure(1,figsize=(11,5)) plt.clf() colors=['b','g','r'] for G in g: c=colors.pop(0) node_size=[int(G.pop[n]/300.0) for n in G] nx.draw_networkx_edges(G,G.pos,edge_color=c,width=4,alpha=0.5) nx.draw_networkx_nodes(G,G.pos,node_size=node_size,node_color=c,alpha=0.5) nx.draw_networkx_nodes(G,G.pos,node_size=5,node_color='k') for c in city: x,y=city[c] plt.text(x,y+0.1,c) plt.savefig("napoleon_russian_campaign.png") except ImportError: pass networkx-1.11/examples/graph/football.py0000644000175000017500000000232112653165361020274 0ustar aricaric00000000000000#!/usr/bin/env python """ Load football network in GML format and compute some network statistcs. Shows how to download GML graph in a zipped file, unpack it, and load into a NetworkX graph. Requires Internet connection to download the URL http://www-personal.umich.edu/~mejn/netdata/football.zip """ # Author: Aric Hagberg (hagberg@lanl.gov) # Copyright (C) 2007-2016 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. from networkx import * url="http://www-personal.umich.edu/~mejn/netdata/football.zip" try: # Python 3.x import urllib.request as urllib except ImportError: # Python 2.x import urllib import io import zipfile sock = urllib.urlopen(url) # open URL s=io.BytesIO(sock.read()) # read into BytesIO "file" sock.close() zf = zipfile.ZipFile(s) # zipfile object txt=zf.read('football.txt').decode() # read info file gml=zf.read('football.gml').decode() # read gml data # throw away bogus first line with # from mejn files gml=gml.split('\n')[1:] G=parse_gml(gml) # parse gml data print(txt) # print degree for each team - number of games for n,d in G.degree_iter(): print('%s %d' % (n, d)) networkx-1.11/examples/graph/words.py0000644000175000017500000000533212653165361017635 0ustar aricaric00000000000000""" Words/Ladder Graph ------------------ Generate an undirected graph over the 5757 5-letter words in the datafile words_dat.txt.gz. Two words are connected by an edge if they differ in one letter, resulting in 14,135 edges. This example is described in Section 1.1 in Knuth's book [1]_,[2]_. References ---------- .. [1] Donald E. Knuth, "The Stanford GraphBase: A Platform for Combinatorial Computing", ACM Press, New York, 1993. .. [2] http://www-cs-faculty.stanford.edu/~knuth/sgb.html """ # Authors: Aric Hagberg (hagberg@lanl.gov), # Brendt Wohlberg, # hughdbrown@yahoo.com # Copyright (C) 2004-2016 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import networkx as nx #------------------------------------------------------------------- # The Words/Ladder graph of Section 1.1 #------------------------------------------------------------------- def generate_graph(words): from string import ascii_lowercase as lowercase G = nx.Graph(name="words") lookup = dict((c,lowercase.index(c)) for c in lowercase) def edit_distance_one(word): for i in range(len(word)): left, c, right = word[0:i], word[i], word[i+1:] j = lookup[c] # lowercase.index(c) for cc in lowercase[j+1:]: yield left + cc + right candgen = ((word, cand) for word in sorted(words) for cand in edit_distance_one(word) if cand in words) G.add_nodes_from(words) for word, cand in candgen: G.add_edge(word, cand) return G def words_graph(): """Return the words example graph from the Stanford GraphBase""" import gzip fh=gzip.open('words_dat.txt.gz','r') words=set() for line in fh.readlines(): line = line.decode() if line.startswith('*'): continue w=str(line[0:5]) words.add(w) return generate_graph(words) if __name__ == '__main__': from networkx import * G=words_graph() print("Loaded words_dat.txt containing 5757 five-letter English words.") print("Two words are connected if they differ in one letter.") print("Graph has %d nodes with %d edges" %(number_of_nodes(G),number_of_edges(G))) print("%d connected components" % number_connected_components(G)) for (source,target) in [('chaos','order'), ('nodes','graph'), ('pound','marks')]: print("Shortest path between %s and %s is"%(source,target)) try: sp=shortest_path(G, source, target) for n in sp: print(n) except nx.NetworkXNoPath: print("None") networkx-1.11/examples/graph/roget_dat.txt.gz0000644000175000017500000003661612637544450021270 0ustar aricaric00000000000000IFm}˒rݜ_AkbI{/ gmM mExD7qIѽk,4MYYYYoqoߧӸǟ?[<v~op9a?n1~Mqac_ei76&9>_/4?rR?Zօ/c] _>  Wpox pďy;|ϗ}/r΁nC >FF3P7Km]_.xOo>Ny{aJ <Ă08u’i䴇YnB?(iB1w7]MuuǴu *ĸ]>o2O 6ghyL ءU6P `}0ot%^m hS̛4ab#dweۇ>a) VKxH ;mZ0 h? 9l\7֖uec䒶8zcgӅP8Eq~:X?M=b|y܆4m5FR;}-%7.=/l2_MbvtNovW[q6z d.x!nqܦw[n)kE=kcGT`,Pp͏ j>8;g 5b -]MXu8>a)>Nż@7>;9np7}9lIQ#<&2iy.)x1*L!?smk7qq6.‘cDwכv~.F0lb,^a}86ӸYqN,;$`[=\aq1r.1!mR] a!ތ;M8!wFd< >qw1o |&skkn\CI2ϻOm(>A0*8vTƏr.F<0HyMɛA /w尣t] nzvpQyu wtyd|@Gku i=79P{8@.=-k̻̎Jψ>6OW&WITƛǭP@Q Mq&dtAI c}07ioO`k$DJK8?OU(~S"hn=G$*x(<7/T 5@=%IK@eu:S {2Z[ʤڥۡ ]&$<5Mw.B^NyS3!v}uN_*wH=`+?|kȷox Aӳm"Wq Ou6~!u_ۼ7 xsah4nxpD" N g4Ab\Mhul:J)R΃hM r]F<}xpة9Ns(=<5^Aٵ́riD`3p!V} D d sOxZ4PoWWR|F 2*>e~Fk'>vE/Y4Rr{6U jґWރމ#!^QyDIĚwf쁬 0ZrcȡߗK%cҽH_jcGoX9@w 8 OO ؑE4 EvaW=qn^wnh`& -A?B$bR~?Riͨ,$(SEDB\a½HSkP!x&6ݩҟilCٮEݜZ5vqyA>$WTO,-e6:4OO#R'zxp(F3i/zƭI2=[MڢRU  ަSL A1f*J@fTq2^Ť8GU'_ڙ%146XGp&Ytnx0AvXmahЪBH …KC_U|.O9Wz&|hnIIh(EcGR?>x[}̸l5cdzNMx~6Dsײ:nodP9Shw"Ibnt}!0CSc!Z P*5==@+|X<2:zx*K/2*8ZC7om;.A) EK =Txcԁ64h&ƒKؼY~@{dMrd2'ޙkEhAUm:4Ҁٓ Ǒ "?4d1BIaRIK6ӛ;͋t*WG5QRHELdRF780NR3#FYCۙ]o,x 5N#MiG^ ⾦ $bG(ZD[C{'f^%dʗP?c؟hLvVghUӿJKU qjfrx4Op;д.M洵)offIfű"K%1r;fanq@dL6*ai$C慸̌}bä;*Z"|2ƜmƶT(bCkAXo[DQmB+8Buvm.̄64٩NhZ{q|18 -84"DxZ\8&;H9x*@.ǝ@Kpzc|r)1l(zBXy,>^Fo$IBP>EC[W\oYKs:'=nM?>m|ZczӞ&WHXfɅVmC;yl_Dd%ՔBZ;యMo|AY @$4xih'%2du}ϥF  *sqЌCs6Je.B;ISoczXGg(+*vu2_dK=/|8M%ʼ#xkc d+|o8;੖4s?CsI$JlFHw:lgi:@w PԮGR\PSB؏P?ϦՕcGf26C[٧mE+^}_M71[NUq#2Hv>Dϥ7ؑ\tmysK0LM*<6?AvGQ N*ӰՉ9 ՝ 5P@.$®!Oaԁݮf;qC|(PcNf@cWr~;MZ>{/>CfZzӭ,$d'548ڹO\? n.z{9\Aͣ&0yvi49sAœ _9ȴ8^yGo5E+_4 4mR/GAp˯0]:CJxuޯ&)қ#Y/\bup=yɮ$i4֊sYy:tO"KF8p @3d ?.!:#a .=[M#9Lץj!HC7ޱ?ۈGa'kJ=p~6}[ʼg'4d= )pqZ;P˜{= <'LK~D %鮱4o)r.c$,\- u)$ pT36YNױ=$MR|{4@Qc@WsY`gTs9O6t'2H)η"2)kyizgil.S1;w! 4t{7U3Y!^SL\"eZ3UQ:ovRgmPܚa?sWgtS aifwuO{b9]-`{,]O>ptX<\~haJŝϔ@s ^"wre F& (am;{+lV 5M VxB0@so@mWg+yˇTpuOT|)%͢~[7b%"kw<a\gu{j xg`eN<͚!IKjm&2@z!rj;C{<3vF&'?YXf-0}RB< #6~ -tݕ7: hRb mw`(V矗_r@L1i5#ƻ} [C+c:\)_YCW񠾏H 0&H&fc?i()%9*jC!ElEp~T7(|拁3G E~EGYetse`{*{<=t*ׁ͖+-hF|5G 'Ύ }"]/c=Ozgq{e"ۙIswgaL:$߮]"e0~o띂 R@!j=cTߟaCn0J|$C39X&)Y(7!/]==i' Ȍ4PqzZ֯X'нsHKc<ԡAn7b|ӻ1s\Mǁy{oC~ʾz iFPל "6&lQ™rueff\-ҟ˸ jca<҂6*V ېHLťHoxDQ0m(zRc&5#Hv@SӒ w1_6ֺy4NðM H޾L|,.%Y0\Ca%֤T8;k3nb2Ŀڂ38j^Vh+.^xh{*cO "*Fbif}eߠtLPYly윓،M7mh`h$8Eg,mKssly&nPnB$PՐa6־NGӅV!<ѥSdl}$I_Hjǚsn D\- >@2J C"^\iDXa7s<ب kqWfhgˈ) _( IdΓ^0U[~gg)r+Ge="-ß@'|pe⚟3Ó fR::2Z؟v/5L5/X: QCi!&qrEΛ17J=R#vi#;e h_毓DR];5N"'Rqϣ B[#Bpq3/Z?ȧ; NG@ID$ 0k3$|%_YGjFweD]Qô;M5<λNlFϛ5A; A~Mƌ,$Zg]|ڧ8z5~;[h!VmP, @, 3 ;:QU[jՠ 4 oqtSࠡy8)3Ucʤ$/m SviB1! lWy&ՂMfa']Bgo|<żQ\Bnx֙8B {ӿ4Qt#ZiŷD6=wNՇfvKu5恝>̘y*t9';Xۣ>)cvHs2*ٚ:Aվ0P.& Zme*"?Ki)PDfc Vg2\6D㢙{^bV͑WjihoCB?/6Gw34 /-($y32\}͂W ElW&J@1!]ƣ3ԱlixˎPZPT6նPg7e1~/o:ixW*W3GF/2#Ajf^c MA,zSLR*%Pɕ\xty%f&Apƈl Q!PQŢ]X_tdKJy,/iDNEr,J ,AŸe4ہ`;h~}yP@d\q6J.%Qc&F—9їHLŽ̻^H6O&[{*={G%1v0[d%epv̈́[Bk%^'1ƄWzl;s^);\wY{-Y|ͥ`w쏆XYc37L {1% W}PNhYR-$-f@/$QM^gRI*jڍs3}PN!ۄ#GGY%9RYF;Q$yZ2 &ƒMRqŸTFE"2*s9?',a+D1RB<鉖i_28A.6!γvcϖd+7c %;EXLƧ)&!JBu;yA4y%-Ng5>Ivee#eF*X{#[>e b qY*h\=F"~3}^1c;㏒eFՒ- MڹrjM!Wc>V刺Ɠ<1BZ^Uo0-4=dh+Y, XL`Y|LF&$煊-ͳމ^y|[h.^IhV<Ԫ0ٵ@L-Qy Yn%D&YO%c*f m1J9^<-iokdlVa]Fh]5%Y*Zuq f-3D-P %AlG+܍cՍ$#X;{\-`ې"|[Wl_ ad7fe$˄~2O?yRgcG)=s π)Q+t-#мU@!h` LL4_X1'N!]` 9 Wt-zn;bZOӆDzRq~˅ҝhY b*eR^tbPܓif8#۠3"KR9nxګڃNDuUwKRmCϖ[e1؜T7,0- lG "f]Z$_К4[JӐjUŚ襥%C;r,v5qm8;ex:Y9͒|XjŪ+vQ1"c]\ \olu0)5lNUJ"Y[x[xC~aLiJZ-Hb[4de8lݮ݇ͥbjLmк (B7wla!@a{:%%sr~6xfm+iDޖF6diC9Jh1i!F40cuŃU":G:0q *ZDE&4 7*kP:XՑ"p89T`毘/psy֖Y: YXZ/r&K$i1+UBZbbњ_~|RyS >qlU,,вTe4=(Yi (ƁbqShuW ϒ"`2i\NK=ߏu/eC':SehN;ҕE%[fuQ\7>/oʒq0$QKǠ=Z2W>:2p(ﻗJWBUHI^a<{^8ʤXM(O߯a1rj mvݚ%?Mgb*#y`ƩxO+(~,&h*"fR k]*g+ дNʐMXgww>~k)PUljSdFg;NC3)(QXLmKyX|CsxS Y1B0}Ea] N2o* /cn dTDwPSa)&[/?xn[ЭTbLKFۥUyVԐ)_wZu\tv:IQٳmˉ:*lzBYzo0rooq7Yjx4ECM77FPD 20+\h-$7ʡ1`,P>0[]djd%)CPyL}\ kgB3Hƛܫ2ɠ_`ro`Ƚ[$Z2M Q6/;scX.vh>&Ed afE/jz.R L_C,ݱ:UY2F< 'z9xaŷ|(x9 b؆а=Ѣ̊qqYtj5njbXn|mE77 4'+ ]4|DZwvX\FhO?ޅUrlm\lhܹVS0UJ@+Us鋲]Y5Zmg*9iľj ]\8T]+*ˬrKU$0= '-uifѾ ]̥A킔K[ZQ-wDsn(u]hYtx}t=넅ٟL dǵ5h}ԓ'3" 2pmV풶*:%A;^SKә=r$ 3DNYqM0ʦPU'79}yc2T2y"xܘ/ñjh:n_j0F$E؁Bo)DA{҃t]TU Aby}-!RʪD+$@J2hאRXSw[ &6ZYZ%–k`uP>'j'Okъb); 2qm?$a\H,rSTDi}M kZ+ɼͭ*Pg>$ZK i@p۲H ?QySh aoXk\7mm]ԐԠ< Bgs0$%$rcdDzvђz9ʴ{큖Fg-SjYm1txLzK_ 4 W!fre >4F ,gh 7Ὅ<_U1l??ZEH~¥Z:{YR4ǀKHRʼs[;|z2Q%^@I!+idO2}[y (ʩP|>SsLw*lq(RoO 򇨖:Slm:L(-C))oɄgN5f7t!%vX (4K3;4daH8`yn %BwZ#[yQNldxjC;2rO\y惎mJ˜yJ.qѯ%Ф<M(_yy6hpYv=TFCs8>Ṧvc7[˥R4*=6,_h:E (]ٲuE{Onb -5vI^i'8QI {~fWI+*gXhkҵoXNZM0i=^Cjwxh`(:,1vozg(ZQZΗA0_'dC l'Y!dMN/hH5IB(y4:oϖȬj˹d!ril&ģ[/h4~܇ =Y7rl0LrP> i PV^?Mqo/=f"Sh4L+P/&HiN* !(x+ц̓c-hmħ9ly9(ro)\u"ߘblzoZ]*?OUXY0ivvӇҰ,%N3*DSɁU? `Nd3_.˞T*O.[&t\bl2u/ 42o$I"{6e?OVPW f w*@ջ37GgtyAYTH|J\~!Y5cT3joԿ3 >T+<?aʝz]Vo8oAO JTT2WETqu,F"<ʟNg;cK- Eix{{[<@d>z(WI3\>sJ?y,\?PkuռJ6d?My-bsfŚBuU5 g+4hLk6c emH^lj1bpC! lKX|$0i!Fᵪ}M2t@M(L"aYvwdK;ٰUvKm;_U+-FPt -_~ SEreY\]u&T˅W=ˈ?Fiy(ۖ6rӀbl@‰BL n[Uiiy.?Db3,15 )Ӽ)bƒ #-ְ+&Uav5/h0U?%U'pާ&r>ۃ1X`) O"_CnP^f+?nu AwzN'zdž"(klf? ֆ ,P [:{E 鴶bbT~<"w2U`I%g =3ϱ ck,4|WOԭv46TH!` I/\efs;V}^ jȋIeŪ^N̾46^iWx~ 0pk؟Z [Fydo~6?8VGUfvqliB^[]oؽ^#wK&>?IO;Q?j\0kVgW#^qud.굊4;Ľ>]K8rN??}/p_݄WL_3`_ vN\zaO`X6޳"ĺ?/u./(s k!K дA3X'YWU=9 4ow\ۿχ7ɳEq܂networkx-1.11/examples/graph/erdos_renyi.py0000644000175000017500000000157512653165361021026 0ustar aricaric00000000000000# -*- coding: utf-8 -*- #!/usr/bin/env python """ Create an G{n,m} random graph with n nodes and m edges and report some properties. This graph is sometimes called the Erdős-Rényi graph but is different from G{n,p} or binomial_graph which is also sometimes called the Erdős-Rényi graph. """ # Author: Aric Hagberg (hagberg@lanl.gov) # Copyright (C) 2004-2016 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. from networkx import * import sys n=10 # 10 nodes m=20 # 20 edges G=gnm_random_graph(n,m) # some properties print("node degree clustering") for v in nodes(G): print('%s %d %f' % (v,degree(G,v),clustering(G,v))) # print the adjacency list to terminal try: write_adjlist(G,sys.stdout) except TypeError: # Python 3.x write_adjlist(G,sys.stdout.buffer) networkx-1.11/examples/graph/words_dat.txt.gz0000644000175000017500000010163712637544450021302 0ustar aricaric00000000000000IFmIҦ_!nI8WvMan;%Q8]bzFmgȉT2̌9YO{Tw?ަV睵8V]{?~y?>w=?wMe[ZuS+n7KwFbvToRM]UWe?Oo_O9~߯6}Fsn{Yk.?_˩OgUuYvʯ kݗ;saOֺ(g4uMSn?._8j}/;N{{4fK˷j/cO}Ttt59ھ{|`ii>{ȴy|ɳK?r|>DŽ}Gy n:j'H?%n󤾟bV1?&u4ME:dǔ".㬟Ty%?FEd,.%H8}ۥۓ)W 7Wk,S#tcb0lj??Ӵgdr}V}|qze"}{_~9Xwe/VS_Z֏짩}(4շ}T/xc l #NSr+^m#{5|>[`yoS_f_6?0ޣV᧷U[]-JD?-;eU O c[I-=do/3y; .$wȎInKN3yUz)uo6Z{ྲྀ#M׻ۗ@m5ґ4Ehk|J٦d Q֝}KlM6'?Ҁ}6ji}GLNe;CO֞s:Pk5;~ɟ|y=Dڶ'[S6W'+_, !tKմlvaѓۉV=p@vqgܣak^4qx$,#CyXƂ!6#mYbWδN3OOږ;3#jGד^ΆPurWW}BꮳyXLuXzv|7\gY^J<4>}e4\jʫAk=-f#8lڴ3LPꇧi_dˎ>uOֆ26ؓ,ƶ2P6{kΙ)ئJؕ T7Lahs)+h(X8ƫ]k= $incՕvS3MhCHž~)Q=9 cAwNA1PflmI!p0dNo5&l4v 1gt "b 6~h@棜>tYӳkw,Μ]@?IXrngl|z1zlT炑<*}A`t`Z>n'bfNlnh%`bƖ"c6jM8O pc]is38'qnv\ <d~$G/?rx۝9|Z^Q#G̺^j)BAW_~ؓs(}&?<^W6 L1i!;8X ,G:R}#߆±Y(\x/L⋉mlo>--o-vօ! ;8sH$b }_PH{e ]#v9/ GVrBy~B,J*:#lTrB=VPTmynؑ:Q Cc]<4|piMJk|YE4/qurVtf#ոt%օ]iщh%4#[h@z`00?9dvJO/$0uu? OP]*&+jpQx2#}/Qj",gFpKwEc7A"2ʁ8uL(29v:='P\E3*#5q Xyb_DxSL]j~QCAZ/;"ۍQH,~iVڃQ2B.ql@+g'c*D8$ 9OgMܢ4?}odލB6Q_E~V KP0!kӺq}mk[@dXs W^?* N[C/@2=P d*:fB3mzQl>wg?<,5cz֭M@o^Mu_:8L֑f-XL?Gig}zd r"+tufWjkeWFE/\_E*I?[*nFVC4V@ /gmp HYs"Y3gnXZF>=wbYQlL5 F9 #P J}!M˻8e^tQ~NFZD2rTÞ_٩go^4VcFM(E+X"xzk]pALp/6 6E0>ʷ"l)mii'gǓ0!o85`fU|cv V+!/V,s+ Y \|K>!W8 %4̐Uv )~Z)I9z5M;u{AQ G]~h`qّk5 .=Rb"YܷAل)i(&/{^ӗD]崌O.>i/<\uc0VZE(.¹czNS *Xz&39)>pdxBbJjLVVK}3o&p7pRIB _^a-G?X~(av/CJ5ԝvIJZG^m#mH85tSyP ih1`Lf4ޙ%5ҰYI#U >v"+DžӖV ES4C *UW*wt6^$e^,ucT FFik q/PX+t@/Uc61S dD j;z֭{.bd1>!z_&d+D#e_= \]7)MvmKT7C^28~.'&BbM쩇Szr9϶RLFYm>ֳmǞ6F hEd#`z\ߥ=Z~ONO/Cka\BpʀxZl}.\4Ta2WbYI#X%&z&q|R+?l>Qy<(@b=v\|NEXX(ײImَpFsD[;S˴C(mYWѝوU٨T#P{W*]%cRo;2OŖVs:-bZe7B#G*JzM-X"Ga$S۰XB#~=j䣓 n6J>>j N35 lQ8~ob+le+h`)g!.N2^-qFgA8Ut5ի )Rׁn?(TD H!Lo9Z)1rl. lYa39p# rLT)c_,!ؐtV d"etefGJ5جnyཐ2UH*I @ș ;aٖR=m;Ȣx4]NKJD^z>揠(ր2[ GgdG|Y lb"´;lkr!\.\ .. 7Ib9/e P5 d^ AK'L@Urź ?Cik T )bO"~^>8W,=u2ʭ#M4,.ЍY?w^t̑(Kŕj9Ƚ9mW] (g=:]tVpqI׳L|볿}3Kbɭފ}r %P Wtl҂JNX~%02.(f/HYJD)u™Si$:KZc/"X"W8{_ PLܗ@UVѡߐ\apVم~VPAĄMm*lwu 4 Gy+h!_aЀfᗶ\ KDZZ DҶBo0HD9F*!%q6* O`% &@0 /բSuƨ{)zU!rGU 2 zKW1oÌTw?og/vtNϩ}N UrT46~/+-.5`?SHR;sbRJ$NUHţ& \JwAgů VŨdD \빐"D:mKRsěHVtזy#)E7l:CWKgV"BK45 ]ĈVGb/a5 ]iW)E/?0³F*IS%^_ qJnǑFe# +[4،RǃQFªUA~긜ii`҃w01ݺAt>ti-J;Vm&נ9΂##B3NOKkV̥Ok-_VPLB ]`'.zEBު>:]8+a$gKe BϦu2|ŧ8ɭAŔJqqY 2N%*Q)ڲQXJڣTi,qc .6}h⑊E>;*ӼӢ {Vį)uJT6}kjM:y-2|dea77k$ 7!~Ɓ3Rwv.J~Hzw2פtu%,+<}[61dMZ9d+ gA,N7Dja‘昩J6`f<Է햇"R(GVQwW'XSwIHuekI)4Ru%Lq:ZDXu%Q6冚N0tGKӊOt0QF=ݓP!0.|'@`]3kQᐎ$]V38xs[BԨuoT}\d 2饍9i )Xk*|BQq:^#;)HZU+#r&16f?bcqdwvԓf3 5@alYt 5Nm, Jcm3yUxS%J$ μ”NP%%4g@}Z0/Y2׳澾sȺuoX|K,"}&b&Gu:DEiB^)&S"Cd1!Db"Y(L_ky#չ"7 )N-%&Z6Ͱ)͕c՜Ҭ^n؍ (RعAћQ˛[ Iyi&fKۢٙ8Â]cU )ct-]_bAy`mq MY w1@$7';pTޖ/Xww9OҦtm!CP3V&ݶޛ4 c>pLQe&8(WOe\YF;Jh(oy쫊X,J$*p6u *)(Phh`frkc 4Y: p&l\4P}~-b^0BJk9-H:;1I\.6s0q{L b`'W݄'Wqf( 㡡)bb?LӞ->PeH:?U0/aP:<0dWE *CBxcLcCƌt#K6(1VTҩ3MnXY,oFV:Wffzwzv@ק.DGՕڪ3Ъ<EDWFR4*aj<Ȋ4mdwk?\lڻ>8WuJF-C}^rrr?$&qs7rߦlY{Ek*Y>JjeP_*e$p[} cYn$X`g\-Qq?"{I*Jr 4[XdKŪtwe.юRicȆ=S%VU/!>Sշl~:Qbl (A j֢J.U_d6E3.+^%Oۈ 􃘰8ߢo67KcW9YrkllӄjYjq}ciOS@2G2\.;#%?r&mA} Gˠ@AB?*XXK]pj$5im]amEnUj㎏]<| &J^ q%Xr#In MbK(;I"aP ,= E3q{Hub72KHw٢ǿ뎴[}lcҟi&|[)gUbBk2Qs GKiQ %RJ(#p Vld!3]ʻT ֕}\ܻU{Kh"3#7C ,lT:Q՜1{OI)Y5P6Oi/8tJ+nRK7ŹʖIp9wMSܝ] L2wۣ9hmݫjFqcQ+$;ft\I;E*?7׽ϫQJh5e aXh"oGZd m;%\BeVKPlu.ZZh >q(S(E(νݹemET_,zE NjΜvGyJ0($e~ `/9F5NfQ͌)GRM2v&] d~#wVqE@;^RʏJ]:҉eY3 4 :]J Q&!E -,Kaaqš7zۇe Iwب{5-,%0xPiE6t-/\-o:wŭ I1DFn\P?oPq#eCs o}ՙLhqZzIOY07ڄW-#moҔn*u{HR[ْ`8܇k EYeK^!m'-h ʸKVo$Ti)(ǂ2,Vw:I{.]Em` B|΢+HyH etf%r1e_I:8/?y6ӡNR ı!{[5ּH&k6!eŭQ\twYKQool-/ѣk%m+&| pE}l;XV^?Q\ؕփo(jL|t@{uM6٬J+֍//ݥ*MLԥҫMmu.6ledr}77`9Ag] v@RSiU`Uħ[$Qd.T߻a*K L%?,#]@B"PQ 's% KUF>1"N؍B燍*-,72҅4QFK'k9n~&wg8&q,Huv@T YYDK=樔iY(KZyw.Dmi]N z*OI^5Ro˗0̕¤"1{EvC7-3.2 H" TaNۣƲԯ? `+a*j.h$=Qfw).%`Ij+WJ7])*SopV|3GsH{> ֟2!~wCJA~l܀BTF", U e:>ܵ), 9٨Ԗ:ȱ<:M]Cm v!ryU9eWI¤'CAJ\LCL-,VKQa#'bqtb%H4.5Su) {IHBS|r< 5L窦aOPᗖ_,idȫDG ?uuCC0/}T-iK*IAnQHpOe&e)%^Qh_nl,_`7K9D8 u;>a$qѱYkkRy-n?CJU=F"$qt'\l}]?7x# 'گ[z-keS,gR[e:?({XM 56x QT`_^ y LV8}6!b(YFu/GQ)N]iS(3*+彑9ǭlk,pZzw6jªҬЉE,~]~ YCv_0罆1.%C$D{MN޲TV\oUj([7cuD\mmNK],?Q@Fp)"r9JRKxK,}D3QkAiJu,vd*)dxh$N4%uyMsʺ$=7eC~=E/9hiw;*_ZVWJt#$0huC/h8ed+1Ze! W8s5_Rְz}@ej7ZHk7]qS&/* "+3>q^.ldgjh(x,~'ut,!r)zeTRF]ċz & 7.2<5j#9g)JGi(Ao_EB17DWێ.甭/#M n%/71Uô"#({*T-wL7p&?SiWnkHd³#b-NNH>X܂(WþJ[)h{.wqW߆[٨g,\ep\ʮJ[Wbq?e~"MyA컓鍍*htexcj̚5䯇N!<%/a4\"OYc?YC:*\O9MTVi0ɲO--0Za_At׳_z{A.0˖ >P㒫N4*D&WGTd !<-X*Χ$BMjKsOl?9o}T/aXŹ}a%'^aZ0*%J-\Ri>(GңIXpANayy@g/O :{V $-!V/削 -ai }zL6>ɤb'=ƥ%)ȑ9)#uw]1p%~c% u@K CX.掤rJO6z6)맏l?-tjVY̵'bR?J対B5Ƴ ؒYh%-u#G]֍bUf_W qQ5ML|#C4nJ%]~%HպZqi?V@#Qå^JXZigL궭-CU1GXs ocQweDq5*e0>j'\{VHTKF{]YF֝XsmÀ7x C-K,+; <5]S,DjL^ 9svJMĒUbJ=#$4:7ꫥ]}=ȷ-lk1bX /Q3f`SOôɢ ~6W }CCdvt_}Φr#Z3ܽ,()eGĤDrCJKWv`Z: D]iJhX%@i(8IHOBAn -峈w=@}s!~Ab_$,N3Q5X8MJ-j,vE$SĈ+aJ'8TF'˪.On,7mVd(|{v!V$T]~GcHf돰T.tX=p[#9vto .Y { Hꬉ3?o{57HBl([>1qV@wr"CW,˗ =a(2(K>fcVkRu_tUY67&i_hkB +셺^^UnH +J+G?5,X V_fwL/(+t@!ijEik }sWO,.|^6,]*R:#Tj?_6{+,6BA& %Ɗ.[<{+a3SpM#X8*2j_$HoJ;aH/`Z(3E=H>,԰^O>PzkC٧4EfF*$dDʚ`DEn+Oq GFɁV /u.>|R~Ӑ^[|3\:(tH\k+: [ik#tE(.ymvk4Oh$鄮Q >w$Lz]߰0Viԋ"(,2_*_E%;|D(҄ʤ_#DzrUYIH~F[Lx^X+,zOy-)woV҂KȤVĕ(ksP3AX?<6OJAq'i:8+xi-w$ށs)YK#NTVdg]5|7bh$\6~\mBFfe%ٕi.GZ`QP ^OXj'@!_>Gag-M=zf»YD"]+ܪ9i 񣯪r@'uiթ֒y55Q[be_ȯL (;41Q#PdH'JS]|S'bB7=noC@~oY!WS5|ǍsUFӡ%n@|6\m4s[.kc*uGw֐f辩~1rD5r3UםH&0Sr4'*(Zs|'QkJ&)ʖsŻY A~ xh`[JtBʣpx>_PLVIGߗlfΙ2qc˗O&z,t Ρ7+6\N=W@)c&=+ |9+vR{ ;;( 'gxڑǍV!s!˂dԝXRƔfhh#_CK_AFqŝri#1Cx׾/)i1"04;`J@J(+>8{Nnёr]7tX?B THF&/ &v,`UT$2gZGKOӇ l`WT! E>N:(g}kNxR"iplSf7i#W/ ^^”0QX!K=5޽WAaiu KW{UC b1˥ g}s[pe `#1 9lӭe\wծSYѼ+k}{fZm,:ZYv"gyk;Y笷yAc$h2ɭF|FjX6  ƽDoB&|wE~)eK(]+ Wȡl-K׃q'{`ז.O>~~GU=*m9@Oz{WcF$wYiE}[T)^Ɔ%DZrʉAg?SuhqG^ZE Jm[p{*l6 cݿSغwcҸKՃvNAp|H1~AG][)K[W?Ƈ8(''{DMfs#c,E%C`vg/A-7R̂,Vm<@pot*4qXrm++k< [ɶ5(e*??V^]oKb H2Y9EXN ]eN/p SQ-.Ryǀ*-&)DŽqKP3Kp>$!nY٨Tc$@DRxJ4u|sH7f+B S-jZ!DS\ 0JM]6v[U4 'gfI:G* H"sGoGm>(3NmO6>M ~QIR la \ Md"PMJYۃX# gU=dB+9m x# VcpEoΛkĤS9ϱǪbØ+`E}5wKPwyfJ~,dշ1-|Lcz1/ Jb]9E- }p _LZ4Bn[)MS]&g+ߙ|g &brV$_ӭ,|<R(&m\$_~\.!2m}jOC;P`(dGYѺ+ my /zGWS;BFUv:ΜÈ4 Dw< R::13N9v!rWz*);6 3NгD.Dv5.{$'|^ٴ~9B7Pn!dC]Pq-gGe&|bNn/4DMPOFw+dV84 ,vF_"7ZjR놩FxmHFfϪ|N8~9au*WpQ gӞs%e}ѽOoP쯀Gl+vS"rzQ^*c h7CkpO}[i3|7;P2p$y2PD6}@g1? 1>tzZ*| u֏U Y A5I%wÄ`R0 D2s)nj~CJ-}S;>0m3Ű{D:Z.5El-㪗7~a'yRz#ܰrL &ԧeK=d?̚j]qܡQ@'@mZ֛ D|oɷ^2NRRYNR"*rO˅9.T?R3Co>ۜTXqX OFؕ g=& Q/?U[|MjuR0^![_F8M:bwKc_'2pKCҧt6^F7ocqh(.m6g'&,jA}c#X.m6$F$o \=) wkqpR6{3^&q/\2Ok8.n 'b?I}]Z(iBg%.4g.E+|](_Ifݶ6uܓeVPvrS{߈FaiFtWd|pz׬fwI_ޒgC9\rsW[BU `b !>Z|JGrߌ'M!%N鸋[20$hة0ld(L@t,wmκqZ @D|zث>s=0 XJ+ܻ껞?nD4h$-BXϺl,mos@vq9xS/1{y6-덐4Om̨6WG': 96}pj\*]7K[4e1پ*s^B-p2{D${5rC-Vv'̣`[[1W쨓TsIGrR7]A݄z`C=WBXuvz!_̲dL)_2su"+XnI~nVM|.RgL}7D6Oe:(!o GqUpp6T&!0pPPeJg'}!]j@+]GOrQ )ĎTQ^xD!KB$Gp䫴݅a0:?ǖMu':Sz&V `@R$^DEQE(ׯţ0J̬%7aM0E+}4ٍ\,St љ4LGRe!a7~l?|FVE;?Fe%^qA_V(h)`ntъ ˅/rsC0$n}7&ɩAFIpl(E%92 nbhRHAf߃}穴V ?@L?JVx Kۭx}=lH{3qo))91ʾb9kprps. L a.#n_G@jE[F}4ճ7[7<(EKКW^.r~]$STNN|o#i fphT#2T ~F*!'8sq. E.ͤ߫[5XD"t( %?a% xyG×>6<6Z ܁J9, r;ی FJK[tfćҏ[Fγ6 &"rP?68A?S$Z'-=9OVֱ<=(X}pߦ*V) M//73YA#ZF Qq&>7K}{⡓u;s\\4neXT-42zgtcaP |EN=0& 5<}u*yE~0 ۾XhE[~[`,#X j$K bJ(2)F>Jp1`]Va3<|ӻ."C 8{4!uߘ Cmz ;-7C}a)mqtAڤʖs1M ?[-Yq~!  A9+bXҙt.{<i}cYﳠ_bZ1aFgѯSgZlONP=U<7DήI0pe#?*!yokWԅ[DDY\bǖ hsF6++ĒLU88i^u )װ9JpAQo%nޭwux.2qR" ⎫fsR:ۄ$(8Vv E,[v7c+@H4q 0H}* ;uQکE.Y(~N}Hc.fgk{bDy*IRfP#8 ҽjR|ɩ$ΣxQy{=_;zz t%GT\5 $ QMBȚ0ɂA-% SXa~fǙ6 ,]}a.$ ZFy%,Y_zT4߼[=lF.G`Nx 9 #[I`GU_L%pO</(~ǝEPGR,MUu+⍒=:$RA'A~"YQmo 1hV䔵J5>09cՖRfM١K"zsNJ'WPڄCLC؞lxBfèiA˧ʷR &Х]u\ u\쏚GM}\% ׂ۪W}O^gwDz6:̓O#-]ԡxDk4~~kLou>)UgC-̜|^}"X: Ƭ^g? 5U}_ƞE_h-<|8ϩ{M q~Fk@꛽op_űoQʿ4tIR5/DGL)WuoT\;wb$eʒHKQÓɠѣX>D^EWH_\k?.C4!8A/rc8mtЅ9:wy ]='Äx'k]q(xnp%i%(YUmt+'l/(TC%ry>Rѥ =CersQZrnT~ۃIox8;7 $&qLOHH,U9Q"vB@if?-=އZ]\*w~\oAdrK0ʦ IAE$_m4ffT2G0O #F*ZMd3?qcH<nQA8@ ݱnտ6au?h`xq*_EDY4.:W[g~VUnD4wn6vB(o Nɿ8x(_J)5˞*_^iHFB1j%rfX7̇X.fܵ/ꤕTGGNb䢳ǛL+;& j%,o|/~jJj$z jDDi{wKj5{gUWBxX,Or)hV#5 WVAE :Ճn .L)mUt>WI7,<ԟkc)jo[ԹJw/GG& ~Cm|=RO<. V<6D@% ˅6D{<DbF`c9Ÿd$v/Zb~gL* ʃS݄!6dR|)WltC NS=^hs^H]tt!&")VDyLnn,d/Ra"\[vrH9-N [*'\eK;%UOy"Jm?n_)FA:)@xr᣼匞Ua8S@ӫI~cY;-h h_JASPx")h5C/IuJ{њd^ȃj oM wki^:V_.Ly\襠6S# IR {%6 !)W rPAt_xʥep3?*$=[y_"1:/pI.rp9𺖥xԍ`S|g7/|<γѩ)~g,Y|tMA0BD*+S8.{6džqE"OСz{jE<.k(er8Sye%!uM ؈w gA G[l",xOT&isJu;Q-X zQ}^<##ɧ_SNr 2c1pQ`x-zCH /HEs&0Im\Oa›GlA|[PD^?4Vۊ{@%4цOSAELYi gm$ D:]B/ (UR0'X4Tˁr(YyKTJ.-f088^җF"OGD8*ȹh1*+Wd5鋻Ci5MKW%TB?Gׂ% +ڰú d Shk2) nbIґe>x + ,8f{ɛ)Hpv_{`Y>jF8T#h(sV'bv@}MOx r)Cq$nĩ|Ff KT9/c" / ]`=1*-H¨|>n)UJi s>橸{?]Q$!Tᴉ!F_ccoV {0n**)DU\aU*@D%< }w`TL E~uяGo#>~cI75 ϡ}XFB?-tM[h(QR`Ŷ?Atc$^?juxlhT' <9yW·LO4Ebb$;4/ E"75_iPlMXaR}G-T>6'@E~5/9EU\:zd6PwR NLED3ZT3V]s6u(nVxixg>fUs-F.$TAE0HhCL!u~_}ʬB`Cg.b`hwfU,΍m('п8=lXrD5WuoߣjN?\B\I+պeRkKbnXwU3xq3b$*i2p"UT&W'Q*8Mwx ٙ?#CMho.RꖉF}Y)λ0BD"z\*#yT: Zf) 2ըY%̆(3!g܇9i6+1 ZTE!LE?>,* ]xρAn0QK*DP=-RܒY," vJ?Q.AظQ_^w$vzd+Q3lM$SDBq9Xo u Dўݒ|nBȝ̖ QdDX pή`K2]i stanM38aE~? |@Aۢf;cZ (7I֏ηڊD*2]~ @+l/H8Gc$y12~Kh?AayQi1J_'JcdZd[`Eҩ|D+iMSх4hZ\3ۋwBǠ愛܇h؇->Kӑ.Zϛ,O] 박CΧyTMc (l$VJv0FWۓf_ :`EMF)ˡQDA-; _xhB@iʠ" 7B)"kEj0*vֻLke>ycX7 D]O t_X/v'ݏ% 1pJ,VKlE:^lY0p bmjy.o VH{uM+6dsu{YSQ&42lCg.}hd鵚I9Y͛ ʨ}YÙbiՐbyҞA)<-qfcFz(\ͯ N&  eaךk]*=6,uJHsB gȮnfYaBWEʠ[g.K^e'o?f?hSw ΁ojęјzm<(JDD#'*>P8 n(1iG*V .m<(1wUjz%3v"H ,2<ݭ]"s$e'E/ZD,EH=i=&R|RAt;D &sYbgHϕ,ypu}!Bi%]ZKLu]Hngܓ@u;mKuhDj8%p`2,e=|L_KK( :ڡciv4䈈3Y3(:3P*4SUpV. CY5A ^o$W\ǍF5 Ni^kE VR$"?d)GhfKGDZqv"ot֢ytI2.)"] SBɧYs2+X$c%LO t_[5;ԋ;St]sn((W174bA[ 2J5q?f&eo6FҜ x wZ!kuݣo}J ~dpG_SWF> -n:ҭu ~=§[vGsvW[$g $:3nB,q60>|9(=o^OG) }McX>IJa!NҞqA;܂ F>Ofz,s^+ELRBĸך280AS:vh|S<H<-<~jac?;mE6ۍeV"W}X%^f%ʢMAp4ڱwVh#ѓ#¯TrpRyJ3vMT8#ēT{P Z 'rv!DH\e-) 4Sh(u‰|R>邓_0v]N{9ӦG 楁A Tqy{X;t|4ypIOp y9И|T#T\ e- O)ĺ#S;Q\Ѻca%E+~;ߛp!̓=b F4 fnkn5 3:}#cƷ}V9rO[~S{)z/Lq %;0x JjVXĠwF\gk׍H<c?cy!x\ŀQdٽL ¿e;c uWqX-NK ]=v#FWaO֔RSIY&xzFÊBcͰћQp(ym8`E6> ( x HP $GV16W=ӓ*e*VFuWVUW-?6{=|Wy//pW/Ͷ!b@_7st]Y;i}O!HE 3BM/= >ߕk8A%4Y[_8J[,ɻ%sOe<(C -g+،QEP0{@Xu@f/", w ma(]}*XAj[ i-~?A[^|ߣwr]5G7Ĉ 3^# fK+EVp}]_9[v@8R%KAIrTi9>gݥ?|q4#d]U PomM$#8eBбilX yR~*SEz/19*LCD5a[pwe0 }o 2~Èt" ΛGM6MEUHZ_r){ oOd.N۽e:B9a(ʷ3Xr% ҝVC?Dh+-`*E<ګ-jm@Qup"]קgů qځQ& #CtϷ@l|,>钥]<8 7TYV=PU:ȚqTJb S9iL6֯d/lJ用} M>`|;vneeVРL!]:ɸ#HN} )[v?gMdju9*dkQַMYI[rܞ??|' {B6W %LL&p@)LK(7Ɔ$=.#l" xKcS_oUSnp9{{9*=B#  z^ڶCeX+yusIob8(QFJom+26RSO5$Đ L.|!ՊV%+x 6OrB+ =BV%D܁gv]@N%cWH6*IjV2[c/5نS@`){b'fjjH?;~bķ2z;[jĶL b@0ބ#RQ Ny}B%8橽N $oeB0 &?Hڱt!t,rWB׊4apˍRh's`LbIԢ%P<0Mx E*&t![ῘpFE h2ص>q/*{ȍT^@{y#[dz liDO< ʃV꜉](`4{z5ZF\35".<=q,ܪ'3] %&`xx(ųǹt~go"Q,%D{sK`Pm &^= Zstj-u_KIJ [ъ3v%ء=PnO{Yz;ohEL |WZirGPb t$ ;;+CGjb׮<ۓkio%ÖWܫdJ*io%q#j)Qz^jx4) \'\%D<̈K_Ʃ쉫F\!2(\b]^@,rv,5;?Đ?+14ڋ1̞S*-CGqG-ğ/?=j$g~\ Wm8Q\\>1aK?nGUq+ a(networkx-1.11/examples/drawing/0000755000175000017500000000000012653231454016451 5ustar aricaric00000000000000networkx-1.11/examples/drawing/house_with_colors.py0000644000175000017500000000116212653165361022565 0ustar aricaric00000000000000#!/usr/bin/env python """ Draw a graph with matplotlib. You must have matplotlib for this to work. """ # Author: Aric Hagberg (hagberg@lanl.gov) try: import matplotlib.pyplot as plt except: raise import networkx as nx G=nx.house_graph() # explicitly set positions pos={0:(0,0), 1:(1,0), 2:(0,1), 3:(1,1), 4:(0.5,2.0)} nx.draw_networkx_nodes(G,pos,node_size=2000,nodelist=[4]) nx.draw_networkx_nodes(G,pos,node_size=3000,nodelist=[0,1,2,3],node_color='b') nx.draw_networkx_edges(G,pos,alpha=0.5,width=6) plt.axis('off') plt.savefig("house_with_colors.png") # save as png plt.show() # display networkx-1.11/examples/drawing/chess_masters_WCC.pgn.bz20000644000175000017500000030360012637544450023221 0ustar aricaric00000000000000BZh91AY&SY[64߀X?b>:@!L#2RETUV4F KAUԃJi[}+l*M[e6ZѪ)m[wM1EQ E "U"JPDHTDU ހ 1nGt{ zd;vaq#]S]2xmIU )Bz˝]֚}Ϟ>7ͪ-BKQuvY@"SM[PXVSQg]g8tK6mt<ާOvh6T:5ؾa:5:]4I{6<T*^cÔhPhP$Z€϶7|} TѢM} (UkTm:׹x=6os` e[@P*}h)nd6e*P4}.[om*6"|iDU@6iOc fJ`BBlj(4{zPbm3OA1RmlbmE)[}4tBQ4=/nLɩ Mޅ3^>}/U {aJ !(4-d(M!44׻-XBl|%Kq!J% *cdPW"Em>eN*+6X-bDD*-unrh[޺Ew۪NfZk3M[>5Rm`RT'Np}P*e6ն_}NaGN.]jɧ P &Wx> ՚Rճ{ۭk_vl9ֶ:p{|mmUgnm1E(mo;);i-7]:r9Մz}ekMRˬnrihqc@Uw(tݺ puӾdm}[]m*frűs$ef,)e4VQ,uꊭMotmݔDog{kTiL}e  =( IJeЕ0Kmeu&kR3 @m*AIG~&DQj4Љ%$Ih$I0F2hM"QR( BR`M iS24J$) 7 SH0bEA@j([kUlbN]l``5э!B1QXEV+F l TZbDѭ51:4mlADFIQ&ƙ:mPmѢ([&,kjKKƝ&6IEjiFڐ((]cX֊˭-V؍`jA:w.IDrnAG{ljj~V(y/( uǬ8ÏԓYV)Hk8$OĊ4u.+ deHI8O$@׮!ΤKE8&3Z@+SZyaGSwNC)'yCަ0غ"%d9yďhL@jnLEdʹ CY#KGLJi4 /2:'ĜqHB/Īu5:׌ a|Hb\Lu 0g1P:Mb";af3,>?; 15/,s0j}O+1JaJ'&mdl>Xy Fzk!Ѯ|=@q;i :mUbab,̑Sl K2{p XPb䘘 `ϩ? ~|Y>$n@IcP}@R2c8h e*YJ/ʣ sz#rD$I'ߓ?'T@sD|7w%F@R0}d$I.ILDAl2I}Ladu2>JFufd-PHې9S|̪̚/mY;adcfYa>0: |9/l <#%TI3d:I*hF$q)<;qȉ/ ȧ)&Q@3?:ƿdO3yu8eBd:\@YTS0{%5g33nT%EШP ]BOOWL`?6#֬66t|+͆X &YRqV &kUCc׉3"gnA,b1 (Z{-l,Eg+f(F J3Eku2]ؤ.1AX*#ssܼ~n1)R%XiH%Tf7~TDQQĔ!%gw[zb"*+3``K&; :%)'R08AC@wd)8gYSK iBgl)U/CfGS?=,Rnwt3U? Q+*]EUrkyx_ DWb 58{^%E:8N8#5&CN!j\x*X*o{"C띹F|`6ˏG UCōu,e1-:pv4? -|BJDoAԦ~-uX.eI#a+^h ydXQFq :34E5 JG* Ob:iC[!v?J2nnnr{б$w՝MJc2LgntcuD`q ༽0A(LTH% M'є.uX\ctUgzE5}Cqy1YV`Ҙ=@zvq*fdy\x"oA|$42E%Y˱ m ޴뿭Btq5 ^N{HCZvzBXBSC.CiJSAa;0tL|9faAF8Ul8ft3{mp 7UWĭ7oHXQh&7}[oIF 4҄ 8g=SkJg8vld=e ‘mCn<.9Qw/ڞ>ȊE6Cbl^S_n@^0 WяkVyy`"[^'ݾ{kKf[({'`eS7I[N(0c{h%U E5eBspnQ",ݶ6 'V}^ k7;O_{3?F Ex 2(JH #*$00Tp (`L(dBRHH%$@`BA`U %e RP$aD !G( =^=>WN5[ڍYC- ͆EťMqY/ 8/^Uy6*ּAn4Y0^\QWzrțfćl w03nt5Wh9''AjY8(Lue%XarVgj5, ̹ǬO80/W9+nl+9ܷV#Z<>DFD~ī,:MwN>e$ʺVv֖ڍ"xlj%r7!(3:a%ZHO,z.yܺh)p8qဧ-k`'X"7lGbR0UJȆEo,H.!ɵZr%r$ɤȚ"g| MG 4Z1b,p+G`y+VOj#a !%&X펖ñS㔗-L}Gt,4n s Il풱$sg&0ӄlcvNn=/W@)~O1cJ2vsesR2#s:6fTŮdD)*WSZ=ΘiIT,v KȮkPjZ|Rund>H:$e_7nڹGJlפ!4Wl-ZB#ګR*ӒeJ"JV'TFꚪR'+fPW|jW6)1:{Ӗ?_lf}o9CJ=59s%*&6;(B$+鶖ȳ+ 7&F:쬶uHCc鴆UT[ ^ͪ:d1#ېU,MLjaWߘxQмfOhhGm]MX E\NXc"[SqSݙbDK"vO't+/V"1w[Xy%H/a)H尶E?߻+$g}~0LWG% \f+oETrDJ*ן-|&Zřc/(QW.2׏u;`pv Vt튜DEO[QzϚ2bu|ni>lmvs[ZǠ쫪Sf?km-#0=y^P;ld'ȲQ×\vZ(ELΨX6cहPekוKs:km1z'oMՊ\Yl$y-ij_0|V/9r]bnVRIJbl%2D4 FH4BS76uf[TQ /(Fkz5"ewơA+홣zab[fdϚ? @$V2fG^VձQ@p5@CP mT-R'TUW}rfDy|5eTˆW<|zlOE$ Fnm;[Wn\uD_Wy}4u g-K*-v j{\sW v8xڱ&du!̽˷-+Sbݚ)_s.TTq( 0$&r-IE!s/';z1M>ipHڕZ#rQ=z)d͏+Xj Vx`?vvQ-ф&}Dڀ풓.VPEj`+D?s&lo~>i6 `R h]tݧv P#*X%H~&׃q_vn}w>MEۄm[S*ؠ JzmأrD[wbowrf6#6-#bG9R*S"nJWBlX%g_V>MX!^8FH%Zjԙ侅`wPBhڱ$<},$kUeD:"7%SMhcUo}Yڽi-Ocjr3T$ kװzK!_}0yzUƩO{Mkb'CョH엂`Bg0 GKi^6co*~ uYcmd_N>ϲXn~8ۗ U5a&}PIT"glP&!n!M["ק(UQFmlKT#pJ(%vnMzNEUO+uvѡ ߼Hu^j$~V\:G!J:M۹-6@cib)eʞV1Nj4׻]>K:^9Xi!n2}*I)鿻i,8Gʤm9(ypև"1%;no;i@.F'b{<KfAvFXQEnz^mB%Ē_Cu"eIf[hr+v\,,9jdrפ@^w zFq?}a?Wwd&yXa{֧17DHk" z?iUZkGLAS  e,vyRžA9oO :W+IiI$?;6Kڡtlsl[`oZ@|5 ͚= UqTW1 nf s&=;vm8s7 !`(RUJPTU -R3004J 0@ȡ2AL=reJۿs6|tJ"T:+ FWlnhKECGjh=-֧z%qʤ,Bؙ oNM@xz{+.^ZGk:PUdTolf_)T8pf5iYv"$HWBc5!O=oZZq؃)LHj q|ζD3h ܎Ebܙk, r RQFFvb nMUXm?,SӼY"G:XE]|z&]T2䕊;[yNj Jpmq=V(Fu"- 00IVL0$fy*I0奋y ]}=Ng3;fݪ 9u$ _IתE=})m"RÄ9`9LSZLG (n>k]i V66wf ˂iY4Zʢ(1k*cN?Xem\^Z%(hjQWі5۴6=B~,UԅOo.xzBbhƁzk -:ƨjl2:f߄ Ҳoۧ,tLNd %I!W S4E+TSuᕙ$y ۛYTS\5)^ )ŵPR:RFCk40%lJKԵ-T\d(WcVHhPh09Sՠ^5$=V qwMHn1/-].Z+cWdv[V䤲D͏ 9q1(-QqD)aohݹ@XєF 6FF4l_x~i}yg8\m+ .$#N'{!ԥ< Alkh(BM6t80m뻥Ja 3[cBbӌcnY\2klP{Nkc;<5P>mD\˔mcdZ 7#$®F038q]. p2o7DSQ~jL׳ۿsF7\1L]\GՑ"evŋۯ)e]-IVK}mmlB;7~_e6z*& TO Je* (jSw[VS%`rj"뤯ŶMif-JXZݎ}]9_yNowN M!Y+ |ah爹:ዖ6Y9}ٵݳfDch,J1{;j}7u2Xi.蝙0JA%-[ -Z!xؾ),Pi5m$/;h@a-ܡ; j;]^:9_=Ť867;B*^fRn~ݝКjRd&8:ꥥH+uj鶢j,m@r4ǣFvGP4Gي6&^7L٭CV$4i&̜1# =9n{'7VԽ ,cE ¤lަյY*үr/ #'1݆-F|+ilP{ {Zk_xs [bjDC8\367-c96GևW۴DgZ3)A7"ҽ>HK˫@%+-B4oMMSJ9F J/3smAn#[wMu(ssa,V`H&URCwUl? "zQKiwlWt?<* >; ,LC&|07GZ@&ƠlbA8q0(^^Q LwV#*pӅ2:>b-V˵+U҂Cё2@h_o{g/ktHqH߶eoz-evD׉䠜+}%MJiAUh }=-]LE1(4&W4.6 FΏ*l/IsjVCS5(:u5Ց'h`HʕVdvӠ@R{!Sn,rE mt1B` Xwrc 1㿖%߻4=~o\T?y?M}˸7$aWR-&HTͶf}.+m|VF'Ϛכe$c]Q|=.&5X1-Ee]z ^e'B$#;9(, #L{T1lyTAv'qn[N)^;FZ8t !ZNAV]:ح"N 4|g}C|p1bc3f/42&#qǸ G!tacÌgs 8pגD$ Q4ruhM]#-%s馻>j9`ĕj&I[To3I' _ s?v63j(t[UN/ޏ*ܽ]ťTDa2*JlT't y,Q,60MJ1w*u)oy0XǩQչB4]*Y)%nΟjw 2zVQ|bHE -+sm֎ߋ Q=Mʯ}w8RdGJZAŲOjq}uίkf=VJRJAI+Wm޼vN\=%kq.r*XDMfLئ[aS6嗵S|]u==\y. Ejʵsi*vmva5 ES^;#}xdcKTXӮ]a,tKӺkxl(1Z`ݙi2l^F5:բ$+1#!h4>TQ#QwE6tFj)0^jd^ JCs,pך)\{FTgb\i\+)N#6=fF߶teD|9L7!ҨJ)"_%,LOE BJf9K|ZV2WL8JmI[HU^WBˈHO^: Ůj~e5dv{p,kVUz5i5Ĥ\1G- 餜|0nd[r(k|onQb70esa}8|ܡb:j( hIETC!&x#M\JQlTnږɸn &VHD.4¥1+tߐO,soS}o"kWbu"j%2C m9t*!Kw' R+HZ2e:pSԗ0pmɀ2$.b& <є.'-S%XMᯩ9{h;m6<$z>8Q؟.(\[8N⌽mEL X)%і & IS7fW%ژuqp;ejOeomiNphqU>6ѨS1e (c:EsCbԞv#u+{-lp=&Dl̋#&\=LesRիWG) =|TԻft!4+ھXDMXȊ2|*r) jI݋o+ <TۭDId1|^?Vdw>=2vTaG(7  ā0&X1$RC{CҖw4Ԣl\g3!s~kbavM.\hQQ׶ڕmCpuQBۛG^Sݢlnk)2u6qw=Sh0&! .i}EdCn@[?J7Q$2!34T5}]9vS4>Pq*FVl!8y>nښN+"a#u6 %gr^..|$9H,_Fr]sM-k'79JXͳ$a70 fC Rf~ZFJCy3̔ 0EJM0d! qFEn2ܜ݅gm Al#g@FZew5(% Ah4NX&.BX&#hA@JS"U{Ю/pA]˒LkU#6K7}jۑ(,0ZXviٸCᓢy4|&0KuN'zi(+HV۽$!ms~MQn8ju H륎9\4<Gd}}m_&H8#o^>wFG.&vPHTvހsR9:26KJSֹRaύb*Y:.ǎ!,|P&e17-8]M|WjHgJ(Ǎǩl_128af|X1kadMvQj3AFDY(H$=ݿZb/tEinjq2nH-52՘.4fƜr7;wxzgquv ip *T&őqLi8,c=vJU|iO}E!5PnķFAt,@pKY ɋq^X3i*E2$]stdd%lvf~uoO1'LbL,0 H.:kW’nx#.Bc̵wrDՑd&Ȇ& ]MkOyIݳr9Sgm+mXp$ۆSh:`̦%BC`C@VՀs4di<ۼpJ.ԏY )X`\qifg^j5u0B;, FA 66^Kd6"~ieE/b-@7Cç# ë^0Tj܆gZcCrG&!Elzdt͛GLgjM-+5!F҂eUgSq$XWyq\f_^U ҇Qa"aMf`,X7 5$#C{-Sn^{^0CGi"Yo1jݍRɬh3;3qn)~vU/8@QtZH!nyӻs &@aPkM6>wNeo>L!LnS0ω=Č%R  W#:ޜwh&]ga>~"m4&Tp&[[Ud*fJ7긛&qMR-(FR~{#ަeG*MQzAkInF)+'%n@4G۔M&;V}Kʹ/7꽲b4YtXQdMFeYŇQ0V8b-H!B# _1$m vJ{qvoʐ7|<]n@I=PKYEsڪT2kf~h``;%(~g2^5Y2EXrh^Z%Ys#}6 ߺvsEAj> QQxf䳅#_w#mNDUfL`Z%ܲ,5:_cLqIc5!NjA]ro3vciP^񁌈R"#O_N`Liwi~o0 x%e'Ts.{+)~sq!?B "{UO`Hp"` P0׽U X"dU\HZ/3dJ??I7u-rߦ1ŭlqf#i_?6_xaD I!݂Bk:a#Yɶ+$ #f 7ٰBYA: Z['D5~>Y6ˏTեLTVy:3<3ᄽd%519\¸3FtKUV` ĔW7̫L(;ԎmR*/m^6T݈=zm%45mߤ>îK6#!ndM(ڪC{(#.C'Ag!ܭBacuZlrH 8VDrc*;:8Xܓ[(],\2>K;FtH9ت2fx$VϞcZx964 cJקeݕn\:+ِv "[<1 A"u(v`([af;n 5lڄM lEQυ*ksMJUZM=a.1^~;.YHiu>Z#Rn5 n9fm*yC%+EIwcNܙ') ~ޯ|b=!UEы]k1~n[jmlhLL:BafdVy ܿ}sye؊ vKKfB0adS[a_5]~ވix>gY!|,Xz{¼'g\qhT4nNGMpT,`4(ГZb<$Y%Lwq<*d*=v_Fx3[}0̓_kg956b0HopQrM4ZN 懄E##DΕ<ݸx{$=d#5!J8>{AR)b'xsOzzᵪ-0*K73a5Opio:e=RWEV4f6[ TPꯊ(Eږ%]wm=WLX˳WI;fy[O&K >q" D0J%Uk&B5ۍ X~[qWBϥ PIʧ;" k  (ijP@li·Gc!qL(GMf.FbMCktTZouøћ;;|Nvfh0$bIiiSCS?4bϫxV4_U ސ=7f[tن2V O+Ƣ}~*ӅQ' \m+]QrKC(/X,M7=ќ3+`=bٮF}`'ش5oMypWz8 !e7]yK9: n)hMu>ղ}ewV}- "3ڭQrpF!,5[ k3vT$R bH &#l(uôH?6 }%?I7^`^ r+)X/Lc  7ϧIZ:͌jdܗ\ԥQʦJ;PoF&f0|Us[K,uh]av}+;+vu3uI]lhC&$aBqQ^ [sOV3feel 4|ZQ-k۬mqudc/Gĵ}{US cf+ԚR4c#=e-*}Z]Mf ) 8uf=L_fpQ7L_Dʍ"ĔD B(uniVҴ7Fцf*QK*W&8d@ L-n;5yF BЉ4LgؑJ]6[W{z Զ%y6+E BʎҴ:^w:*9A)Q}@\+ыm٭tr lhc1iLN3̼nCXY'Y:ɯ3b|ta>o'ɻSl_ ^ИmpCwN&QLb'6T]O}z@1 :oP[F[8IC>1O֩ >T[UoshCqcSٹ-›%ŌfM &gk;ƳC 67P6v,L-F{:oW{K+2l͎]Mzb$q3lCަ b:ag87dr=ctjC#Y-2ib\ ]v̓ޞt+2[YZ)"=1~sn\jrp;u'ݫJ"I\[ GW^~ݣ|u{"]i8g h}T ڋ"ڒi sXhŨ‰ϹU zP=:)mM*&zFxl\ 拋T,Uv[_yMm{jؖuI̓=DO(do[LOR7OgWlv{1E&|=B i~ux#P@֡-(漚 <㴲ʡ[zU1NsVآ-Ex-@C'\nm&k|8Q5h/߮ʚqi!Ck)OqC^|n5o _ Id&Y%Qd0BRW9H~, N!氶YdnRh< Z5ڞvWdSP҆kY/øF@h"ݕqdmIJmRo:sn !%pץ=ztk$H@7k)iM,4,޲Z"?c y)&Tvk|bڂE:p@(^ѦM54W\E~i>STv%"Ӌuh}34q0F ;Kw.wHm԰| Qi—ҊVg!F ` fHvӅW l_WtsMɃW-4#X 5n/l;vzkQbشvY]޸JCnn?τ_!*u_gLr= xkVF%g'3Y f>lbzO'̳ӟ,%,_x,|.0%vM4x J ~fKhT'58s} `+hV_Ug^hZ[._B8=$5Y+vM=>gj*1S=4ECYʛ~!oܾϣIc3ui8-,tִEswTƽÝOE?>Db0<LjYR4όRwz4th+"Qx3\B5z* '+w-|=Ph%MILuf&6,!ǔtGݭ=槡W՛LP-1^@cnWur>ړ$B<ρo*7ѮS:ٱXғkilN޻joN(~&ITh[:7"c,[ڟ$ fdLa)$hPFb8F,6\˃jez(u{ +45ݟ,ww5^3W5l(ܲy;+~ܠ]cUV r~hEL#- Ϭx7 lW_s%A*SAr3(K o5Nnvl+79dv6E5-ejY* =OC#ZfrY1E[j)"'OoReI7|ZpRzzu&yL~v8jnJI'=^_kjz[|1Z{i4Yu/wmMȨzҙ'TךhVcMݶ)&zȱ.D^OQֱ'E! Yrg!v˦8ld};CXG-k[sݣzSZ}Z];ڡ Rpzz5^ʠ;%8S|BqB8uLPoO  {=+惯5{u :h66>&ↈ9Ӹrȁ7cj!kG |ld\NHHSWEC%^xߥؙM\|U^&b|nFmmaC>߅v2i AqI,HϬ-(^O PEޚHЕ:ݗhtD6ӵmTon`OFpA8ljqqS6Uk3KWw@GVͿvw@:6Żzq"V=[ђέ%bm]]RhvTm<#,NcEX˪i]+LpwJ\4QNx(F.bTylzi"14)%Cjc zmoN $1;f}1X:V䵠j2r!_4ՉNPmSLZy @ayjڤע["i{=d#=lVrMHiiF\AŢKt֟2Svky&ZSt #)JXNE=A|Yo xD}q vI`AX%p֞8n麷mf43\^ze[hښ*$" f˜AۺN8 _jk)7vH&"ZT\v]zU]!b܂c;89Ѩ ̳7m[jަ&kb RtC툵HI9\MJI~j!{~9~4QWD P{F-& ?.'P8e`I0f𩛦I\OkϺ1Eϖڝul H!6үU3#Z`+273կ Mb2TE%߇>97>rk(ؕ;8'4Z,f@/!` 5o?|l/\SY+a"əS8֧O%e:"QWϼ92,D"F#(G=7J!}( UXB"#ޤ@3"4i6 j0(04Co'K\V(jH.ZŽ7d\YX%loha{,_zSe7TZ[@ @ c"L 㒿Z;kd!knnݯ6;xJ(ٶ:^t^ (2^c&Ea] =1KA+Qպů PVgG4mg<\ekP%,:lGȶ' Y.\M a@q]p;poJoxnY9]l~ mM:y[hXgt-*ptV୬ &ޫNU4U_^;Q$J[KT˴W2y謦'g)3oɩekuJa_k;]>`%d`;gYQf Jbun$ \J̦Dn2o..34Mg$QcK*mw"B-#"$~-X i.9(l#XpPxPbȰ3H@ H4w=t]fD`lyM;ѽƬ1fH(+b|⸖ 1LwY>ғBK s %o0?E-3Ac/B5& kZ'Lڊ&$C+chNxm37^g{͈oAYigM3zcJfjmSU##Ώ]V5jbm&J5{r^kHTy Kˆ c!}+ #hkҞUQP L $p~iZ8 I>9˦\FCh"+IzنA[0 kLtKeեr0 h4Z kb ($VF&={!q@)j}%en[Yg *WJT2]ծff$y8=@οvn3v&;h_A~r+ %$1$$kkx"-(BL$IDAseA\1YtH$ &ݣr"*S5fhb+'CxBbH6u^[k'"LKLU,_{oa/N~?+a6{jn;>BWj7fPY{X%vV(V^tN+FD9IO5"55t׷?ZӣT@_/RQNiHϱsx8k=xݪy%` ReogDw뱫(ѷc>!J'a|QR?bR*$Abz z5/~o2A#p8Vw=8w=XwzIa*9͡ C} d8ɬ1t wdqݹ>֌Ci q]3u:=L߷V-ɔB`FuUk!}+#.Ow:qY^v[M'zZ2 H-Hun*޺oЩVQ TP53n@)[]px#m8ThE%VVwEWvM[Z5؏Y,GC+q(6Fw xU-) % 3fN~GS*1:l=y-ˑec6:+_U բ6Ԯ:}O]O?<~g&ONoml0J=s?{sOa l 0z 43ߓ' c+^$Q\߳O"}|z_~ܸ͵zқ?V3RةA #;ibstǎo ={W{_w|p2+Gɚ{فpM>p6mBͯ=7OߣM_×R剜cM8YǥPÇ.[FQf+dx{Wƞ}#sGr+^03o؛Yk,϶߼}#nNwG~+Nj1%ܓqڮ $n[mVȢyt:q^yܟ,x[K1`{6RE:=B}eoW*4xw*q߿|×˾X^fbh^"Μϛ>--`Ehi]%6 $HFƇC| u`szm:c1JcVɅR%t4d%J=s-q[ %E_|x=]]TCXEN *Jmʱc{|Wqy0 JJ%Ujֆmi".w.?!M͆IrXHV϶jqBղb۸Ǎ lU-,*@%JsأBj." Ud@%ls]U+-\[tr/)ҏwuuhr]ײ̎B|,b-# 35rKf;եr~3[^ktnBjp;޹:]~j^yZϧnIwT6= \\(۬ݥ^؞zkB'\˶U$IZuDc؛sE!;ZV"%^;Sn[FQQwRʜz1l'T9ȐUm5GluIkD$"-R9jIx>slWE M̥߮BuQtϨU7q Z['{ճ L1/Gm#ֽlU$bu}]w3x@ײү@5/MH^WnQ3+dTJYHFPN.lBSN1ǻuxLSN8^$%.8 u175T0ۼɽ}FI]lZxxcɭaVk[k`hLLnp3JFH—!^~G2kٰM#5u}׉RVݳXb-bOn|U<*fHrCӽˢf3SEGnO,;drrӖ$fjܓ=IeY*IF2" wvM S4 f[1OvM\<'dOΠU?^P~D;Oت'p(~OH' xQQ>B }]otS s2߿Xuyhn)Fdn sg Ƶ[o>TtFdM?]"_j'Ga+Mn-ʲ s8pxwG>E+paq} z+@A5 BRcicc>gnʻ5'f\s # ͓,^XaJ {a<qגs/wž4 :5_YjXݶkoISzSarو@Ǝ:?%a(ՄwF}7=|NPmC RN2_/#Ǿ+u>i,6['+D+%_j$::>ؿ["}dOxQ^;yvMBk x\+W4mG:Ij-FR~lo_G>j-`Ɵzv`FE\LL*H }P3|6' ߱bn-mģۮ<^q͒VOvlL!Z~OWji dE bh."h"e!HEQZ2`CkױZl֠YU+YwŶ"R'bNi}`!ʧFʱHYpHMH5c(wU]e_I @K+ϰr#^'oT;t5vke°GC8l$R#4|UUXpn>{޾DUU ((y%xU[ZS2fiRfjdu$keݙ{.bѪӃZdv'eX,J<a1,(Vhk4}BhMv"ٴnR$LM9=E5mTUbskg]a|g%+((0dB )˽;v=|0 ̈zn6:p=R~$Glp8̼^Q^^zvfAι(|86g?2r؃%ӛE X_u+˲vQW)$!⻆=~SsBdlA/ಟggMcCfy:7L "y_ 2ߴMj׊\FrnK3XW[NDɡ&N-'̨Mƴ8>|ckQ`x0㾈t}E:6ܚP|y x[b=L wx,:#]-BD@}@$[3Qʐz^>׻x_i$N)w񷻎 'c7x?j㯽zo/1(7Nk3GW`ViG'ɽ4͏{=WL)K=qk{.fO gm}@"22[gbo_};ޞ(*I^,VD<%}NO< t^d(333}U݁{5,OE.@8#v}jLSAyo<6']y{^ GYFq v[C/ #W?Ww/~~ =s=CV޷bw5)!q?JKr{݉We"1|]]jK9cjuwY͸~_B6=hӑ윴G{!^Y::89>Óy;1b ,cwG'2S|ȞǪ?,dŹ7q?kk(? -ոQ> rrOZVS4p^Y}y%ڄݢ j-#|{O+;Ĵʾ̛q~pg9 I<;XyIet2M[ {k4Uw@_{I5<ي3]ײᯬ$r-Z3wr`>-MnxGv ׆{ܒ_yeO*Ùsw:E|P?LS^[{*s 7b]QW|~(H7ކ`nI7-y@AoYwV=٣ވ.3@ufˢF{zМ#8N$>9z&{O/ ~\f}8zsM/ K /1gw:du6[mL47&6FNUh#/!Zy]!;>= 2C .4y vz=xD{}DYNi~j~/I;<$yLz=cӛ U 꼎hgd1{_5> nNWǞqP{ we=.+}=/yԓ9^o>v-!vvcQG v좬Yg~g(ygFCӺ_Wx|ZElytAw:7fb{qIsofZ99Cś(F =onw3{()_e çp]/AcϑE` <N?)Of0/tjKIew+HŚОxX{^S|v&47ƓoxJfցq-G}ڮ9sk6V=={yWO8Q0Fw{(^+C7ٖhcfd=ڵ$i.k[w'nxtNaCi歧,y-'qVk/'֎1{ww_iFffq2E4ڸ}ksՏ3؜,337%$#7g{+hOy؇U{sYeS>g|BxDTm y=Ls-y3rKbO q943vbyN̯Iܪ+Q3spӱ7.ڄy3sG9ps=rlǏ۠{}eTՕJLqҝ[AT=&H;DBԃu@dp}Y#e2/$vJ([|.{P}M΁끕e#R[ßkm~[k}U3sS;3{F0{yBc|@Tn=TNhuI6?^/{۝Aa&3Of{pLž;{Dyܻpz%rñ|pO1g9n=f z;ɶwosTg&-bq z(!:C8UC$e;s7}jhbiv";f-EfVOA6m׷oqj0=8n')7}}K͝w:1z˛V% ׽:V+%~7M ƃۭ]=mwy{`yۻ9˶zC:\wܫ SsowtEB&r#wz=y̝3^rQ_?)s+-cPk}{ðF+ V s0ܪ䗗o'Xxp$!/.V jm{}khÆ-@՚'w'{0*>>ٟ_ LE@}Jfwoub3m 'zp;=# |7!Ç߼vΙ!4|sɀwþ=|Xˏ(D\#vgtѿgYzL{}5-}=qkgs^ @b4pF_:pvnxr~>sG{t%{,ٷ]tfhk/{ϻr Blk'C^4;TԹ@<|_ NB_Y;s4[gX7O}RQ杫W#7rv_׻Ŏ#QL9lܓ痽9а ˅2n{5:7瓨Sd,澛_ ۝H> ~kbXHboM;"T|{ %<)k.fETxYs fwr˒LuO{Yg9vvRJkd}]E'xB2KxZ|p{^hoV$osݾ~%}.Qxpt oeuʄo]}YߠJGgag.%D:Q$!74 SJAs=ioj_=Uz@Ie|l>雟Mwa㏤9|tM溈 {4WS6cډu`0z{ OǾ= dr/{z{&{w C}au:,r'V wOy@8!ϯm|V_"SDt>\(| $Oo;,}9%t/=:}+1+{Ŵf{gn؇G |(Ql&M>`W3װz|t;̷F,*wL3TN}" ՝w1u<בw};g7X}Ҵ}.4!sy8xyMˢLc^%_Oly3caLݸ3s` w vZSsi/ﯽ\wm:{L2v?$ O=k}UY(@nU'/d;'3xkY3%Zd|<5oxI݌{N1{Ha։>ᝯ[|s Wat}5ݓǻ2ǻ5|/ sOa1feL==H YUN6v";m}1 xL~|._?| iQsтy^ϾOYy*}=LJFg_s{A pһ` h7<)=KT# :4w}4}_KGg7O wWOw4KH x],@ ~_um2ÇruMxx{v`)%XV#k9<v${c40P6 q}7} y E$Fp2|}C`g؇y?z߹7Ov7%ŀ޻x r(o$mǙasr=u #n{2LpۼJOD-Ph0W{ԫ=,$l#='x3; Z 7lŔ* a&E[-~3겕n0C~;;{wǽxDzmPpgpfݛϥUʌ^ܺw|z|C@ty'7wRD{|%V.փ<|yv;z}wĮy>w(5oCESs@ߋܿ\&L1=oF9A7&|xIx⯗qӹ3mκ,s5.5dܓ<@MmW,;{2Ӽs *9SSDh;=eBn[du/-WG:YHT{u>ҹ?y&l=9ݘm\}o.6 '8="z9U80@/?h+Ǟͪ?F'rAx6a@p0Ä}siCR|_<52.Y0S˽Ɔ~͒L야tpΚ7O_l|돴x{`0׼vGt[8MXTu7Az92xX<{, {w X#gק=:ry|}MKҺsA[zr߄[WVǪT0:4ZM׽wЯ^p4fj;=.kǗf%+bݱKʝO_xG  jENŜ7|u!9ӷ/=~=΄پlj > f]F><ĘᏢ3,KūP> Dމ$/x.}L^=^x8fwt^="#Žla; o3G|]lh𹞳*Mh_a{{wnH{٧=rfytA7Z:f|Fv{x-ή7O[wܶp[ 2dwG_?=1;x=w9P岯iyɹb ? Y{;5NZ^|̗8;*#n ۴.NʤY+\=pTCyoJ cs+68gkʼ4yj`]9 {B{ |;8xaRS'bѫ3nݜw1{}o nF7թ#O}ow[vn2pw8|?N2u2e >[; ~hlv }}ǝ,¿Y9cc> L/^*>SNh75[>w;D3ü \4`KјFT"&hnytQr6n{|7/]6:UQ5fc2lћx6^^Om1v {p(%6xLԫo>ah^p`\$'t޿}Nt$I=ǝsGIPG= u'2ћwþY3䭻ajۇf>`C2 A \5yQvT X-p|o/;}=۴IgC@ӽ'9aM@'NOl6P{NަP ( uן <_}ç {G3}U`O(0_{7vv标NFw[ 6 Ϻ#Kwtv9D< K;mb%`qع3pmI(C-CϷ#$=Kݣԑ=R<<\d{{7dޛ ,vgz 3o)E\5aׁOt}}xr|vմI@ J@sovy}{rݚ չByC߼av3H\c{ B{|}QPϟ>9};=OL-`|G`A9=>r7qÃSWa_w!} ޠ0olW܃-ں]cؒWf?`^˒^l34by{~nq'߽y&oدzI!ei л7fv(04xwx9-"3}:EUiChVhYޟi3C:Zkbx}\tVf^wTe8Z8 yg `fZ"#ےyfUsϻA"=\) YRfw[''ozuJU^?fuOict Y{|T|d pBfO)+:,~ޞ-s;!oGy(4{=-ࡹWi&CrfާT^wطϖL5pt_{ EfB8 cr;ww=+{=ZRS}֥b>]Հ偏5֮#)6iV{o< վC܅еVsg}р|ņ-7n>^wkl2&pwˆ]XBY{Өi`VRN><%f} 8=}uors8a$wלssaGu8wer{'en|Z<ĽJȪ`/پޞ-ڼZ6<(d==)1{ǥũ?{_{f%;Nuj܌#ǂ5E{$z/e;"].BA8ow{;!d_C?=;P ?_IibƭVtUAb1QUm 9U5Ex"R%)J'_'F>5A5F.!6 Zw(Qg;c6d(T=76`MF܁Ɠ*љ,teR=uW pa6! ( > ov1QE-"oRFuX7uE"D',Y'e 1%;e˶2I)uv m+9B ־(უ%){tZY/Oӻ5Mr|GS!"7EZl]*^[E McкI"(C,YG p Ƴ nۤj@ Z{bbFX\$;o:]_Xc6e Q5bnGT5HKol 篻IGr¼Xٔb"F2.pڰe8$Y 5OsHzD#p Knm㫂x!#іQ, /3r[;;v!;BZ#0X)h2Z-1` 5ﶫ昺@5AKwZ^Zx׬A$j۹/U*m61;()`s%"b(YKt+iA #d|TXq6KN7f-6HȜB@!Tm-@E,&s\On݈VRںq DFIkPok~!VC"j$̘=ELa AFA 4USKd LI(S4r!ӛ]e{ ْ4f78RE R{aϟ4+qKqH$cBnʩvg1"ےy. NGmUd$"C'ίYZb_ۈ>aTx¼3*o=eS ⏨"Axuw ayUR#7Voצ5mL6֫ ]:9+GQ#&Um'Č$/%o]>H {nq+KbVr;OU/&2.pny˥嵡'ȏ[%i&֘vw;2M7憱^ {ԭ^WmkoOĶX8rj ^<` v<Ծx45ك>}hW Գ$o)֊K6VvВ+AUIbͺfjx㞉l!њ+Mѿz{kBYHV.((V"tb6x{Kmj(o(oEW̑iܣ3^o)ݷڳ1WWXa%oeJm[.!@m'6)6r6fW0{iIL݀SҜ]؎`Cdgy&Rykbb&ͻxW'j˼/_`ݤfpTQ%]At$c26[U {5imO aHdA=z 8ш0Wc q2 ^ GQA{ҪYi 6(2j*f&dyY'J*S%tz!e9x|A#G'4Z/v:DԳu3 FmH5Jצ=8  uQh%_M%\:݋l-agQ8Hv$h ߇P+ƮȆk=&fۛtf2dFh}y#Y15)0{yel4δ;dj7 bPko}&~6w6M]^.]sDxD:Ķ) YaHFGI\H8ug"sk/UQ 7 vg=}װ}= |ķ"wOu뷳w쾛D`#x\ÙqtC \vybTU0=oAb bbTowgc!<ѿs(}4Jdv<ܥrDa$Dxb 4H&bQu%}vmVo]f<\?/kNzh1z˝Z̃-󢉭ڙ\ڧ>a=J{MnNm1؈ cL!F"CϳTC0@l9K {iW9v&"rZz E#@AE@JЭ*₏1bαLDʹdA(fj Pfj ) +}lQNC $QAuڢ+,p%E10I3(U ?ÝnjYq3k 69 R3k.!?恜ESkT LO~)9?9&Cyv3mfU]WȄs~ PE T &XBHH~2[*ya{swh7sjBlmO+}it#ի|OLZ\}$tOL\6'6V`.O9^άPBݢkH6bIR l3)gSuHkh\w NK2&$ e8ŝv-h3!,fdH9<>ػSn[B*cwY +-eID§kfZF PmTəSAhv˒Ui<$c,ŝe7nkIR+EsP*g[)"/ Dlsj4DkR*D"iX7jȶxK&͹ImYc65D[ҟIa"26 ֻ@hg[ݎI#>e= q^EN$!&Eu8+ x7&xUXF.u.rz{i$֍6Adl)O™ 쌄[m.E~R8!0 XָksTuT< {}G?_QDwL\4HP?$*0dƕ{"9,r򃺻t[G3Y Q"gXb"}1mq`qvp ^_n;ykz/^fNM{eߢ?f`nR'ؤm-kz;7U^hwTz'Q=zmol5U{Clwn<]֫`C-q%ZlpLԌcW,7{z\xLly p=Їr s=*r.{`,c$-Kd^Ȏ02% Ol&1%PIq|pl3^ѹ񐪮f<\9n'Fr3|.^"E_r/y+k޷jPOH.=Q,e]/uzTXmG=+2S!(SQZ4ApoV>nhue":d73XD ٛdw6ijŻ"p|@٭#^ylϩ(oc-6 t{C4Bc"y榘YY2r ZknƶI$3@ {6'ȓMmRpezn5nT #` [Euz9TAłU%ɘ EQ5sf*ZYzʷf"o}e7+3L0Vޔf4i|X^Wj`@!>q85w]X LJMU?m"#|l5^ۺ>V4W%;"7mW[@i\?lڣ1LVTծy{b;#dkRpCZ/'OkQcj;o1 (uL)y#4~ވi@,TSQSW x1mͻraoc5JUq ).lUҞ ",{IaUi&HjsHxkw\`&nR:Ѹid).jl Wnl8y2 1"eݹvYXP'.ZT*7~z5I.͋b ]Y\*0 f{斜46`'Q3zӡ[dSF֣E -Ǝ<\5&{]N%'/*~pCy9Qe =Fz%NA@릓zSmN¢-2$ ǔL<׈]fx2ۛyj-02.kyݲКÒ.0+0e(;^fݾa\ VFZ4+a w'R(vҤ*K?MYnik1%2Eq2~QRGf]F\żQV8g g ;);rZ"%Z1xTWDE)Yg>.jvi>zka4hr-emuRI/ODD$H -V7ϾiHcFjUsm@y_((_4 X?^\Y4F?˄8Vf&>G^H̗q (Q<҂hA@~` { AKBMP4BRBE P%)HRPKJHMRб1 @PPLTHρPTIU-AM5M%IHRQCATUSTM#U2DE4RP, QUT4URET)M%D +JDEMP-",U""! I="P4ШR"RH'_L=? ==! (>a ZJ)* *xx ٪ RZ)bh ** JHj!((J"Z J* jIJ&j$!fBDPD!C h)(iF(J (iB("B@ @  JP J) ~zc y T}J;}!H3?㵑"LP;!918fI֛ǕĤyU!^t3Wblzv7GK?H"T2L# cXZDZ^ힾpDiolڀS^6gI\Rر,5y B,oؠOmܷYEּ۩T1GDNbU/xyNv"+SokglҊ`6(sۚuskծ$J/>Θ} %Nj{żRTqUj#k`PCԂل nLvstbW_?l5-fvrӇ9!s183R^nћ2B&-+E֢']kChǢ{.0JŸ A dܔG#/" SyAbp56&滫t8%]՘03~_U!ϦN%<Pfh(Bh*}u`)1%5YC( +? ;KFIHTd= &7PulKتtދAm\dˆɑ3$if[ɷa8H0 e>Ҿ =7=DF76"ۋ8PLP+X`gvm-fŸF̽k+dw*4UO!"QS *ZˮmrYtPIXlsXƴ&;Nos;< 2p)(N6(fv-JNR W^o&OCDy/3D \T37vښ46F#wT5I!@k 2LvUUm֝+ԪPPvܟO Zwse*GaPQ Djd`nT;bqQyۢK;Gub^o.BT)@ xwO ԍP` Lb9vj8k=c{pmL8\nۧ TxjlRHS! =ꝲm];cbtYc.}_GJy޼ݙWEMWzrjw*Wh=w.7?: /\q% ! ~AUW%YїIdc4^wBü<;yO oS[/wyfyy{U:(Yl3hHlYk_=0A8ț÷7/Q0HY$`PxVu)#kfT${`'!bK 0$[5`HC7,"lFaC"v`7THMQ(@3EJ4 K>)f퉛PuRsDm萓/ml3ݸ3NDD%rz` V7[wUriaS<9O=ecuDxhdh5vzIOZ+JR-) o56^6#%ʉ2;)U$A Y{Cqjxȕ vч V \N4f!V"EސVoIilqF`YL{|XdГNZE-pfgkx֕W+;]pY`Va3Pbۂ34DɃ⥭ԣK(߷q_oGE^uњw}:S as==_X5ٻ,0Ա܆nꀑFGfPpŎ4*@֬teKt3)yXFK Mp6Dt*'Q=wz\3-Q(ݵ ׆Yf]߮;4Ot{I]{jL'ݔr]~Wѝ'^8=$1פ}UfV TQN.晡#*U$dx5(!JN/NNȸǪx:f}"_P?}/?rV4nzP"M3/hv`7XwY DaQ$fk3y8fK&J!ȇ#hot[G/zubݎ %ŋfCT W*lW4ٽbڒgĞMRFCA J1Qn%Ґ3FMhEKD%Q"^9_sq<РV,;SwHv+tI;hhЊue=a .6V-ګ @1adW[O-;WaE*+TbjZ) to>fZl7qLX5GQPwP!uI:=Mvޠ&=btZG %a#<E1b<(lTL覉W׫h8[3}!cN)/pNJ&:svuڢ-tNձAmo~-<ڤ`#a$!$XBll?t{$մj:5$jW"3iZė)Oro؛UnÙS(x.fOH;%hnR:NCQ-"j+t́Cf{iC`^\=gcQ5D}0, 2gZ%ա S̄=$s#s"2Cƅi?IH#Caz>a9^p 08Ƭ\EHc}$Ѻ}hx)bi\Dl+@alGlL8&|yz;;vɏB/ #U~p\?Ţ%y\auDqQ 6~vxЖkHPN(,=@[ )۝[=A@v@6OM \3NڤE [$l,I &a#r%"7v*z+c:dc*% .8MUFM:sXcG0dp/_!X6 zp-AQ R,%*goyt&0Rv~9`97v 3 (\ lt"(j!m8I5Օ 5"[[RTlCqG5Ī-rBeD` Rnb><];Pܠ_^}!6*x\)mP+{$c1WKPj.˥;vYVYsp am%7cHIҬݎc%2+ETvůSZP4dR$0 Hu6X>-'QRUHP̎Xϊz:vNf0:I1'pDHV 5;;n!"9cη  ͺfB鋊e I3)elL"X,CX s86ZU v䇞=Oô~NΤUQڻdԁHٚII+U5;&]tj*%lDO-NE6ga zxq>Ó$zi@lv@O0FODM+EPERSUAT%-ЁMR5C4!M% TIBP2P4JJ_wG+ii*)`ibF )*")JX i*&"J**d(*E x BAa` RC@EP~!@"toΎMW6UcBU]s^ٲeG1\pSRF:rF5h)J-D4x0b(-}teQS,'BҳE]5@j[MQ'Vo.򷜻HweGh) w4j3wQzjNJn$.̕4nJd{"pW%Qb4 KYgj59DWDixڹܸ&%k*MD⨎#yW=#BF׷@#"s\o̽ƺz̝ӷ׻$6֡{jn[~Cw$BK͒ǕB!O130ciTÎ6YU}ulg:ڵ۷q>wŢ2wBNՓ Xs KYM:L.Lm-jԼ{)΄76f6,7397+2I02PUdIw}GvzvWPϒ VE2nMTR5!dz^3.'^:b[Iè7JF wF7Ezezʹɡķpf 5BW Q0MJ3 Q:˜2E 5yG{LbzX  0P 5XM@d$"k GՕ:R R?Whv̮ LS n-s׻ǯOz*K훉`ITX50_Ӿ[ο=M氋U7 7.dX`gpLhB0 eٻ7=㝹{Z]Λ7,z<źIBdV43fYw`L9Q@nuY52πPA ΠPR(X{խk^]-Wb20L:SN^ B:+kt]g"HD#RG<vs6+>*E:8IP@StY̹KPszpƢv+c᧣RX+ׯn;f *ŽMVa);҂J>[}H߂wka:ʛfPm O&Mq;J`0LJA1.&]:Oqۧ#dT]4g!V?lCr#neh|ʁjJ#A/+0.LSkhxlfwTi}&T]۱'pd%s4nJLSr[l9^]+sPOXmÛWuq"B{[tb1bv0r=x~"Щw8Jfje;Asd%ֶ IdsSyU< ng|FmS$011 U_z"zJْjPU\Č30vhFۡ[K58o'P{Boj.g@1s["lfaݽ=ZR #%BT[+p ¨ksO!%aǧ=IP׼[^[*qjZyU8Ff$ x|syå d `L0`=˨ 4y!F;M)"0Y_.=ն=ttFOK}:$_9'mČ'۩{%$I"d# b(V!Mh#!I(唶$׶zvs,\[7%7rS4\v˜NFf4XVp^*$bt/)zMstįl}pngR0`?o?k!v|-(6UR'ߖEa*(gnhGZ#uۂ \KAczqldݮ nrJX*)4{nslEɗ}J:}Uv^E3Zn7rݢMɾw8EU;Z5/lsBjNvd;yfU1򟛓+m)L1%%EV1@iG~`0s&4U U ,ykmVqTqdQFEP!/'!r?g d=ԜqAkyH\aPNTC-G Uo02i>]\ےՀPGArXi+YvY,#+ }ҟ/s)I[m{!i5e~kƒP%D~Ǘ^UDy.Ūl7ݓ159v>6X[b@{dAd|f5C'H">v5T]sNW͊w~n͡Zဲq0Fs \-.3Y멅BZauH`gV=cRX5&N3 Y*#hF"MdΖ [H#!# 6fnU7Tޥ!DMfXﹻ'&V[)`ަZXdUȹqHS1{Tlik>,kH9%Q|x5鱞ki. 7.e\UgV@Ϸ(vͧDeu l$z L$7H0ƣE:x0:"%I:hmEVIWIWĵu`FB!\|Ԡ:6TDB[>rWL2$-~.쏈]UؓI󋇯69vi")y GLmeE$g *;կ%{:vg>rl+B(lQQNY -'0˥KLtN!Y, c Vmr;i&&Lecݘ pFHKI&D)Q9!1C萅 rTDl;XfzYZ;K!|+9;*Xrm-øgR:w9ׯ.|S 4%RHB?U H*i&fJ"" J*ihj"(")ZB hh( bh*%=G%  5BPP44%"*P*@nIs!9t  E׈ JZ ihhiRwߝUU5I@UUM4U#@DQE JREIM4T=y Ѕ (G=HO|>#ۭ棵Jgm]W%+55\ʪŇM!Erk{꟦Oed[=Mvl T9vB!gsWY4l=bcFUݫЇWp<.Z[:J#˰N`q&4&`ăo([ul R mG3kJf9٦E^G3ܽy \Y)쑊C-IZv[7qy,Ow%gԶǭmiuSĖ#ZN.[4@4 `YfRii |g^a=Ι gYnF~'|s9{P{::o7% -q7*UfFYEEO~0蓦[}y+RЧF-{Mo5mA⥢I ud{y;Rհ%ɷ^r :uᭊ ]wi8ij9ed-64u+ec̿]A$ZX [cŸ"![]t ؝IZ-+=qhqJ͏2f-[KH[_*oQz.ӦEf%.R!s+ecWќRg"];} vD< bӱbS6-݇eRL$ 3H!BwFc=DyNД# vcV[?]ފB)ڪu׎JnyxSgu<-g#@1Ǔ튍ՁF:_۶֭^N)SiMSz3 )'2`,J&Jގ껧[W20蠖~Ӻruփ-XTN0q%?Lf]";&} Bc̑3 lU $ݎ*27F3â$ڙk"iU*TL9_)t+cK̇nDka3.CEƼY:ݨt %qY1D7sCBXiXDwAjLAdz1 ON(껌$ȆJm,"()f}SZXX%`\Hh d%C'T!N_}Ҿ-vxW~~*AFdh2ʉ7Vb"< N#`≠1jVe9*jkzg#k=J2|âMud390A88kp3FIpby^xOm%~y8ܱp2Bo[bݢQ.K2摅/DqȂb1%)iKW'-]xM 6=\hE`llq{r~;=7ku:J\o +߲#e$TPr@n38&eI5T#I.N7N]vwF)U&WAWm"zuz43uFhx9w%Ʒ )Ky4lfǺ]M(5Co1j[&wQr Wk{Ojexk鷵ARQ06’i6[C`SvH# [ђ6!kDk;wH&߾~Q/5ٹ;wvbWViVj5ӻGo7EpMNXt.돓mŻm&v}-*r4=R|ب̩2uvk~[3f) dfa=+<:;̥9 w'Ys={\’$a o__=;xc[ɆzСLBIbY xq6n0@խ(7ycTZ$X6̹BqmATKޭPg^IQgkkuy5cZN^%p2`.;BDHRJ'wENEwq#05,p9eAC QJ0KoOk2 +2XR oܺ0@luP TlONձ*ǨECxEo;鶾 &IGTCS8Z(ڶoӪ5K)_BS4~;SRDDR(Rn4ntƈi*E!ACNښ:Ю%.oja99]ӻBk̘ Fv#k\nbd2uyomm}*fv+?4g;4aG0 Gw>^c#h%3laa'+9i{Y%K>0S#p+I!l͌l︋}7暈fmm ߢVN ~\gmxPǗԷE?ZxF6im lA_突n"2-T_[׳)[n vwbA&Ns'f Ŕqv8&;hMgվ17:g`F2^2;y21Zb.cy=$'zUP"` _[c<Iq1 1$alS;Ci#$t3u:e⃓7(Ш銎&]{RvzbW IK}(狣- mc +$B[.mؤ*-+k>e^tAbYcOx˵9vgQԔa_|+"q4'6eM(c흧NKx^f޲l$$ lSL{{&pZ@)d&1v`6tnv! 46Ck+hl(**N"ROҢb I tP|0-gm'u zcuQz=9z=T-θwg2o]\\v";y=G^u 16X]}ʚT%|ͱx"NtUyCA~ %p g PϷLZytԅ2ѪcjWeHɥ6,m{Ե<d8 }׮9:qE-Nri=~[9\[=D=ZqrJđ !!N a@^MY[qf<]mSey|{-^^~E!'KzɣL:Kty-iQ遮n]PlP . ! tk$}BV=4A mTuG4P2mxr`كj@w4CELd\eGQ#5}5ͩ NthFL6;7f(mNdػ[^v&3@$#w YsNuIbIAnO]р ܞdS'OI$}j46{\F|w.5Y:a/ӆ܍6CeS ^^>4L#PD$Z(X05AU H8+Y:FO5MElSrsY-`q,*co\w^Jx'yI/|TVVE_JTa7dtdZnWf`(l-iǢarm=/ ;Iˬ0Kk \جҤ:Ӡao["|@F߮Cӑ[jAm*9'B<$+H!܉@@QKB$fd(2ukPv Xq/ r :q@S3`˖3!)M" b՟bH(PpF'ňwuA6)V sL(`ɂlZ+m30n̗n8Q0{mM)Ե}vk-袰vN9'^=Bm5v!{ j[Pɀ6ÌȚ3T r m4ݙsZ# DlYp[wN_dkndfFqWnOgd9]&M̗Ki!ۀ$Ǵ]#I9~2,l{Ss'IHVJ .Xw6v[iLh&,-ITiф"ba;i!+EG)Q؞P\P0ۥFRk!?JD:BPsgeVc݉6|So5`.ͨu$B[b]ыsjXUb8[v{V>W53[)tMAKao7C0Er՛ _}Yt/TbD%#[[IWN%xS8h7tq= n\r^g .iL8a-dnb%Z|娂3wܽ[E|{wN1 8R-81 qʉwZ椐dI`,@oM:ևmжrE`_jZ`" .T:˭Bmi(\9ڑ}X3K[PRhx!K**S'Vͤj굫o%W9Y6aЋZvYUSLyd6Wf9;e_!^C7}Y~DܷZz3v0aژ-083F`Azf4lD*p6պl]8Aj TeL`3q&LśLj-d&0lunUF&G^× u݌;: Ad!>RN vESF2(1g0i{vaz֡sH4dP\35DR\kBd>Xpv${Χ۾{R#lRyr,6WYwFB;TE5#iok!y+4͙Ć5+ʵi:ܹ4|8VT^d[<@ f ʼnaD$07c B^vMmAT*^Ly_qͨTUV&>9.z=*[w*=CtYº%rb :by֔L0 l?hf_D 2~?nCCz{_r$l1 ol֖BIڙF c&%SGW|M3 xB& f\1vz:)zPᠢ$"ST4f77!7$1b{7Ktof=đY+[#(iR[H(2maPFsLgzC{gNk6(е݀×guɥ4Jh1W䦠40#'2V̊`4sZ3/v"Z8{_STƹ}, EldI8]hac\pƲ=)L](i;4ke1ZL[oN;4iA.a8U…! ANlR(]aZ"hc!>I!N͍EacmZR(&1 g\ -]X=_g6/tCbV+l@MvVK h pYmBoMx,3JN]əRYD87mBZ[_޼U}DHR"5{;8rzlv;01XKwW=$ 1.\s"4Ҍ! nTۛǚ;2A4nRk,$\xv;3toӮG(D-űfVN45]Ah~WEۺ5먶٬v׷NO};'ϚWao^A׽04[{Sg1в62iͷqE$ҙ+%2;7e6M}d{FOjr\CRdQkLvبlhFNdX&V]gaj.$cg;Βs05V$ #-V,,[r [ݪm4Mζ i)(4ʣeF2']0:2z\Z&eXDA餷BH@ ?A53ebv[ \s}Ϯ{gd'´yE0iE56Kg6[c4BVDJb"@(YH )]EƼkTd 2&Af!J$|ƅljEuʦVZm QIsijvGqLR5]Vk6sft]8b`td([eAF2̧WX}=}Uݮzn֦ t%tjr&0DlFUKUGDlR<˥Q7VVL 7=T=}.rn!uM}B-ˆ,V֫HlI`d3rZɕ L=)(.LalJ]7rB3%ת5~hEhN[#VG* vZ,#A hØM sr]`̘ffՄ!WnYZ,( ī{Rǭǝ7hђ:ө@s'u׾4]d&EfƎF{LF"S1c*svS8*UC7ٲF85^:1s.Qetf 8Nܠ+} m)hzAw4iUB;z禫V*dr#֑Q0sB)fdm n2vj;W/jYڝ,b,M1$k K1NFPMcZpLΖ(k6ertכn'[R`BDeChqF"j&wQv{ˢ;۴9AZ>V%Zm ,FEs-9ozt9дkhjq]۳j;۽Pr- =+&e`INSGn0 `ٖJ$`Dm|nvh QEpo0ӣeB^8zS\hFGSrj;cMcW.(gyɻˍ}HYaHe(ripN̐ǽX#m&m8cv7_E}] PͻtK+7+w,ou:A]N$1L\qc0h2Re% PUm{L ˣXiNe|8@֫,]LicA*8ajSFm |k[:o7Wnb6nZuQ:]a}\wbץcIlREUn?PKr.  k?LMrX4[bٻsCb"tCt(və^Yu M%#pb&D lL!Lh$jԒ.bbf'n lCNULiz7g8{=&y:f5zTǺd͑@"-(hAL&+ ``ٙZo5La=ÇOۇ,a*wګ􏸾˫XQ͇3.$smfgM~7 6 B'w,+Xn)"};pOҌA'O|-ğPi>yl El![UQQG\DDDD?#L?fsaz,V GK{3*kRZ#2B3C:U2>{TE*۷wܭ=$}}]}<񔯽Z7[rTѾ^W y*;;FrM}}Ss߃`ŊF=ם!UDPT>8"yUQEQpR*XH|0TS=01cB QHb_68I4].wKyq!&йDQ*4+bq瓄jfn*V9eĮ ]T1+.v(rWDԠd]sazdK[<i; (:a-R,4LsЍ3VƘgoH(qLy3B# ZznVC"NכC,u]]0}*Fgґp"? @~x1mA[);1V-lR F".5캐 Xf)y0o{1}vhpH-]aݸܦ#Y1ծXK;N ybP@,5Mv83&P^܂3 Q` ʎN]aHe D4 ĺBAtxy} 5eo&&fEJEf%(w- BFiy袑Aq$W7mzG:#];]a@$Hiэ܀B;{Mw`=*xyliO&6:VYvr^kw\,a"ˑ]pxpŒ91vME]k*K8rЗGpUMKdiR)ϵ8ёOUrI/=Ui)E*20s0iLL|uǍM&T!oV]\WoYrmjQ4 `;ʨcPENzR(ZBKa p)pFIsbCY o/[o5DDcRYBAffc(͒&=]{ބ[SbV[l8֧!MEup[)XY_2#ދe]z5G%OH8}(=,?0&??Ckss9^+3(,$a Vr^[[V UCSM|l~EHː[a %8lPX2ݲ؆+2$1l=j|aE5q7'W'Nsa>^_>}E3VXiLEgn656hl݈H&Lz??@66F;dh`iv(neh#Õ7S@7TW/[tªf͌{Z)SlkhSi=ֵLg{{bDyw-;u!pF ;WVڼOo ⛥l9G- W e ;`#TYo[]Ѧ0K͞sYXU)E Dn )18ڊj6WX.&zd1mUƴglWrUhh?DUDTis9cޙ;}#yD?9[b`P}UB"dUTBܞ^{ѱZP5)Ar5|PE$D+;`r ]^L &:|:*Nyb"94ufئ65$LADhk*((ƝS~"h(-b݅@" )Ĵ[-|AHADt[fC1MeJwb}ڿ!y/u46!kZ:DS(́<z񀓆ǂR dS"-`22H]\*c]X1AD2ئ/vD@ '!a"4^HqTsMm)_A ijM9+YEve}[ f;mLtҠ5FdByT}~{9srka-$Tp++[7tnl9YHi&b=!4޽5C&.)rYg.d,`kD>ӟ/J活*X}{ow-ql %z'[;) CF%Q,@nsK y+ q\DWŘ Q1WkJ7}I`E.,ш]X5{Lݜ}6<Xd3t ]^CPpBr~n".@i{&y{kUWc? 4qikQTLJ.m?,t4峈H1 $56c;n!G.<0]j`W,QenNQL#"BNp+*7n[57bR0R_s6uk5ۅ5x_tUGnJ.VQʯ#sc,N/; 8sՁXd?,utˣBQkC! tUP[Av<9*$bek|>_/ ^ ~2t;<?:@ H w憐I$%<`aM5Q@3˽~} )gE€tU}'= E R!CK]Pa@M"ᒔuCtYuC@ICT:$dУJB@b 9:r:T"\Di#!AaBM 4΄F1  BtE cƕ( h\l (94KU1(RSBR"TĄHі1t@ A,eMd - H % EDLsZMD:l9iA$J-"IisKZTJbbV " 2lR4M)FP*d эPJ R&B%DM*U BRR !AAT4@RRá#n)rH%4f %%* h4Dd@ @P MDH"gHbeGBċjhs(@M"TJ-!@eш  QЈPPe)*6ңRqJC]"4B+t &\J%$RH@,C$HdEV@:!87=Pp H"‘4$h(hrIR !GLI#N],P * P!.ha40KcGLT%ĖLH.GILB8BTt!#``M4H"jBjCJW4I JPlA$% @gZNADK)JX,.-(B6D laEҙD132B@D ;({QDN <eE$*@O(J(J ))  0# ² 00  B"H$ B *J$!K0SDOP~H` IL(QDdI_`2+2"DOTi`R4Ch!pAΓ QR(LJU eRb ĆҴ!Q#B$((TrhVJqhT(G$. (( ))C(e!NR"4JEtP%4i#HE"h 0h 2"B"%, @hBД%e  iPR ЉB @!4(PAP:F@( Qj @ZL!iJ4Ԛd*JUZPbTR02P(J B((0i&))*F$bZ &$4 HҔ ڄ:21H )&Rh2 ]E3MAT T@S(]#Q]4H!Jh@ҴҵH94Z tBTB"1-P]MJ% D)O&9`lAdfI*-A4 2J: h0ԛ.% P``5""|$B UdOལ&bh=U(p"R;"@9(Ĕ!a`ta`$]!@4-(]ZC2Ȓ)$H)VhRҔ 4- 1 L6c0A*(P("UQ Q(JSP CiZ)2`H4LB"* 3FJh)\Aj( *!hɦ"tS#@ G(TBphr0Іt P%"Rd J R6RZQ)h1D[`$JQ. Y29cQ5lSдb %444HP4 IJ)Bm&F+@"IIB %-"Q(HšQBpH%RP&X*Z2Ѫ]A(B)i 9CJR]& " @j$SiHCB-BP"PP"ж Ci3!U P5d4 d]R&J1 PXetMC hKl䊪HP$MP iSS AĐ..( a1rhSE ++8f[N%% `Bm$BYCY՗T$!1D4: ZX QHK%t4`)҆0&P C)eM0 9$atq"@!h KA@TJИ ] $JPJ1Fj(Lca2' *"΄(A(Jc T0%kDA]%%)] Ht6i[.Bvihh]\1$F!* t)1RR P %P)@QKE-QD@WIbbMUDTE1B"5-DAAДPP!(SIU@ЅF؅R T4X2R6 H mJD+@i4 Dmb NM)HeұEP@:(HQxwR- CLҒJCS++0P;4:% S0KHEE-(b4j&J)F"D[ЪX1Dp@M! W̄@:R@at H(P]9tQICH.A`Aj(evLӥ,%) * ̐%J& PR"P$ .SB%( t!) DBV4fU5JI96)@J(H!ĊL`tF`дM- (pilP]+HЉBgA lHP"MD(RH`M%$HPm&$$)(D*S$ҁX2:5,G!@d* С@ aDҹ HR ҩ@etRJ50lSTa+0k5\& :p&( LRi$"B4`dH4&E"B M#M0ˤ+2i &сHdLe2JBPh2 J+BF B&JFs4 Be I@S \X 26MӂV]$Ba3JXM+eL BT0:ZtPF!Qi0UQ0t0CDhi1pRCi&F``6ii-#4J,IL3), )Ɣ;&Ӷ@@bH$B#c&JȞ*!B )S+˥)2i1#jځ4!NtPGHf!X@҆)Bi R҅*QJJc) iZ˥() P " ()T@â0ZDδ 0P(W ID+ii(i2 WhDh\2D!L:hhsPfU%5B pTJ*иJPiPAY#lFI ΔF@ZC#5Ɣ"ZQr%P4@H*CJQJIAIHM&UIH4TT% a")E(UE AB+@M.)f$mE $m-EAE X!H*U I 5CMJ4 ѕ`B()5 G"iBh0ĮƀEĤEKiZTi11A HN .i"AA*"DB! `%䀤+H&b*GYЁBa 2!L1DV0@ 1bV.[8SjM ,)b% Q#9ӴP鴹 Л1e Zɱ& H'x.F"y%\H#AYm&!PU#VirQ#"hiU:D:qZLE+r&ͫF% (`ФRЭ@f 1Ҕ8@t +@-4%-!1Ѝ`44AJ%fm$DaC@Q  $`-1@顡TGCRZ !@h@Ƅ̺1D1lh&iX4J@` MatVb!++@M"bH0 )DF:"@ *Ëd .$2iCA. 5I a$ ]i 15 +Д 6h 4QpĤ!+LC@h6E,]K4 !@dj -2`1!c@i\+&SDR$ԣIII YHYd4DAJ h% E!lhԆ.#0΂C jHcP%Ή tT& I&'FRHM 'TiL;ed  Q:w {E0E Q@` E>0De ( G>?~͎Oߣ5 0e11};q*qlOq6vw6ˏU4?t4A0tK]@(e&EG1]Zp05SmP^>wvK^Λt18<,jPdeR2GC*U>!I~ߛ/*%O P-{} \ s qPL#Z+q=1J$Ѐ738x \%[!1 pK~TqD"D?B8a"U6y۫mmؚEZ[{nKcC[H uӶwsG@abF 롢/CVR|}_Fq b~j5Xw4J޲j-iC&X+hCe1 nzrFБG!gtJlndFi+(ߵinO 랿1ADm(O%㶢׆AmȬ+5:&ۛAj8s"4.:wCz"o3 JZmNWxdd24k:PbA1duFQ2ijbq M$2 &Vg0x'oB\Zƒ SN)УFsA+kD \a<kI]Y3-ڇh5qEtČG:xXjmČ +hTxg ҭ |4^@̤`W 11ә+'P}1K %-rl6N2v,AcѢ8*K )']x\2=g_Mi9akVt[J|Sɉ{F#:;zg!-J[fff:K>K2Dj*Pp E? Ji =$Q Bt H Y?]rLfqn521Kh-5qi?# L='ć)vb"H5e3;^zkE;_*W^ &qNKOӗ-:lziy?VL$Zi&/^ϗLX(U1< bļ,O/۪dGYop{{\ J47 R_WQo2q5ٴZ6NC 0,#ZA{TrJbضN6N,dˮxB'籆*%(,is-‡/n_([&SOgkl.$ՈgXBm( 5;ZHj7MX\+(vNLM\qD,bJ^_O; DY { \fmS70-KmU>sE+H))dkbzoK۷Zfݯ=aXܒns7h9=p:UȢ5DdfJqӕCm_Me+I$-qSMJ"l^oElޏj'}Of+'FmjjVr'y';B sI15j2U$R@ x$s;țhʒ]tikZUNt/m(C2EE)kUAhE zc{-Bh,Pv%LTAŽ}JA[1STZ & 0` ♮kv350X'?}=T\K$2,0L j6jd ׎v>uҕ]X$ӥU-4z5Ǧ5e@v #d͡n!QGZ?%Xؚfڲkpq'gzq8kElJhIC&kkMOhøS/4$qe>(9$^$?osђhbĚ@%A[bqysiddyE{VOp[XZG<mb> t$0;S[5+LR͵&q+ݰ,w7m{\؝X.ጓK>0@%H`W݂Q $bmk$ bJ 斟9tkhQ z䣋xMZlQeGˮ"ѫsdFWgvɋº1*кZ˟S*2N֯/NHH}Yل95./2` eYnμƤIrNgSi Gp8ur"TP?D,Psn-T养!2Fua.XS}YWh͝ ̙ryg$I\)eee d 14FNeŭK,\IBB]ݤN4hg.Fŕ旨.#X#9MUΫRlu-plvy?XN ٪1ڶ\r-nFU}Һ>jԼ%ȢEZe.DWi!H kᄑˉ<1-.ުtU%f {wE+!%ڻjсE ^fbR-tMVw{D0hxjB(L38 7)S `xQ'yҬVC"61TK10! 8^k} 6.k6gN*45iEHrKtض]KtBPX/#t7k;G{xyLUnqK6Y`ET_qH>jPWWEFwIMgZڔƷIt6@x/cž^oP7y p@-eIB&8`Q!PĈ֐֠mB 40R  H@v'Zmi-lXrf4nk߶ עn[qvWXwk-ٮbJDnfJ5Z#)ěi_UO):q'{ڍ%WffAF3Nnٚq)b24<'y,8 %w̖ݿMUXd- Y n(1C9nX7EpA'iFR1C53Y`k(`_YT]fșap`UL+D1:}Jc30ԑ\HuX9ns4RHa9`IUtIR &`00vKV?FƧs'{Tt"U[j~GM#[E Mǻك܆OŇ#̄#qouRI'v4}w'sㆥ9œ.+/sƑRXmžtZ`~'k=ln pAS x]`Ci[W%" ^O"YZA 2"SQI5@LP$?XA W9B9r2Si @/iB:3(UQ)x"Dh@Pw{|kO}ÉjbXnjTI֯߿48p6s q>#ZDg39|Mkoɟͽ3s"b dtx TtBr_M富vʨc}svk4B0%TfϿr^K_ ~/4Y[`.3I _'H?}"RoHg3H'~OD2|oLw>@hlMP$pE$bO}K,7 ?P*Pz?4e'Ip:%gEkj3xy[7!ΛЋl[j_ WE~]o2$:v-#YE9HbC0I甮zlU[HbK-H3uX#,p_oź_q g>sv.̕se38P7Pb*k{uy 1jk7v 6Ǜ*<4 Y-Y4t_}dr&SS/Z|v'vQ!؍Ni'ER:)SlU?撾8s$2 ngl҂h_y>fEc*QQ*z-owƘ b† Nx7 sٽ8N51-YrH)6M?}:wdu3,ߪ! ;'2ckq4h%9DVųГ]6[-̲mHed"D_4ZtI۽V-lVV [)gj%+4n-6WʠKfF#߶MHti vZSɰi'MyV=75<$9͙/Kl$PᙚB%랕>M˦[[~NNDC9LՔȤ!]BvNLJOCy})$xh]2RWLE-a1yqfXp}qk*Ԝa{ Bg+ b铍4\]dM̪$RI<@-{gXԳdXF m1 <]\uQr n*#d7;NiBTbG. UG0ӒNno?dl~r/Zq~|(" o z!kWű dcD2DnYU:n9f$1&X S1?O|/-/šǍ;Nu}0Y?b\ wdihFINrEvvgo'mOn 55Rdv:[(XA1ut8׼dD4 pV:|'{HtG&O;pkUft䋏lMd[xl+^j&g}[Ϯ,V >&$zkx c11w L%!B0G8$g.tyoh"5:ҋ?Ds@bB˯\ WJFjB9 RmGE!v@I/8l(MWJp3'0itm3M?xP**!W"yo=v}d"J㻽MA A'lhRBQ-%1DRKkVgL,SUz)ɯ z)hٝ_{˭l!6((EGXFV)F"wRnɎ'WM%MH"Q,]{\ͺWm/GM=7)=Bj -} Nqp2,4Ҡm(p,G&fuTK,6S6+.t܁ Svou9 =sVė=d\ h'Hɽ39CsD`!6bL/е_}<;N%7nhEX_i}}`ǡ{ O]9L8R|mk\+&'-'">&lE #/Z1IdoOl_fM{U2t1J?`zȶd͟|z G"1]_z=8N`vga/ a@FO}Np9C4~ӨwvwS?nVϏל̎e׉9;M&w}ӿJ "hd҅Ŋ~a_h[CPX?9_2W=weELjVrQ>;Pbt`Ԕ .?A>Uf!H~6~i2LjXW,mz}x'bمw^E&L#5hdMcBOC`wV'KO[X1ՑmFD"FlΈ'sN/H?k7JnPoXRlvT:S@iwĻyUjݤO*+=s,n;Z⿹y%[V<@d JڈI{ j?Gb55ta@/#4\$'kk+m9/==2"bPi&M(Q+;k,jjj9mgݙ`pH=9x<-3D*D6ƃd*c)IOy4'95OVQwWja[H$g<ׄsgP~o/y7Y.{wa!p֖7: wZ,TV`0I'z +'q4CE &r#Ɂ9L -qX,aEZ+n[9xI UJW]r[w ymQ6F h/ ԅ1PeQ?_핵1 D$tEjR5Ukm[^W"yϩg]@:F4HԕHDeS M Aߺ # v7R[q8fc\5*gl-D[$XMu<3 =8U("$&vձ*cD]bZъڃ8a_¦20|ib+Z^@qkIZ;hIX]TKZn)M|DuQ` %yAqzntSGh3%CWHX[:u{uJ.ǯsKU(3wItU,4SMWg㻕t~t~;@O]vk7UBV>.t#*20ȴ6mv$h,V~;().[H*lX1 g `QWg*Rv@BI"%X+&9saMUs1n0 j,R&ӵ&$nbB[Pǿޣȳ9ÚS~W*`I*?]@̓Şs*t9v[hRBȍZ63kAyk@o/}lED{33Vs?1OtO%kݷ;$C6Uh+!JejR3 qU'@ hNӊ'&RLX^~0+!wMMꚧwi1ppō‚'U^yɭJba$4faG]}ёi:ثRmR C^Z z:{kCZ5Mb-m(_r1r$J۳`q(8ds%ѓvwaW `mmH[!Z4)|=ڞYn,k چțZ{Nϖ/XI [h.d4mg2ODtᄼztnE}@OSêBůte}c|a݌W¢["Pnš/0L5ߣPo}O\|p;V0eAbđnaژt!O_ W *h +A|?h"z@'UNC֡P)"A?JSx'C C}Ol0ȓ(!N0H qJ2% H^IɎ ɮIChi()L#]At)'킟-$e*[*p*ke#^nn{РXo;>;[m9@`Izע弧s^mXˆLVLW19q+njo>Sؤzcܤmhs>58lˆpOH8 Beؔ7οUC̉yaXU:}k)E5=/׎O^Y:9m5YF-ht-j[+yY56Tc,yMM=ܷ4[6H'X[y$yM>ާ֒7xaXFíp@83]?o-;]._yYkR6#ΙyMH"g^_S$L e#P­.u {O 3PlX;F #c8'LgCcE ,b9x\ \BM1ud#x!@J|N۱@'5KgNarg8pn麸1%Ŝ)u\toɴhӽ1:>C],Sͱ1ȂsbL%tn=ۤqYH79sԻ$( 5 ߆Ӣ Ad|,P#%? 'dYqsVa+'wEU[dܑ= uj0zxJe[lu~ |"bM&qM1$dƱS,ǸӎSߛuuv1'CݑxӿN>jB ̱9kZjS34RDR EikDF.֒bs9rQbV*k.dU>C mEqqDx[VJC*YcP}ΪΩX|yFו^Y'> yȀP+2{!)Eimϒ\Os-m.cYn@sݷoG(AN湶D+&n[(SQ:vY?jz@&I1t~S.˗Xh[?4,`(egF?A/Z{tqj"  sihJEKņuNon9@Ӓ#aRa"'/! Ƀ+6u?ֽRqliikltj71,[K7 睺5zyJ<Э:uL^Խ=lЗĒ3M*)=wu $aN![s ֕N&fCb /RC3"(>x>0E}MPf1`b^O9b鴷RxoE(!IbT 9q1.NJ^59 8?Z{[Hf1%?fm}q-OlzNt1٬zOY:K"Oeh%zMU0;Rwl4=Ƹ$R݋4iٳDđW&R%d:ܳ2Y \3HWyo폀Z,c"yY*&iʽSh{UYVڒyphBج<)݇ ~yցJmKrI(#<&poNsTa鸤nߋUs^aE;H3s4{|(,@iU J"눭6+0:ٌ_lxA-!@µhCh3=.SL=k71ZbԻ1"p = F)I)U("D * %II@ åDQ2! CI‘! (LhҚh3EЕFo๙^S(ts)-{["dOobN;w>zO؟~leML~`?o%\){Δ=ld{XBEEۻ(0AYX94ʲe!%L,V.gnaZ0qd 9@-Q&9jN%0˸cmUt+'?wnJ*0Q$H%Qp`-LpS d5Õ󿡡;[Ja;l+9P뵡J tu9wIݞ4r\C_֩לˌon$$.ԟSaS*|VL 8LksLaF ƄE^]] pWZmrE6Uu=:0M)ƮȻ.r5r|vӜv(5)WWy`]Ld""7ƈ1 ^46qT\?;[+dAcIV`F6VW![ƀM?5Q4*ƘؗRh%nB G$~袃jJ6 t* m )d*܍yAPD ,3TĊ>N=p t&BA2ૼx[4B r"D1}[DX~lY{C0ŋ$,>!}`g㪏UV'519_ g:1掌5&&a9a$1Shjfqek 88ʓY&Ll4٦W;y)'r~1(bUBc/mJ kkNҜ4#G nP‡Wcyd1Ʋ{DN+?ï;@;o,|uްweCyE[Ū0AhQw[Q&E;lʍ|zh{\v7Cm8 M֦8Ssfy'4 ^ -F/U̜1 4_z@^hUWp(hH'b) Zoۦ"mx tRc/I4Tx#@~l|DU7vwr(WLL8^^%d֖$6v $B4CS6񥅶9I[gSI.Wzw1j`%NvPAL :d5PxhGܖ]ȥO4ge2Pҷ06ya8YtJ")e-bfA@4yr-* cϘke"zQ6EŌ<1RDw˭HnJŊHq.)dVL!Rd fj V3jw51CuB&&S#6Թ42Ya :)i<%lpuTǏ1ͩkDQTO(:fd Bj3 ~Ej(舟p:pX Jv;@"pAu>= &( -DȚMH(K4%~"șC}'DTO΀ =)P}~ zA@'?b(DaPDN  H"(¢B0)"HBr'AT_byt@O2A2ħA&Xej-`"pZi)B?"z@)P{*/=('p zAT ~`)2@(r(P0(@PJy>?=ʩ>*E;DOCQ("TO NTC&DCα?HƢU*nH8LT&`1.uT+ba5K4ҽ,4*@!)A37_;`6`nٱU[l<;9KFֺ6VeFGcdؤov7$ʐ2YB)/n R],EQXaq2cUXDѳG-e)!G2)úPVVܶ2L%a3CLe]&VJ@2V~݊Hs"~'/a7q5 ڞKg;\j'Lp # ň$gGJ b54{ qkbhDDoszwKe/Ik &us:z^<2P.$͛7t[M7YYvիXS@Ld'^A@`Kj<+-잲p 'vLBdkA/' X,S#oЍ .S}R1EcJL eS ]v -*gD0!rj)ZKKיN&%dg 1!m+ISM8 ֵE Q!QeeN]Cb/'0ǁ1giҩPbc-F.MF; 7vvLCBoIțUS33]#LrKeݴ i39d.fuKz0wNc{F޳tr&@|KM&SķIc6G]̟u3 C'<"Ȇl \VYC!mqt椥[;b­X]-ƙ\7neFkahƚ̐5ӘnS!twp$ՓLeBrc0ytd0\5} g$b (gsi] 28ݎ8=dg&6ԓX)DB2":b՝([O!!&-pW`U:,\\mx BV!+&G0iD\8v-:$YMZ8ݤ: j뛳K5Xj5WEbztž۝ w^]c @!*ivnDl4L5[wudsAZ,JLBmmejfAe0AeQ $JG}ti9V& 6mj ǫdVQMruRqܪpRκἉTgov۫t UImwjBPCs;=BDfRbf*R-"D %vڥ*:@WeVTfTb̘#h !0f.jN84)̆N{R&  (fcfH0m^ @]ҒnLHI IКVVWnR#%6z-^W rzYbogk|t[Żҩ[`Z2=C 6ƱeZh֨ ho.7._4l^- ^[mѰ'1n͔!0"R uMܞ#+-7 PB@{-%:/3‡Bĝ4D6u;(iLb )F"&(bN.;r16'hj\َ-ȭA~@bAR=b؛%bu:=-k~r5z5lD^l#ekVBr(Ckv!O5۲u!e @J*P F[m͞:vEd[VU7M{ګY(6ȓc4<4`iƓNeMyFnlhxgK!F-) Ylڐuw8%=ZBzh-UC7f86mP^p+P)Qpcq!Ȣh 1̶yRpNimuد7 # Dan Schult # Pieter Swart # All rights reserved. # BSD license. def lanl_graph(): """ Return the lanl internet view graph from lanl.edges """ import networkx as nx try: fh = open('lanl_routes.edgelist' , 'r') except IOError: print("lanl.edges not found") raise G = nx.Graph() time = {} time[0] = 0 # assign 0 to center node for line in fh.readlines(): (head, tail, rtt) = line.split() G.add_edge(int(head), int(tail)) time[int(head)] = float(rtt) # get largest component and assign ping times to G0time dictionary G0 = sorted(nx.connected_component_subgraphs(G), key = len, reverse=True)[0] G0.rtt = {} for n in G0: G0.rtt[n] = time[n] return G0 if __name__ == '__main__': import networkx as nx import math try: import pygraphviz from networkx.drawing.nx_agraph import graphviz_layout except ImportError: try: import pydotplus from networkx.drawing.nx_pydot import graphviz_layout except ImportError: raise ImportError("This example needs Graphviz and either " "PyGraphviz or PyDotPlus") G=lanl_graph() print("graph has %d nodes with %d edges"\ %(nx.number_of_nodes(G), nx.number_of_edges(G))) print(nx.number_connected_components(G), "connected components") import matplotlib.pyplot as plt plt.figure(figsize=(8, 8)) # use graphviz to find radial layout pos = graphviz_layout(G, prog="twopi", root=0) # draw nodes, coloring by rtt ping time nx.draw(G, pos, node_color=[G.rtt[v] for v in G], with_labels=False, alpha=0.5, node_size=15) # adjust the plot limits xmax = 1.02 * max(xx for xx,yy in pos.values()) ymax = 1.02 * max(yy for xx,yy in pos.values()) plt.xlim(0, xmax) plt.ylim(0, ymax) plt.savefig("lanl_routes.png") networkx-1.11/examples/drawing/unix_email.py0000755000175000017500000000515612653165361021172 0ustar aricaric00000000000000#!/usr/bin/env python """ Create a directed graph, allowing multiple edges and self loops, from a unix mailbox. The nodes are email addresses with links that point from the sender to the recievers. The edge data is a Python email.Message object which contains all of the email message data. This example shows the power of XDiGraph to hold edge data of arbitrary Python objects (in this case a list of email messages). By default, load the sample unix email mailbox called "unix_email.mbox". You can load your own mailbox by naming it on the command line, eg python unixemail.py /var/spool/mail/username """ # Author: Aric Hagberg (hagberg@lanl.gov) # Copyright (C) 2005-2016 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import email from email.utils import getaddresses,parseaddr import mailbox import sys # unix mailbox recipe # see http://www.python.org/doc/current/lib/module-mailbox.html def msgfactory(fp): try: return email.message_from_file(fp) except email.Errors.MessageParseError: # Don't return None since that will stop the mailbox iterator return '' if __name__ == '__main__': import networkx as nx try: import matplotlib.pyplot as plt except: pass if len(sys.argv)==1: filePath = "unix_email.mbox" else: filePath = sys.argv[1] mbox = mailbox.mbox(filePath, msgfactory) # parse unix mailbox G=nx.MultiDiGraph() # create empty graph # parse each messages and build graph for msg in mbox: # msg is python email.Message.Message object (source_name,source_addr) = parseaddr(msg['From']) # sender # get all recipients # see http://www.python.org/doc/current/lib/module-email.Utils.html tos = msg.get_all('to', []) ccs = msg.get_all('cc', []) resent_tos = msg.get_all('resent-to', []) resent_ccs = msg.get_all('resent-cc', []) all_recipients = getaddresses(tos + ccs + resent_tos + resent_ccs) # now add the edges for this mail message for (target_name,target_addr) in all_recipients: G.add_edge(source_addr,target_addr,message=msg) # print edges with message subject for (u,v,d) in G.edges_iter(data=True): print("From: %s To: %s Subject: %s"%(u,v,d['message']["Subject"])) try: # draw pos=nx.spring_layout(G,iterations=10) nx.draw(G,pos,node_size=0,alpha=0.4,edge_color='r',font_size=16) plt.savefig("unix_email.png") plt.show() except: # matplotlib not available pass networkx-1.11/examples/drawing/ego_graph.py0000644000175000017500000000166712653165361020773 0ustar aricaric00000000000000#!/usr/bin/env python # -*- coding: utf-8 -*- """ Example using the NetworkX ego_graph() function to return the main egonet of the largest hub in a Barabási-Albert network. """ # Author: Drew Conway (drew.conway@nyu.edu) from operator import itemgetter import networkx as nx import matplotlib.pyplot as plt if __name__ == '__main__': # Create a BA model graph n=1000 m=2 G=nx.generators.barabasi_albert_graph(n,m) # find node with largest degree node_and_degree=G.degree() (largest_hub,degree)=sorted(node_and_degree.items(),key=itemgetter(1))[-1] # Create ego graph of main hub hub_ego=nx.ego_graph(G,largest_hub) # Draw graph pos=nx.spring_layout(hub_ego) nx.draw(hub_ego,pos,node_color='b',node_size=50,with_labels=False) # Draw ego as large and red nx.draw_networkx_nodes(hub_ego,pos,nodelist=[largest_hub],node_size=300,node_color='r') plt.savefig('ego_graph.png') plt.show() networkx-1.11/examples/drawing/giant_component.py0000644000175000017500000000435712653165361022223 0ustar aricaric00000000000000#!/usr/bin/env python """ This example illustrates the sudden appearance of a giant connected component in a binomial random graph. Requires pygraphviz and matplotlib to draw. """ # Copyright (C) 2006-2016 # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. try: import matplotlib.pyplot as plt except: raise import networkx as nx import math try: import pygraphviz from networkx.drawing.nx_agraph import graphviz_layout layout = graphviz_layout except ImportError: try: import pydotplus from networkx.drawing.nx_pydot import graphviz_layout layout = graphviz_layout except ImportError: print("PyGraphviz and PyDotPlus not found;\n" "drawing with spring layout;\n" "will be slow.") layout = nx.spring_layout n=150 # 150 nodes # p value at which giant component (of size log(n) nodes) is expected p_giant=1.0/(n-1) # p value at which graph is expected to become completely connected p_conn=math.log(n)/float(n) # the following range of p values should be close to the threshold pvals=[0.003, 0.006, 0.008, 0.015] region=220 # for pylab 2x2 subplot layout plt.subplots_adjust(left=0,right=1,bottom=0,top=0.95,wspace=0.01,hspace=0.01) for p in pvals: G=nx.binomial_graph(n,p) pos=layout(G) region+=1 plt.subplot(region) plt.title("p = %6.3f"%(p)) nx.draw(G,pos, with_labels=False, node_size=10 ) # identify largest connected component Gcc=sorted(nx.connected_component_subgraphs(G), key = len, reverse=True) G0=Gcc[0] nx.draw_networkx_edges(G0,pos, with_labels=False, edge_color='r', width=6.0 ) # show other connected components for Gi in Gcc[1:]: if len(Gi)>1: nx.draw_networkx_edges(Gi,pos, with_labels=False, edge_color='r', alpha=0.3, width=5.0 ) plt.savefig("giant_component.png") plt.show() # display networkx-1.11/examples/drawing/circular_tree.py0000644000175000017500000000120712653165361021651 0ustar aricaric00000000000000import networkx as nx import matplotlib.pyplot as plt try: import pygraphviz from networkx.drawing.nx_agraph import graphviz_layout except ImportError: try: import pydotplus from networkx.drawing.nx_pydot import graphviz_layout except ImportError: raise ImportError("This example needs Graphviz and either " "PyGraphviz or PyDotPlus") G = nx.balanced_tree(3, 5) pos = graphviz_layout(G, prog='twopi', args='') plt.figure(figsize=(8, 8)) nx.draw(G, pos, node_size=20, alpha=0.5, node_color="blue", with_labels=False) plt.axis('equal') plt.savefig('circular_tree.png') plt.show() networkx-1.11/examples/drawing/unix_email.mbox0000644000175000017500000000325512637544450021504 0ustar aricaric00000000000000From alice@edu Thu Jun 16 16:12:12 2005 From: Alice Subject: NetworkX Date: Thu, 16 Jun 2005 16:12:13 -0700 To: Bob Status: RO Content-Length: 86 Lines: 5 Bob, check out the new networkx release - you and Carol might really like it. Alice From bob@gov Thu Jun 16 18:13:12 2005 Return-Path: Subject: Re: NetworkX From: Bob To: Alice Content-Type: text/plain Date: Thu, 16 Jun 2005 18:13:12 -0700 Status: RO Content-Length: 26 Lines: 4 Thanks for the tip. Bob From ted@com Thu Jul 28 09:53:31 2005 Return-Path: Subject: Graph package in Python? From: Ted To: Bob Content-Type: text/plain Date: Thu, 28 Jul 2005 09:47:03 -0700 Status: RO Content-Length: 90 Lines: 3 Hey Ted - I'm looking for a Python package for graphs and networks. Do you know of any? From bob@gov Thu Jul 28 09:59:31 2005 Return-Path: Subject: Re: Graph package in Python? From: Bob To: Ted Content-Type: text/plain Date: Thu, 28 Jul 2005 09:59:03 -0700 Status: RO Content-Length: 180 Lines: 9 Check out the NetworkX package - Alice sent me the tip! Bob >> bob@gov scrawled: >> Hey Ted - I'm looking for a Python package for >> graphs and networks. Do you know of any? From ted@com Thu Jul 28 15:53:31 2005 Return-Path: Subject: get together for lunch to discuss Networks? From: Ted To: Bob , Carol , Alice Content-Type: text/plain Date: Thu, 28 Jul 2005 15:47:03 -0700 Status: RO Content-Length: 139 Lines: 5 Hey everyrone! Want to meet at that restaurant on the island in Konigsburg tonight? Bring your laptops and we can install NetworkX. Ted networkx-1.11/examples/drawing/atlas.py0000644000175000017500000000532112653165361020133 0ustar aricaric00000000000000#!/usr/bin/env python """ Atlas of all graphs of 6 nodes or less. """ # Author: Aric Hagberg (hagberg@lanl.gov) # Copyright (C) 2004-2016 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import networkx as nx from networkx.generators.atlas import * from networkx.algorithms.isomorphism.isomorph import graph_could_be_isomorphic as isomorphic import random def atlas6(): """ Return the atlas of all connected graphs of 6 nodes or less. Attempt to check for isomorphisms and remove. """ Atlas = graph_atlas_g()[0:208] # 208 # remove isolated nodes, only connected graphs are left U = nx.Graph() # graph for union of all graphs in atlas for G in Atlas: zerodegree = [n for n in G if G.degree(n)==0] for n in zerodegree: G.remove_node(n) U = nx.disjoint_union(U, G) # list of graphs of all connected components C = nx.connected_component_subgraphs(U) UU = nx.Graph() # do quick isomorphic-like check, not a true isomorphism checker nlist = [] # list of nonisomorphic graphs for G in C: # check against all nonisomorphic graphs so far if not iso(G, nlist): nlist.append(G) UU = nx.disjoint_union(UU, G) # union the nonisomorphic graphs return UU def iso(G1, glist): """Quick and dirty nonisomorphism checker used to check isomorphisms.""" for G2 in glist: if isomorphic(G1, G2): return True return False if __name__ == '__main__': G=atlas6() print("graph has %d nodes with %d edges"\ %(nx.number_of_nodes(G), nx.number_of_edges(G))) print(nx.number_connected_components(G), "connected components") try: import pygraphviz from networkx.drawing.nx_agraph import graphviz_layout except ImportError: try: import pydotplus from networkx.drawing.nx_pydot import graphviz_layout except ImportError: raise ImportError("This example needs Graphviz and either " "PyGraphviz or PyDotPlus") import matplotlib.pyplot as plt plt.figure(1, figsize=(8, 8)) # layout graphs with positions using graphviz neato pos = graphviz_layout(G, prog="neato") # color nodes the same in each connected subgraph C = nx.connected_component_subgraphs(G) for g in C: c = [random.random()] * nx.number_of_nodes(g) # random color... nx.draw(g, pos, node_size=40, node_color=c, vmin=0.0, vmax=1.0, with_labels=False ) plt.savefig("atlas.png", dpi=75) networkx-1.11/examples/drawing/lanl_routes.edgelist0000644000175000017500000004510012637544450022527 0ustar aricaric000000000000001 0 9 2 3 173 3 4 167 4 5 165 5 102 96 5 6 96.43 6 7 96.62 7 8 86 8 9 87 9 10 82 10 11 79 11 12 74 12 13 41 13 1 3 14 15 187.46 15 16 187 16 17 187.04 17 18 186.62 18 19 185.88 19 20 185.4 20 21 184.02 21 22 184.38 22 23 183.97 23 24 93.01 24 25 88.59 25 26 86.26 26 27 88.04 27 28 77.12 28 29 74 29 119 70.1 29 30 72 30 31 73 31 32 89.41 32 13 6.63 33 34 212.25 34 35 211.46 35 36 209.04 36 37 211.57 37 38 206.57 38 39 105.04 39 40 77.21 40 41 86.5 41 42 0 42 1 16 43 44 112.67 44 45 111.94 45 46 111.77 46 47 111.83 47 48 111.27 48 49 111.76 49 50 109.32 50 51 110.67 51 52 80.63 52 53 80.78 53 41 30 54 55 164.62 55 56 164.13 56 57 164.24 57 58 156.43 58 59 88.92 59 60 88.6 60 61 86.53 61 62 85.96 62 10 86.8 63 64 221.31 64 65 220.88 65 66 220.84 66 67 220.27 67 68 215.19 68 69 209 69 70 213.81 70 71 208.04 71 72 196.45 72 73 197.05 73 74 163.85 74 75 186.4 75 76 180.35 76 77 103.92 77 78 98.6 78 79 98.83 79 80 96.09 80 10 309 81 82 174.3 82 83 173.57 83 84 173.9 84 85 173.66 85 86 173.75 86 87 92.62 87 88 73.93 88 32 91.44 89 90 158 90 91 155 91 92 158 91 97 157.76 92 93 86 93 94 325.32 94 32 5.53000000000003 95 96 158.17 96 91 157.6 97 98 84.74 98 94 84.87 99 100 282 100 101 279 101 5 167 102 7 87 103 104 201.16 104 105 200.5 105 106 200.5 106 39 121.02 107 108 148.27 108 109 147.59 109 110 147.87 110 111 147.88 111 112 147.82 112 113 140.19 113 114 147.54 114 115 64.02 115 116 67.75 116 117 70.55 117 118 98.3 118 119 80.4 119 31 81.28 120 121 161.26 121 122 160.84 122 123 160.86 123 58 160.84 124 125 238.34 125 126 237.79 126 127 237.08 127 128 234.34 128 129 236.72 129 130 237.18 130 131 119.25 131 132 111.15 132 133 30.93 133 134 50 134 135 49 135 42 43 136 137 148.68 137 138 147.78 138 139 147.74 139 140 148.29 140 141 136.56 141 142 145.37 142 143 144.93 143 144 141.78 143 160 144.92 144 145 141.65 145 146 72.32 146 147 68.81 147 148 73.15 148 149 93.13 149 150 82.38 150 151 86.03 151 152 73.3 152 153 80.2 153 154 83.08 154 32 0 155 156 152.24 156 157 151.47 157 158 150.17 158 159 151.59 159 140 145.38 160 145 143.86 161 162 197.3 162 163 196.83 163 164 196.73 164 165 196.8 165 166 131.22 165 383 128.8 166 167 36 167 168 44.08 168 42 11.49 169 170 178 170 171 174 171 172 166 172 5 166 173 174 213.7 174 175 213.01 175 176 213.03 176 177 213.38 177 178 202.81 178 179 204.73 179 180 203 180 38 202.46 181 182 44 182 183 41 183 184 43 184 185 44 185 186 42 186 187 33 187 188 32 188 189 39.44 189 42 36.28 190 191 258.85 191 192 257.61 192 193 258.1 193 194 87.26 194 195 88.26 195 196 85.82 196 197 18.93 197 198 32 198 188 33.68 199 200 114 200 201 110 201 202 114 202 203 112 203 204 29 204 205 36 205 206 37 206 42 27 207 208 562.86 208 209 561.89 209 210 561.94 210 211 9.50999999999999 211 212 14.12 212 213 6.81000000000006 213 214 1.72000000000003 214 215 11.83 215 117 85.94 216 217 114.12 217 218 112.57 218 219 112.04 219 220 112.01 220 188 88.74 221 222 211 222 223 208 223 224 211 224 225 103 225 226 104.75 226 227 103 226 687 105 227 98 73 228 229 67.83 229 230 66.68 230 231 67.39 231 232 67.36 232 233 66.39 233 234 32.11 234 188 32.19 235 236 194 236 237 191 237 172 166 238 239 191.94 239 240 190.85 240 241 191.48 241 242 190.73 242 243 183.14 243 244 182.32 244 245 179.92 245 246 182.17 246 97 173.79 247 248 195.23 248 249 194.78 249 250 179.06 250 251 179.01 251 252 168.31 252 57 165.26 253 254 92.14 254 255 91.07 255 256 90.41 256 257 90.85 257 258 46.52 258 259 57.31 259 260 4.04000000000001 260 261 19.22 261 262 8.82000000000001 262 263 79 263 41 78 264 265 102.69 265 266 102.24 266 267 101.92 267 1355 116 267 268 101.64 268 269 0 269 270 80.77 270 271 89.27 271 272 115 272 273 116 273 274 115 274 133 50 275 276 58.9 276 277 58.19 277 278 0 278 279 58.13 279 280 58.21 280 281 58.04 281 282 58.12 282 283 31.7 283 284 26.07 284 285 38.42 284 574 32.12 285 286 196 286 42 127 287 288 186.74 288 289 185.88 289 290 183.24 290 291 184.06 291 292 181.22 292 293 0 293 294 171.29 294 295 102.04 295 296 68.99 296 30 68.93 297 298 103.31 298 299 102.8 299 300 102.75 300 301 102.77 301 302 102.55 302 270 101.64 303 304 555.45 305 306 76.2 306 307 75.65 307 308 75.66 308 309 75.9 309 310 75.7 310 311 69.12 311 312 68.99 312 32 0 313 314 170 314 315 167 315 316 169 316 317 169 317 318 169 318 319 167 319 320 167 320 321 27 321 198 16 322 323 178 323 324 175 324 325 177 325 326 125 326 327 117 327 328 34 328 329 34 329 330 34 330 331 0 331 198 33 332 333 85.6 333 334 84.88 334 335 84.69 335 336 77.26 336 337 80.75 337 53 79.65 338 339 104.12 339 340 103.35 340 341 103.55 341 342 61.24 342 343 90.09 343 344 92.58 344 345 92.92 345 346 86 346 9 86 347 348 106.96 348 349 106.37 349 350 75.33 350 351 90.59 351 61 66 352 353 76.11 353 354 75.34 354 355 75.16 355 356 74.84 356 357 61.66 357 358 103.06 358 359 34 359 168 34 360 361 122.16 361 362 121.69 362 166 121.31 363 364 63.84 364 365 63.35 365 366 63.5 366 367 48.79 367 368 47.81 368 369 38.89 369 370 38.77 370 371 37.31 371 358 36.43 372 373 210 373 374 206 374 375 187 375 376 343 376 377 193 377 378 33 378 379 86 378 866 33.57 378 958 33 379 285 196 380 381 51.17 381 382 50.66 382 383 27.75 383 359 32 384 385 44.03 385 386 43.56 386 387 43.19 387 358 40.03 388 389 79.01 389 390 0 390 358 78.04 391 392 186.71 392 393 185.92 393 394 184.8 394 395 171.14 395 396 176.31 396 53 82 397 398 133.54 398 399 133.05 399 400 133.09 400 383 132.6 401 402 101.49 402 403 101 403 358 100.55 404 405 199.34 405 406 198.85 406 407 198.94 407 408 199.01 408 409 198.79 409 410 53.72 410 411 33.45 411 412 38.86 412 413 36.46 413 414 38.68 414 205 31.92 415 416 35.64 416 417 34.93 417 418 34.91 418 414 34.93 419 420 119.9 420 421 119.37 421 422 119.02 422 423 118.89 423 215 111.28 424 425 57.09 425 426 56.3 426 427 54.33 427 428 55.68 428 429 55.89 429 430 52.57 430 283 39.54 431 432 91.65 432 433 90.74 433 434 91.05 434 435 90.27 435 436 87.89 436 437 83.17 437 438 76.52 438 439 67.26 440 441 32.05 441 42 32.57 442 443 103.73 443 444 100.75 444 357 103.22 445 446 226.87 446 447 226.41 447 448 226.52 448 166 225.66 449 450 107.54 450 451 107.08 451 383 106.74 452 453 190.21 453 454 186.79 454 455 188.77 455 456 188.36 456 457 41.12 457 458 39.38 458 459 39.48 459 460 37.96 460 286 32 461 462 32.81 462 359 32.31 463 464 106.75 464 465 106.27 465 358 105.93 466 467 34.12 467 468 15.78 468 469 33.79 469 470 33.05 470 471 53.35 471 188 17.55 472 473 90.41 473 474 89.07 474 475 90.06 475 476 89.2 476 477 88.42 477 478 85.87 478 10 0 479 480 142.27 480 481 8.90000000000001 481 482 13.62 482 483 112.18 483 29 138.4 484 485 101.64 485 486 100.96 486 487 101.12 487 488 100.77 488 489 49.98 489 117 48.89 490 491 42.63 491 492 42.02 492 493 41.89 493 494 41.72 494 460 40.09 495 496 184.89 496 497 184.2 497 498 184.58 498 499 184.51 499 500 43.65 500 501 43.27 501 502 39.97 502 286 39.64 503 504 87.59 504 505 86.98 505 506 68.34 506 507 86.09 507 79 85.85 508 509 35.46 509 510 34.98 510 511 35.09 511 512 35.06 512 203 32.71 513 514 237.49 514 515 236.63 515 516 235.99 516 517 235.45 517 119 232.59 518 519 108.9 519 520 108.33 520 521 108.34 521 522 108.32 522 523 89.23 523 524 76.97 524 525 76.55 525 32 74.26 526 527 94.58 527 528 93.93 528 529 93.74 529 530 92.33 530 531 91.65 531 532 92.78 532 533 52.01 533 119 76 534 535 278.82 535 536 277.82 536 537 277.42 537 538 277.54 538 215 269.96 539 540 61.14 540 541 59.85 541 542 60.8 542 543 60.65 543 544 56.01 544 470 33.14 545 546 111.16 546 547 109.47 547 548 110.05 548 549 108.98 549 550 108.86 550 551 47.69 551 552 47.26 552 205 55.57 553 554 85.92 554 555 83.29 555 556 85.58 556 557 85.45 557 558 79.84 558 559 76.12 559 478 67.6 560 561 315.3 561 562 314.68 562 563 314.69 563 564 314.81 564 565 291.4 565 596 287.51 566 567 281.41 567 568 117.21 567 597 134.64 568 569 121.28 569 570 138.98 570 571 130.29 571 572 85.41 572 573 85.15 573 283 35.07 574 286 31.89 575 576 107.08 576 577 106.44 577 578 96 578 1347 96 578 579 106.29 579 580 106.33 580 204 21 581 582 182.8 582 583 180.62 583 584 181.1 584 585 181 585 586 176.62 586 587 167.92 587 588 33.47 588 589 34.93 589 580 33.76 590 591 293.64 591 592 293.08 592 593 293.16 594 595 292.45 595 564 291.9 596 566 287.81 597 569 129.61 598 599 108.64 599 600 103.72 600 203 108.12 601 602 188.26 602 603 186.83 603 101 150.99 604 605 172.09 605 606 171.68 606 607 171.75 607 57 165.06 608 609 108.26 609 610 107.62 610 611 107.17 611 612 107.3 612 613 94.52 614 615 179.19 615 616 178.63 616 617 178.73 617 618 177.65 618 619 93.43 619 312 87.42 620 621 180.61 621 622 179.95 622 623 180.24 623 624 180.2 624 625 180.07 625 626 182.28 626 627 176.33 627 628 99.93 628 629 96.99 629 630 96.68 630 396 76.07 631 632 156.82 632 633 156.12 633 634 155.87 634 635 156 635 636 155.69 636 637 155.46 637 638 169.15 638 639 168.91 639 640 168.78 640 28 47.56 641 642 199.77 642 643 196.16 643 644 199.4 644 645 197.12 645 646 174.59 646 71 197.29 647 648 234 648 649 230 649 650 233 650 651 102 651 652 98 652 197 101 653 654 187 654 655 184 655 101 167 656 657 203.5 657 658 202.75 658 659 202.65 659 660 202.31 660 661 202.32 661 662 202.19 662 663 181.05 663 664 93.39 664 665 86.23 665 148 93.14 666 667 204.1 667 668 203.08 668 669 203.14 669 617 177.3 670 671 115.11 671 672 114.18 672 673 114.39 673 674 110.33 674 675 107.11 676 677 262 677 678 261.21 678 679 256.73 679 680 258.46 680 681 258.25 681 637 257.7 682 683 234.24 683 684 233.35 684 685 232.7 685 686 29.28 686 225 103.48 687 93 321.3 688 689 89.06 689 690 88.36 691 692 88.72 692 693 88.5 693 285 88.12 694 695 637.37 695 696 636.56 696 697 634.76 697 698 628.88 698 699 615.38 699 700 102.91 700 701 85.77 701 702 78.1799999999999 702 703 96.09 703 165 52.89 704 705 98.12 705 706 97.48 706 707 95.44 707 708 97.47 708 709 97.42 709 167 43.83 710 711 237.43 711 712 236.85 712 713 236.53 713 714 236.33 714 715 208.15 715 716 207.98 716 38 108.65 716 717 107.34 717 39 111.29 718 719 87 719 720 82 720 721 86 721 722 87 722 723 86 723 724 79 724 725 32 725 396 83 726 727 196.18 727 728 195.25 728 729 194.91 729 730 195.83 730 731 110.77 731 732 111.26 732 733 111.4 733 734 111.31 734 735 111.15 735 736 90.31 736 737 93.77 737 215 96.41 738 739 456.99 739 740 454.87 740 741 455.13 741 742 454.46 742 743 422.76 743 744 327.98 744 687 326.23 745 746 241.98 746 747 241.31 747 748 241.24 748 749 236 749 750 231.22 750 751 91.12 751 752 97.31 752 753 103.3 753 754 31.03 754 198 0 755 756 219.66 756 757 218.95 757 758 219.13 758 759 219.11 759 760 218.51 760 761 218.5 761 762 200.71 762 763 199.49 763 203 105.48 764 765 186.62 765 766 185.87 766 5 127.52 767 768 88 768 102 85 769 770 199.66 770 771 199.14 771 772 199.05 772 773 196.95 773 774 194.12 774 775 194 775 776 190.3 776 777 190.08 777 625 187.75 778 779 203.98 779 780 203.14 780 781 202.48 781 782 182.54 782 783 177.2 783 626 175.98 784 785 230.98 785 786 230.13 786 787 229.17 787 788 228.7 788 789 222.24 789 790 208.82 790 791 96.82 791 628 96.76 792 793 107 793 794 105 794 795 106 795 796 107 796 797 106 797 798 106 798 799 91 799 345 91 800 801 343 801 375 340 802 803 191 803 804 188 804 805 190 805 806 191 806 807 149 807 808 149 808 809 146 809 810 11 810 811 13 811 812 33.77 812 42 16.29 813 814 201.7 814 815 200.68 815 816 199.52 816 817 199.61 817 818 199.6 818 819 45.02 819 820 44.94 820 821 26.66 821 285 43.1 822 823 156.08 823 824 131.23 824 825 149.04 825 826 151.23 826 827 142.62 827 828 111.31 827 856 109.19 828 829 110.5 829 295 110.86 830 831 83.15 831 832 82.49 832 833 80.44 833 834 80.99 834 40 81.66 835 836 126.83 836 837 125.73 837 838 125.54 838 839 118.42 839 840 110.85 840 841 111.09 841 842 81.99 842 358 80.97 843 844 619.19 844 845 618.53 845 846 106.9 846 847 105.36 847 848 65.2 848 849 82.97 849 850 103.91 850 851 106.54 851 829 105.35 852 853 161.36 853 854 160.48 854 855 153.31 855 826 145.92 856 857 111.26 857 858 111.27 858 533 72 859 860 146.09 860 861 145.5 861 827 144.82 862 863 75.24 863 864 74.61 864 865 34.1 865 378 197 866 285 33.46 867 868 138 868 869 135 869 166 137 870 871 526.33 871 872 524.67 872 873 171.42 873 874 167.51 874 875 174.09 875 876 98.02 876 877 100.53 877 878 104.75 878 879 98.28 879 880 103.86 880 117 0 881 882 406.4 882 883 405.18 883 884 404.32 884 885 395.55 885 886 38.38 886 887 39.79 887 574 36.72 888 889 196.54 889 890 195.66 890 891 195.5 891 892 34.95 892 893 34.39 893 188 35.13 894 895 98 895 896 95 896 897 98 897 383 98 898 899 200.41 899 900 199.7 900 901 199.66 901 902 199.24 902 903 199.17 903 165 194.72 904 905 240.47 905 906 239.63 906 907 240.14 907 908 239.85 908 909 239.93 909 910 239.8 910 911 153.73 911 912 139.02 912 913 25.99 913 914 25.8 914 640 30.59 915 916 57.63 916 917 56.93 917 918 56.9 918 919 56.61 919 920 56.6 920 921 56.31 921 922 52.48 922 923 49.64 923 924 37 925 926 698.1 926 927 696.29 927 928 697.35 928 929 696.84 929 930 104.07 930 931 112.53 931 932 116.56 932 933 113.51 933 42 72.73 934 935 108 935 936 105 936 937 108 937 166 108 938 939 247.1 939 940 246.42 940 941 245.86 941 942 218.76 942 943 218.87 943 944 33.42 944 188 31 945 946 75 946 947 72 947 383 74 948 949 103 949 950 95 950 951 103 951 952 103 952 953 98 953 725 83 954 955 180 955 956 177 956 957 179 957 377 76 957 865 146 958 460 32 959 960 91 960 961 87 961 962 91 962 963 91 963 964 90 964 118 89 965 966 306 966 967 303 967 968 305 968 969 155 969 970 303 970 971 192 971 972 112 972 973 112 973 974 111 974 975 80 975 41 79 976 977 170.09 977 978 56.57 978 979 165.74 979 980 169.19 980 981 168.92 981 982 169.16 982 983 169.1 983 984 161.78 984 985 161.68 985 986 164.96 986 987 164.63 987 988 144.05 988 989 33.96 989 811 33.93 990 991 194.72 991 992 193.9 992 993 194.2 993 994 165.1 994 995 164.38 995 985 164.86 996 997 144.93 997 998 144.35 998 999 144.48 999 1000 144.54 1000 1001 142.77 1001 987 144.15 1002 1003 92 1003 1004 16 1004 1005 60 1005 62 91 1006 1007 103 1007 1008 100 1008 1009 103 1009 1010 96 1010 1011 60 1011 80 86 1012 1013 101 1013 1014 97 1014 1015 95 1015 1016 77 1016 60 54 1017 1018 98 1018 1019 95 1019 1020 0 1020 351 19 1021 1022 380 1022 1023 254 1023 1024 370 1024 1025 372 1025 1026 379 1026 80 311 1027 1028 47 1028 1029 40 1029 1030 40 1030 1031 46 1031 924 59.42 1032 1033 39 1033 1034 36 1034 1035 38 1035 552 39 1036 1037 56 1037 1038 52 1038 1039 55 1039 923 55 1040 1041 42 1041 1042 39 1042 1043 42 1043 358 41 1044 1045 104 1045 1046 101 1046 1047 103 1047 1048 102 1048 1049 102 1049 858 98 1050 1051 338 1051 1052 335 1052 1053 337 1053 1054 337 1054 1055 64 1055 1056 178 1056 1057 177 1057 1058 177 1058 1059 177 1059 1060 64 1060 1061 153 1061 1062 31 1062 460 153 1063 1064 183 1064 1065 180 1065 377 33 1066 1067 255.41 1067 1068 254.7 1068 1069 254.31 1069 1070 218.96 1070 1071 238.65 1071 1072 238.41 1072 1073 237.99 1073 1074 81.72 1074 1075 102.28 1075 188 92.68 1076 1077 87.02 1077 1078 86.6 1078 1079 86.7 1079 1080 86.71 1080 1081 86.51 1081 1082 86.28 1082 117 72.11 1083 1084 37 1084 865 26 1085 1086 88.34 1086 1087 86.73 1087 1088 87.48 1088 1089 87.6 1089 1090 86.77 1090 571 85.58 1091 1092 82 1092 1093 78 1093 1094 80 1094 1095 79 1095 1096 78 1096 1097 77 1097 1098 79 1098 262 79 1099 1100 80.61 1100 1101 77.63 1101 40 71.88 1102 1103 90.9 1103 1104 90.1 1104 1105 90.42 1106 1107 329 1107 1108 326 1108 957 301 1109 1110 157.39 1110 1111 156.91 1111 1112 156.99 1112 1113 32.57 1113 188 3.44 1114 1115 33 1115 1116 29 1116 1117 33 1117 1118 32 1118 1119 32 1119 135 32 1120 1121 179 1121 1122 175 1122 1123 178 1123 1124 0 1124 1125 0 1126 1127 43.71 1127 1128 43.16 1128 1129 41.4 1129 1130 33.49 1130 812 31.02 1131 1132 87.82 1132 1133 86.88 1133 1134 85.49 1134 9 77.69 1135 1136 98 1136 1137 102 1137 1138 98 1138 1139 100 1139 1140 41 1140 1141 95 1141 1142 91 1142 41 34 1143 1144 352.6 1144 1145 351.82 1145 1146 351.73 1146 1147 343.47 1147 1148 323.5 1148 1149 172.05 1149 1150 171.57 1150 1151 179.87 1151 1152 154.08 1152 1153 157.73 1153 1154 170.23 1154 42 55.24 1155 1156 185.3 1156 1157 184.13 1157 1158 183.56 1158 1159 183.14 1159 1160 42.16 1160 1161 40.43 1162 1163 40.61 1163 1164 32.05 1164 1165 34.36 1165 205 31.2 1166 1167 39 1167 1168 36 1168 1169 0 1169 1170 38 1170 924 38 1171 1172 240.93 1172 1173 240.24 1173 1174 238.02 1174 1074 96.87 1175 1176 107 1176 1177 104 1177 1178 104 1178 1179 104 1179 1180 102 1180 1181 84 1181 1182 84 1182 28 74 1183 1184 108.07 1184 1185 107.53 1185 1186 106.58 1186 709 106.38 1187 1188 119.42 1188 1189 118.71 1189 1190 118.15 1190 1191 117.11 1191 1192 115.11 1192 1193 114.82 1193 1194 104.91 1194 41 103.99 1195 1196 37.68 1196 1197 37.15 1197 1198 37.19 1198 1199 33.94 1199 414 34.17 1200 1201 118.72 1201 1202 113.14 1202 1203 0 1203 1204 78.09 1204 1205 117.75 1205 1206 117.48 1206 1207 103.57 1207 1208 116.96 1208 1209 117.29 1209 1210 106.13 1210 10 113.75 1211 1212 95.92 1212 1213 95.3 1213 1214 93.98 1214 1215 93.98 1215 1216 93.93 1216 1217 89.21 1217 1218 85.06 1218 1219 65.28 1219 1220 65.3 1220 40 63.33 1221 1222 33 1222 1223 30 1223 1224 33 1224 187 33 1225 1226 216.31 1226 1227 215.75 1227 1228 207.33 1228 716 201.93 1229 1230 163.93 1230 1231 163.51 1231 1232 163.47 1232 1233 163.54 1233 1234 91.76 1234 59 88.9 1235 1236 155.59 1236 1237 154.73 1237 1238 155.15 1238 1239 155.18 1239 1240 155.17 1240 1241 155.12 1241 1242 154.86 1242 1243 75.58 1243 1244 75.58 1244 1245 75.6 1245 117 70.16 1246 1247 211.11 1247 1248 210.29 1248 1249 210.75 1249 1250 210.68 1250 1251 210.58 1251 1252 210.37 1252 1227 209.84 1253 1254 86.95 1254 1255 86.13 1255 1256 86.29 1256 1257 85.37 1257 1258 77.29 1258 1259 76.3 1259 215 75.57 1260 1261 117.66 1261 1262 109.11 1262 1263 101.86 1263 1264 92.95 1264 215 107.68 1265 1266 69.35 1266 1267 68.74 1267 1268 67.13 1268 1269 67.59 1269 1270 66.29 1270 1031 67.14 1271 1272 95.06 1272 1273 94.48 1273 1274 93.89 1274 1275 93.92 1275 1276 76.55 1276 1277 93.73 1277 149 87.97 1278 1279 67 1279 1280 90 1280 1281 91 1281 1282 91 1282 1283 90 1283 1284 89 1284 1285 90 1285 1286 89 1286 1287 85 1287 9 85 1288 1289 123 1289 1290 120 1290 1291 122 1291 1292 115 1292 1293 119 1293 1294 116 1294 1295 113 1295 1296 116 1296 1297 44 1297 133 29 1298 1299 109 1299 1300 106 1300 1301 103 1301 1302 103 1302 1303 103 1303 1304 94 1304 1305 95 1305 1306 90 1306 1307 84 1307 1308 84 1308 1142 83 1309 1310 215 1310 1311 212 1311 1312 214 1312 1313 212 1313 1314 213 1313 1331 245.24 1314 188 37.52 1315 1316 227 1316 1317 224 1317 1318 226 1318 1319 226 1319 944 45 1320 1321 219.19 1321 1322 218.4 1322 1323 218.88 1323 1324 218.55 1324 1325 213.7 1325 1314 214.5 1326 1327 248.5 1327 1328 247.98 1328 1329 248.15 1329 1330 247.15 1330 1313 245.15 1331 188 64.76 1332 1333 140 1333 1334 137 1334 1335 139 1335 1336 139 1336 1337 139 1337 1338 138 1338 1339 70 1339 1340 119 1340 1341 121 1341 1342 121 1342 274 107 1343 1344 97 1344 1345 93 1345 1346 96 1346 577 96 1347 580 95 1348 1349 51 1349 1350 48 1350 1351 50 1351 203 47 1352 1353 117 1353 1354 114 1354 267 116 1355 1356 116 1356 1357 116 1357 271 116 networkx-1.11/examples/drawing/edge_colormap.py0000644000175000017500000000073712653165361021635 0ustar aricaric00000000000000#!/usr/bin/env python """ Draw a graph with matplotlib, color edges. You must have matplotlib>=87.7 for this to work. """ # Author: Aric Hagberg (hagberg@lanl.gov) try: import matplotlib.pyplot as plt except: raise import networkx as nx G=nx.star_graph(20) pos=nx.spring_layout(G) colors=range(20) nx.draw(G,pos,node_color='#A0CBE2',edge_color=colors,width=4,edge_cmap=plt.cm.Blues,with_labels=False) plt.savefig("edge_colormap.png") # save as png plt.show() # display networkx-1.11/examples/drawing/knuth_miles.py0000644000175000017500000000561012653165361021352 0ustar aricaric00000000000000#!/usr/bin/env python """ An example using networkx.Graph(). miles_graph() returns an undirected graph over the 128 US cities from the datafile miles_dat.txt. The cities each have location and population data. The edges are labeled with the distance betwen the two cities. This example is described in Section 1.1 in Knuth's book [1,2]. References. ----------- [1] Donald E. Knuth, "The Stanford GraphBase: A Platform for Combinatorial Computing", ACM Press, New York, 1993. [2] http://www-cs-faculty.stanford.edu/~knuth/sgb.html """ # Author: Aric Hagberg (hagberg@lanl.gov) # Copyright (C) 2004-2016 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import networkx as nx def miles_graph(): """ Return the cites example graph in miles_dat.txt from the Stanford GraphBase. """ # open file miles_dat.txt.gz (or miles_dat.txt) import gzip fh = gzip.open('knuth_miles.txt.gz','r') G=nx.Graph() G.position={} G.population={} cities=[] for line in fh.readlines(): line = line.decode() if line.startswith("*"): # skip comments continue numfind=re.compile("^\d+") if numfind.match(line): # this line is distances dist=line.split() for d in dist: G.add_edge(city,cities[i],weight=int(d)) i=i+1 else: # this line is a city, position, population i=1 (city,coordpop)=line.split("[") cities.insert(0,city) (coord,pop)=coordpop.split("]") (y,x)=coord.split(",") G.add_node(city) # assign position - flip x axis for matplotlib, shift origin G.position[city]=(-int(x)+7500,int(y)-3000) G.population[city]=float(pop)/1000.0 return G if __name__ == '__main__': import networkx as nx import re import sys G=miles_graph() print("Loaded miles_dat.txt containing 128 cities.") print("digraph has %d nodes with %d edges"\ %(nx.number_of_nodes(G),nx.number_of_edges(G))) # make new graph of cites, edge if less then 300 miles between them H=nx.Graph() for v in G: H.add_node(v) for (u,v,d) in G.edges(data=True): if d['weight'] < 300: H.add_edge(u,v) # draw with matplotlib/pylab try: import matplotlib.pyplot as plt plt.figure(figsize=(8,8)) # with nodes colored by degree sized by population node_color=[float(H.degree(v)) for v in H] nx.draw(H,G.position, node_size=[G.population[v] for v in H], node_color=node_color, with_labels=False) # scale the axes equally plt.xlim(-5000,500) plt.ylim(-2000,3500) plt.savefig("knuth_miles.png") except: pass networkx-1.11/examples/drawing/knuth_miles.txt.gz0000644000175000017500000004753512637544450022176 0ustar aricaric00000000000000IFmreǑybO]uaT+"!Jե5mcP&Ȅh X|ةR\({C/_}x||p>y߿?.?]^Ǘ{~_>=,x㻇ׇ,>O>߿r _˯^_oO]~׌|x{??^>~O^~ᛷz?S|om՗o߼||׏?|yyxxpm.q~zzxx| O.bvg/?>?Oq?<5p_c~㳗oqxdvwqmݢ}S,B<c|1C# ty|,N /><,{,jl]7_o/?>]~?JYzR˿?_oqq-u-ex?8ֻ^X^ڥne]k`Yz-G9X8Kz9mkk/^ڿ:q׺ײKl\Ǘ-/xVzߗQYǥ[eo~u<7oX~lٖxr97kR>|a՟~ueܝ׺r륎Kqc+Ry{,ߺ]_qu-&' ,Ɍ;?x7yUCtW"#cs%oIiEB;"-do;Z4AFBCY6~7ԙ¢lGYj{ ӻ寵N5$b$c):?a23ii;ޥ]եv AB|ӻ/!m"n"2v#͋$us:rw|5V [= == [שnH^?0V7 3ڣƗ3kZ]q;6_>})P=|@FRu_RxV"Hbv~!xoKX&(‰B Uc%02>pRf QƂУyb}c#:fkFc44!X!"m =VgaEvGд"TSZ+W۵YCNBqc\sZ,!`۸n O,maǞ/;ù+'bAZ-^ssŔF<=\P!>dB¾.aRâ#cr]qjVS2 @1߽FBcXP2%猭QweBZuxWb?,L= ߱ : i,x<]]׎9X_C_6$c5Цχg[N/DBڱrZ'`{$q[bC:;>]B$0WE?={{xBXȵ\q50Df}]O%Ԟ")fşyU#V4y9KaygzI቟؀#*fȡ\(s7u0ۢDy%'Z{{^CebsDPh )WaX1*<2Q:vOvǫCqydbCpD,`A-|{|w?R"-|*gwl'Ɨ%Pby$0 o_)L\i*AH:" Ue, X“X wc#Fz>vaC4A!^-|~|yH$]]˺laZ|[el&Uo?!RyAbk qf2 $a0Jߌ mP UOO(-.F%r}Hg-S '~ bww\n27-+c=#3ܒ=ngPf s~MC)Chw.|9*1GK8P9naCxa@hdn'{5Dxabn3BuPkgx:{ ?FLQPr{jGXj{<+$UxGnഊx0˭;^|-=\5@#ۃ'T#M2p>P0unkSxo0:f\Cv.qA]PfcȗKƟ^Nl.w_E?1!80b=Z{A-]m<vlg@*bQ'Rz#]zHP跔ȸ[cs JGf'&p=ւ 8";¾͠$ t&=%SN̉1@AIB ӛ Y!-xӠK/&ZOow8ds̀0_iݷ3Gږ4"t#|y Nj)8sՎ=b4tvת- Y2TX126AF]lkDr O_f)+8oBL9=2 ؀K8OFb$$ pMS|7?lxY"C4IKV}.x3*VԦ".(@G@RK$?|pfVGD rHbH6b$W=7mˢ,*W|"6+f5<h }SnB/rX%?ρ{R΍7VBFBEG6HJLDZw],J$Cľwn5#V2 qFV[;rurZvJҨ +CzA2kPnV(R5`ʣO}z}HK8p5rA+M~n);Ykd%%aŦ_˚EPI0>`uԣQ%\:@gǨ3C ªb=0d圀aյ@^Lҭi>F4[]/0%rVzv X*8ݺo ߝ"f)WGxP0-T6Tx$ƈpo?,MGi2qC[p{]IJ /~aF \Cɗbh,}Vq?jb_~^Ǘ_"n A@Ʊ l-U:|T!d*9u.p"!C\eЏ0{#TҒԖWo,5 bm5m ̓5w53Q'@Lm+\ ]?)}˧q^v\#af),8 kt{EJۇ)ۚDcVXJV\ePXzgjau#?Qc\PN v6 mo5 8U[p!S[Pc^%ШDacS'(q"D,k? 9 3+1àǑ')BH0~#ۨo{4Xbv!IF:}ݢfdzwVRʻ!BX,*xX{v:AbH CD\JlLwׇ_&a2_ADL,So?TYHH .e-V^>ēl0I+eSji '֊e@"xZ*#CxI7=D&MD~$,ԧ<}^l!z.Ŕi.Ii@)$28O0rz/!vYE N[_2JRʼnTQ42YWy,-QraM)Տѷ=Oܦ׬@TŃVֻXB fp2',/ λ?Yiޯg H=*ZJPpmO]%Obr)\*h8Ǥ^[U^RD@]\8 BJdB@\V-ق&`ND z# ס}M m+@\*8ì[EY.k%tqMg_ &5.Xa*j:41]0f7a*݀@DX2 UzZR$73 0k6#hD[qؔ9(!UxvǐEw7j p(Yq7CuհMEٷ#3ziF{KEhY~*Xº(|Zne/ {ŨLQdwXc&_;M>R)!FfD.%B: )R3A4bEEۂ ˗Oo.xx~/\'T8M()ɠMumA"AHE[_8GBo!lv4ZKo)>dQj[Xu Л^MČ V\r =}M敥\x7҅]A/V_W%9sLƌ!*ք%g#>CrT,?'&@ u<"M(lQ^ߟ7nd:]bb]s4BG]X1rl5^"\:+1JK,F"\W+从"+$Iyx)See&#@Bb?dg=$LI5!/^4wBf׵4}f";gJ ` +asS}!E魤1`х&I[<3ĺ7u"*p>D?2 `'α%]n)5$!25(ЏY?ב#w+u" 0Qy!118IlvLQcyE\'IQ#WN!AQ00#[ I4S5+(Jaq= ILa%7DxG\Z^) j}lv9&;h繸LdDz:yDlO` \/BHuAH-kwc83,|j=̽֠#9J}IX%JXd`%4pYI#6 D&*QHNd*;,VtƑ)f?%wWU[mtImY s!.J2P7><}?)ŧ"^#) Rc%ll0̈́hhMmXE2iK:[dEl r:BX]YZ)3 ocYٜlt`;$ KwKD͓Nћlw=!kAIdmSjv<'/<[ )Ց޴%+Qpk&@@ &v~d_>}Li}8*5SRN9ͺ5%= r9hNRp͓ئ5[&EݶH6}e=Sz[Uqgow ^&A'anm+tOj\nZ̪lg1(Qʦf5L +vE>+ )r$xnV a,tU><3;o@`sn^4 9~J<(8RL]g6 qdFuXbOM$-_Ȏp^q 5snKh_?L&z% :ΒU+G/>n%1OUV}K8ܖpkeӁ7hm 954YIp<őzy8;NinIBA TOK)8Ģlkmo3L@~0(EvJ2._?>=\~s!Y~ޅcW6eG"+euxԦ>貺Z:Q6Ań5=2UUِ>9by瞵1tUxd@-nqDV& šbtAsxeC\dGMޚjʙtPUaD2;`l3kG0켲Mc VqP2F4\lח+mwD;CvˬX=I֤,HZЦG[hʅF3L}XYZ%Xvі~G"~!m7t0LIU̜FHq$-狰M*?DmNڦXӱuuN7A1Y٣옚(A 5;Cb/o BPR^{+M~9XT&U ls2)ihVΛ3 QIql oT2=(H7]]a¸سgwv"ӌev6T&\=V~3,| jfNz1vaŀ&R!ʒS M ` pF# %Vo$zC{8$bx:B~Gf}ڛI+eGvxW "@H ?퇝1M4ng)B,uM0PikViji j!ch$RZ >* oV~fݲ/GFa6 )+eRmP tx@v6قԅZ7#kRk0i]ǿLۘCzfG@ao\!{+f_mʬt$+~KV$Ԏ[%7AK8č0 FT @d-#7%VfOpP"]AN8JlTN8ۙmcΗsl>6 NWKEV~SVm-},;*Ei9HD6Y3ܷmϟ^FMyW^=HG&ҕqfwaKԏ d>1zgŘ쫬l.FXGܓM2j=D ,H=qFx(HRGq] jΓ9byJZUP' n^B t~0)a(p"xT7u D@<ߒlGym Vw9G#yb95WڀrX5 v_>?cҳۈR#)ihDtUBFO6624Hѧ#Zqf>CY盰ƲTa 0yJCy QQnUvN;ow*.UUTEeV\~W,1o9o%V8Fhb_o2BƳDŽ]s$J\< q"$g SYcXI);CO;ޒg(U. |LBxFMխ+QHfëmI_<||yr r+WI')cK`E As}fM,&8֬13KHpo?[;ꄌ;T>=D  SQxA$U2>Id[$cd"@8%vN홍w}]3R*:V|a+HK]r@ְdB0:I~قH`hQofmnϩ<ܻ0#i釁 t9EC|8En;&IBkKΗXom2e'\66/V 葍٠l) ̶=XG ԳȗI; `IH:wW\)5н1$eAu:Líy/*E7/cB۰o%WcI) |g$v&eukE&v Yr+_$V'p2ud,;Ʊ֐Sj,1&>]5N☤!0XQh{zqI/@ZLw|93jCn8 ɨM4ul1Mu]ILH8)sڭIZd2u(&7,@stavYvRЃVKiܱ#١NAȿz۰?~u7tXpx %ۃbW*9lɹQRn\ʚ!XoOّ3 koÿbt82ۜsؔU ĴE.r)ju"*Ij|Ξ_X@k|oPœ/[0Qo}p \w[nNZ'f1֚=܄f9thS[V"`ʱ~Ba}H렙Wf6%48tfSO28&%Am\fuhP%qlӦٶ{洄jI&24EQw$P8'`JLPwa*EfFڞ\Hkܑpb>4/u|~) 2{δ Bٙ 8f*RAHG ͩ=v\$4V&2w=(* O8IE,N[o!At)ϡhV=)~qp6)2BrC尠w:2`2u %O@GLP߸9H2z0AoDsj#z*~\&<So0S'q9=c2gSг0 (, &O3'[ۖ pa#Þ #!΋Zc/˹+ ^Ca#it5A Nv/5nBNI7ްy>K'TYfP:Nlu$՛ >2L)DYY8fI$7&i ArZT=,ğE~YaMN*|v2Ł #hcLvڍf;7<Ȝ P11@J@Ϛ|&Ss Ʌ/ ]lC!P՜Ĉ*Z2( }銦c,q5d2d!mSXptc )[} Y^u.3d`!5U`t8CxurpPN@sd$Df{\=$BwsݲUߖ Q.VGƁBf͖MÙjb#O9_/Xa\) L[ vPPM[aR `!l xr~wdW*mᎶ:*9UfG9EYR IG܇[S +Ŏr' 1s>yH,u6`X΄+YlMr֥\ _}B@U!K&ppa /.9861󟖇$k,;UJ̲V&1NK3w)_<|z۾cyHgq3|;y%z"X @(89AeȾGY`O=7&fs .9Y\sGeLyVܒӒ͟ `j-yf n=k5O><ؠ8Lk8u Ai Hb,s$gsnFN?{5\̖RqdY)ԕUALAo?=>W8wpj`3UPg6kzNܜn]"6G'd3#g[8Vpi#Ɩ|Q*#pI9!4JƴW"lsdAS8YzQD[NvJ%NۼgSo8`I 1H'FK΃=!kF $0@ ~4 S%2G "C[6dԤSPZ8겲;Bwr9sv$p0;2OoaI|HPr4=kCZ-EU+<%3CӽeBWՒL.`YOCq9i3գ%mw8 3L7E7Ƣ\=';PY}9 >qxٱi F8r|?u"NtQD kc d8~k)pت p핤>YLů4,`XmY8wL~E=uo)x8F͎r2~WK]鞩I E9+h$s?kyҔTKL _[RHOD̎ ,}IvHGփ^Ug`iRfd>i-|r:(u1MqzbОȸ0!ͱuG2 _xSfL+1US+m&}qB6N51re8O0}E^;D)3c.̦[G)hz9l,xy);=ڶzMD<[2llplOٳ+GI5^+)&7~`(5\2~>F}i_"m$:7 9 NE[^ufBֆ*'K<ıڭ!q2pSǀ3:Ca<\ {0mhmq0<$)Qm2[meZ業k+YC\Fl~cpyFB̂f*~ 4V{_K QM> 7cOX{d ٖl'`+1h?ˋޅ91&,}L)c G)[i, o)F= (&rX `; Xyhvۆ.Qu˦"wAy>(Ȏ#U [vNx޻G9D-\}sˠjQ=X8N 9m0&f,Y4S*t}%:G'<\;ibD=Ȥ9anPk;S۴M؛M#l}CY06V:5D%'fSa~!Gjqw[eun.kڧCp\yY-G=c6s@*=RiN2;:z\ [Jꙵ䛎dIZӷ휏wԄf<,7o !*X Zr`6|)2 mj 5tuF宒Q{ƻМڑ5;#Ƒ@s< mHxmdc>3[<9Wt+]H'zְ%c?<( p$]G`<#畔*_/b &1/,SE+:C9cd~y,Ɠ$-CVgfU>߲r zNP/M#{FHϞh֦qJt-FfqlUdؗ3Q+0c= !H\6PE'Sɹ n"͚a'%h]nbG yA6Ss) o:z;`&'P4W(7(P M)/ L3O٬P%ì076ݯyo-D 9y'hPfkRK0t(7,I!}ՖyW0FcD9l><>?P (h"` G`eL֐ 0˃ۤɽq{%9c5:8 >t,5^UAvs>'֬75OF/9Ms+gt)/ZVSF>L[$VMSP-5$,NFkc"'`zbUH{“I@ y/ 5ۊzb%ӆ4"9gٚ`zkb"s LV,3|Z?<Љ?7۝, 2ӿwG$]j˞n'9yN,m#RIA HqDe2qf ٘/qk8W&@|fS2O+Oݙ-oj6Zu鹱 cp$=XJ U#Py>g8C.Odf9vg }`d;J_1MQɼߒ^$ S=W`bKkiE1ѤJKs:]3΃=VL$NY)s\FHڷ%}o_# 6`,n/ F2- XO0f&^8R𰱓QT%2aXgEp&Cp"8i)v' l375vd_񟎱*]&y\Z&e"Q.Dl>Y|ụSa(2)urb&H%_՝܌@KpR[ [Ijwp!<-QrjDZ%-YIoJlh8vۮviv;[v[//\y|z߿kV;أnetworkx-1.11/examples/drawing/random_geometric_graph.py0000644000175000017500000000147412637544500023531 0ustar aricaric00000000000000import networkx as nx import matplotlib.pyplot as plt G=nx.random_geometric_graph(200,0.125) # position is stored as node attribute data for random_geometric_graph pos=nx.get_node_attributes(G,'pos') # find node near center (0.5,0.5) dmin=1 ncenter=0 for n in pos: x,y=pos[n] d=(x-0.5)**2+(y-0.5)**2 if d import matplotlib.pyplot as plt import networkx as nx G = nx.gnp_random_graph(100,0.02) degree_sequence=sorted(nx.degree(G).values(),reverse=True) # degree sequence #print "Degree sequence", degree_sequence dmax=max(degree_sequence) plt.loglog(degree_sequence,'b-',marker='o') plt.title("Degree rank plot") plt.ylabel("degree") plt.xlabel("rank") # draw graph in inset plt.axes([0.45,0.45,0.45,0.45]) Gcc=sorted(nx.connected_component_subgraphs(G), key = len, reverse=True)[0] pos=nx.spring_layout(Gcc) plt.axis('off') nx.draw_networkx_nodes(Gcc,pos,node_size=20) nx.draw_networkx_edges(Gcc,pos,alpha=0.4) plt.savefig("degree_histogram.png") plt.show() networkx-1.11/examples/drawing/sampson.py0000644000175000017500000000253512653165361020513 0ustar aricaric00000000000000#!/usr/bin/env python """ Sampson's monastery data. Shows how to read data from a zip file and plot multiple frames. """ # Author: Aric Hagberg (hagberg@lanl.gov) # Copyright (C) 2010-2016 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import zipfile, cStringIO import networkx as nx import matplotlib.pyplot as plt zf = zipfile.ZipFile('sampson_data.zip') # zipfile object e1=cStringIO.StringIO(zf.read('samplike1.txt')) # read info file e2=cStringIO.StringIO(zf.read('samplike2.txt')) # read info file e3=cStringIO.StringIO(zf.read('samplike3.txt')) # read info file G1=nx.read_edgelist(e1,delimiter='\t') G2=nx.read_edgelist(e2,delimiter='\t') G3=nx.read_edgelist(e3,delimiter='\t') pos=nx.spring_layout(G3,iterations=100) plt.clf() plt.subplot(221) plt.title('samplike1') nx.draw(G1,pos,node_size=50,with_labels=False) plt.subplot(222) plt.title('samplike2') nx.draw(G2,pos,node_size=50,with_labels=False) plt.subplot(223) plt.title('samplike3') nx.draw(G3,pos,node_size=50,with_labels=False) plt.subplot(224) plt.title('samplike1,2,3') nx.draw(G3,pos,edgelist=G3.edges(),node_size=50,with_labels=False) nx.draw_networkx_edges(G1,pos,alpha=0.25) nx.draw_networkx_edges(G2,pos,alpha=0.25) plt.savefig("sampson.png") # save as png plt.show() # display networkx-1.11/examples/drawing/simple_path.py0000644000175000017500000000043712637544450021341 0ustar aricaric00000000000000#!/usr/bin/env python """ Draw a graph with matplotlib. You must have matplotlib for this to work. """ try: import matplotlib.pyplot as plt except: raise import networkx as nx G=nx.path_graph(8) nx.draw(G) plt.savefig("simple_path.png") # save as png plt.show() # display networkx-1.11/examples/drawing/four_grids.py0000644000175000017500000000150612653165361021173 0ustar aricaric00000000000000#!/usr/bin/env python """ Draw a graph with matplotlib. You must have matplotlib for this to work. """ # Author: Aric Hagberg (hagberg@lanl.gov) # Copyright (C) 2004-2016 # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. try: import matplotlib.pyplot as plt except: raise import networkx as nx G=nx.grid_2d_graph(4,4) #4x4 grid pos=nx.spring_layout(G,iterations=100) plt.subplot(221) nx.draw(G,pos,font_size=8) plt.subplot(222) nx.draw(G,pos,node_color='k',node_size=0,with_labels=False) plt.subplot(223) nx.draw(G,pos,node_color='g',node_size=250,with_labels=False,width=6) plt.subplot(224) H=G.to_directed() nx.draw(H,pos,node_color='b',node_size=20,with_labels=False) plt.savefig("four_grids.png") plt.show() networkx-1.11/examples/drawing/node_colormap.py0000644000175000017500000000067212653165361021654 0ustar aricaric00000000000000#!/usr/bin/env python """ Draw a graph with matplotlib, color by degree. You must have matplotlib for this to work. """ # Author: Aric Hagberg (hagberg@lanl.gov) try: import matplotlib.pyplot as plt except: raise import networkx as nx G=nx.cycle_graph(24) pos=nx.spring_layout(G,iterations=200) nx.draw(G,pos,node_color=range(24),node_size=800,cmap=plt.cm.Blues) plt.savefig("node_colormap.png") # save as png plt.show() # display networkx-1.11/examples/drawing/chess_masters.py0000644000175000017500000001176012653165361021676 0ustar aricaric00000000000000#!/usr/bin/env python """ An example of the MultiDiGraph clas The function chess_pgn_graph reads a collection of chess matches stored in the specified PGN file (PGN ="Portable Game Notation") Here the (compressed) default file --- chess_masters_WCC.pgn.bz2 --- contains all 685 World Chess Championship matches from 1886 - 1985. (data from http://chessproblem.my-free-games.com/chess/games/Download-PGN.php) The chess_pgn_graph() function returns a MultiDiGraph with multiple edges. Each node is the last name of a chess master. Each edge is directed from white to black and contains selected game info. The key statement in chess_pgn_graph below is G.add_edge(white, black, game_info) where game_info is a dict describing each game. """ # Copyright (C) 2006-2016 by # Aric Hagberg # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import networkx as nx # tag names specifying what game info should be # stored in the dict on each digraph edge game_details=["Event", "Date", "Result", "ECO", "Site"] def chess_pgn_graph(pgn_file="chess_masters_WCC.pgn.bz2"): """Read chess games in pgn format in pgn_file. Filenames ending in .gz or .bz2 will be uncompressed. Return the MultiDiGraph of players connected by a chess game. Edges contain game data in a dict. """ import bz2 G=nx.MultiDiGraph() game={} datafile = bz2.BZ2File(pgn_file) lines = (line.decode().rstrip('\r\n') for line in datafile) for line in lines: if line.startswith('['): tag,value=line[1:-1].split(' ',1) game[str(tag)]=value.strip('"') else: # empty line after tag set indicates # we finished reading game info if game: white=game.pop('White') black=game.pop('Black') G.add_edge(white, black, **game) game={} return G if __name__ == '__main__': G=chess_pgn_graph() ngames=G.number_of_edges() nplayers=G.number_of_nodes() print("Loaded %d chess games between %d players\n"\ % (ngames,nplayers)) # identify connected components # of the undirected version Gcc=list(nx.connected_component_subgraphs(G.to_undirected())) if len(Gcc)>1: print("Note the disconnected component consisting of:") print(Gcc[1].nodes()) # find all games with B97 opening (as described in ECO) openings=set([game_info['ECO'] for (white,black,game_info) in G.edges(data=True)]) print("\nFrom a total of %d different openings,"%len(openings)) print('the following games used the Sicilian opening') print('with the Najdorff 7...Qb6 "Poisoned Pawn" variation.\n') for (white,black,game_info) in G.edges(data=True): if game_info['ECO']=='B97': print(white,"vs",black) for k,v in game_info.items(): print(" ",k,": ",v) print("\n") try: import matplotlib.pyplot as plt except ImportError: import sys print("Matplotlib needed for drawing. Skipping") sys.exit(0) # make new undirected graph H without multi-edges H=nx.Graph(G) # edge width is proportional number of games played edgewidth=[] for (u,v,d) in H.edges(data=True): edgewidth.append(len(G.get_edge_data(u,v))) # node size is proportional to number of games won wins=dict.fromkeys(G.nodes(),0.0) for (u,v,d) in G.edges(data=True): r=d['Result'].split('-') if r[0]=='1': wins[u]+=1.0 elif r[0]=='1/2': wins[u]+=0.5 wins[v]+=0.5 else: wins[v]+=1.0 try: pos=nx.nx_agraph.graphviz_layout(H) except: pos=nx.spring_layout(H,iterations=20) plt.rcParams['text.usetex'] = False plt.figure(figsize=(8,8)) nx.draw_networkx_edges(H,pos,alpha=0.3,width=edgewidth, edge_color='m') nodesize=[wins[v]*50 for v in H] nx.draw_networkx_nodes(H,pos,node_size=nodesize,node_color='w',alpha=0.4) nx.draw_networkx_edges(H,pos,alpha=0.4,node_size=0,width=1,edge_color='k') nx.draw_networkx_labels(H,pos,fontsize=14) font = {'fontname' : 'Helvetica', 'color' : 'k', 'fontweight' : 'bold', 'fontsize' : 14} plt.title("World Chess Championship Games: 1886 - 1985", font) # change font and write text (using data coordinates) font = {'fontname' : 'Helvetica', 'color' : 'r', 'fontweight' : 'bold', 'fontsize' : 14} plt.text(0.5, 0.97, "edge width = # games played", horizontalalignment='center', transform=plt.gca().transAxes) plt.text(0.5, 0.94, "node size = # games won", horizontalalignment='center', transform=plt.gca().transAxes) plt.axis('off') plt.savefig("chess_masters.png",dpi=75) print("Wrote chess_masters.png") plt.show() # display networkx-1.11/examples/drawing/weighted_graph.py0000644000175000017500000000176112653165361022014 0ustar aricaric00000000000000#!/usr/bin/env python """ An example using Graph as a weighted network. """ # Author: Aric Hagberg (hagberg@lanl.gov) try: import matplotlib.pyplot as plt except: raise import networkx as nx G=nx.Graph() G.add_edge('a','b',weight=0.6) G.add_edge('a','c',weight=0.2) G.add_edge('c','d',weight=0.1) G.add_edge('c','e',weight=0.7) G.add_edge('c','f',weight=0.9) G.add_edge('a','d',weight=0.3) elarge=[(u,v) for (u,v,d) in G.edges(data=True) if d['weight'] >0.5] esmall=[(u,v) for (u,v,d) in G.edges(data=True) if d['weight'] <=0.5] pos=nx.spring_layout(G) # positions for all nodes # nodes nx.draw_networkx_nodes(G,pos,node_size=700) # edges nx.draw_networkx_edges(G,pos,edgelist=elarge, width=6) nx.draw_networkx_edges(G,pos,edgelist=esmall, width=6,alpha=0.5,edge_color='b',style='dashed') # labels nx.draw_networkx_labels(G,pos,font_size=20,font_family='sans-serif') plt.axis('off') plt.savefig("weighted_graph.png") # save as png plt.show() # display networkx-1.11/examples/drawing/labels_and_colors.py0000644000175000017500000000246012653165361022475 0ustar aricaric00000000000000#!/usr/bin/env python """ Draw a graph with matplotlib, color by degree. You must have matplotlib for this to work. """ # Author: Aric Hagberg (hagberg@lanl.gov) import matplotlib.pyplot as plt import networkx as nx G=nx.cubical_graph() pos=nx.spring_layout(G) # positions for all nodes # nodes nx.draw_networkx_nodes(G,pos, nodelist=[0,1,2,3], node_color='r', node_size=500, alpha=0.8) nx.draw_networkx_nodes(G,pos, nodelist=[4,5,6,7], node_color='b', node_size=500, alpha=0.8) # edges nx.draw_networkx_edges(G,pos,width=1.0,alpha=0.5) nx.draw_networkx_edges(G,pos, edgelist=[(0,1),(1,2),(2,3),(3,0)], width=8,alpha=0.5,edge_color='r') nx.draw_networkx_edges(G,pos, edgelist=[(4,5),(5,6),(6,7),(7,4)], width=8,alpha=0.5,edge_color='b') # some math labels labels={} labels[0]=r'$a$' labels[1]=r'$b$' labels[2]=r'$c$' labels[3]=r'$d$' labels[4]=r'$\alpha$' labels[5]=r'$\beta$' labels[6]=r'$\gamma$' labels[7]=r'$\delta$' nx.draw_networkx_labels(G,pos,labels,font_size=16) plt.axis('off') plt.savefig("labels_and_colors.png") # save as png plt.show() # display networkx-1.11/examples/drawing/sampson_data.zip0000644000175000017500000000172012637544450021653 0ustar aricaric00000000000000PKUeg~ixsǒJc*AABP4TWoΆ;CICqd"f3}dYٜ~kAb}#3XXFfguUUQ"n^j^jn7g;wPKUe # Dan Schult # Pieter Swart # All rights reserved. # BSD license. import networkx as NX try: import pylab as P except ImportError: pass try: hd='H' + unichr(252) + 'sker D' + unichr(252) mh='Mot' + unichr(246) + 'rhead' mc='M' + unichr(246) + 'tley Cr' + unichr(252) + 'e' st='Sp' + unichr(305) + 'n' + unichr(776) + 'al Tap' q='Queensr' + unichr(255) + 'che' boc='Blue ' + unichr(214) +'yster Cult' dt='Deatht' + unichr(246) + 'ngue' except NameError: hd='H' + chr(252) + 'sker D' + chr(252) mh='Mot' + chr(246) + 'rhead' mc='M' + chr(246) + 'tley Cr' + chr(252) + 'e' st='Sp' + chr(305) + 'n' + chr(776) + 'al Tap' q='Queensr' + chr(255) + 'che' boc='Blue ' + chr(214) +'yster Cult' dt='Deatht' + chr(246) + 'ngue' G=NX.Graph() G.add_edge(hd,mh) G.add_edge(mc,st) G.add_edge(boc,mc) G.add_edge(boc,dt) G.add_edge(st,dt) G.add_edge(q,st) G.add_edge(dt,mh) G.add_edge(st,mh) # write in UTF-8 encoding fh=open('edgelist.utf-8','wb') fh.write('# -*- coding: utf-8 -*-\n'.encode('utf-8')) # encoding hint for emacs NX.write_multiline_adjlist(G,fh,delimiter='\t', encoding = 'utf-8') # read and store in UTF-8 fh=open('edgelist.utf-8','rb') H=NX.read_multiline_adjlist(fh,delimiter='\t', encoding = 'utf-8') for n in G.nodes(): if n not in H: print(False) print(G.nodes()) try: pos=NX.spring_layout(G) NX.draw(G,pos,font_size=16,with_labels=False) for p in pos: # raise text positions pos[p][1]+=0.07 NX.draw_networkx_labels(G,pos) P.show() except: pass networkx-1.11/examples/advanced/eigenvalues.py0000644000175000017500000000100312637544450021443 0ustar aricaric00000000000000#!/usr/bin/env python """ Create an G{n,m} random graph and compute the eigenvalues. Requires numpy and matplotlib. """ import networkx as nx import numpy.linalg import matplotlib.pyplot as plt n = 1000 # 1000 nodes m = 5000 # 5000 edges G = nx.gnm_random_graph(n,m) L = nx.normalized_laplacian_matrix(G) e = numpy.linalg.eigvals(L.A) print("Largest eigenvalue:", max(e)) print("Smallest eigenvalue:", min(e)) plt.hist(e,bins=100) # histogram with 100 bins plt.xlim(0,2) # eigenvalues between 0 and 2 plt.show() networkx-1.11/examples/advanced/iterated_dynamical_systems.py0000644000175000017500000001357512637544450024566 0ustar aricaric00000000000000""" Digraphs from Integer-valued Iterated Functions =============================================== Sums of cubes on 3N ------------------- The number 153 has a curious property. Let 3N={3,6,9,12,...} be the set of positive multiples of 3. Define an iterative process f:3N->3N as follows: for a given n, take each digit of n (in base 10), cube it and then sum the cubes to obtain f(n). When this process is repeated, the resulting series n, f(n), f(f(n)),... terminate in 153 after a finite number of iterations (the process ends because 153 = 1**3 + 5**3 + 3**3). In the language of discrete dynamical systems, 153 is the global attractor for the iterated map f restricted to the set 3N. For example: take the number 108 f(108) = 1**3 + 0**3 + 8**3 = 513 and f(513) = 5**3 + 1**3 + 3**3 = 153 So, starting at 108 we reach 153 in two iterations, represented as: 108->513->153 Computing all orbits of 3N up to 10**5 reveals that the attractor 153 is reached in a maximum of 14 iterations. In this code we show that 13 cycles is the maximum required for all integers (in 3N) less than 10,000. The smallest number that requires 13 iterations to reach 153, is 177, i.e., 177->687->1071->345->216->225->141->66->432->99->1458->702->351->153 The resulting large digraphs are useful for testing network software. The general problem ------------------- Given numbers n, a power p and base b, define F(n; p, b) as the sum of the digits of n (in base b) raised to the power p. The above example corresponds to f(n)=F(n; 3,10), and below F(n; p, b) is implemented as the function powersum(n,p,b). The iterative dynamical system defined by the mapping n:->f(n) above (over 3N) converges to a single fixed point; 153. Applying the map to all positive integers N, leads to a discrete dynamical process with 5 fixed points: 1, 153, 370, 371, 407. Modulo 3 those numbers are 1, 0, 1, 2, 2. The function f above has the added property that it maps a multiple of 3 to another multiple of 3; i.e. it is invariant on the subset 3N. The squaring of digits (in base 10) result in cycles and the single fixed point 1. I.e., from a certain point on, the process starts repeating itself. keywords: "Recurring Digital Invariant", "Narcissistic Number", "Happy Number" The 3n+1 problem ---------------- There is a rich history of mathematical recreations associated with discrete dynamical systems. The most famous is the Collatz 3n+1 problem. See the function collatz_problem_digraph below. The Collatz conjecture --- that every orbit returrns to the fixed point 1 in finite time --- is still unproven. Even the great Paul Erdos said "Mathematics is not yet ready for such problems", and offered $500 for its solution. keywords: "3n+1", "3x+1", "Collatz problem", "Thwaite's conjecture" """ from networkx import * from math import * nmax=10000 p=3 mach_eps=0.00000000001 def digitsrep(n,b=10): """Return list of digits comprising n represented in base b. n must be a nonnegative integer""" # very inefficient if you only work with base 10 dlist=[] if n<=0: return [0] maxpow=int(floor( log(n)/log(b) + mach_eps )) pow=maxpow while pow>=0: x=int(floor(n // b**pow)) dlist.append(x) n=n-x*b**pow pow=pow-1 return dlist def powersum(n,p,b=10): """Return sum of digits of n (in base b) raised to the power p.""" dlist=digitsrep(n,b) sum=0 for k in dlist: sum+=k**p return sum def attractor153_graph(n,p,multiple=3,b=10): """Return digraph of iterations of powersum(n,3,10).""" G=DiGraph() for k in range(1,n+1): if k%multiple==0 and k not in G: k1=k knext=powersum(k1,p,b) while k1!=knext: G.add_edge(k1,knext) k1=knext knext=powersum(k1,p,b) return G def squaring_cycle_graph_old(n,b=10): """Return digraph of iterations of powersum(n,2,10).""" G=DiGraph() for k in range(1,n+1): k1=k G.add_node(k1) # case k1==knext, at least add node knext=powersum(k1,2,b) G.add_edge(k1,knext) while k1!=knext: # stop if fixed point k1=knext knext=powersum(k1,2,b) G.add_edge(k1,knext) if G.out_degree(knext) >=1: # knext has already been iterated in and out break return G def sum_of_digits_graph(nmax,b=10): def f(n): return powersum(n,1,b) return discrete_dynamics_digraph(nmax,f) def squaring_cycle_digraph(nmax,b=10): def f(n): return powersum(n,2,b) return discrete_dynamics_digraph(nmax,f) def cubing_153_digraph(nmax): def f(n): return powersum(n,3,10) return discrete_dynamics_digraph(nmax,f) def discrete_dynamics_digraph(nmax,f,itermax=50000): G=DiGraph() for k in range(1,nmax+1): kold=k G.add_node(kold) knew=f(kold) G.add_edge(kold,knew) while kold!=knew and kold<=1: # knew has already been iterated in and out break return G def collatz_problem_digraph(nmax): def f(n): if n%2==0: return n // 2 else: return 3*n+1 return discrete_dynamics_digraph(nmax,f) def fixed_points(G): """Return a list of fixed points for the discrete dynamical system represented by the digraph G. """ return [n for n in G if G.out_degree(n)==0] if __name__ == "__main__": nmax=10000 print("Building cubing_153_digraph(%d)"% nmax) G=cubing_153_digraph(nmax) print("Resulting digraph has", len(G), "nodes and", G.size()," edges") print("Shortest path from 177 to 153 is:") print(shortest_path(G,177,153)) print("fixed points are %s" % fixed_points(G)) networkx-1.11/examples/3d_drawing/0000755000175000017500000000000012653231454017037 5ustar aricaric00000000000000networkx-1.11/examples/3d_drawing/mayavi2_spring.py0000644000175000017500000000210712637544500022344 0ustar aricaric00000000000000# needs mayavi2 # run with ipython -wthread import networkx as nx import numpy as np from enthought.mayavi import mlab # some graphs to try #H=nx.krackhardt_kite_graph() #H=nx.Graph();H.add_edge('a','b');H.add_edge('a','c');H.add_edge('a','d') #H=nx.grid_2d_graph(4,5) H=nx.cycle_graph(20) # reorder nodes from 0,len(G)-1 G=nx.convert_node_labels_to_integers(H) # 3d spring layout pos=nx.spring_layout(G,dim=3) # numpy array of x,y,z positions in sorted node order xyz=np.array([pos[v] for v in sorted(G)]) # scalar colors scalars=np.array(G.nodes())+5 mlab.figure(1, bgcolor=(0, 0, 0)) mlab.clf() pts = mlab.points3d(xyz[:,0], xyz[:,1], xyz[:,2], scalars, scale_factor=0.1, scale_mode='none', colormap='Blues', resolution=20) pts.mlab_source.dataset.lines = np.array(G.edges()) tube = mlab.pipeline.tube(pts, tube_radius=0.01) mlab.pipeline.surface(tube, color=(0.8, 0.8, 0.8)) mlab.savefig('mayavi2_spring.png') # mlab.show() # interactive window networkx-1.11/README.rst0000644000175000017500000000251512653165361014675 0ustar aricaric00000000000000NetworkX -------- NetworkX is a Python package for the creation, manipulation, and study of the structure, dynamics, and functions of complex networks. Documentation http://networkx.github.io Mailing List https://groups.google.com/forum/#!forum/networkx-discuss Development https://github.com/networkx/networkx .. image:: https://travis-ci.org/networkx/networkx.png?branch=master :target: https://travis-ci.org/networkx/networkx .. image:: https://readthedocs.org/projects/networkx/badge/?version=latest :target: https://readthedocs.org/projects/networkx/?badge=latest :alt: Documentation Status .. image:: https://coveralls.io/repos/networkx/networkx/badge.png?branch=master :target: https://coveralls.io/r/networkx/networkx?branch=master A quick example that finds the shortest path between two nodes in an undirected graph:: >>> import networkx as nx >>> G = nx.Graph() >>> G.add_edge('A', 'B', weight=4) >>> G.add_edge('B', 'D', weight=2) >>> G.add_edge('A', 'C', weight=3) >>> G.add_edge('C', 'D', weight=4) >>> nx.shortest_path(G, 'A', 'D', weight='weight') ['A', 'B', 'D'] Distributed with a BSD license; see LICENSE.txt:: Copyright (C) 2004-2016 NetworkX Developers Aric Hagberg Dan Schult Pieter Swart