Berkeley Notes
  • Introduction
  • EE120
    • Introduction to Signals and Systems
    • The Fourier Series
    • The Fourier Transform
    • Generalized transforms
    • Linear Time-Invariant Systems
    • Feedback Control
    • Sampling
    • Appendix
  • EE123
    • The DFT
    • Spectral Analysis
    • Sampling
    • Filtering
  • EECS126
    • Introduction to Probability
    • Random Variables and their Distributions
    • Concentration
    • Information Theory
    • Random Processes
    • Random Graphs
    • Statistical Inference
    • Estimation
  • EECS127
    • Linear Algebra
    • Fundamentals of Optimization
    • Linear Algebraic Optimization
    • Convex Optimization
    • Duality
  • EE128
    • Introduction to Control
    • Modeling Systems
    • System Performance
    • Design Tools
    • Cascade Compensation
    • State-Space Control
    • Digital Control Systems
    • Cayley-Hamilton
  • EECS225A
    • Hilbert Space Theory
    • Linear Estimation
    • Discrete Time Random Processes
    • Filtering
  • EE222
    • Real Analysis
    • Differential Geometry
    • Nonlinear System Dynamics
    • Stability of Nonlinear Systems
    • Nonlinear Feedback Control
Powered by GitBook
On this page

Was this helpful?

  1. EECS126

Random Graphs

A random graph is one which is generated through some amount of randomness.

Definition 76

An Erdos-Renyi random graph G(n,p)G(n, p)G(n,p) is an undirected graph on n≥1n \geq 1n≥1 vertices where each edge exists independently with probability ppp.

With random graphs, we often ask what happens to particular properties as n→∞n\to\inftyn→∞ and ppp scales with some relationship to nnn. In particular, we want that property to hold with high probability (i.e, as n→∞n\to\inftyn→∞, the probabilty that G(n,p)G(n,p)G(n,p) has the property approaches 1).

Theorem 42

Every monotone graph property (adding more edges doesn't delete the property) has a sharp threshold tnt_ntn​ where if p≫tnp \gg t_np≫tn​, then G(n,p)G(n, p)G(n,p) has ppp with high probability and does not have ppp with high probability if tn≪G(n,p)t_n \ll G(n,p)tn​≪G(n,p).

One example of a threshold is the connectivity threshold.

Theorem 43 (Erdos-Renyi Connectivity Theorem)

Fix λ>0\lambda > 0λ>0 and let pn=λlog⁡nnp_n = \lambda \frac{\log n}{n}pn​=λnlogn​. If λ>1\lambda > 1λ>1, then P(G(n,pn) is connected)P(G(n,p_n)\text{ is connected})P(G(n,pn​) is connected) with probability approaching 1, and if λ<1\lambda < 1λ<1, then P(G(n,pn) is disconnected)P(G(n,p_n)\text{ is disconnected})P(G(n,pn​) is disconnected)with probability approaching 1

PreviousRandom ProcessesNextStatistical Inference

Last updated 3 years ago

Was this helpful?