invited talk presented for the Distinguished Lecturer Series of the Department of Computer Science at the University of Illinois at Chicago, 10 April 2014
Career Management (invited talk at ICSE 2014 NFRS)David Rosenblum
Contenu connexe
Similaire à Jogging While Driving, and Other Software Engineering Research Problems (invited talk for UIC Computer Science Distinguished Lecturer Series)
Similaire à Jogging While Driving, and Other Software Engineering Research Problems (invited talk for UIC Computer Science Distinguished Lecturer Series) (20)
Unleashing Real-time Insights with ClickHouse_ Navigating the Landscape in 20...
Jogging While Driving, and Other Software Engineering Research Problems (invited talk for UIC Computer Science Distinguished Lecturer Series)
1. Jogging While Driving!
and Other Software Engineering
Research Problems
David S. Rosenblum!
Dean, School of Computing!
National University of Singapore
8. NUS
School of Computing
✓Ranked #1 in Asia, #9 in the world [QS World University Rankings by Subject]!
✓2 Departments: Computer Science and Information Systems!
✓111 Academic Staff (tenure-track & teaching track)!
✓115 Research Staff!
✓1800 Undergraduate Students!
✓180 Masters Students!
✓350 PhD Students!
✓S$25 million operating budget!
✓S$10 million+ in research income per annum
10. Certainty in
Software Engineering
Engineering of software is centered around
simplistic,“yes/no” characterizations of artifacts
Program is correct/incorrect
Program execution finished/crashed
Compilation completed/aborted
Test suite succeeded/failed
Specification is satisfied/violated
20. Adaptation in CAAAs
Physical Context
Sensed Context
Inferred Context
Presumed Context
Environment
Context!
Manager
Application
Adaptation!
Manager
Middleware
M. Sama, D.S. Rosenblum, Z.Wang and S. Elbaum,“Multi-Layer Faults in the Architectures of Mobile,
Context-Aware Adaptive Applications”, Journal of Systems and Software,Vol. 83, Issue 6, Jun. 2010, pp. 906–914.
21. Adaptation in CAAAs
Physical Context
Sensed Context
Inferred Context
Presumed Context
Environment
Context!
Manager
Application
Adaptation!
Manager
Middleware
Rule
Engine
M. Sama, D.S. Rosenblum, Z.Wang and S. Elbaum,“Multi-Layer Faults in the Architectures of Mobile,
Context-Aware Adaptive Applications”, Journal of Systems and Software,Vol. 83, Issue 6, Jun. 2010, pp. 906–914.
22. Adaptation in CAAAs
Physical Context
Sensed Context
Inferred Context
Presumed Context
Environment
Context!
Manager
Application
Adaptation!
Manager
Middleware
3rd-Party
Libraries
Rule
Engine
M. Sama, D.S. Rosenblum, Z.Wang and S. Elbaum,“Multi-Layer Faults in the Architectures of Mobile,
Context-Aware Adaptive Applications”, Journal of Systems and Software,Vol. 83, Issue 6, Jun. 2010, pp. 906–914.
23. Approach
1.Derive Adaptation Finite-State Machine
(A-FSM) from rule logic!
2.Explore state space of A-FSM to discover
all potential faults!
✓Enumerative algorithms!
✓Symbolic algorithms!
3.(Confirm existence of discovered faults)
M. Sama, S. Elbaum, F. Raimondi and D.S. Rosenblum,“Context-Aware Adaptive Applications: Fault Patterns and Their
Automated Identification”, IEEETransactions on Software Engineering,Vol. 36, No. 5, Sep./Oct. 2010, pp. 644-661.
29. PhoneAdapter A-FSM
checking location implies GPS is on!
locations are mutually exclusive!
speeds monotonically increase!
a meeting’s end time is later than its start time
Global constraints:
ActivateMeeting DeactivateMeeting
Office
Driving!
Fast
Meeting
Driving
Sync
General
Home
Outdoor
Jogging
40. PhoneAdapter Results
Hazards: Enumerative
n PhoneAdapter
aptation Races and Cycles Context Hazards
signments Race Cycle Paths Hold Activ. Prior.
3968 45 13 14085 0 11 3182
3968 135 23 161 0 0 52
3072 97 19 2 0 0 0
2560 36 13 16 2 2 4
3072 58 19 2 0 0 0
2816 76 19 104 8 0 13
2848 29 1 82634 1828 368 2164
2048 32 1 0 0 0 0
1024 27 5 2 2 0 0
ned a formal model of a key complex behavioral char-
eristic, namely adaptation, of an increasingly large and
Table 2: Faults
State Vars. Nondet. Adaptation Dead Pred
Assignments Faults Assignments
General 7 128 37 128
Outdoor 5 32 3 17
Jogging 2 4 0 1
Driving 3 8 0 7
DrivingFast 2 4 0 2
Home 4 16 0 9
O ce 7 128 1 65
Meeting 1 2 0 2
Sync 2 4 0 1
6.4 Detecting Context Hazards
This class of faults corresponds to sequences of asynchr
41. CAAAs
Summary
✓Rule-based CAAAs can be extremely fault-
prone, even with a small set of rules!
✓The model checking algorithms find many
actual faults, with different tradeoffs!
✓Some alternative to rule-based adaptation
may be preferable
46. P≥0.95 [ ]
Probabilistic
Model Checking
Model
Checker
✓
✕
State Machine!
Model
Temporal
Properties
Results
Counterexample!
Trace
System
Requirements
0.4
0.6
Probabilistic
Probabilistic
! ¬p → ◊q( )∧"( )
47. P=? [ ]
Probabilistic
Model Checking
Model
Checker
✓
✕
State Machine!
Model
Temporal
Properties
Results
Counterexample!
Trace
System
Requirements
0.4
0.6
Quantitative Results
0.9732Probabilistic
Probabilistic
! ¬p → ◊q( )∧"( )
48. Example
Die Tossing Simulated by Coin Flipping
Knuth-Yao algorithm,
from the PRISM group
(Kwiatkowska et al.)
0
3
2
1
6
4
5
0.5
0.5
0.5
0.5
0.5
0.5
0.5
0.5
0.5
0.5
0.5
0.5
0.5
0.5
49. Example
Die Tossing Simulated by Coin Flipping
Knuth-Yao algorithm,
from the PRISM group
(Kwiatkowska et al.)
The behavior is governed by a!
theoretical probability distribution
0
3
2
1
6
4
5
0.5
0.5
0.5
0.5
0.5
0.5
0.5
0.5
0.5
0.5
0.5
0.5
0.5
0.5
50. P≥0.95 [ ]
Probabilistic
Model Checking
Model
Checker
✓
State Machine!
Model
Temporal
Properties
Results
Counterexample!
Trace
System
Requirements
0.4
0.6
Quantitative Results
0.9732Probabilistic
Probabilistic
! ¬p → ◊q( )∧"( )
51. P≥0.95 [ ]
Probabilistic
Model Checking
Model
Checker
✓
State Machine!
Model
Temporal
Properties
Results
Counterexample!
Trace
System
Requirements
Quantitative Results
0.9732Probabilistic
Probabilistic
0.41
0.59
! ¬p → ◊q( )∧"( )
52. P≥0.95 [ ]
Probabilistic
Model Checking
Model
Checker
✕
State Machine!
Model
Temporal
Properties
Results
Counterexample!
Trace
System
Requirements
Quantitative Results
Probabilistic
Probabilistic
0.41
0.59
0.6211
! ¬p → ◊q( )∧"( )
53. Example!
Zeroconf Protocol
s1s0 s2 s3
q
1
1
{ok} {error}
{start} s4
s5
s6
s7
s8
1
1-q
1-p
1-p
1-p
1-p
p p p
p
1
from the PRISM group
(Kwiatkowska et al.)
54. Example!
Zeroconf Protocol
s1s0 s2 s3
q
1
1
{ok} {error}
{start} s4
s5
s6
s7
s8
1
1-q
1-p
1-p
1-p
1-p
p p p
p
1
The behavior is governed by an!
empirically estimated probability distribution
from the PRISM group
(Kwiatkowska et al.)
packet-loss rate
55. Perturbed Probabilistic
Systems
• Starting Points!
✓Discrete-Time Markov Chains (DTMCs)!
✓… with one or more probability parameters!
✓… verified against reachability properties:!
!
✓… and (more recently) LTL properties
S? ∪ S!
Guoxin Su and David S. Rosenblum,“Asymptotic Bounds for QuantitativeVerification of
Perturbed Probabilistic Systems”, Proc. ICFEM 2013!
!
Guoxin Su and David S. Rosenblum,“Perturbation Analysis of Stochastic Systems with
Empirical Distribution Parameters”, Proc. ICSE 2014
56. Parametric
Markov Chains
• A distribution parameter in a DTMC is represented as a
vector x of parameters xi!
• The norm of total variance represents the amount of
perturbation:!
!
• The parameter is allowed a “sufficiently small”
perturbation with respect to ideal reference values r:!
!
• Can generalize to multiple parameters
v = vi∑
x − r ≤ Δ
57. Perturbation Bounds
• Perturbation Function!
!
where A is the transition probability sub-matrix for S?
and b is the vector of one-step probabilities from S? to S!
!
• Condition Numbers: [ICFEM 2013]!
!
• Quadratic Bounds: [ICSE 2014]!
ρ x( )= ι? i A x
i
i b x( )− Ai
i b( )( )i=0
∞
∑
κ = lim
δ→0
sup
ρ(x − r)
δ
: x − r ≤ δ,δ > 0
⎧
⎨
⎩
⎫
⎬
⎭
f −
(δ )− inf ρ(x − r) + f +
(δ )− supρ(x − r) = o(δ 2
)
59. Additional Aspects
• Models
✓Markov Decision Processes (MDPs)!
✓Continuous-Time Markov Chains (CMTCs)!
• Verification
✓PCTL Model Checking!
with singularities due to nested P[ ] operators!
✓Reward Properties!
✓Alternative Norms and Bounds!
Kullback-Leibler Divergence!
✓Parameters as random variables
60. Other Forms of
Uncertainty
“There are known knowns; there are things we know
we know. We also know there are known unknowns;
that is to say, we know there are some things we do
not know. But there are also unknown unknowns –
the ones we don’t know we don’t know.”!
!
— Donald Rumsfeld
61. Uncertainty in Testing
1982: Elaine Weyuker: Non-Testable Programs!
- Impossible/too costly to efficiently check results!
- Example: mathematical software!
2010: David Garlan: Intrinsic Uncertainty!
- Systems embody intrinsic uncertainty/imprecision!
- Cannot easily distinguish bugs from “features”!
- Example: ubiquitous computing
69. Sources of
Uncertainty
✓Output: results, characteristics of results!
✓Sensors: redundancy, reliability, resolution!
✓Context: sensing, inferring, fusing!
✓Machine learning: imprecision, user-specificity
These create significant challenges for
software engineering research and practice!
70. Conclusion
✓Software engineering (certainly) suffers
from excessive certainty!
✓A probabilistic mindset offers some insight!
✓But significant challenges remain for
probabilistic verification!
✓And other forms of uncertainty remain a
challenge to address
71. Jogging While Driving!
and Other Software Engineering
Research Problems
David S. Rosenblum!
Dean, School of Computing!
National University of Singapore