Machine learning has become a useful tool for many data-rich problems. However, its use in cyber-physical systems has been limited because of its need for large amounts of well-labeled data that must be tailored for each deployment, and the large number of variables that can affect data in the physical space (e.g. weather, time). This talk introduces the problem through the concept of Structures as Sensors (SaS), in which the infrastructure (e.g. a building, or vehicle fleets) acts as the physical elements of the sensor, and the response is interpreted to obtain information about the occupants and about the environment. We present three physical-based approaches to reduce the data demand for robust learning in SaS: 1) generate data through the use of physical models, 2) improve sensed data through actuation of the sensing system and 3) combine and transfer data from multiple deployments using physical understanding.