Biblio
Filters: Author is Yang, Fan [Clear All Filters]
Research on Automatic Demagnetization for Cylindrical Magnetic Shielding. 2021 IEEE 4th International Electrical and Energy Conference (CIEEC). :1–6.
.
2021. Magnetic shielding is an important part in atomic clock’s physical system. The demagnetization of the assembled magnetic shielding system plays an important role in improving atomic clock’s performance. In terms of the drawbacks in traditional attenuated alternating-current demagnetizing method, this paper proposes a novel method — automatically attenuated alternating-current demagnetizing method. Which is implemented by controlling the demagnetization current waveform thorough the signal source’s modulation, so that these parameters such as demagnetizing current frequency, amplitude, transformation mode and demagnetizing period are precisely adjustable. At the same time, this demagnetization proceeds automatically, operates easily, and works steadily. We have the pulsed optically pumped (POP) rubidium atomic clock’s magnetic shielding system for the demagnetization experiment, the magnetic field value reached 1nT/7cm. Experiments show that novel method can effectively realize the demagnetization of the magnetic shielding system, and well meets the atomic clock’s working requirements.
An Automated Composite Scanning Tool with Multiple Vulnerabilities. 2019 IEEE 3rd Advanced Information Management, Communicates, Electronic and Automation Control Conference (IMCEC). :1060–1064.
.
2019. In order to effectively do network security protection, detecting system vulnerabilities becomes an indispensable process. Here, the vulnerability detection module with three functions is assembled into a device, and a composite detection tool with multiple functions is proposed to deal with some frequent vulnerabilities. The tool includes a total of three types of vulnerability detection, including cross-site scripting attacks, SQL injection, and directory traversal. First, let's first introduce the principle of each type of vulnerability; then, introduce the detection method of each type of vulnerability; finally, detail the defenses of each type of vulnerability. The benefits are: first, the cost of manual testing is eliminated; second, the work efficiency is greatly improved; and third, the network is safely operated in the first time.
The Survey on Intellectual Property Based on Blockchain Technology. 2019 IEEE International Conference on Industrial Cyber Physical Systems (ICPS). :743—748.
.
2019. The characteristics of decentralization, tamper-resistance and transaction anonymity of blockchain can resolve effectively the problems in traditional intellectual property such as the difficulty of electronic obtaining for evidence, the high cost and low compensation when safeguarding the copyrights. Blockchain records the information through encryption algorithm, removes the third party, and stores the information in all nodes to prevent the information from being tampered with, so as to realize the protection of intellectual property. Starting from the bottom layer of blockchain, this paper expounds in detail the characteristics and the technical framework of blockchain. At the same time, according to the existing problems in transaction throughput, time delay and resource consumption of blockchain system, optimization mechanisms such as cross-chain and proof of stake are analyzed. Finally, combined with the characteristics of blockchain technology and existing application framework, this paper summarizes the existing problems in the industry and forecasts the development trend of intellectual property based on blockchain technology.
High sensitive surface-acoustic-wave optical sensor based on two-dimensional perovskite. 2019 International Conference on IC Design and Technology (ICICDT). :1–4.
.
2019. Surface acoustic wave (SAW) optical sensor based on two-dimensional (2D) sensing layer can always provide extremely high sensitivity. As an attractive option, the application of exfoliated 2D perovskite on acousto-optic coupling optical sensor is investigated. In this work, exfoliated 2D (PEA)2PbI4 sheet was transferred as a sensing layer onto the delay area of a dual-port SAW resonator with resonant frequency 497 MHz. From the response under 532 nm laser with intensity of 0.9 mW/cm2, a largest frequency shift of 13.92 MHz was observed. The ultrahigh sensitivity up to 31.6 ppm/(μW/cm2) was calculated by experiment results. We also carried out theoretical analysis and finite element simulation of 3D model to demonstrate the mechanism and validity for optical sensing. The fabricated optical sensor expressed great potential for a variety of optical applications.
Resilient Cloud in Dynamic Resource Environments. Proceedings of the 2017 Symposium on Cloud Computing. :627–627.
.
2017. Traditional cloud stacks are designed to tolerate random, small-scale failures, and can successfully deliver highly-available cloud services and interactive services to end users. However, they fail to survive large-scale disruptions that are caused by major power outage, cyber-attack, or region/zone failures. Such changes trigger cascading failures and significant service outages. We propose to understand the reasons for these failures, and create reliable data services that can efficiently and robustly tolerate such large-scale resource changes. We believe cloud services will need to survive frequent, large dynamic resource changes in the future to be highly available. (1) Significant new challenges to cloud reliability are emerging, including cyber-attacks, power/network outages, and so on. For example, human error disrupted Amazon S3 service on 02/28/17 [2]. Recently hackers are even attacking electric utilities, which may lead to more outages [3, 6]. (2) Increased attention on resource cost optimization will increase usage dynamism, such as Amazon Spot Instances [1]. (3) Availability focused cloud applications will increasingly practice continuous testing to ensure they have no hidden source of catastrophic failure. For example, Netflix Simian Army can simulate the outages of individual servers, and even an entire AWS region [4]. (4) Cloud applications with dynamic flexibility will reap numerous benefits, such as flexible deployments, managing cost arbitrage and reliability arbitrage across cloud provides and datacenters, etc. Using Apache Cassandra [5] as the model system, we characterize its failure behavior under dynamic datacenter-scale resource changes. Each datacenter is volatile and randomly shut down with a given duty factor. We simulate read-only workload on a quorum-based system deployed across multiple datacenters, varying (1) system scale, (2) the fraction of volatile datacenters, and (3) the duty factor of volatile datacenters. We explore the space of various configurations, including replication factors and consistency levels, and measure the service availability (% of succeeded requests) and replication overhead (number of total replicas). Our results show that, in a volatile resource environment, the current replication and quorum protocols in Cassandra-like systems cannot high availability and consistency with low replication overhead. Our contributions include: (1) Detailed characterization of failures under dynamic datacenter-scale resource changes, showing that the exiting protocols in quorum-based systems cannot achieve high availability and consistency with low replication cost. (2) Study of the best achieve-able availability of data service in dynamic datacenter-scale resource environment.