ISSN 0253-2778

CN 34-1054/N

2024 Vol. 54, No. 4

Information Science and Technology
A statistical characteristics preserving watermarking scheme for time series databases
Yelu Yu, Zehua Ma, Jie Zhang, Han Fang, Weiming Zhang, Nenghai Yu
2024, 54(4): 0401. doi: 10.52396/JUSTC-2023-0091
Abstract:
Database watermarking is one of the most effective methods to protect the copyright of databases. However, traditional database watermarking has a potential drawback: watermark embedding will change the distribution of data, which may affect the use and analysis of databases. Considering that most a...
Toward 3D scene reconstruction from locally scale-aligned monocular video depth
Guangkai Xu, Feng Zhao
2024, 54(4): 0402. doi: 10.52396/JUSTC-2023-0061
Abstract:
Monocular depth estimation methods have achieved excellent robustness on diverse scenes, usually by predicting affine-invariant depth, up to an unknown scale and shift, rather than metric depth in that it is much easier to collect large-scale affine-invariant depth training data. However, in some vi...
Physically plausible and conservative solutions to Navier–Stokes equations using physics-informed CNNs
Jianfeng Li, Liangying Zhou, Jingwei Sun, Guangzhong Sun
2024, 54(4): 0403. doi: 10.52396/JUSTC-2022-0174
Abstract:
The physics-informed neural network (PINN) is an emerging approach for efficiently solving partial differential equations (PDEs) using neural networks. The physics-informed convolutional neural network (PICNN), a variant of PINN enhanced by convolutional neural networks (CNNs), has achieved better r...
A feature transfer model with Mixup and contrastive loss in domain generalization
Yuesong Wang, Hong Zhang
2024, 54(4): 0404. doi: 10.52396/JUSTC-2023-0010
Abstract:
When domains, which represent underlying data distributions, differ between training and test datasets, traditional deep neural networks suffer from a substantial drop in their performance. Domain generalization methods aim to boost generalizability on an unseen target domain by using only training ...
LightAD: accelerating AutoDebias with adaptive sampling
Yang Qiu, Hande Dong, Jiawei Chen, Xiangnan He
2024, 54(4): 0405. doi: 10.52396/JUSTC-2022-0100
Abstract:
In recommendation systems, bias is ubiquitous because the data are collected from user behaviors rather than from reasonable experiments. AutoDebias, which resorts to metalearning to find appropriate debiasing configurations, i.e., pseudolabels and confidence weights for all user-item pairs, has bee...
Management
The impact of external search, tie strength, and absorptive capacity on new product development performance
Huijun Yang, Wei Wang
2024, 54(4): 0406. doi: 10.52396/JUSTC-2022-0170
Abstract:
This study examines the influences of external search breadth and depth on new product development performance from a knowledge-based view. In particular, we introduce tie strength and absorptive capacity as two contextual variables in this study. The findings from data on 281 Chinese firms indicate...
Article
Hybrid fault tolerance in distributed in-memory storage systems
Zheng Gong, Si Wu, Yinlong Xu
2024, 54(4): 0406. doi: 10.52396/JUSTC-2022-0125
Abstract:
An in-memory storage system provides submillisecond latency and improves the concurrency of user applications by caching data into memory from external storage. Fault tolerance of in-memory storage systems is essential, as the loss of cached data requires access to data from external storage, which ...