xpark.dataset.ImageNSFWScore#
- class xpark.dataset.ImageNSFWScore(_local_model: str = 'Falconsai/nsfw_image_detection')[source]#
Image NSFW score calculation processor for CPU, GPU
- Parameters:
_local_model – The nsfw model name for CPU or GPU. default is “Falconsai/nsfw_image_detection” available models [‘Falconsai/nsfw_image_detection’]
Examples
from xpark.dataset.expressions import col from xpark.dataset import ImageNSFWScore, from_items import numpy as np ds = from_items([ {"image": np.random.randint(0, 255, (256, 256, 3)).astype(np.uint8), "path": "test.jpg"} ]) ds = ds.with_column( "image_nsfw_score", ImageNSFWScore().options(num_workers={"CPU": 4}, batch_size=1).with_column(col("item")), ) print(ds.take(1))
Methods
__call__(images)Call self as a function.
options(**kwargs)with_column(images)- __call__(images: pa.ChunkedArray) pa.Array#
Call self as a function.
- options(**kwargs: Unpack[ExprUDFOptions]) Self#
- with_column(images: pa.ChunkedArray) pa.Array#