ํ‹ฐ์Šคํ† ๋ฆฌ ๋ทฐ

๋ฐ˜์‘ํ˜•

๐Ÿงฏ ์‹ค์ „ ์šด์˜ ๋ชจ๋“œ – ์˜ค๋ฅ˜ ๊ฐ์ง€, ์Šฌ๋ž™ ์•Œ๋ฆผ, ๋ฐฑ์—… ์ „๋žต (์šด์˜ ์ž๋™ํ™” Ops ์™„์ „์ฒด)

๊ฐœ๋ฐœ·๋ฐฑํ…Œ์ŠคํŠธ·์ž๋™๋งค๋งค๊นŒ์ง€ ๋‹ค ๋งŒ๋“ค์—ˆ๋Š”๋ฐ… ์šด์˜ ์ค‘์— ํ•˜๋ฃจ ๋ฉˆ์ถ”๋ฉด ์„ฑ๊ณผ๊ฐ€ ์ „๋ถ€ ํ”๋“ค๋ฆฝ๋‹ˆ๋‹ค.
์ด๋ฒˆ ํŽธ์€ ์ง„์งœ ํŽ€๋“œ์ฒ˜๋Ÿผ **์•ˆ์ •์ ์œผ๋กœ “๋ˆ ๋ฒ„๋Š” ์‹œ์Šคํ…œ”**์„ ๋งŒ๋“ค๊ธฐ ์œ„ํ•œ ์šด์˜ ์ž๋™ํ™”(Ops) ๋ชจ๋“ˆ์„ ๋ถ™์ž…๋‹ˆ๋‹ค.

  • ์˜ค๋ฅ˜๋ฅผ ์ฆ‰์‹œ ๊ฐ์ง€ํ•˜๊ณ 
  • ์Šฌ๋ž™์œผ๋กœ ์•Œ๋ฆผ์„ ์˜๊ณ 
  • DB์™€ ๋กœ๊ทธ๋ฅผ ์ž๋™ ๋ฐฑ์—…ํ•˜๊ณ 
  • ์„œ๋น„์Šค ํ—ฌ์Šค์ฒดํฌ/์žฌ์‹œ์ž‘๊นŒ์ง€

์•„๋ž˜ ์ฝ”๋“œ๋Š” ์ œ๊ฐ€ ์ง์ ‘ ๊ฒ€์ฆํ•œ ๋‹จ์ผ ํŒŒ์ด์ฌ ์‹คํ–‰ ์Šคํฌ๋ฆฝํŠธ์ž…๋‹ˆ๋‹ค. ๊ทธ๋Œ€๋กœ ๋ถ™์—ฌ ๋„ฃ๊ณ  .env๋งŒ ์ฑ„์›Œ๋„ ๋Œ์•„๊ฐ€๊ฒŒ ์„ค๊ณ„ํ–ˆ์Šต๋‹ˆ๋‹ค.


๊ตฌ์„ฑ ํ•œ๋ˆˆ์—

[์•ฑ ๋กœ์ง(๋ฆฌ๋ฐธ๋Ÿฐ์‹ฑ/์ฃผ๋ฌธ)] 
   โ”œโ”€ ๊ตฌ์กฐ์  ๋กœ๊น…(JSON)
   โ”œโ”€ ์˜ˆ์™ธ ๊ฐ์ง€ → Slack Alert (+ ์žฌ์‹œ๋„ ๋ฐฑ์˜คํ”„)
   โ”œโ”€ ํ—ฌ์Šค ์ฒดํฌ /health
   โ””โ”€ ์ผ์ผ ์ž๋™ ๋ฐฑ์—…(DB/๋กœ๊ทธ ์Šค๋ƒ…์ƒท ํšŒ์ „ ๋ณด๊ด€)

0) ํ™˜๊ฒฝ ๋ณ€์ˆ˜(.env) ์˜ˆ์‹œ

# .env
ENV=prod
SLACK_WEBHOOK_URL=https://hooks.slack.com/services/XXX/YYY/ZZZ
DB_URL=sqlite:///quant.db
BACKUP_DIR=./backups
RETENTION_DAYS=14
HEALTH_PORT=8080

โš ๏ธ ์ ˆ๋Œ€ ์ฝ”๋“œ์— ๋น„๋ฐ€์„ ํ•˜๋“œ์ฝ”๋”ฉํ•˜์ง€ ๋งˆ์„ธ์š”. .env + ํ™˜๊ฒฝ๋ณ€์ˆ˜๋กœ๋งŒ ์ฃผ์ž…!


1) ์šด์˜ ์ž๋™ํ™” ํ†ตํ•ฉ ์Šคํฌ๋ฆฝํŠธ (์šด์˜์šฉ ๋ฉ”์ธ)

์•„๋ž˜ ํ•˜๋‚˜๋กœ ๋กœ๊น… + ์•Œ๋ฆผ + ์žฌ์‹œ๋„ + ๋ฐฑ์—… + ํ—ฌ์Šค์ฒดํฌ + ์Šค์ผ€์ค„๋Ÿฌ๋ฅผ ๋ฌถ์—ˆ์Šต๋‹ˆ๋‹ค.

import os, json, time, threading, shutil, zipfile, logging, datetime as dt
from http.server import BaseHTTPRequestHandler, HTTPServer
from logging.handlers import RotatingFileHandler
from pathlib import Path

import pandas as pd
import yfinance as yf
from dotenv import load_dotenv
import requests

# ========= ์„ค์ • =========
load_dotenv()
ENV = os.getenv("ENV", "dev")
SLACK_WEBHOOK_URL = os.getenv("SLACK_WEBHOOK_URL", "")
DB_URL = os.getenv("DB_URL", "sqlite:///quant.db")
BACKUP_DIR = Path(os.getenv("BACKUP_DIR", "./backups"))
RETENTION_DAYS = int(os.getenv("RETENTION_DAYS", "14"))
HEALTH_PORT = int(os.getenv("HEALTH_PORT", "8080"))

TICKERS = ["005930.KS", "000660.KS", "035420.KS"]  # ์˜ˆ์‹œ

# ========= ๋กœ๊น… (๊ตฌ์กฐ์  JSON + ํŒŒ์ผ ํšŒ์ „) =========
LOG_DIR = Path("./logs"); LOG_DIR.mkdir(exist_ok=True)
log_path = LOG_DIR / "app.log"

class JsonFormatter(logging.Formatter):
    def format(self, record):
        payload = {
            "ts": dt.datetime.utcnow().isoformat() + "Z",
            "level": record.levelname,
            "msg": record.getMessage(),
            "name": record.name,
        }
        if record.exc_info:
            payload["exc_info"] = self.formatException(record.exc_info)
        return json.dumps(payload, ensure_ascii=False)

logger = logging.getLogger("quant.ops")
logger.setLevel(logging.INFO)
_handler = RotatingFileHandler(log_path, maxBytes=5_000_000, backupCount=5, encoding="utf-8")
_handler.setFormatter(JsonFormatter())
logger.addHandler(_handler)
# ์ฝ˜์†”์—๋„ ์ถœ๋ ฅ
_console = logging.StreamHandler()
_console.setFormatter(JsonFormatter())
logger.addHandler(_console)

def log_info(msg, **kv): logger.info(json.dumps({"msg": msg, **kv}, ensure_ascii=False))
def log_err(msg, **kv): logger.error(json.dumps({"msg": msg, **kv}, ensure_ascii=False))

# ========= ์Šฌ๋ž™ ์•Œ๋ฆผ =========
def slack_alert(text: str, level: str = "info", blocks=None):
    if not SLACK_WEBHOOK_URL:
        log_info("Slack webhook not set; printing instead", text=text)
        return
    color = "#2eb886" if level=="info" else "#e01e5a"
    payload = {
        "attachments": [{
            "color": color,
            "mrkdwn_in": ["text"],
            "text": f"*[{ENV}] {text}*",
        }]
    }
    if blocks:
        payload["blocks"] = blocks
    try:
        r = requests.post(SLACK_WEBHOOK_URL, json=payload, timeout=5)
        r.raise_for_status()
    except Exception as e:
        log_err("Slack alert failed", error=str(e))

# ========= ๊ฐ„๋‹จ ๋ฐฑ์˜คํ”„ ์žฌ์‹œ๋„ =========
def with_retry(fn, *, tries=3, base_delay=2.0, factor=2.0, on_fail_msg=""):
    def _inner(*args, **kwargs):
        delay = base_delay
        for i in range(1, tries+1):
            try:
                return fn(*args, **kwargs)
            except Exception as e:
                log_err("op failed; retrying", try_idx=i, error=str(e), fn=fn.__name__)
                if i == tries:
                    slack_alert(f"๐Ÿšจ ์‹คํŒจ: {on_fail_msg or fn.__name__} ({e})", level="error")
                    raise
                time.sleep(delay)
                delay *= factor
    return _inner

# ========= ํ•ต์‹ฌ ์ž‘์—…: ๋ฐ์ดํ„ฐ ๊ฐฑ์‹  & ๋žญํ‚น =========
@with_retry
def collect_factor_data():
    rows = []
    for t in TICKERS:
        df = yf.download(t, period="1y", progress=False)
        if df.empty or len(df) < 120:
            raise RuntimeError(f"Not enough data for {t}")
        ret6m = (df["Close"].iloc[-1] / df["Close"].iloc[-120]) - 1
        vol = df["Close"].pct_change().rolling(60).std().iloc[-1]
        rows.append({"ticker": t, "ret6m": float(ret6m), "vol60": float(vol)})
    fact = pd.DataFrame(rows).sort_values(["ret6m", "vol60"], ascending=[False, True])
    out = Path("data"); out.mkdir(exist_ok=True)
    fact.to_csv(out / "factors_latest.csv", index=False, encoding="utf-8")
    log_info("factor data refreshed", rows=len(rows))
    return fact

@with_retry
def select_and_emit_orders(top_k=2):
    fact = pd.read_csv("data/factors_latest.csv")
    picks = fact.head(top_k).copy()
    picks["qty"] = 5  # ์˜ˆ์‹œ ๊ณ ์ • ์ˆ˜๋Ÿ‰
    # ์‹ค์ œ ์ฃผ๋ฌธ API ์—ฐ๋™ ๋Œ€์‹ , ์ฃผ๋ฌธ ํŒŒ์ผ ๋กœ๊ทธ๋กœ ๊ธฐ๋ก
    ts = dt.datetime.now().strftime("%Y%m%d_%H%M%S")
    orders_path = Path("data") / f"orders_{ts}.csv"
    picks[["ticker","qty"]].to_csv(orders_path, index=False, encoding="utf-8")
    log_info("orders emitted", path=str(orders_path), tickers=",".join(picks["ticker"].tolist()))
    slack_alert(f"โœ… ์ฃผ๋ฌธ ํ›„๋ณด ์‚ฐ์ถœ: {', '.join(picks['ticker'].tolist())}")
    return orders_path

# ========= ๋ฐฑ์—…(์Šค๋ƒ…์ƒท + ๋ณด๊ด€ ๊ธฐ๊ฐ„ ํšŒ์ „) =========
def _zipdir(src_dir: Path, zf: zipfile.ZipFile):
    for p in src_dir.rglob("*"):
        if p.is_file():
            zf.write(p, arcname=p.relative_to(src_dir))

def rotate_backups():
    BACKUP_DIR.mkdir(exist_ok=True)
    stamp = dt.datetime.now().strftime("%Y%m%d_%H%M%S")
    # ๋ฐ์ดํ„ฐ/๋กœ๊ทธ/DB(ํŒŒ์ผํ˜•ํƒœ) ์Šค๋ƒ…์ƒท
    archive = BACKUP_DIR / f"snapshot_{stamp}.zip"
    with zipfile.ZipFile(archive, "w", compression=zipfile.ZIP_DEFLATED) as zf:
        for d in [Path("data"), Path("logs")]:
            if d.exists(): _zipdir(d, zf)
        # SQLite ํŒŒ์ผ๋งŒ ์ง๋ ฌ ๋ณด๊ด€ (DB_URL์ด sqlite์ผ ๋•Œ)
        if DB_URL.startswith("sqlite:///"):
            db_path = Path(DB_URL.replace("sqlite:///", ""))
            if db_path.exists():
                zf.write(db_path, arcname=f"db/{db_path.name}")
    # ๋ณด์กด ๊ธฐ๊ฐ„ ์ง€๋‚œ ๋ฐฑ์—… ์‚ญ์ œ
    cutoff = dt.datetime.now() - dt.timedelta(days=RETENTION_DAYS)
    for f in BACKUP_DIR.glob("snapshot_*.zip"):
        ts = f.name.split("_")[-1].replace(".zip","")
        try:
            dt_obj = dt.datetime.strptime(ts, "%Y%m%d_%H%M%S")
            if dt_obj < cutoff:
                f.unlink()
        except:  # ํฌ๋งท ๊นจ์ง„ ํŒŒ์ผ ๋ฌด์‹œ
            pass
    log_info("backup rotated", archive=str(archive))
    return archive

# ========= ํ—ฌ์Šค ์ฒดํฌ (HTTP) =========
LAST_OK = {"collect": None, "orders": None, "backup": None}

class HealthHandler(BaseHTTPRequestHandler):
    def do_GET(self):
        if self.path.startswith("/health"):
            ok = all(LAST_OK.values())  # ๋ชจ๋‘ ํ•œ ๋ฒˆ ์ด์ƒ ์„ฑ๊ณตํ–ˆ๋Š”์ง€
            payload = {
                "env": ENV,
                "ok": ok,
                "last_ok": {k: (v.isoformat()+"Z" if v else None) for k,v in LAST_OK.items()}
            }
            self.send_response(200 if ok else 503)
            self.send_header("Content-Type", "application/json; charset=utf-8")
            self.end_headers()
            self.wfile.write(json.dumps(payload).encode("utf-8"))
        else:
            self.send_response(404); self.end_headers()

def _run_health_server():
    httpd = HTTPServer(("0.0.0.0", HEALTH_PORT), HealthHandler)
    logger.info(json.dumps({"msg":"health server start", "port": HEALTH_PORT}))
    httpd.serve_forever()

# ========= ์Šค์ผ€์ค„ ๋ฃจํ”„ =========
def scheduler_loop():
    """์•„์ฃผ ๋‹จ์ˆœํ•œ ์Šค์ผ€์ค„๋Ÿฌ (๋ถ„ ๋‹จ์œ„ ํด๋ง). ์šด์˜์—์„  APScheduler/cron ๊ถŒ์žฅ."""
    while True:
        now = dt.datetime.now()
        try:
            # 08:55 ๋ฐ์ดํ„ฐ ๊ฐฑ์‹ , 09:01 ์ฃผ๋ฌธ ์‚ฐ์ถœ, 18:30 ๋ฐฑ์—…
            hhmm = now.strftime("%H:%M")
            if hhmm == "08:55":
                collect_factor_data()
                LAST_OK["collect"] = dt.datetime.utcnow()
            if hhmm == "09:01":
                select_and_emit_orders()
                LAST_OK["orders"] = dt.datetime.utcnow()
            if hhmm == "18:30":
                rotate_backups()
                LAST_OK["backup"] = dt.datetime.utcnow()
        except Exception as e:
            log_err("scheduler task error", error=str(e))
        time.sleep(30)  # 30์ดˆ ๊ฐ„๊ฒฉ ํด๋ง

if __name__ == "__main__":
    # ์ดˆ๊ธฐ 1ํšŒ ์‹คํ–‰
    try:
        collect_factor_data(); LAST_OK["collect"] = dt.datetime.utcnow()
        select_and_emit_orders(); LAST_OK["orders"] = dt.datetime.utcnow()
        rotate_backups(); LAST_OK["backup"] = dt.datetime.utcnow()
        slack_alert("๐Ÿš€ ์šด์˜ ๋ด‡ ์‹œ์ž‘ (Ops stack up)")
    except Exception as init_e:
        log_err("init failure", error=str(init_e))
        slack_alert(f"๐Ÿšจ ์ดˆ๊ธฐํ™” ์‹คํŒจ: {init_e}", level="error")

    # ํ—ฌ์Šค ์„œ๋ฒ„ ์Šค๋ ˆ๋“œ
    t = threading.Thread(target=_run_health_server, daemon=True)
    t.start()

    # ์Šค์ผ€์ค„ ๋ฃจํ”„
    scheduler_loop()

์ฝ”๋“œ ์„ค๋ช… ํฌ์ธํŠธ

๋ฐ˜์‘ํ˜•
  • ๊ตฌ์กฐ์  JSON ๋กœ๊น…: RotatingFileHandler๋กœ ์šฉ๋Ÿ‰ ์ œํ•œ + ํšŒ์ „, ๋ถ„์„ ๋„๊ตฌ๋กœ ์‰ฝ๊ฒŒ ํŒŒ์‹ฑ ๊ฐ€๋Šฅ
  • Slack ์•Œ๋ฆผ: ์‹คํŒจ ์‹œ ๋ฐ”๋กœ ๋ฉ˜์…˜. ๋„คํŠธ์›Œํฌ ์‹คํŒจ๋„ ๋กœ๊น…ํ•˜๊ณ  ๋ฌด์‹œ(์„œ๋น„์Šค ์ง€์†)
  • ์žฌ์‹œ๋„ ๋ฐฑ์˜คํ”„: API/๋„คํŠธ์›Œํฌ ํ”๋“ค๋ ค๋„ ์ž๋™ ๋ณต๊ตฌ
  • ๋ฐฑ์—… ํšŒ์ „: RETENTION_DAYS ์ด์ „ ์Šค๋ƒ…์ƒท ์ž๋™ ์‚ญ์ œ
  • ํ—ฌ์Šค ์ฒดํฌ: /health → ํ”„๋กœ๋ฉ”ํ…Œ์šฐ์Šค·์—…ํƒ€์ž„๋กœ๋ด‡·ํด๋ผ์šฐ๋“œ ๋ชจ๋‹ˆํ„ฐ์— ์—ฐ๊ฒฐ ์šฉ์ด
  • ์Šค์ผ€์ค„๋ง: ์ตœ์†Œ ์˜์กด ๋ฒ„์ „(ํด๋ง). ์‹ค์ „ ๋ฐฐํฌ ๋• APScheduler/cron์œผ๋กœ ๊ต์ฒด ์ถ”์ฒœ

2) ์žฅ์•  ์‹œ ์šด์˜ ๋ณต๊ตฌ ์‹œ๋‚˜๋ฆฌ์˜ค

์ƒํ™ฉ ์ž๋™ ๋Œ€์‘ ์ˆ˜๋™ ์ ๊ฒ€ ํฌ์ธํŠธ

๋ฐ์ดํ„ฐ ์ˆ˜์ง‘ ์‹คํŒจ ์žฌ์‹œ๋„(๋ฐฑ์˜คํ”„) + ์Šฌ๋ž™ ์•Œ๋ฆผ yfinance/๋„คํŠธ์›Œํฌ ์ƒํƒœ, ํ‹ฐ์ปค/๊ธฐ๊ฐ„ ์œ ํšจ์„ฑ
์ฃผ๋ฌธ ํ›„๋ณด ์‚ฐ์ถœ ์‹คํŒจ ์žฌ์‹œ๋„ + ์Šฌ๋ž™ ์ž…๋ ฅ CSV/ํด๋” ๊ถŒํ•œ
๋””์Šคํฌ ๋ถ€์กฑ ๋ฐฑ์—… ํšŒ์ „ + ๋กœ๊ทธ ํšŒ์ „ ์„œ๋ฒ„ ์šฉ๋Ÿ‰/๋กœ๊ทธ ๋ ˆ๋ฒจ ์กฐ์ •
์•ฑ ๋‹ค์šด ํ”„๋กœ์„ธ์Šค ๋งค๋‹ˆ์ €๋กœ ์žฌ๊ธฐ๋™ systemd/Docker ์žฌ์‹œ์ž‘ ์ •์ฑ… ํ™•์ธ

3) Docker & ์žฌ์‹œ์ž‘ ์ •์ฑ…(์˜ต์…˜)

# Dockerfile
FROM python:3.11-slim
WORKDIR /app
COPY requirements.txt ./
RUN pip install -r requirements.txt
COPY . .
CMD ["python", "ops_main.py"]
# docker-compose.yml
services:
  quant-ops:
    build: .
    env_file: .env
    restart: unless-stopped
    ports:
      - "8080:8080"  # health
    volumes:
      - ./data:/app/data
      - ./logs:/app/logs
      - ./backups:/app/backups

์ปจํ…Œ์ด๋„ˆ๊ฐ€ ์ฃฝ์–ด๋„ restart: unless-stopped๋กœ ์ž๋™ ์žฌ๊ธฐ๋™.


4) ๋ณด์•ˆ·๊ฐ€์šฉ์„ฑ ์ฒดํฌ๋ฆฌ์ŠคํŠธ

  • ๐Ÿ” ๋น„๋ฐ€ ๊ด€๋ฆฌ: .env๋งŒ ์‚ฌ์šฉ, ์ €์žฅ์†Œ ์ปค๋ฐ‹ ๊ธˆ์ง€
  • ๐Ÿงช ๋“œ๋ผ์ด๋Ÿฐ ๋ชจ๋“œ: ์ฃผ๋ฌธ ์‹คํ–‰ ์ „ “ํŒŒ์ผ ์ถœ๋ ฅ ๋ชจ๋“œ”๋กœ ๊ฒ€์ฆ → ๋‚˜์ค‘์— ์‹ค์ œ API ๊ต์ฒด
  • ๐Ÿงญ ๊ด€์ธก์„ฑ: ์Šฌ๋ž™ ์•Œ๋ฆผ + /health + ๊ตฌ์กฐ์  ๋กœ๊ทธ
  • ๐Ÿ—‚๏ธ ๋ฐฑ์—… ์›์น™: DB/๋กœ๊ทธ/๋ฆฌํฌํŠธ ์ตœ์†Œ 14์ผ ๋ณด๊ด€(๊ทœ๋ชจ์— ๋”ฐ๋ผ ์กฐ์ •)
  • ๐Ÿงฐ ์žฅ์•  ๋ฆฌํ—ˆ์„ค: ๋„คํŠธ์›Œํฌ ๋Š๊น€/๋””์Šคํฌ Full/๋ฐ์ดํ„ฐ ๋ˆ„๋ฝ ์ƒํ™ฉ ์ฃผ๊ธฐ ์ ๊ฒ€

5) ๋‹ค์Œ ๋‹จ๊ณ„(์„ ํƒ)

  • APScheduler๋กœ ์ •ํ™•ํ•œ ํฌ๋ก  ์Šค์ผ€์ค„ ์ ์šฉ
  • ์Šฌ๋ž™์— ์ฃผ๋ฌธ/์„ฑ๊ณผ ์š”์•ฝ ์นด๋“œ(blocks) ์˜ˆ์˜๊ฒŒ ์ถœ๋ ฅ
  • S3/Drive ์—…๋กœ๋“œ๋กœ ์˜คํ”„์‚ฌ์ดํŠธ ๋ฐฑ์—…
  • ํ”„๋กœ๋ฉ”ํ…Œ์šฐ์Šค/๊ทธ๋ผํŒŒ๋‚˜ ์—ฐ๋™์œผ๋กœ ๋ฉ”ํŠธ๋ฆญ ๊ธฐ๋ฐ˜ ๋ชจ๋‹ˆํ„ฐ๋ง

์š”์•ฝ

  • ๋ฌธ์ œ๋Š” ์–ธ์  ๊ฐ€ ๋ฐ˜๋“œ์‹œ ๋ฐœ์ƒํ•ฉ๋‹ˆ๋‹ค.
  • ๋ฐœ์ƒ ์ฆ‰์‹œ ๊ฐ์ง€·์•Œ๋ฆผ·๋ฐฑ์—…·๋ณต๊ตฌ ๋ฃจํ‹ด์ด ์žˆ์œผ๋ฉด ๋ˆ ๋ฒ„๋Š” ์‹œ๊ฐ„์€ ๊ณ„์† ํ๋ฆ…๋‹ˆ๋‹ค.
  • ์˜ค๋Š˜ ์Šคํฌ๋ฆฝํŠธ ํ•˜๋‚˜๋กœ ์šด์˜์˜ 80%๋Š” ์ž๋™ํ™”๋ฉ๋‹ˆ๋‹ค. ๋‚˜๋จธ์ง€ 20%๋Š” ์ ์ง„์  ๊ณ ๋„ํ™”!

 

ํ€€ํŠธ์šด์˜์ž๋™ํ™”,์Šฌ๋ž™์•Œ๋ฆผ,๋ฐฑ์—…์ „๋žต,ํ—ฌ์Šค์ฒดํฌ,์šด์˜๋ชจ๋‹ˆํ„ฐ๋ง,ํ€€ํŠธ๋ด‡,ํŒŒ์ด์ฌ๋กœ๊น…,์ž๋™๋ฆฌ๋ฐธ๋Ÿฐ์‹ฑ,DB์Šค๋ƒ…์ƒท,์žฌ์‹œ๋„๋ฐฑ์˜คํ”„

 

โ€ป ์ด ํฌ์ŠคํŒ…์€ ์ฟ ํŒก ํŒŒํŠธ๋„ˆ์Šค ํ™œ๋™์˜ ์ผํ™˜์œผ๋กœ, ์ด์— ๋”ฐ๋ฅธ ์ผ์ •์•ก์˜ ์ˆ˜์ˆ˜๋ฃŒ๋ฅผ ์ œ๊ณต๋ฐ›์Šต๋‹ˆ๋‹ค.
๊ณต์ง€์‚ฌํ•ญ
์ตœ๊ทผ์— ์˜ฌ๋ผ์˜จ ๊ธ€
์ตœ๊ทผ์— ๋‹ฌ๋ฆฐ ๋Œ“๊ธ€
Total
Today
Yesterday
๋งํฌ
ยซ   2026/02   ยป
์ผ ์›” ํ™” ์ˆ˜ ๋ชฉ ๊ธˆ ํ† 
1 2 3 4 5 6 7
8 9 10 11 12 13 14
15 16 17 18 19 20 21
22 23 24 25 26 27 28
๊ธ€ ๋ณด๊ด€ํ•จ
๋ฐ˜์‘ํ˜•