Compare commits
5 commits
6f49ee0bc8
...
7f1a1afaf7
| Author | SHA1 | Date | |
|---|---|---|---|
| 7f1a1afaf7 | |||
| b8a73bfb3e | |||
| bf700a15af | |||
| b653590c36 | |||
| e15e6f1053 |
11 changed files with 290 additions and 10 deletions
|
|
@ -49,11 +49,20 @@ Markdown files are named with a timestamp: `YYYYMMDD-HHMMSS [markers].md`
|
||||||
|
|
||||||
For example: `20260131-210000 Task Streamd.md`
|
For example: `20260131-210000 Task Streamd.md`
|
||||||
|
|
||||||
|
An optional `_file_type` segment can follow the timestamp to classify the file:
|
||||||
|
|
||||||
|
```
|
||||||
|
YYYYMMDD-HHMMSS_<file_type> [markers].md
|
||||||
|
```
|
||||||
|
|
||||||
|
For example: `20260413-083000_daily.md` — the `daily` prefix is stored as the `file_type` dimension and propagates to all child shards.
|
||||||
|
|
||||||
Within files, `@`-prefixed markers at the beginning of paragraphs or headings define how a shard is categorized.
|
Within files, `@`-prefixed markers at the beginning of paragraphs or headings define how a shard is categorized.
|
||||||
|
|
||||||
## Commands
|
## Commands
|
||||||
|
|
||||||
- `streamd` / `streamd new` — Create a new timestamped markdown entry, opening your editor
|
- `streamd` / `streamd new` — Create a new timestamped markdown entry, opening your editor
|
||||||
|
- `streamd daily [YYYYMMDD]` — Open today's daily file (or create it if missing); pass a date to open that day's file instead
|
||||||
- `streamd todo` — Show all open tasks (shards with `@Task` markers), numbered for easy reference
|
- `streamd todo` — Show all open tasks (shards with `@Task` markers), numbered for easy reference
|
||||||
- `streamd todo N edit` — Edit task N in your editor, jumping to the task's line
|
- `streamd todo N edit` — Edit task N in your editor, jumping to the task's line
|
||||||
- `streamd todo N done` — Mark task N as done by inserting `@Done` after `@Task`
|
- `streamd todo N done` — Mark task N as done by inserting `@Done` after `@Task`
|
||||||
|
|
|
||||||
|
|
@ -275,13 +275,18 @@ This allows conditional placements to override base placements.
|
||||||
|
|
||||||
### R15: File Name Format
|
### R15: File Name Format
|
||||||
|
|
||||||
Files follow the pattern: `YYYYMMDD-HHMMSS [markers].md`
|
Files follow the pattern: `YYYYMMDD-HHMMSS[_file_type] [markers].md`
|
||||||
|
|
||||||
- `YYYYMMDD`: Date (8 digits, required)
|
- `YYYYMMDD`: Date (8 digits, required)
|
||||||
- `HHMMSS`: Time (4-6 digits, optional, pads with zeros)
|
- `HHMMSS`: Time (4-6 digits, optional, pads with zeros)
|
||||||
|
- `_file_type`: Optional alphanumeric prefix identifying the file type (e.g. `_daily`)
|
||||||
- `[markers]`: Space-separated marker names extracted from file content
|
- `[markers]`: Space-separated marker names extracted from file content
|
||||||
|
|
||||||
**Extraction regex:** `^(?P<date>\d{8})(?:-(?P<time>\d{4,6}))?.+\.md$`
|
**Extraction regex for datetime:** `^(?P<date>\d{8})(?:-(?P<time>\d{4,6}))?.+\.md$`
|
||||||
|
|
||||||
|
**Extraction regex for file type:** `^\d{8}(?:-\d{4,6})?_([a-zA-Z0-9]+)`
|
||||||
|
|
||||||
|
When a `_file_type` prefix is present it is stored in the `file_type` dimension of the root shard and propagates to all child shards.
|
||||||
|
|
||||||
### R16: Temporal Markers
|
### R16: Temporal Markers
|
||||||
|
|
||||||
|
|
@ -387,6 +392,7 @@ Provide recursive search through the shard tree:
|
||||||
| Command | Description |
|
| Command | Description |
|
||||||
|---------|-------------|
|
|---------|-------------|
|
||||||
| `streamd new` | Create new timestamped file, open editor, rename with markers on close |
|
| `streamd new` | Create new timestamped file, open editor, rename with markers on close |
|
||||||
|
| `streamd daily [YYYYMMDD]` | Open the earliest daily file for the given date (default: today in configured timezone), or create a new `_daily` file if none exists |
|
||||||
| `streamd todo` | List all shards with `task: "open"`, numbered, hiding future tasks |
|
| `streamd todo` | List all shards with `task: "open"`, numbered, hiding future tasks |
|
||||||
| `streamd todo --show-future` | Include tasks with future dates in the todo listing |
|
| `streamd todo --show-future` | Include tasks with future dates in the todo listing |
|
||||||
| `streamd todo N edit` | Edit task N in editor, cursor positioned at task line |
|
| `streamd todo N edit` | Edit task N in editor, cursor positioned at task line |
|
||||||
|
|
@ -395,6 +401,23 @@ Provide recursive search through the shard tree:
|
||||||
| `streamd timesheet` | Generate formatted timesheet report with expected/actual hours |
|
| `streamd timesheet` | Generate formatted timesheet report with expected/actual hours |
|
||||||
| `streamd completions <shell>` | Generate shell completions (bash, zsh, fish, elvish, powershell) |
|
| `streamd completions <shell>` | Generate shell completions (bash, zsh, fish, elvish, powershell) |
|
||||||
|
|
||||||
|
### R21a: Daily Command Behavior
|
||||||
|
|
||||||
|
`streamd daily [YYYYMMDD]` provides quick access to the daily journal entry for a given date.
|
||||||
|
|
||||||
|
**Date resolution:**
|
||||||
|
- If a `YYYYMMDD` argument is provided, it is parsed as the target date.
|
||||||
|
- If no argument is given, today's date is used, interpreted in the repository timezone (from `.streamd.toml`, defaulting to UTC).
|
||||||
|
|
||||||
|
**File lookup:**
|
||||||
|
- All markdown files in the base folder are localized.
|
||||||
|
- Files with `file_type = "daily"` whose root shard `moment` falls within the target date (in the configured timezone) are collected.
|
||||||
|
- The file with the earliest `moment` is opened in `$EDITOR` (defaults to `vi`).
|
||||||
|
|
||||||
|
**File creation:**
|
||||||
|
- If no matching file is found, a new file is created at `<now_local>_daily.md` (e.g. `20260413-083000_daily.md`) containing `# ` and opened in the editor.
|
||||||
|
- The `_daily` suffix is permanent — it identifies the file type and is not renamed after editing.
|
||||||
|
|
||||||
### R21: Todo Command Behavior
|
### R21: Todo Command Behavior
|
||||||
|
|
||||||
**Task Numbering:**
|
**Task Numbering:**
|
||||||
|
|
|
||||||
|
|
@ -60,6 +60,12 @@ pub enum Commands {
|
||||||
debug: bool,
|
debug: bool,
|
||||||
},
|
},
|
||||||
|
|
||||||
|
/// Open or create the daily entry for a given date
|
||||||
|
Daily {
|
||||||
|
/// Date in YYYYMMDD format (defaults to today in configured timezone)
|
||||||
|
date: Option<String>,
|
||||||
|
},
|
||||||
|
|
||||||
/// Generate shell completions
|
/// Generate shell completions
|
||||||
Completions {
|
Completions {
|
||||||
/// Shell to generate completions for
|
/// Shell to generate completions for
|
||||||
|
|
|
||||||
100
src/cli/commands/daily.rs
Normal file
100
src/cli/commands/daily.rs
Normal file
|
|
@ -0,0 +1,100 @@
|
||||||
|
use std::fs;
|
||||||
|
use std::path::Path;
|
||||||
|
use std::process::Command;
|
||||||
|
|
||||||
|
use chrono::{Days, NaiveDate, NaiveDateTime, NaiveTime, TimeZone, Utc};
|
||||||
|
use chrono_tz::Tz;
|
||||||
|
use walkdir::WalkDir;
|
||||||
|
|
||||||
|
use crate::config::Settings;
|
||||||
|
use crate::error::StreamdError;
|
||||||
|
use crate::extract::parse_markdown_file;
|
||||||
|
use crate::localize::localize_stream_file;
|
||||||
|
use crate::models::{LocalizedShard, RepositoryConfiguration};
|
||||||
|
use crate::timesheet::load_repository_config;
|
||||||
|
|
||||||
|
fn load_all_shards(base_folder: &Path, tz: Tz) -> Result<Vec<LocalizedShard>, StreamdError> {
|
||||||
|
let config = RepositoryConfiguration::new();
|
||||||
|
let mut shards = Vec::new();
|
||||||
|
|
||||||
|
for entry in WalkDir::new(base_folder)
|
||||||
|
.max_depth(1)
|
||||||
|
.into_iter()
|
||||||
|
.filter_map(|e| e.ok())
|
||||||
|
{
|
||||||
|
let path = entry.path();
|
||||||
|
if path.extension().map(|e| e == "md").unwrap_or(false) {
|
||||||
|
let file_name = path.to_string_lossy().to_string();
|
||||||
|
let content = fs::read_to_string(path)?;
|
||||||
|
let stream_file = parse_markdown_file(&file_name, &content);
|
||||||
|
|
||||||
|
if let Ok(shard) = localize_stream_file(&stream_file, &config, tz) {
|
||||||
|
shards.push(shard);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
Ok(shards)
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn run(date: Option<String>) -> Result<(), StreamdError> {
|
||||||
|
let settings = Settings::load()?;
|
||||||
|
let base_folder = Path::new(&settings.base_folder);
|
||||||
|
|
||||||
|
let repo_config = load_repository_config(base_folder)?;
|
||||||
|
let tz: Tz = repo_config
|
||||||
|
.timezone
|
||||||
|
.as_deref()
|
||||||
|
.and_then(|s| s.parse().ok())
|
||||||
|
.unwrap_or(chrono_tz::UTC);
|
||||||
|
|
||||||
|
let target_date: NaiveDate = match date {
|
||||||
|
Some(s) => NaiveDate::parse_from_str(&s, "%Y%m%d").map_err(|_| {
|
||||||
|
StreamdError::ConfigError("Invalid date format, expected YYYYMMDD".into())
|
||||||
|
})?,
|
||||||
|
None => Utc::now().with_timezone(&tz).date_naive(),
|
||||||
|
};
|
||||||
|
|
||||||
|
let day_start = tz
|
||||||
|
.from_local_datetime(&NaiveDateTime::new(target_date, NaiveTime::MIN))
|
||||||
|
.earliest()
|
||||||
|
.unwrap()
|
||||||
|
.with_timezone(&Utc);
|
||||||
|
let day_end = tz
|
||||||
|
.from_local_datetime(&NaiveDateTime::new(
|
||||||
|
target_date + Days::new(1),
|
||||||
|
NaiveTime::MIN,
|
||||||
|
))
|
||||||
|
.earliest()
|
||||||
|
.unwrap()
|
||||||
|
.with_timezone(&Utc);
|
||||||
|
|
||||||
|
let all_shards = load_all_shards(base_folder, tz)?;
|
||||||
|
let mut daily_shards: Vec<_> = all_shards
|
||||||
|
.into_iter()
|
||||||
|
.filter(|s| {
|
||||||
|
s.location
|
||||||
|
.get("file_type")
|
||||||
|
.map(|v| v == "daily")
|
||||||
|
.unwrap_or(false)
|
||||||
|
})
|
||||||
|
.filter(|s| s.moment >= day_start && s.moment < day_end)
|
||||||
|
.collect();
|
||||||
|
daily_shards.sort_by_key(|s| s.moment);
|
||||||
|
|
||||||
|
let editor = std::env::var("EDITOR").unwrap_or_else(|_| "vi".to_string());
|
||||||
|
|
||||||
|
if let Some(shard) = daily_shards.first() {
|
||||||
|
let file_path = shard.location.get("file").unwrap();
|
||||||
|
Command::new(&editor).arg(file_path).status()?;
|
||||||
|
} else {
|
||||||
|
let now_local = Utc::now().with_timezone(&tz);
|
||||||
|
let file_name = now_local.format("%Y%m%d-%H%M%S_daily.md").to_string();
|
||||||
|
let file_path = base_folder.join(&file_name);
|
||||||
|
fs::write(&file_path, "# ")?;
|
||||||
|
Command::new(&editor).arg(&file_path).status()?;
|
||||||
|
println!("Created {}", file_name);
|
||||||
|
}
|
||||||
|
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
@ -1,4 +1,5 @@
|
||||||
pub mod completions;
|
pub mod completions;
|
||||||
|
pub mod daily;
|
||||||
pub mod edit;
|
pub mod edit;
|
||||||
pub mod new;
|
pub mod new;
|
||||||
pub mod timesheet;
|
pub mod timesheet;
|
||||||
|
|
|
||||||
|
|
@ -12,6 +12,8 @@ struct BlockInfo {
|
||||||
end_line: usize,
|
end_line: usize,
|
||||||
block_type: BlockType,
|
block_type: BlockType,
|
||||||
events: Vec<Event<'static>>,
|
events: Vec<Event<'static>>,
|
||||||
|
/// Nested list items contained within this block (for ListItem blocks with sub-lists).
|
||||||
|
nested_items: Vec<BlockInfo>,
|
||||||
}
|
}
|
||||||
|
|
||||||
#[derive(Debug, Clone, PartialEq)]
|
#[derive(Debug, Clone, PartialEq)]
|
||||||
|
|
@ -110,12 +112,14 @@ pub fn parse_markdown_file(file_name: &str, file_content: &str) -> StreamFile {
|
||||||
fn collect_blocks(content: &str, parser: Parser) -> Vec<BlockInfo> {
|
fn collect_blocks(content: &str, parser: Parser) -> Vec<BlockInfo> {
|
||||||
let mut blocks = Vec::new();
|
let mut blocks = Vec::new();
|
||||||
let mut current_block: Option<BlockInfo> = None;
|
let mut current_block: Option<BlockInfo> = None;
|
||||||
let _current_events: Vec<Event<'static>> = Vec::new();
|
|
||||||
let mut depth = 0;
|
let mut depth = 0;
|
||||||
let mut list_items: Vec<BlockInfo> = Vec::new();
|
let mut list_items: Vec<BlockInfo> = Vec::new();
|
||||||
let mut in_list = false;
|
let mut in_list = false;
|
||||||
let mut list_start_line = 0;
|
let mut list_start_line = 0;
|
||||||
|
|
||||||
|
// Stack for nested lists: (saved current_block, saved list_items, saved list_start_line)
|
||||||
|
let mut list_nesting_stack: Vec<(Option<BlockInfo>, Vec<BlockInfo>, usize)> = Vec::new();
|
||||||
|
|
||||||
// Pre-compute line starts for offset-to-line mapping
|
// Pre-compute line starts for offset-to-line mapping
|
||||||
let line_starts: Vec<usize> = std::iter::once(0)
|
let line_starts: Vec<usize> = std::iter::once(0)
|
||||||
.chain(content.match_indices('\n').map(|(i, _)| i + 1))
|
.chain(content.match_indices('\n').map(|(i, _)| i + 1))
|
||||||
|
|
@ -135,6 +139,7 @@ fn collect_blocks(content: &str, parser: Parser) -> Vec<BlockInfo> {
|
||||||
end_line: line,
|
end_line: line,
|
||||||
block_type: BlockType::Paragraph,
|
block_type: BlockType::Paragraph,
|
||||||
events: Vec::new(),
|
events: Vec::new(),
|
||||||
|
nested_items: Vec::new(),
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
depth += 1;
|
depth += 1;
|
||||||
|
|
@ -166,6 +171,7 @@ fn collect_blocks(content: &str, parser: Parser) -> Vec<BlockInfo> {
|
||||||
end_line: line,
|
end_line: line,
|
||||||
block_type: BlockType::Heading(heading_level),
|
block_type: BlockType::Heading(heading_level),
|
||||||
events: Vec::new(),
|
events: Vec::new(),
|
||||||
|
nested_items: Vec::new(),
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
depth += 1;
|
depth += 1;
|
||||||
|
|
@ -186,7 +192,15 @@ fn collect_blocks(content: &str, parser: Parser) -> Vec<BlockInfo> {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
Event::Start(Tag::List(_)) => {
|
Event::Start(Tag::List(_)) => {
|
||||||
if !in_list {
|
if in_list {
|
||||||
|
// Entering a nested list: save current list item and collected items
|
||||||
|
list_nesting_stack.push((
|
||||||
|
current_block.take(),
|
||||||
|
std::mem::take(&mut list_items),
|
||||||
|
list_start_line,
|
||||||
|
));
|
||||||
|
list_start_line = line;
|
||||||
|
} else {
|
||||||
in_list = true;
|
in_list = true;
|
||||||
list_start_line = line;
|
list_start_line = line;
|
||||||
list_items.clear();
|
list_items.clear();
|
||||||
|
|
@ -195,7 +209,18 @@ fn collect_blocks(content: &str, parser: Parser) -> Vec<BlockInfo> {
|
||||||
}
|
}
|
||||||
Event::End(TagEnd::List(_)) => {
|
Event::End(TagEnd::List(_)) => {
|
||||||
depth -= 1;
|
depth -= 1;
|
||||||
if depth == 0 && in_list {
|
if let Some((parent_block, parent_items, parent_start_line)) =
|
||||||
|
list_nesting_stack.pop()
|
||||||
|
{
|
||||||
|
// Nested list ended: attach collected items as nested children of parent item
|
||||||
|
let nested = std::mem::take(&mut list_items);
|
||||||
|
list_start_line = parent_start_line;
|
||||||
|
list_items = parent_items;
|
||||||
|
current_block = parent_block.map(|mut item| {
|
||||||
|
item.nested_items = nested;
|
||||||
|
item
|
||||||
|
});
|
||||||
|
} else if depth == 0 && in_list {
|
||||||
in_list = false;
|
in_list = false;
|
||||||
// Create a list block containing all list items
|
// Create a list block containing all list items
|
||||||
if !list_items.is_empty() {
|
if !list_items.is_empty() {
|
||||||
|
|
@ -204,6 +229,7 @@ fn collect_blocks(content: &str, parser: Parser) -> Vec<BlockInfo> {
|
||||||
end_line: line,
|
end_line: line,
|
||||||
block_type: BlockType::List,
|
block_type: BlockType::List,
|
||||||
events: vec![], // List events are handled through list_items
|
events: vec![], // List events are handled through list_items
|
||||||
|
nested_items: vec![],
|
||||||
});
|
});
|
||||||
// Store list items for later processing
|
// Store list items for later processing
|
||||||
for item in list_items.drain(..) {
|
for item in list_items.drain(..) {
|
||||||
|
|
@ -222,6 +248,7 @@ fn collect_blocks(content: &str, parser: Parser) -> Vec<BlockInfo> {
|
||||||
end_line: line,
|
end_line: line,
|
||||||
block_type: BlockType::ListItem,
|
block_type: BlockType::ListItem,
|
||||||
events: Vec::new(),
|
events: Vec::new(),
|
||||||
|
nested_items: Vec::new(),
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
@ -240,6 +267,7 @@ fn collect_blocks(content: &str, parser: Parser) -> Vec<BlockInfo> {
|
||||||
end_line: line,
|
end_line: line,
|
||||||
block_type: BlockType::CodeBlock,
|
block_type: BlockType::CodeBlock,
|
||||||
events: Vec::new(),
|
events: Vec::new(),
|
||||||
|
nested_items: Vec::new(),
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
depth += 1;
|
depth += 1;
|
||||||
|
|
@ -507,13 +535,21 @@ fn parse_single_block_shard(
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
BlockType::List | BlockType::ListItem => {
|
BlockType::List | BlockType::ListItem => {
|
||||||
// List handling is complex - for now, extract any markers/tags
|
|
||||||
let (markers, tags) = extract_block_markers_and_tags(block);
|
let (markers, tags) = extract_block_markers_and_tags(block);
|
||||||
if markers.is_empty() {
|
// Recursively build child shards from nested list items
|
||||||
|
let children: Vec<Shard> = block
|
||||||
|
.nested_items
|
||||||
|
.iter()
|
||||||
|
.filter_map(|item| {
|
||||||
|
let (child, _) = parse_single_block_shard(item, item.start_line, item.end_line);
|
||||||
|
child
|
||||||
|
})
|
||||||
|
.collect();
|
||||||
|
if markers.is_empty() && children.is_empty() {
|
||||||
(None, tags)
|
(None, tags)
|
||||||
} else {
|
} else {
|
||||||
(
|
(
|
||||||
Some(build_shard(start_line, end_line, markers, tags, vec![])),
|
Some(build_shard(start_line, end_line, markers, tags, children)),
|
||||||
vec![],
|
vec![],
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
|
@ -716,6 +752,26 @@ mod tests {
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_parse_nested_list_creates_three_shards() {
|
||||||
|
let content = "* @Task 1\n * @Task 2\n* @Task 3";
|
||||||
|
let result = parse_markdown_file(&make_file_name(), content);
|
||||||
|
let root = result.shard.unwrap();
|
||||||
|
// The root shard should have two top-level children: @Task 1 and @Task 3
|
||||||
|
assert_eq!(root.children.len(), 2, "expected 2 top-level shards");
|
||||||
|
let task1 = &root.children[0];
|
||||||
|
let task3 = &root.children[1];
|
||||||
|
// @Task 1 must carry its marker and contain @Task 2 as a child
|
||||||
|
assert_eq!(task1.markers, vec!["Task"], "@Task 1 marker");
|
||||||
|
assert_eq!(task1.children.len(), 1, "@Task 1 should have one child");
|
||||||
|
let task2 = &task1.children[0];
|
||||||
|
assert_eq!(task2.markers, vec!["Task"], "@Task 2 marker");
|
||||||
|
assert!(task2.children.is_empty(), "@Task 2 should have no children");
|
||||||
|
// @Task 3 is a sibling of @Task 1
|
||||||
|
assert_eq!(task3.markers, vec!["Task"], "@Task 3 marker");
|
||||||
|
assert!(task3.children.is_empty(), "@Task 3 should have no children");
|
||||||
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn test_parse_continues_looking_for_markers_after_first_link_marker() {
|
fn test_parse_continues_looking_for_markers_after_first_link_marker() {
|
||||||
let result = parse_markdown_file(
|
let result = parse_markdown_file(
|
||||||
|
|
|
||||||
|
|
@ -9,6 +9,11 @@ use std::path::Path;
|
||||||
static FILE_NAME_REGEX: Lazy<Regex> =
|
static FILE_NAME_REGEX: Lazy<Regex> =
|
||||||
Lazy::new(|| Regex::new(r"^(?P<date>\d{8})(?:-(?P<time>\d{4,6}))?.+\.md$").unwrap());
|
Lazy::new(|| Regex::new(r"^(?P<date>\d{8})(?:-(?P<time>\d{4,6}))?.+\.md$").unwrap());
|
||||||
|
|
||||||
|
/// Regex for extracting a file-type prefix from file names.
|
||||||
|
/// Matches filenames like `20260412-123456_daily.md` or `20260412_daily Some Title.md`.
|
||||||
|
static FILE_TYPE_REGEX: Lazy<Regex> =
|
||||||
|
Lazy::new(|| Regex::new(r"^\d{8}(?:-\d{4,6})?_([a-zA-Z0-9]+)").unwrap());
|
||||||
|
|
||||||
/// Regex for validating datetime marker format (14 digits).
|
/// Regex for validating datetime marker format (14 digits).
|
||||||
static DATETIME_MARKER_REGEX: Lazy<Regex> = Lazy::new(|| Regex::new(r"^\d{14}$").unwrap());
|
static DATETIME_MARKER_REGEX: Lazy<Regex> = Lazy::new(|| Regex::new(r"^\d{14}$").unwrap());
|
||||||
|
|
||||||
|
|
@ -62,6 +67,28 @@ pub fn extract_datetime_from_file_name(file_name: &str, tz: Tz) -> Option<DateTi
|
||||||
.and_then(|dt| naive_to_utc(dt, tz))
|
.and_then(|dt| naive_to_utc(dt, tz))
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// Extract the file-type prefix from a filename.
|
||||||
|
///
|
||||||
|
/// Filenames with a `_prefix` segment after the timestamp (and optional time component)
|
||||||
|
/// are recognised. The prefix must consist of alphanumeric characters only.
|
||||||
|
///
|
||||||
|
/// # Examples
|
||||||
|
/// - `"20260412-123456_daily.md"` → `Some("daily")`
|
||||||
|
/// - `"20260412_daily Some Title.md"` → `Some("daily")`
|
||||||
|
/// - `"20260412-123456 Some Title.md"` → `None`
|
||||||
|
/// - `"/path/to/20260412-123456_daily.md"` → `Some("daily")`
|
||||||
|
pub fn extract_file_type_from_file_name(file_name: &str) -> Option<String> {
|
||||||
|
let base_name = Path::new(file_name)
|
||||||
|
.file_name()
|
||||||
|
.and_then(|s| s.to_str())
|
||||||
|
.unwrap_or(file_name);
|
||||||
|
|
||||||
|
FILE_TYPE_REGEX
|
||||||
|
.captures(base_name)
|
||||||
|
.and_then(|c| c.get(1))
|
||||||
|
.map(|m| m.as_str().to_string())
|
||||||
|
}
|
||||||
|
|
||||||
/// Parse a 14-digit marker string as a NaiveDateTime without timezone conversion.
|
/// Parse a 14-digit marker string as a NaiveDateTime without timezone conversion.
|
||||||
fn parse_naive_datetime_from_marker(marker: &str) -> Option<NaiveDateTime> {
|
fn parse_naive_datetime_from_marker(marker: &str) -> Option<NaiveDateTime> {
|
||||||
if !DATETIME_MARKER_REGEX.is_match(marker) {
|
if !DATETIME_MARKER_REGEX.is_match(marker) {
|
||||||
|
|
@ -155,6 +182,51 @@ mod tests {
|
||||||
use chrono::TimeZone;
|
use chrono::TimeZone;
|
||||||
use chrono_tz::UTC;
|
use chrono_tz::UTC;
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_extract_file_type_with_time() {
|
||||||
|
assert_eq!(
|
||||||
|
extract_file_type_from_file_name("20260412-123456_daily.md"),
|
||||||
|
Some("daily".to_string())
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_extract_file_type_with_time_and_title() {
|
||||||
|
assert_eq!(
|
||||||
|
extract_file_type_from_file_name("20260412-123456_daily Some Title.md"),
|
||||||
|
Some("daily".to_string())
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_extract_file_type_without_time() {
|
||||||
|
assert_eq!(
|
||||||
|
extract_file_type_from_file_name("20260412_daily.md"),
|
||||||
|
Some("daily".to_string())
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_extract_file_type_without_prefix() {
|
||||||
|
assert_eq!(
|
||||||
|
extract_file_type_from_file_name("20260412-123456 Some Title.md"),
|
||||||
|
None
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_extract_file_type_with_full_path() {
|
||||||
|
assert_eq!(
|
||||||
|
extract_file_type_from_file_name("/path/to/20260412-123456_daily.md"),
|
||||||
|
Some("daily".to_string())
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_extract_file_type_no_timestamp() {
|
||||||
|
assert_eq!(extract_file_type_from_file_name("notes.md"), None);
|
||||||
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn test_extract_date_from_file_name_valid() {
|
fn test_extract_date_from_file_name_valid() {
|
||||||
let file_name = "20230101-123456 Some Text.md";
|
let file_name = "20230101-123456 Some Text.md";
|
||||||
|
|
|
||||||
|
|
@ -9,7 +9,7 @@ pub use configuration::{
|
||||||
};
|
};
|
||||||
pub use datetime::{
|
pub use datetime::{
|
||||||
extract_date_from_marker, extract_datetime_from_file_name, extract_datetime_from_marker,
|
extract_date_from_marker, extract_datetime_from_file_name, extract_datetime_from_marker,
|
||||||
extract_datetime_from_marker_list, extract_time_from_marker,
|
extract_datetime_from_marker_list, extract_file_type_from_file_name, extract_time_from_marker,
|
||||||
};
|
};
|
||||||
pub use preconfigured::TaskConfiguration;
|
pub use preconfigured::TaskConfiguration;
|
||||||
pub use shard::{localize_shard, localize_stream_file};
|
pub use shard::{localize_shard, localize_stream_file};
|
||||||
|
|
|
||||||
|
|
@ -20,6 +20,12 @@ pub static TaskConfiguration: Lazy<RepositoryConfiguration> = Lazy::new(|| {
|
||||||
.with_comment("Project the task is attached to")
|
.with_comment("Project the task is attached to")
|
||||||
.with_propagate(true),
|
.with_propagate(true),
|
||||||
)
|
)
|
||||||
|
.with_dimension(
|
||||||
|
"file_type",
|
||||||
|
Dimension::new("File Type")
|
||||||
|
.with_comment("Type of file derived from filename prefix (e.g. 'daily')")
|
||||||
|
.with_propagate(true),
|
||||||
|
)
|
||||||
.with_marker(
|
.with_marker(
|
||||||
"Task",
|
"Task",
|
||||||
Marker::new("Task").with_placements(vec![
|
Marker::new("Task").with_placements(vec![
|
||||||
|
|
|
||||||
|
|
@ -5,7 +5,10 @@ use indexmap::{IndexMap, IndexSet};
|
||||||
use crate::error::StreamdError;
|
use crate::error::StreamdError;
|
||||||
use crate::models::{LocalizedShard, RepositoryConfiguration, Shard, StreamFile};
|
use crate::models::{LocalizedShard, RepositoryConfiguration, Shard, StreamFile};
|
||||||
|
|
||||||
use super::datetime::{extract_datetime_from_file_name, extract_datetime_from_marker_list};
|
use super::datetime::{
|
||||||
|
extract_datetime_from_file_name, extract_datetime_from_marker_list,
|
||||||
|
extract_file_type_from_file_name,
|
||||||
|
};
|
||||||
|
|
||||||
/// Localize a shard within the repository's coordinate system.
|
/// Localize a shard within the repository's coordinate system.
|
||||||
///
|
///
|
||||||
|
|
@ -102,6 +105,9 @@ pub fn localize_stream_file(
|
||||||
|
|
||||||
let mut initial_location = IndexMap::new();
|
let mut initial_location = IndexMap::new();
|
||||||
initial_location.insert("file".to_string(), stream_file.file_name.clone());
|
initial_location.insert("file".to_string(), stream_file.file_name.clone());
|
||||||
|
if let Some(file_type) = extract_file_type_from_file_name(&stream_file.file_name) {
|
||||||
|
initial_location.insert("file_type".to_string(), file_type);
|
||||||
|
}
|
||||||
|
|
||||||
Ok(localize_shard(
|
Ok(localize_shard(
|
||||||
shard,
|
shard,
|
||||||
|
|
|
||||||
|
|
@ -18,6 +18,7 @@ fn main() -> miette::Result<()> {
|
||||||
Some(Commands::Timesheet { decimal, debug }) => {
|
Some(Commands::Timesheet { decimal, debug }) => {
|
||||||
streamd::cli::commands::timesheet::run(decimal, debug)?
|
streamd::cli::commands::timesheet::run(decimal, debug)?
|
||||||
}
|
}
|
||||||
|
Some(Commands::Daily { date }) => streamd::cli::commands::daily::run(date)?,
|
||||||
Some(Commands::Completions { shell }) => {
|
Some(Commands::Completions { shell }) => {
|
||||||
streamd::cli::commands::completions::run(shell);
|
streamd::cli::commands::completions::run(shell);
|
||||||
}
|
}
|
||||||
|
|
|
||||||
Loading…
Add table
Add a link
Reference in a new issue